GPT-3 can’t save you today. GPT-3 by OpenAI (private beta) dazzles the world in 2020 with incredible demos like generating SQL query using plain English, writing code, and doing what seems like machine comprehension. Yes comprehensions are even hard for humans (think of SAT and GRE). We talked about this exciting news in our article GPT-3 past present future. We also wrote about NLP fundamentals in this our Getting Started with NLP article. It is very important for us to follow up and write about what GPT-3 cannot do. Limitations, weakness, mistakes. It is a bit hard without access to GPT-3 yet. But we still did some early investigation and found insights to present in this article.
We were also able to get hands on with the the GPT-3 playground. First let’s talk about Sam Altman then about what really happens in GPT-3.
This article is not completed and is going through continuous development, continuous writing. Any feedback is welcome. We write all kinds of programming, data science, machine learning and deep learning articles. To support us please subscribe on Medium, Clap, or if you feel like it, leave a comment below.
All Machine Learning models are defined and limited by the data it consume. After all, GPT-3 learned a huge slice of sensible, random, factual and fictional content from the internet — forums, wikipedia, twitter etc.