GPT-3 paper from OpenAI is live. I really appreciate Open AI work, but feel somewhat disappointed.. ► Increasing number of parameters is technical challenge. But without new approaches - it boils down to server resources and money. GPT-3 has 175 billion parameters, 10x compared to GPT-2. Cost estimates are $7-12 million dollars to train the model! Cost is mind blowing. ► Collected more data is not a research problem. ► No open source model or code again?? Paper: Github repo:
Language Models are Few-Shot Learners

No comments yet
Be the first to comment

Download Qoorio to comment, talk & learn
Learn more by discovering other insights
Continuing on my previous GPT-3 insight. Well written criticism on GPT-3 capabilities. Leaving all cherry-picked examples aside. GPT-3 gives some surprisingly good answers, but then fails on similar examples.. Full article: All data:
GPT-3, Bloviator: OpenAI’s language generator has no idea what it’s talking about
Justas Janauskas asked me: "Mangirdai, I would love to ask you a question and you to answer it publicly. Can you explain in simple words what GPT3 is, how it works, who can use it, and what can be done with it?” My answer: GPT-3 was released in the beginning of June by OpenAI. It is an invite only API. GPT-3 is one of the most hyped tech in recent years. My opinion it is overhyped by new startups trying to use the hype for product launch or money raising. Let’s dig deeper. GPT-3 is transformer based language model. It is a successor to GPT-2 and GPT-1. Transformers type neural networks are an ongoing1 trend demonstrating better performance. GPT-3 idea is simple - it is trained to correctly predict the next word in a sentence. Model could be adapted to other tasks as well. GPT-3 works with text. It can answer questions, generate articles. Even translate text terminal commands or generate code(although such demos should be taken with the grant of salt). GPT-3 real innovation is its ability to adapt to new(unseen) tasks. Good results or sometimes state-of-the-art results could be achieved showing only 16-32 new task examples. This is important - training neural networks from zero is very hard. GPT-3 API lowers the barrier for product development by small startups. Unfortunate model is not open source and available as an invite only API(API will be paid going further). On the other hand similar results could be achieved with GPT-2/BERT/other models with custom training. Some good reviews so far:
GPT-3: Careful First Impressions

Thomas DesimpelAngel Investor, Polyglot, Real Estate Investor
Thanks for sharing Mangirdas Adomaitis I learned something new and interesting 👌
Congrats to Rasa team, which just raised $26mln series B to continue building open source company. I used Rasa framework to build chatbots. Rasa is open source and free of charge for commercial use. I want to discuss investments in open source. A16Z is not doing charity here: 1️⃣ Just 12 months ago Rasa raised $13mln from Accel. 2️⃣ Rasa annual recurring revenue increased 3x in last 12months. 3️⃣ Downloads grew 6 fold in last 14 months. 4️⃣ Some customers: Adobe, Deutsche Telekom, BMW, Airbus Companies building open source earn money from consulting work. Later they might make paid service with advanced features(security, versioning, etc) for corporate clients. Seems to be long-term play. In retrospective, Rasa could have build closed-source SaaS business. ► Why go with open source while end-goal is still profit? ► Does anyone know more financially successful businesses going open source from day one?
TechCrunch is now a part of Verizon Media

Danielius VisockasSoftware Engineer @ Qoorio; Burger geek; Sound processing
Prisma @ is a similar example. It's cool to see open source businesses growing !
Download Qoorio to comment, talk & learn
Become Open HumanFAQBlog