Justas Janauskas asked me: "Mangirdai, I would love to ask you a question and you to answer it publicly. Can you explain in simple words what GPT3 is, how it works, who can use it, and what can be done with it?” My answer: GPT-3 was released in the beginning of June by OpenAI. It is an invite only API. GPT-3 is one of the most hyped tech in recent years. My opinion it is overhyped by new startups trying to use the hype for product launch or money raising. Let’s dig deeper. GPT-3 is transformer based language model. It is a successor to GPT-2 and GPT-1. Transformers type neural networks are an ongoing1 trend demonstrating better performance. GPT-3 idea is simple - it is trained to correctly predict the next word in a sentence. Model could be adapted to other tasks as well. GPT-3 works with text. It can answer questions, generate articles. Even translate text terminal commands or generate code(although such demos should be taken with the grant of salt). GPT-3 real innovation is its ability to adapt to new(unseen) tasks. Good results or sometimes state-of-the-art results could be achieved showing only 16-32 new task examples. This is important - training neural networks from zero is very hard. GPT-3 API lowers the barrier for product development by small startups. Unfortunate model is not open source and available as an invite only API(API will be paid going further). On the other hand similar results could be achieved with GPT-2/BERT/other models with custom training. Some good reviews so far: https://blog.rasa.com/gpt-3-careful-first-impressions/ https://minimaxir.com/2020/07/gpt3-expectations/
GPT-3: Careful First Impressions
blog.rasa.com

Thomas DesimpelAngel Investor, Polyglot, Real Estate Investor
Thanks for sharing Mangirdas Adomaitis I learned something new and interesting 👌
2 months ago

Download Qoorio to comment, talk & learn
MORE INSIGHTS YOU MIGHT LIKE
Learn more by discovering other insights
Continuing on my previous GPT-3 insight. Well written criticism on GPT-3 capabilities. Leaving all cherry-picked examples aside. GPT-3 gives some surprisingly good answers, but then fails on similar examples.. Full article: https://www.technologyreview.com/2020/08/22/1007539/gpt3-openai-language-generator-artificial-intelligence-ai-opinion/ All data: https://cs.nyu.edu/faculty/davise/papers/GPT3CompleteTests.html
GPT-3, Bloviator: OpenAI’s language generator has no idea what it’s talking about
www.technologyreview.com
4
Congrats to Rasa team, which just raised $26mln series B to continue building open source company. I used Rasa framework to build chatbots. Rasa is open source and free of charge for commercial use. https://techcrunch.com/2020/06/23/rasa-raises-26m-led-by-a16z-for-its-open-source-conversational-ai-platform/ I want to discuss investments in open source. A16Z is not doing charity here: 1️⃣ Just 12 months ago Rasa raised $13mln from Accel. 2️⃣ Rasa annual recurring revenue increased 3x in last 12months. 3️⃣ Downloads grew 6 fold in last 14 months. 4️⃣ Some customers: Adobe, Deutsche Telekom, BMW, Airbus Companies building open source earn money from consulting work. Later they might make paid service with advanced features(security, versioning, etc) for corporate clients. Seems to be long-term play. In retrospective, Rasa could have build closed-source SaaS business. ► Why go with open source while end-goal is still profit? ► Does anyone know more financially successful businesses going open source from day one?
TechCrunch is now a part of Verizon Media
techcrunch.com

Danielius VisockasSoftware Engineer @ Qoorio; Burger geek; Sound processing
Prisma @ https://www.prisma.io/ is a similar example. It's cool to see open source businesses growing !
GPT-3 paper from OpenAI is live. I really appreciate Open AI work, but feel somewhat disappointed.. ► Increasing number of parameters is technical challenge. But without new approaches - it boils down to server resources and money. GPT-3 has 175 billion parameters, 10x compared to GPT-2. Cost estimates are $7-12 million dollars to train the model! Cost is mind blowing. ► Collected more data is not a research problem. ► No open source model or code again?? Paper: https://arxiv.org/abs/2005.14165 Github repo: https://github.com/openai/gpt-3
Language Models are Few-Shot Learners
arxiv.org
5
Download Qoorio to comment, talk & learn
Become Open HumanFAQBlog
We use cookies to personalise content, provide social media features, and analyse our traffic. We value your privacy and only use the most necessary and analytical cookies. You can opt out at any time.