It is not surprising the CEO of OpenAI is enthusiastic about his company’s language processing tool, or artificial intelligence chatbot, ChatGPT.
But some of Sam Altman’s comments in a highly anticipated fireside chat smacked of hyperbole.
He asserted that AI is ‘the next platform’ for explosive innovation, economic empowerment, and value creation.
Altman also claims the AI tech revolution will be bigger than the Industrial revolution or any prior tech revolutions, with potential to create ‘unimaginable economic growth and prosperity’.
He may eventually be right but we reserve the right to be at least a little skeptical of that type of proselytizing.
Having said that, here are five points that Altman hit on that stood out for analysts from Morgan Stanley Research and three companies well-positioned to benefit.
Altman defines Artificial General Intelligence (AGI) as any system more powerful than humans in cases of cognitive work.
He says to think of OpenAI as a ‘reasoning engine’ and *not* a fact database.
He acknowledged challenges and said that while OpenAI will never get to 100% accuracy, it *will* surpass human accuracy.
He talked about the immense scale, capital and compute required to train/run AI systems, which made MSFT the perfect partner and is also a reason NVDA stock has been working.
The detailed commentary is bulleted below, but five points that stuck out the most with implications for NVDA, MSFT & GOOGL:
1. The scale, compute (GPU) and capital requirements for AI models are much higher than anticipated. The high capital requirements will lead to a relatively small number of players who can create underlying base models, with a middle layer on top where players ‘fine tune’ the models for specific verticals. The emphasis on capital requirements and compute is why NVDA stock has been working, and why OpenAI is working with MSFT as exclusive cloud partner.
2. On AI cost/cost per query, Altman said his original comment of AI cost per query of ‘a few cents’ is ‘down a lot now’. “We can stress about the current gross margin or believe in the curve of technology that will get cheaper and cheaper.” He said chips do get cheaper all the time but the main driver of cost downs are research. OpenAI recently reduced the cost of ChatGPT by 10x over the prior model via extraordinary gains in research. The cost per query comment and faith in cost downs is a relief for GOOGL given concerns on higher costs. He also said GOOGL is switching to OpenAI’s approach with ‘incredible vigor’ and ‘sure they will do great work’.
3. Early AI use cases expected to see this year include AI legal helpers, AI medical advisors, and AI education tutors. This creates ‘economic empowerment’ as small businesses can use AI for legal docs/customer service responses/ data entry/etc. instead of paying for those tasks. The legal and educational focus has created an overhang on stocks like LZ and CHGG.
4. People are surprised that creative work is impacted more by AI than ‘mundane’ jobs, but Altman said that makes sense because of the lack of ‘precision’ required in creative use cases. For instance, DALL-E can be used to make a logo and if just one out of 40 attempts is good then that’s a positive outcome. On the other hand, using AI for driving a car would have bad consequences with mistakes. That said, AI precision is getting better ‘every week’.
5. In terms of challenges, Altman called out ‘hallucinations’ (AI responding confidently without justification in training data) and inaccuracies, though both are improving quickly. On risks, Altman is focused on getting AGI to be as ‘beneficial and safe’ as possible. He emphasized safety several times.
Related stories: 11 Stocks Benefitting from AI & Ones to Avoid