😲 It's F***ing Bonkers

Generative AI is Moving So Fast

Hello, and thanks for reading One AI Thing. Get smarter about artificial intelligence, one thing at a time.

👀 Today’s Thing: It's F***ing Bonkers

Y’all remember the Maxell guy, right?

🤖 Last week, I got to listen to some of the best 🧠 in the machine intelligence industry talk about generative AI. TL;DR: Everything is moving so fast, it’s f***ing bonkers. Actually, I think that’s a direct quote from one of said brains.

🎧 Speaking of brains, check out my 2019 interview with Pieter Abbeel on how to teach robots new skills.

📖 Backstory

☞ This all took place when I spoke at an investment event, as part of a Generative AI Day. It was fun! It was the first time I’d gone to a tech event in person, let alone spoke to a room full of industry folk, in who knows how long. Podcasting is awesome, but it’s been all remote since Covid. So big thanks 🙏 to Ten13 for inviting me!

☞ Along with myself, a VC, and a startup founder, four software engineers spoke at the event. They’re all leaders in the AI/Machine Learning space, and all four current or former Meta employees. While their talks were fairly different, one of the common takeaways was that AI — and generative AI, specifically — is moving faster than pretty much any technology any of them had ever seen before. That was my takeaway, anyway.

☞ Meta was left uninvited to the recent White House summit of AI leaders, but has made serious inroads with developers since releasing LLaMa, an open source large language model (LLM). And there’s a lot of smart money being placed on open source as the best way to develop LLMs, long term.

🔑 Keys to Understanding

🥇 While not the only measure of performance, LLM AI models are generally compared by the number of parameters — where bigger is usually better. The more parameters a model has, the more data it can process, learn from, and generate. The number of parameters in state-of-the-art LLMs has risen sharply, from 340 million in 2018 (Google BERT) to in excess of 1 trillion in 2021 (Google GLaM) and 2023 (Huawei PanGu-Σ and OpenAI GPT-4).

🥈 People who know about markets and finance don’t expect things to slow down anytime soon, either:

  • Global AI investment surged from $12.75 million in 2015 to $93.5 billion in 2021, and the market is projected to reach $422.37 billion by 2028. Already over $2B has been invested in Generative AI, up 425% since 2020, according to the Financial Times (via Forbes).

  • The generative AI market is projected to grow from USD 11.3 billion in 2023 to USD 51.8 billion by 2028 at a compound annual growth rate (CAGR) of 35.6%.

🥉 This excerpt from a Fast Company article about whether or not AI will end reality as we know it (😱) pretty perfectly sums up how I felt as I tried to put my little talk together in the weeks leading up to Gen AI Day:

As I started to write the original version of this piece, more news about AI’s advancements kept breaking. I wrote and rewrote what I already had. At one point, I threw it all away. I figured this future was going to be impossible to articulate in a traditional structure of a journalistic feature, so I turned to sci-fi prototyping, a technique used by futurists and organizations like the Pentagon to prepare for what’s coming.

No, I didn’t use sci-fi prototyping. But as soon as I started digging into the latest in Gen AI, my head started spinning. It’s. All. Moving. So. Fast!

🕵️ Need More?

Searching for a certain kind of AI thing? Reply to this email and let me know what you'd like to see more of.

Until the next thing,

- Noah

p.s. Want to sign up for the One AI Thing newsletter or share it with a friend? You can find me here.