Google I/O was an AI evolution, not a revolution
At Googleâs I/O developer conference, the company made its case to developers â and to some extent, consumers â why its bets on AI are ahead of rivals. At the event, the company unveiled a revamped AI-powered search engine, an AI model with an expanded context window of 2 million tokens, AI helpers across its suite of Workspace apps, like Gmail, Drive and Docs, tools to integrate its AI into developersâ apps and even a future vision for AI, codenamed Project Astra, which can respond to sight, sounds, voice and text combined.Â
While each advance on its own was promising, the onslaught of AI news was overwhelming. Though obviously aimed at developers, these big events are also an opportunity to wow end users about the technology. But after the flood of news, even somewhat tech-savvy consumers may be asking themselves, wait, whatâs Astra again? Is it the thing powering Gemini Live? Is Gemini Live sort of like Google Lens? How is it different from Gemini Flash? Is Google actually making AI glasses or is that vaporware? Whatâs Gemma, whatâs LearnLMâŠwhat are Gems? When is Gemini coming to your inbox, your docs? How do I use these things?
If you know the answers to those, congratulations, youâre a TechCrunch reader. (If you donât, click the links to get caught up.)
What was missing from the overall presentation, despite the enthusiasm from the individual presenters or the whooping cheers from the Google employees in the crowd, was a sense of the coming AI revolution. If AI will ultimately lead to a product that will profoundly impact the direction of technology the way the iPhone impacted personal computing, this was not the event where it debuted.Â
Instead, the takeaway was that weâre still very much in the early days of AI development.Â
On the sidelines of the event, there was a sense that even Googlers knew the work was unfinished. When demoing how AI could compile a studentâs study guide and quiz within moments of uploading a multihundred-page document â an impressive feat â we noticed that the quiz answers werenât annotated with the sources cited. When asked about accuracy, an employee admitted that the AI gets things mostly right and a future version would point to sources so people could fact-check its answers. But if you have to fact-check, then how reliable is an AI study guide in preparing you for the test in the first place?Â
In the Astra demo, a camera mounted over a table and linked to a large touchscreen let you do things like play Pictionary with the AI, show it objects, ask questions about those objects, have it tell a story and more. But the use cases for how these abilities will apply to everyday life werenât readily apparent, despite the technical advances that, on their own, are impressive.Â
For example, you could ask the AI to describe objects using alliteration. In the livestreamed keynote, Astra saw a set of crayons and responded âcreative crayons colored cheerfully.â Neat party trick.
When we challenged Astra in a private demo to guess the object in a scribbled drawing, it correctly identified the flower and house I drew on the touchscreen right away. When I drew a bug â one bigger circle for the body, one smaller circle for the head, little legs off the sides of the big circle â the AI stumbled. Is it a flower? No. Is it the sun? No. The employee guided the AI to guess something that was alive. I added two more legs for a total of eight. Is it a spider? Yes. A human would have seen the bug immediately, despite my lack of artistic ability.Â
To give you a sense of where the technology is today, Google staff didnât allow recording or photographs in the Astra demo room. They also had Astra running on an Android smartphone, but you couldnât see the app or hold the phone. The demos were fun, and certainly the tech that made them possible is worth exploring, but Google missed an opportunity to showcase how its AI technology will impact your everyday life.
When are you going to need to ask an AI to come up with a band name based on an image of your dog and a stuffed tiger, for example? Do you really need an AI to help you find your glasses? (These were other Astra demos from the keynote.)
This is hardly the first time weâve watched a technology event filled with demos of an advanced future without real-world applications or those that pitch conveniences as more significant upgrades. Google, for instance, has teased its AR glasses in previous years, too. (It even parachuted skydivers into I/O wearing Google Glass, a project built over a decade ago, that has since been killed off.)
After watching I/O, it feels like Google sees AI as just another means to generate additional revenue: Pay for Google One AI Premium if you want its product upgrades. Perhaps, then, Google wonât make the first huge consumer AI breakthrough. Like OpenAIâs CEO Sam Altman recently mused, the original idea for OpenAI was to develop the technology and âcreate all sorts of benefits for the world.â
âInstead,â he said, âit now looks like weâll create AI and then other people will use it to create all sorts of amazing things that we all benefit from.âÂ
Google seems to be in the same boat.
Still, there were times when Googleâs Astra AI seemed more promising. If it could correctly identify code or make suggestions on how to improve a system based on a diagram, itâs easier to see how it could be a useful work companion. (Clippy, evolved!)
There were other moments when the real-world practicality of AI shone through, too. A better search tool for Google Photos, for instance. Plus, having Geminiâs AI in your inbox to summarize emails, draft responses or list action items could help you finally get to inbox zero, or some approximation of that, more quickly. But can it clear out your unwanted but non-spam emails, smartly organize emails into labels, make sure that you never miss an important message and offer an overview of everything in your inbox that you need to take action on as soon as you log in? Can it summarize the most important news from your email newsletters? Not quite. Not yet.Â
In addition, some of the more complex features, like AI-powered workflows or the receipt organization that was demoed, wonât roll out to Labs until September.
When thinking about how AI will impact the Android ecosystem â Googleâs pitch for the developers in attendance â there was a sense that even Google canât yet make the case that AI will help Android woo users away from Appleâs ecosystem. âWhen is the best time to switch from iPhone to Android?â, we posed to Googlers of varying ranks. âThis fallâ was the general response. In other words, Googleâs fall hardware event, which should coincide with Appleâs embrace of RCS, an upgrade to SMS that will make Android messaging more competitive with iMessage.
Simply put, consumersâ adoption of AI in personal computing devices may require new hardware developments â maybe AR glasses? a smarter smartwatch? Gemini-powered Pixel Buds? â but Google isnât yet ready to reveal its hardware updates or even tease them. And, as weâve seen already, with the Ai Pin and Rabbitâs underwhelming launches, hardware is still hard.Â
Though much can be done today with Googleâs AI technology on Android devices, Googleâs accessories like the Pixel Watch and the system that powers it, WearOS, were largely overlooked at I/O, beyond some minor performance improvements. Its Pixel Buds earbuds didnât even get a shout-out. In Appleâs world, these accessories help lock users into its ecosystem, and could someday connect them with an AI-powered Siri. They are critical pieces to its overall strategy, not optional add-ons.
Meanwhile, thereâs a sense of waiting for the other shoe to drop: that is, Appleâs WWDC. The tech giantâs Worldwide Developer Conference promises to unveil Appleâs own AI agenda, perhaps through a partnership with OpenAIâŠor even Google. Will it be competitive? How can it be if the AI canât deeply integrate into the OS, the way Gemini can on Android? The world is waiting for Appleâs response.
With a fall hardware event, Google has time to review Appleâs launches and then attempt to craft its own AI moment thatâs as powerful, and as immediately understandable, as Steve Jobsâ introduction of the iPhone: âAn iPod, a phone, and an Internet communicator. An iPod, a phone⊠are you getting it?âÂ
People got it. But when will they get Googleâs AI in the same way? Not from this I/O, at least.
Weâre launching an AI newsletter! Sign up here to start receiving it in your inboxes on June 5.
Â