In 1975, psychologist Donald Campbell coined what became known as Campbell's Law:
Today, we might call this the "AI narrative trap." We're so focused on the flashy demos, bold pronouncements, and model comparisons that we risk missing the real story unfolding beneath the surface – the story of endurance, not just speed, and of products, not just raw technology. This brings to mind the pre-season press conference for the AI Championship.
The reigning champ, Alphabet (represented by Sundar Pichai, with a calm demeanor), sits quietly, having dominated the "Search Game" for years.
Across the room, OpenAI (with ChatGPT as their star rookie), backed by Microsoft's Satya Nadella, is loud. "
Search is over!" they declared. "We're changing the game!" The media echoed: "GOOGLE IN CRISIS!"
But Sundar remained unfazed, focused on the long game – a game that's ultimately about building solutions, not just showcasing tools.
Alphabet sits at the center of this paradox.
For months, the narrative has been straightforward: Google is playing defense, desperately trying to protect its Search empire from AI disruption. OpenAI is the innovator; Google, the incumbent, desperately trying to catch up.
But here's the thing about narratives: they're usually wrong precisely when everyone agrees on them.
The reality is that Alphabet isn't playing defense – it's orchestrating what might be the most ambitious expansion of its capabilities since the company's founding.
This isn't a story about protecting Search; it's about expanding the very definition of what Google can be, by building AI-native products, not just bolting AI onto existing ones.
Beyond the Headlines: Reading Between the Lines of Q4 2024
The most important numbers in Alphabet's Q4 earnings weren't the ones that made the headlines.
Yes,
✅ Revenue grew 12% to $96.5 billion.
☁️ Cloud revenue hit $12 billion, up 30%.
💰 The company plans to spend a staggering $75 billion on capital expenditure in 2025.
But the real story lies in three numbers that most observers glossed over:
These numbers tell a different story than the "Google playing catch-up" narrative.
They suggest something more profound: Alphabet isn't just adapting to AI; it's using AI to expand the very surface area of its business, by creating new ways for users to interact with information and technology.
🔄 Think about Circle to Search.
This isn't just a defensive move to protect Search – it's an expansion of what "search" means. When every pixel on your screen becomes a potential search interface, you're not just defending territory; you're creating new territory. When Lens handles 20 billion visual searches monthly, with most being purely incremental to traditional search, you're not replacing queries – you're creating new ones that weren't possible before. These are AI-native experiences, not just features added to existing products.
☁️The same pattern emerges in Cloud.
The capacity constraints that led to a slight revenue miss weren't a sign of weakness – they were a symptom of demand outstripping supply. When your problem is that customers want more of your product than you can currently deliver, that's not a bad problem to have. It's the kind of problem that justifies a $75 billion capital expenditure plan.
Finally, the company highlighted a pivot to efficiency, with margin expansion being helped by measures such as optimizing real estate.
Alphabet's Underappreciated Moat: The Flywheel Effect and the Power of Products
There's an old saying in technology: "First-mover advantage lasts until someone builds something better." But there's a corollary that's less discussed: "Network effects compound until they become nearly impossible to replicate." And a new corollary is emerging in the AI era: "Model advantage lasts until someone builds a better product."
This is where the real story of Alphabet's AI advantage begins. It's not just about who launched chatbots first – it's about the intricate web of advantages that compound on each other, creating a "wide and deepening moat," and about the ability to translate those advantages into compelling user experiences. Remember that initial press conference? While the newcomers were talking a big game, Alphabet was quietly playing the game, building infrastructure, refining its models, and crafting AI-native products.
A. End-to-End AI Synergies: The Full Stack Advantage… and Beyond
Most companies building AI today are like race car drivers who rent their cars, hire different mechanics for each race, and race on borrowed tracks. Alphabet, on the other hand, builds its own cars, trains its own mechanics, and owns its tracks.
This full-stack approach – from custom TPUs to efficient data centers to the Gemini model family – creates a compound effect that's easy to miss if you're looking at each piece in isolation.
As CEO Sundar Pichai noted,
But it's not just about the full stack; it's about what Alphabet does with it.
It's about taking those powerful models and building products like NotebookLM, which reimagines how we interact with documents; or AI-powered podcast and video tools, which empower creators in new ways. This is where Alphabet's strategy diverges from the "Copilot" approach of simply adding AI assistants to existing products.
These aren't just incremental improvements – they're compounding advantages that make each new AI innovation more efficient and cost-effective.
But here's where it gets interesting: Each piece of this stack doesn't just work better because it's optimized for the others – it also generates data that improves the entire system. Seven products with over two billion users each create a data flywheel that's unprecedented in scale and diversity. When Pichai mentions that Cloud customers now consume eight times more compute capacity for AI compared to 18 months ago, he's not just talking about growth – he's describing a flywheel spinning faster and faster. This flywheel doesn't just improve models; it improves products.
B. The Market is Totally Missing This: It's About Products, Not Just Models
The financial press, the tech blogs, even many investors – they're all obsessed with the model wars. Who has the biggest language model? Who has the fastest training times?
This focus on raw technological horsepower misses the fundamental point:
And this is where Alphabet is quietly, but decisively, pulling ahead.
The market is overlooking the profound difference between having a powerful AI engine and building a compelling, user-centric, AI-native product.
Microsoft and OpenAI, with their Copilot-centric strategy, are playing a different – and, in the long run, arguably less ambitious – game.
1. Model vs. Product: A Chasm, Not a Gap
We need to stop thinking of LLMs like GPT-4 or Gemini as finished products. They are powerful ingredients, but they are not the meal. They are like having the world's most advanced kitchen appliance – capable of amazing feats, but useless without a chef, a recipe, and a hungry customer.
✅ Models are Tools, Products are Solutions:
A language model, in isolation, can generate text. But a product solves a problem. It takes that raw capability and packages it in a way that is:
🔹 Intuitive and Accessible: The UI and UX are paramount. Alphabet's Circle to Search transforms the entire screen into a search interface.
🔹 Deeply Integrated: A true AI-native product connects to other tools, data sources, and platforms. Think of how Alphabet is integrating AI across Search, YouTube, Workspace, and even its hardware.
🔹 Continuously Learning: A great AI product isn't static; it gets better over time. Alphabet's seven products, each with over two billion users, provide an unparalleled advantage. This is the flywheel in action.
🔹 Economically Viable: A successful product needs a sustainable business model. Alphabet's diverse revenue streams and experience give it a significant edge.
Distribution: All the above is for nothing, if users can not get it.
2. Alphabet's AI-Native Product Masterclass
Alphabet isn't just talking about AI; they're building it into the fabric of their products, and creating entirely new products designed from the ground up to leverage AI. This is the crucial distinction: AI-native.
NotebookLM: This isn't your grandfather's note-taking app. It's a fundamental reimagining of how we interact with information. Users can "converse" with their documents, ask complex questions, synthesize insights, and generate new content. This is AI as a thought partner.
AI-Powered Podcasts & Video: Google isn't just adding automatic transcription; they're transforming the entire media creation and consumption experience. AI-generated summaries, personalized recommendations, and tools for creators – these are not just features; they're redefining what's possible. Veo and Imagen aren't just about pretty pictures; they're about empowering creators.
Circle to Search & Lens: These aren't incremental improvements to Search; they're new paradigms for information discovery, blurring the lines between the physical and digital.
Workspace: Gemini is not a bolt on feature. These aren't isolated experiments; they're part of a deliberate push to embed AI deeply into every aspect of Alphabet's business. This is the antithesis of the "Copilot" approach.
3. Microsoft's Copilot: A Valuable Enhancement, But Not a Revolution
Microsoft's Copilot strategy is not a failure. Adding AI assistants to Office 365, Windows, and GitHub is a smart move in the short term. It leverages Microsoft's existing strengths:
🔹 Rapid Deployment: Bolting a Copilot onto existing products is faster.
🔹 Massive User Base: Microsoft can instantly offer AI to millions.
🔹 Enterprise Dominance: The Copilot model fits Microsoft's enterprise focus.
But – and this is a critical but – the Copilot approach has inherent limitations:
🔹 Incremental, Not Transformative: Copilots are helpers, not game-changers. They are enhancements, not revolutions. They are features, not foundational shifts.
🔹 "Feature, Not Product" (The Risk of Commoditization): If every software package has a Copilot, does the Copilot itself become a commodity?
🔹 Tethered to the Past: The success of the Copilot model is linked to the success of the underlying products. This is a dependency, not a strength.
🔹 The Innovation Ceiling: Bolting AI onto existing products limits the potential for truly groundbreaking, AI-first experiences.
4. Microsoft's Enterprise Reach vs. Alphabet's Product Universe
Microsoft has a powerful advantage in enterprise sales. Their "suite" approach is a proven model.
But Alphabet's advantage is arguably even more significant: a vast and diverse product ecosystem that touches nearly every aspect of digital life.
This gives Alphabet far more "surface area" to deploy AI, more opportunities to collect data, and more avenues to create AI-native products. It's the difference between having a single storefront and owning the entire mall.
5. Building Products is Hard: The Unspoken Truth
The market is underestimating: building great products is incredibly difficult. It's not just about technology; it's about:
🔹 Deep User Empathy: Understanding what users actually need.
🔹 Design Thinking: Crafting intuitive, delightful user experiences.
🔹 Iterative Development: Constantly testing, learning, and refining.
🔹 Cross-Functional Collaboration: Bringing together engineers, designers, product managers, and marketers.
While Microsoft has a history of building software, their focus on Copilots raises a question: are they prioritizing incremental improvements over fundamental innovation?
Alphabet, in contrast, appears to be embracing the messy, challenging, but rewarding, process of building AI-native products. This is a riskier strategy, but also the one most likely to yield transformative results.
C. The "Hidden" Value of Non-Search Assets (and Their AI-Native Potential):
Think of Alphabet like an iceberg. Search, the visible part above water, gets all the attention. But beneath the surface lies a vast network of assets, and that provide fertile ground for AI-native innovation.
🔹 YouTube: On the surface, it's a video platform. But dig deeper and you'll find it's become the most popular podcast listening platform in the U.S. In the living room, viewers globally streamed over 1 billion hours of YouTube content daily in 2024. This isn't just about serving ads; it's about becoming a full-spectrum media platform, ripe for AI-powered creation, recommendation, and interaction.
🔹 Waymo: Now averages over 150,000 trips per week. While many see it as a moonshot, it's actually a massive data collection engine for AI development, and a potential platform for real-world AI agents.
D. The Expanding Definition of "Search":
Many have concerns about AI "cannibalizing" search queries, however these AI agents enable users to complete more complex tasks or research. Furthermore, new innovations, such as Circle-to-Search and Lens, allow a whole range of new user cases. These products are AI native, not AI added on.
E. The Enterprise Opportunity: Beyond Infrastructure, Into Solutions
The cloud story at Alphabet isn't just about renting compute power – it's about providing AI capabilities to enterprises, and increasingly, about offering AI-powered solutions, not just infrastructure.
The evidence? Vertex AI's customer base grew 5x year-over-year, with usage up 20x.
F. Cost Discipline: The Hidden Accelerant
Alphabet's approach to cost discipline. CFO Anat Ashkenazi emphasized that the massive $75 billion capex plan for 2025 isn't just about building capacity – it's about building the right kind of capacity. Most of the planned capex is earmarked for "short-lived assets", with operating expenses actually decreased 1% year-over-year.
Addressing the Risks (and Counterarguments)
Charlie Munger once said that the best way to minimize risk isn't to avoid it, but to seek out risk that others misunderstand.
The Search Disruption Myth (and the Rise of AI-Native Search)
The conventional wisdom is that AI chatbots pose an existential threat to Google's search business.
But Q4's results tell a different story: AI isn't cannibalizing Search. While the newcomers were busy with press conferences, Alphabet was quietly integrating AI into Search, and building entirely new search experiences.
Consider three data points that challenge the disruption narrative:
🔹 AI Overviews are monetizing "at approximately the same rate" as traditional search
🔹 Circle to Search has already captured 10% of search volume
🔹 Lens is processing 20 billion visual searches monthly, "most of which are purely incremental"
This is expansion, driven by AI-native search products.
The ROI Question: Are AI Investments Worth It? (Products, Not Just Models)
The $75 billion capex number for 2025 has raised eyebrows. But the key isn't how much you spend – it's what you build.
Alphabet's full-stack approach and its focus on AI-native products mean every dollar of AI investment works harder. It's not just about having the best model; it's about having the best products that leverage that model.
The proof is in the metrics. This isn't speculative R&D – it's investment meeting rapidly growing demand for AI-powered solutions.
The Regulatory Wild Card
Alphabet faces regulatory scrutiny. Regulatory pressure might actually strengthen Alphabet's competitive position in AI. Navigating regulation isn't just about compliance – it's about building institutional knowledge. Moreover, increased regulatory scrutiny creates a moat of its own.
Conclusion: Beyond Search: The New Google Universe, Powered by AI-Native Products
Jeff Bezos once observed that people often ask him what's going to change in the next 10 years, but they rarely ask what's not going to change.
For Alphabet, what won't change is the fundamental human desire to understand, to discover, to connect, and to get things done. What is changing is how dramatically AI expands the possibilities, and how products, not just models, will be the key.
This isn't about protecting Search – it's about expanding what information discovery, productivity, and connection can mean. The trash-talking newcomers might have grabbed early headlines, but Alphabet was building infrastructure, refining models, and, crucially, crafting AI-native products. Now, the scoreboard is telling a different story. Not just technological prowess, but of product vision.
Consider:
🔹Circle to Search turning every pixel into a search interface.
🔹 Gemini evolving into an agent capable of complex tasks.
🔹 Cloud customers doubling their commitments.
🔹 YouTube dominating podcasts and expanding into AI-driven content creation.
🔹 Waymo deploying autonomous vehicles at scale.
These aren't defensive moves. They're product-driven expansions.
What makes this expansion particularly powerful is how it compounds. Each new product feeds data into the flywheel. When you have seven products with over 2 billion users each – the compound effects become staggering. This isn't just about having the best AI; it's about having the best AI-powered ecosystem.
The real question isn't whether Google can compete in AI. The question is how many new markets will AI-powered products allow Google to enter and potentially dominate?
We're talking about Alphabet becoming the operating system for our AI-enabled future.
Now, with AI, we're seeing the potential for a similar expansion – from finding information to understanding it, from discovering possibilities to acting on them, from organizing the world's information to helping organize our lives, all powered by AI-native products.
The $75 billion capex isn't just about maintaining Alphabet's position. It's an investment in expanding what's possible.
So, as Alphabet continues its AI journey, perhaps the most important question to ask is: What new verbs will AI create, and will Google own them?
By
Kristal Investment Desk
February 26, 2025
Liked it? Share it with your friends & colleagues!
We encourage our India investors to use a financial guide. Kristal does not charge any additional fees for investing through them.
In case you already have a guide, we will try to bring them onboard. In case not, we can recommend one of our qualified partners to advise you through the journey.
This is offered only to Accredited and Institutional Investors as defined under the Securities and Futures Act, Chapter 289 of Singapore (“Act”), which broadly comprises of regulated financial Institutions, large corporates, high net worth individuals and sophisticated investors.
An Accredited Investor is an individual
Whose net personal assets exceed in value SGD 2 million (or it’s equivalent in a foreign currency) with value of his/her primary residence capped at SGD 1 million, or
Whose financial assets (net of any related liabilities) exceed in value SGD 1 million (or it’s equivalent in a foreign currency), or
Whose income in the preceding 12 months is not less than SGD 300,000 (or it’s equivalent in a foreign currency)
I agree to opt-in as Accredited Investor and will submit required documentation to confirm the same.
Proceed as Private Wealth
« BACK
Barrier Reverse Convertible (BRCs)
It is a structured product issued at par. In working it is similar to a Reverse Convertible but includes the barrier feature to protect downside to some extent. The underlying can be a basket of shares where the worst performing share may be delivered on expiry at the strike price.
It’s a structured product which is similar to ELONs, except the underlying can be a basket of stocks. This means that in addition to normal ELON factors, there are additional knock-out, knock-in rules associated with them.
An equity linked note which is issued at par. The payment is made on initiation, by the client, in the form of shares that he/she might already hold and wants to unlock more returns especially during the periods when the stock is not giving any dividends.