In the mid-1980s, my relationship with technology was defined by a 3:00 AM time slot. The university’s mainframe was a humming, climate-controlled beast housed in a fortress a mile away. It was too busy during the day for undergraduates. So, we walked through the dark streets to sit at dumb terminals, writing code in silence. There were no windows, no mouse, and no "undo" button. You typed your logic, submitted it to the queue, and waited. If you were lucky, you got a printout on a paper stack folded along well-aligned perforations the next morning.

It wasn't just the students who lived in this rigid world. When I started my working life selling computers for IBM a few years later, the hierarchy was clear. Real money was made on mainframes and the mid-range AS/400s. We viewed personal computers as toys that never amounted to more than a rounding error in the revenue forecast. Selling software was infra dig.  Even for a salesforce, albeit aspiring to be a part of the revered pinstripe-suited gang, discussing software sales appeared like the grease one applied to sell the machine. So, we discussed solutions.

Of course, in less than a decade, the tech that one knew, one worked with, and one sold for decades at a time was dead. As words like the Internet and Applications made their entries, the tech jargon had to retire an entire vocabulary containing phrases like dumb terminals, “master-slave”, and compilers. Those investing at the time would have more forceful stories on how all selection processes and habits that made one money in previous decades turned wasteful in a few years. A bigger and faster change is afoot now. The software as we know it is dead. The hardware we know it is dead. The internet as we know it is dead. And, the industry as we know it is dead. It is impossible to say anything with conviction about where one is headed, and hence the need to keep learning and being flexible is higher than ever. But, investing based on old habits or predicting the future based on existing trends, some obscure theory or long history is fraught with risks. 

This is a strong setup. It pivots perfectly from the nostalgic, hardware-heavy intro into the current software crisis.

Software is Dead as We Know It

The first casualty of this new era is the very thing investors have prized most for two decades: the software moat. We have operated on the assumption that writing code is hard, expensive, and creates a durable asset. That assumption collapsed for me last week. Despite being a novice programmer, I used a just-announced IDE environment to build a highly specific Chrome Extension for the GenInnov investment process. It took me half a day. A year ago, I attempted something far less sophisticated with a small number of programmers, and we could not produce anything useful despite trying for weeks. Even six months ago, when we wrote about the massive shift in programming automation (link), it was met with a few eyerolls and accusations of hype. Today, the biggest critics of AI productivity are ready to concede the complete overhaul of software development arguments. 

When a novice can build enterprise-grade tools in an afternoon, the economics of the software industry invert. We are moving from a world of "software scarcity" (where you pay a premium for a tool because you can't build it) to "software abundance" (where the tool is generated on demand). The barriers to entry that protected the 80% gross margins of SaaS companies are dissolving. If anyone can spin up a bespoke CRM or a data-scraper for the cost of a few electricity tokens, a monthly subscription for a rigid, off-the-shelf alternative becomes one of the first items to cut to pay more for the rising electricity costs!

But the disruption goes deeper than just the creation of code. The consumption of software, both applications and apps, is dying, too. For twenty years, we loved the "program" model: distinct, walled gardens where you go to do a specific task. You open a spreadsheet to calculate, a browser to search, and a word processor to write. These containers are now liabilities. Generative AI does not want to work in a silo; it needs to traverse your entire digital life to be useful. It needs to see the email, the spreadsheet, and the calendar simultaneously.

This is why our desktops are beginning to look like graveyards of dead icons. We are instinctively moving away from "best-of-breed" applications that lock data away, and toward fluid interfaces where the boundaries between tools disappear. The most useful software from hereon is no longer a "destination" you visit; it is a layer that sits over everything. The companies that built their empires on trapping users inside a proprietary interface are finding that users are breaking the lock.

The result is becoming difficult to ignore as we end 2025. Beyond personal machines with a list of untouched, old “must-have” applications, on most knowledge-worker laptops, the number of native applications opened on the day is falling towards zero. The work has moved into browsers, but there is a change there as well. With the rise of all-in-one workspaces, into side panels, into tools called agents, for many of us, the number of unused SaaS logins is soaring, as well. Unlike the dead application icons on the machine, they represent a monthly cost burden.

In a world where a note-taking app is also a coding surface, a research assistant, and a meeting summariser, and where video and photo editing is happening through chatboxes, browsers are the new operating system. This, too, may change, but what is established is the fact that we are witnessing the end of software as a distinct product category. For investors, the danger is clinging to a picture of software that looks tidy on a sector map but no longer exists on a screen. We were trained to think in terms of products, licenses, modules, and “feature roadmaps.” The real action now sits in flows, embeddings, permissions, and how easily a system can lean on the rest of the digital world. It is true that software, in a broad sense, is not going away; it is dissolving into something more permeable. It is becoming a utility, generated on the fly, invisible and boundless. But if the software layer is dissolving into a commodity, the value must migrate somewhere else. It relies on the distribution network that feeds it data. And that network—the Internet itself—is facing an even more existential crisis.

The Internet is Dead as We Know It

Within the AI mode of Google Search for a query this morning, I was expectedly moved away from the list of blue links. However, with the new dynamic features, I got an immersive webpage-like result built instantly, just for me. Google’s new generative UI based output had charts, synthesized reasoning, and a path forward around the text-based answer that heralds how even the text-based AI Chatboxes as we know them may also be dead soon. 

Once again this week, ChatGPT quietly turned into a shopping front end. Shopping Research now reads product pages, reviews, and prices across the web and produces a personalised buyer’s guide, with Instant Checkout letting you complete the purchase without touching either a merchant’s site or the e-commerce websites at all.

In both of them, the "Web"—that tangled mess of SEO-optimized pages one is used to wading through—was entirely absent. The user is no longer surfing the web; the AI is surfing it for them and presenting a clean, sanitized summary. The “page” is no longer a fixed document waiting to be fetched. It is an experience assembled live, for each user, for each question.

B2C apps like Uber or Shopify hold steady for now, but long before these new features, in July 2025, Similarweb spotted a 6.7% decline in global search referrals to 1,000 major domains. AI summaries now devouring 35% of U.S. queries and slashing click-throughs on top results.  Independent studies suggest that the click-through rates have halved from around 15% before to roughly 8%, and some publishers report traffic drops of 50–80% on affected queries. Gartner estimates a 25% drop in search volume by 2026, and they may prove optimistic.

We see this shift tearing through sectors that seemed stable. Education had the MOOC era, where one learnt with the help of amazing content created by the best educators, and that has now turned static. It is being replaced by AI tutors that adapt each sentence to the student’s level of confusion. The hyperpersonalization offered cuts every strand, and education is just one example.

Take another example: for social media, the "feed" is under pressure as human creation struggles to compete with infinite, machine-generated engagement loops. 69% of marketers now deploy AI for content gen, birthing hyperlocal feeds. More importantly, with the future likely dominated by AI-history-based query agents that provide threaded summaries, content consumption will be less in the feed providers’ control, given the benefits of moving away from endless scroll for probabilistic pings.

Even Payments is about to break while getting ready for machine Agents to make payments. ChatGPT has its open Agentic Commerce Protocol, while Google has introduced Agent Payments Protocol, effectively the earliest day tools for AI agents, aka machines, to make payments without human involvement, and down the line with as few intermediaries as possible.  A lot more innovation is likely in the quarters ahead, which could put pressure on a vast swath of intermediaries that benefit from payment workflows established before the arrival of these new ways. 

The internet is now a substrate of APIs, feeds, embeds, and protocols that models swim through while the user talks to one or two front ends that feel like “software” but are really windows into everything. It’s being strip-mined in real time, its once-sacred pages reduced to training fodder for models that no longer need to send you there. That creates a new kind of bottleneck. And that brings us to the most observed, and yet least discussed part: the utterly changed nature of hardware. 

Hardware is Dead as We Know It

This brings us full circle, back to my 3:00 AM escapades during college. Those were days in front of dead terminals, with an all-powerful, inscrutable, computing behemoth somewhere hidden behind. The subsequent era’s personal computing devices seemed to mark an irreversible state of evolution. As we never tired of saying, we had more supercomputer power in our pockets than was required for the first moon landing. 

For now, the pendulum is reversing sharply, and our devices - regardless of the market capitalization of some of the stocks linked to them - are turning into their own kind of dumb terminals.

The "brain" itself has undergone a complete biological replacement. The Central Processing Unit (CPU) has been demoted. It is no longer the CEO of the computer; it is a glorified middle manager, shuffling papers while the real work is done by massive, specialized arrays of GPUs, TPUs, and LPUs (Language Processing Units). The spend on data centres in 2025 could exceed total spend on PCs and Laptops, and by some calculations may surpass that on mobile phones in a few more years. If the era of the "do-it-all" processor is over, the worry about ROI should be more about the spend still on the front end, with ever-declining utility.

Your laptop, your phone, and your corporate data center are turning increasingly inadequate. The best Edge inference chips top out at ~50 TOPS INT8, while Gemini 3 Flash Thinking already demands 1,200 TOPS sustained for the full multimodal chain. The utility share of the front-end device has fallen below 4% of total compute cycles for a typical knowledge-worker query, down from ~95% in 2019. The remaining 96% occurs in hyperscale data centres, with implications for data centre chip revenue that could be 10x the money spent on traditional servers next year.  

The local-memory-in-a-box world is also ending. Old hardware was organised around a neat stack: CPU, some DRAM slots, a storage controller, and an upgrade cycle where you “put in more RAM” every few years. AI workloads are breaking that architecture as well. The unit of compute is no longer a single board but a "Pod" or a "Supercluster"—thousands of chips wired together with cabling (like NVLink or InfiniBand) that acts as a nervous system 100 times faster than the internet. We aren't building computers anymore; we are building synthetic organisms the size of warehouses. The hot data often sits in stacked high-bandwidth memory (HBM) on the same package as the accelerator, with separate pools of cheaper DRAM and flash hanging off the system. New interconnects like CXL let servers treat external memory appliances as if they were local, pooling DRAM across a rack to be used by everyone together. 

Effectively, the comforting idea that “my application runs on one machine with its own memory” is quietly being retired. Today, the cutting-edge silicon isn't going into your laptop or phone; it is going into the grid. This may change again, but either way, the devices in our offices and pockets are getting obsolete at a rate faster than what the advanced scientific calculators witnessed back then.

Conclusion: The Industry is Dead as We Know It

Old habits die hard. I am writing you a 3,000-word essay to be read in an email inbox, a format that feels increasingly like sending a smoke signal in the age of Starlink. I feel that if I attach a hero image and hit send, I can pretend someone will read it instead of asking an agent for the 12-second version. We are all cosplaying modernity while clutching rituals already fossilized.

This denial bleeds into how we invest. We still sit in investment committees and demand multiyear revenue forecasts, pretending we can see across a horizon that changes every Tuesday. We treat "SaaS retention" and "Brand Moats" as laws of physics. We stack spreadsheets with margin assumptions as if pricing, competition, and regulation remain the same slow-moving variables. Excel is our favourite nostalgia device. In here, we build valuation models for castles made of sand, assuming that a B2C giant or a smartphone maker has a divine right to rule forever.

The industry we celebrated—the reliable compounders, the "Rule of 40", the infinite hiring plans—is walking dead. We are unable to accept the decision-making implications of when we do not know, or rather, when we cannot know. So, we keep underwriting deals with no clear exit path and lean on thumb-rule-type convictions off trends that worked in years past.

The hardest admission is simple. We cannot know what the future of “tech” will look like. Maybe, one day, the tech trends will become more predictable again, but while in the transition, it is important to accept the invisibility ahead. Our arguments above show that the hardware is becoming a utility, the software is becoming a commodity, and the internet is becoming a ghost town. To survive the next decade, we must be willing to torch our own playbooks before the market does it for us because every aspect of Tech as we knew it is afire. What remains are shifting fault lines of compute, data, and behaviour. The only durable edge now is to accept that the old is gone, even before we strap ourselves to make judgments on the new.

Related Articles on Innovation