Opportuna Newsletter #8 | May-25 Edition

Don’t Call It a Comeback (But Maybe It Is): Sunlight, Silicon & Secondary Markets

Finally, some sun. And not just outside—IPOs are starting to show signs of life too. We continue to see plenty of opportunities to deploy capital, especially in secondaries. But these deals come with hidden cracks and navigating them takes more than just enthusiasm—it takes real technical know-how. Stay tuned here: much more to come!

If you’d like to catch up in person, we’ll be in The Hague on May 21 for a cybersecurity event, and at Gitex Europe in Berlin from May 22–23. We're planning to be back in Berlin in early June as well. Spring has us on the road.

This edition focuses—as ever—on the convergence of private and public tech investing. Our vertical this month: semiconductors. Startups in the space are experiencing a revival, turbocharged by the AI spending wave.

In “Current Topics,” we break down how AI is reshaping semiconductor demand—GPUs, high-bandwidth memory, packaging, and connectors are all in the spotlight. And in AI Is Rewiring the Semiconductor Stack, we look ahead: the next 5–10 years of innovation, and the companies pushing the frontier.

Special thanks to Michele Barbazza, our advisor on the semiconductor vertical, who helped put together this outstanding list.

🚨 Highlights of the Month

This week eToro, the Israeli stock brokerage platform went public, pricing its IPO at $52 a share. Shares closed at $67 Wednesday; we are back to IPO popping on the day. That’s a positive for the IPO market opening. In the last month, Figma and Chime filed for IPO.

It has been a fascinating month for the future of Big Tech. With OpenAI launching a shopping experience inside ChatGPT, building a social network, hiring Fiji Simo – see why we think this matter here -, and finally the revelation that Apple is exploring a move to AI Search in browser. Here is a quote worth reflecting upon from Eddy Cue, Apple’s SVP of Services:
“Cue said that, in order to improve, the AI players would need to enhance their search indexes. But, even if that doesn’t happen quickly, they have other features that are “so much better that people will switch.” “There’s enough money now, enough large players, that I don’t see how it doesn’t happen,” he said, referring to a switch from standard search to AI. “ 

Secondaries have received a lot of attention recently. Bloomberg wrote a piece focused on what it is so hot right now. We found this article particularly on point:

1. VCs registering as IRAs will allow them to take part in more secondary transactions, bringing capital and discipline into the asset class

2. Key risk in the asset class: lack of transparency

3. One fascinating take-away: 1/3rd of transactions get cancelled.

📈 Chart of the Month: Nvidia Now 30x Intel

Source: Koyfin, May 13th

🌐 Current Topics: AI Is Behind the Semiconductors Surge

The semiconductor industry is experiencing a profound shift driven by escalating AI demand. Annapurna Labs, acquired by Amazon a decade ago, highlights this change, now producing chips like Trainium for Amazon’s own AI training, and supplying Project Rainier, Amazon’s supercomputer developed for Anthropic.

AI spending is heavily concentrated among the top hyperscalers—Amazon, Meta, Google, and Microsoft—whose combined capex will approach $300 billion by 2025, doubling their 2023 expenditures. Despite the substantial capital involved, these companies have substantial financial flexibility, with capex consuming less than 60% of their operating cash flow. Additionally, investments in cloud infrastructure generate returns as vendors such as Anthropic and OpenAI utilize their services.

GPUs have become central to AI computing, commanding around 90% of the market, significantly boosting Nvidia's market cap, now 30 times larger than Intel’s. AI servers also require advanced memory solutions such as HBM, essential for rapid data processing, alongside sophisticated assembly technologies including specialized die bonders, micro-bumping, and Through-Silicon Vias (TSVs). These needs also fuel demand for high-performance interconnects between GPUs and other server components.

Investors have focused on listed entities: Nvidia (GPUs), SK Hynix (HBM), BE Semiconductor (die bonders), Astera Labs (semiconductor-based connectivity), and Credo (serial connectivity solutions).

🧭LT: The Rise of Reasoning Models

AI continues reshaping semiconductor demand beyond two years, shifting from training towards inference, with growing importance for edge applications across automotive and mobile sectors. The dominant GPUs face competition from application-specific integrated circuits (ASICs), tailored for AI workloads.

A diverse group of disruptors challenges Nvidia's hegemony. Established innovators—Ampere, Cerebras, SambaNova, and Graphcore (on its way to being acquired for $500m by Softbank) — target extensive infrastructure deployments. Ampere leverages ARM-based architectures for efficient cloud inference, while Cerebras bets on wafer-scale engines, minimizing chip-to-chip communication for massive model training.

Newer specialized firms like Groq, Tenstorrent, d-Matrix, Furiosa, Recogni, and Lambda Labs each tackle distinct aspects of AI acceleration. Groq focuses on deterministic, ultra-low latency inference. Tenstorrent develops programmable chips with novel on-chip networking, and d-Matrix pioneers in-memory computing for energy-efficient inference. Furiosa and Recogni specialize in niche applications, including automotive perception.

Photonics emerges as another transformative technology, addressing traditional electrical bottlenecks in data movement and computation. Companies like Ayar Labs build chip-adjacent optical communication solutions, significantly enhancing data transfer efficiency between processors and memory in AI systems.

More ambitiously, Celestial AI and Lightmatter explore photonic computing—leveraging light for actual computations, potentially surpassing electrical computation in speed and efficiency. Celestial AI aims to utilize photonics for memory-centric computations, while Lightmatter focuses on heavy mathematical processing required by AI training and inference.

The integration of photonics into semiconductor designs promises faster, more energy-efficient AI hardware, breaking current computational barriers. This fundamental shift in semiconductor architecture not only addresses AI's voracious data demands but also sets a new direction for semiconductor innovation, reshaping the industry's long-term landscape.

Source: Michele Barbazza

Source: Michele Barbazza

📌 Conclusion

That’s it for this edition—sunshine, silicon, and a healthy dose of secondaries. The semiconductor stack is evolving fast, and AI is both the wrecking ball and the architect. We’ll keep tracking the companies reshaping the infrastructure layer of tech, and sharing the opportunities (and risks) we see across public and private markets.

If you’re building in this space or looking for exposure, let’s talk. Get in touch here.

And if you’re in The Hague or Berlin later this month, drop us a line—we’d love to catch up over coffee or something stronger.

Until next time,
The Opportuna team