AI has undoubtedly been one of the most transformative, and talked-about, technologies of our recent times. What can we look forward to in 2025? Noam Mizrahi, as Chief Technology Officer at Marvell, has a front row perspective of AI’s next era. He also comes with a global view as he’s based at Marvell’s site in Israel, spoke to us from India, and was at Marvell’s headquarters in Santa Clara, Calif. last week.
In this Q&A, Noam shares his vision for how silicon and other foundational technologies can power AI’s future, what changes we might see in data center architecture, and the role customization and efficiency will play in sustaining AI’s rapid growth.
Q: With the AI boom now entering its third year, do you expect the pace of innovation to continue?
Noam Mizrahi: Up until now, the industry has largely followed a “Bigger is Better” path –building larger models to achieve better performance. But we should ask ourselves whether “Bigger” is still better. Can smaller, specialized models do the same or better job than a single, giant model? I believe they can.
Smaller models are often more efficient to train, easier to update, and better suited to specific tasks. You could think of them like human experts – doctors, lawyers, engineers –each specializing in their own fields. Similarly, we might see a shift toward a network of smaller, interconnected models, what I call “the Internet of Models.” These models could collaborate to deliver precise, cost-effective insights, much like how the internet itself functions as a collection of interconnected websites.
We will still have larger models that would be used as foundational models, reasoning models and new yet-to-come types, all used together as part of the “Internet of models”. I do see smaller, specific domain expert type of models, being the bulk. The industry’s future lies in balancing efficiency, specialization, and innovation.
Q. Can you share an example of how AI is making a significant impact at Marvell – either customer-facing or internally?
Noam Mizrahi: AI presents an incredible opportunity for Marvell. One of the biggest aspects we’re focusing on is the need to customize silicon at scale and optimize every detail. We work closely with our customers, bringing their vision to life, and through these advancements, especially in data centers, billion-dollar opportunities are opened up for us.
But at a wider, industry wise perspective, we’re at an inflection point where we can optimize the entire chip-making process – from algorithms and coding to verification, validation, physical design, software development, productization and other. This will drive silicon technology to the next level, reduce development costs, and accelerate adoption of AI across the industry.
As an engineer, I sometimes think about the notion that AI could replace certain tasks or even jobs. But this is something we’ve seen before. Thirty years ago, when we were doing chip layouts by hand, there were similar concerns about layout tools taking over with the introduction of high scale layout tools that can deal with millions and billions of transistors. I believe similar concerns came up when the machines of the industrial revolution were introduced. But what actually happened is that this evolution of technology enabled people to grow to the next phase and control the machines to do even more complex and demanding tasks, that are yet controlled by humans.
The same will happen now – AI won’t replace humans so much as empower those who understand AI technology to do even more. That’s why it’s so critical to stay on top of AI advancements.
Q: How can silicon and other foundational technologies accelerate to meet AI’s challenges?
Noam Mizrahi: I see silicon technology as one of the key enablers of AI’s present and future growth. It’s the fundamental piece – without it, much of the innovation AI brings wouldn’t be possible. While AI algorithms have been around for a long time, what has changed over the last few years is that we now have the compute platforms to handle the necessary computations. That shift has been the catalyst for the incredible advancements we’re seeing today.
To push AI forward, we also need chips that are not only higher performing but also more efficient. For instance, reducing power consumption is critical. Data centers that are being built today to host next generation AI clusters with hundreds of thousands and even a million AI accelerators require a power supply of a large city – and keep on growing – a trend that’s simply not sustainable. Any technologies that will reduce overall power consumption will be super important. But we must innovate across every element of the AI data center: AI accelerator chips, memory, connectivity, storage, and switching.
Q: What changes do you foresee in data center architecture to better power AI in 2025?
Noam Mizrahi: One key trend I see is the intensification of customization. Every element in an AI data center – from accelerators to connectivity – needs to be tailored to specific use cases for a perfect fit. At the scale we’re operating, every wasted milliwatt (mW) turns into a megawatt (MW), and every non-optimized nanosecond translates into significant inefficiencies. Customizing each component is critical to managing costs and energy consumption effectively.
Another major shift is the move toward “Optics Everywhere” within AI data centers. Optical connectivity is the only technology that can scale to meet the bandwidth, power, and reach requirements of AI platforms, at the required efficiency and reliability. While copper will continue to be used where possible, optics will increasingly take over as data centers scale. I envision a future where AI accelerators integrate silicon photonics-based optical engines directly into their packages, enabling faster and more efficient communication between components.
Finally, there will be a shift in focus from training to inference. While training has been the primary emphasis in the AI world, the spotlight is already beginning to move toward inference. Inference – where artificial intelligence meets reality – is how companies capitalize on their AI investments. To drive wider AI adoption, businesses will need to optimize inference platforms down to the last bit, making them more cost-effective and better aligned with specific use cases and user needs. Without widespread, efficient inferencing, even the best-trained models have limited real-world impact. With that focus shift happening, we will see AI platforms that are optimized more for efficiency and cost reduction of inference usage, in some cases, at a tradeoff of higher performance.
Q. What’s a book you’re excited to read in 2025?
Noam Mizrahi: I really love reading, and I’m always excited about books that offer fresh perspectives. One book I’m looking forward to diving into is Chip War by Chris Miller. I had the chance to hear him speak at a recent conference in India, and his insights really resonated with me. While we all know the importance of semiconductors and silicon in today’s world, Dr. Miller provided a unique angle that brought new clarity. It’s a book I can’t wait to get into.
Noam’s insights remind all of us that AI’s next frontiers aren’t just about making things faster or bigger – it’s about making them smarter, more efficient, and more sustainable. As we look to 2025 and beyond, the possibilities are endless.
####
Marvell and the M logo are trademarks of Marvell or its affiliates. Please visit www.marvell.com for a complete list of Marvell trademarks. Other names and brands may be claimed as the property of others.
This blog contains forward-looking statements within the meaning of the federal securities laws that involve risks and uncertainties. Forward-looking statements include, without limitation, any statement that may predict, forecast, indicate or imply future events or achievements. Actual events or results may differ materially from those contemplated in this blog. Forward-looking statements are only predictions and are subject to risks, uncertainties and assumptions that are difficult to predict, including those described in the “Risk Factors” section of our Annual Reports on Form 10-K, Quarterly Reports on Form 10-Q and other documents filed by us from time to time with the SEC. Forward-looking statements speak only as of the date they are made. Readers are cautioned not to put undue reliance on forward-looking statements, and no person assumes any obligation to update or revise any such forward-looking statements, whether as a result of new information, future events or otherwise.
Tags: What Makes Marvell