Nvidia’s AI chip game is about to get even more exciting!
The tech giant just announced a jaw-dropping $14 billion profit in a single quarter, thanks to its cutting-edge AI chips. And guess what? Nvidia’s CEO, Jensen Huang, revealed that the company will now design new chips every year instead of every two years.
Huang spilled the beans during Nvidia’s Q1 2025 earnings call, stating, “I can announce that after Blackwell, there’s another chip. We’re on a one-year rhythm.” This marks a significant shift from their previous two-year cycle, which saw the release of Ampere in 2020, Hopper in 2022, and Blackwell in 2024.
Analyst Ming-Chi Kuo recently reported that the next architecture, “Rubin,” might debut in 2025, bringing us the R100 AI GPU. Huang’s comments suggest that this report is spot on.
But it’s not just AI chips that are speeding up. Huang confirmed that Nvidia will accelerate the production of all its chips, including CPUs, GPUs, networking NICs, and switches. “We’re going to take them all forward at a very fast clip,” he said.
Want to learn more about AI's impact on the world in general and property in particular? Join us on our next Webinar! Click here to register
Nvidia’s new AI GPUs are designed to be backward-compatible, making it easy for customers to upgrade their data centres without a hitch. Huang explained, “Customers will easily transition from H100 to H200 to B100 in their existing data centres.”
The demand for Nvidia’s AI GPUs is skyrocketing. Huang noted, “We expect demand to outstrip supply for some time, as we transition to H200, as we transition to Blackwell. Everyone’s anxious to get their infrastructure online.”
Huang also made a compelling FOMO argument: “The next company who reaches the next major plateau gets to announce a groundbreaking AI, and the second one after that gets to announce something that’s 0.3 percent better.”
Interestingly, Nvidia’s CFO mentioned that automotive will be the company’s “largest enterprise vertical within data centre this year,” highlighting Tesla’s purchase of 35,000 H100 GPUs for its “full-self driving” system. Meanwhile, consumer internet companies like Meta continue to be strong growth verticals, with Meta planning to have over 350,000 H100 GPUs in operation by year’s end.
Hold onto your hats, folks! Nvidia’s annual AI chip revolution is here, and it’s going to be a wild ride.
Want to learn more about AI's impact on the world in general and property in particular? Join us on our next Webinar! Click here to register
Made with TRUST_AI - see the Charter: https://www.modelprop.co.uk/trust-ai
Comments