Nvidia Only Has Four Critical Customers

Posted by Kirhat | Wednesday, September 25, 2024 | | 0 comments »

Nvidia
Nvidia earned about US$ 30 billion in revenue in its second-quarter earnings results. That is not surprising considering the financial position of the company. What was surprising is the disclosure of the AI chipmaker that almost half of that came from only four customers.

While the customers' identity is hidden in Nvidia’s 10-Q regulatory filing, "Customer A" made up 14 percent of Nvidia’s revenue, while two customers made up 11 percent, and the fourth made up 10 percent. Those sales — representing about 46 percent already of its revenue, or about US$ 13.8 billion — "were primarily attributable to the Compute and Networking segment," the company said.

"We have experienced periods where we receive a significant amount of our revenue from a limited number of customers, and this trend may continue," Nvidia said in the filing.

Some analysts suspect that Nvidia’s top buyers are likely to include Alphabet, Amazon, Meta, Microsoft, and Tesla. All of these companies are major players in the generative artificial intelligence boom. Even ChatGPT-maker OpenAI could be one.

Alphabet
Google announced in July it was making its fastest, most cost-efficient model, 1.5 Flash, available in the unpaid version of its Gemini chatbot in over 40 languages and more than 230 countries and territories.

In August, Google brought back Gemini’s ability to generate images of people, after having to pause the feature in February, when users pointed out that Gemini was generating historically inaccurate images of people, including racially diverse Nazi-era German soldiers.

Amazon
Months after Nvidia unveiled its next-generation, highly sought-after Blackwell chips, Amazon Web Services, the largest cloud computing provider in the world, "fully transitioned" its order of Nvidia’s Hopper chips for Blackwell because the "window between Grace Hopper and Grace Blackwell was small," the company told The Financial Times.

In April, Amazon chief executive Andy Jassy said in an annual letter to shareholders the company is "optimistic that much of this world-changing AI will be built on top of AWS." While Amazon spent about US$ 1.5 billion on Nvidia’s H100 chips in 2023, the company has also built its own chips: Trainium for training its AI models, and Inferentia, a chip for inferencing.

Meta
Meta said it trained the model with over 16,000 of Nvidia’s H100 GPUs, or graphics processing units. Meanwhile, Nvidia announced an Nvidia AI Foundry service for enterprises and nation states to build “supermodels” with Llama 3.1 405B.

Meta’s Llama models are used to power its AI chatbot, Meta AI, which is available on Facebook, Instagram, and other platforms. Mark Zuckerberg, Meta’s chief executive, told Bloomberg the company is working on Llama 4.

Microsoft
In September, Microsoft unveiled "the next wave" of its Copilot artificial intelligence tools in its suite of work apps.

The AI-powered features include Business Chat, which databases web data, work data, and business data into a new tool called Copilot Pages, where human and AI-generated data can be edited, added to, and shared between work teams. Microsoft chief executive Satya Nadella said the new tool was part of his "daily habit."

0 comments

Post a Comment