Summary: Meta, led by CEO Mark Zuckerberg, is investing billions in Nvidia’s H100 graphics cards to build a massive compute infrastructure for AI research and projects. By end of 2024, Meta aims to have 350,000 of these GPUs, with total expenditures potentially reaching $9 billion. This move is part of Meta’s focus on developing artificial general intelligence (AGI), competing with firms like OpenAI and Google’s DeepMind. The company’s AI and computing investments are a key part of its 2024 budget, emphasizing AI as their largest investment area.
Of course they do, but my point was that I doubt Meta is locked into this generation.
The article says “by the end. Of the year” they will spend billions
“spend billions” does not equal “hand over cash and take home GPUs”. It’ll mean a contract worth that amount with delivery terms defined over time. Even over the course of a year there’s likely to be newer product than Lovelace.
When you get product you pay for it. Spending means paying for it. You may have a contract for future product, but you don’t pay for the future product in advance as SOX rules kick in. Commonly, a chip development cycle can be at least 10 months.