AI: Lift-off for Elon's xAI. RTZ #371

AI: Lift-off for Elon's xAI. RTZ #371

As early as this AI Tech Wave is, there’s no question that Elon Musk is an ‘origin’ player in the AI game. His fabled involvement goes back to when OpenAI was but a gleam in the eye of Google co-founder Larry Page, and founder Elon Musk. And a bit later, of course OpenAI founder Sam Altman.

That origin story set the stage for a lot that has happened in AI since, including the ‘accel’ vs ‘doomer’-driven ‘decel’ movements, to the soon to be less important debate over ‘open’ vs ‘close’ AI models and businesses.

So of course, the media is all at attention everywhere on reports that Elon Musk has closed over $6 billion in a Series B funding round. All at a $24 billion post-money valuation for his AI venture, xAI, from his close investors in his other ventures, the ones mentioned above. And of course the recent X/Twitter adventure, itself recently adorned with AI bling Grok 1.5. With an unambiguous ambition to “Understand the Universe”.

And of course with an xAI ‘Gigafactory of Compute’ to be built with much of the proceeds, a lot of which will go into the revenue coffers of AI GPU infrastructure kingmaker Nvidia/Jensen Huang. For both their current Hopper H100 and future Blackwell B100 AI chips and systems infrastructure.

Axios does a good job of summarizing the scene for us thus far in Elon’s AI-venture:

“Elon Musk’s AI startup — xAI — announced that it has raised $6 billion in one of the largest venture capital funding rounds of all time.”

“Why it matters: This could help Musk begin to catch up with ChatGPT maker OpenAI, which he co-founded before leaving in a dispute that’s become litigious.”

“The intrigue: Some of the xAI investors, such as Andreessen Horowitz and Sequoia Capital, also backed OpenAI.”

“Zoom in: All of the $6 billion is new money, Musk tells Axios, rather than shares “given” to investors in Musk’s takeover of Twitter/X.”

  • “That said, there’s plenty of overlap in the investor list, and it’s unclear if the original Twitter/X backers also received additional equity in xAI, which Musk says is now valued at $24 billion.”

  • “In addition to Andreessen and Sequoia, investors in the Series B round include Valor Equity Partners, Vy Capital, Fidelity, Craft Ventures, Prince Alwaleed Bin Talal and Kingdom Holding.”

“The bottom line: xAI didn’t say much about what the new money will go toward, but The Information recently reported that it plans to build a massive new supercomputer — the “Gigafactory of compute” —possibly in partnership with Oracle.”

Of course, as I’ve highlighted before, the world of hyperscale LLM AI model companies turn on their ability to harness unrelenting amounts of ongoing data sourced and feeds. That goes in spades for everyone from OpenAI/Microsoft, to Meta, Google, Amazon, Apple and many others.

As the WSJ highlights, it’s the one area where Elon counts the way his cluster of companies are well-positioned to provide training and inference data for xAI going forward:

“One selling point for xAI, according to its investors, is Musk’s other businesses, which collect valuable data that could be used to train the startup’s AI models and give it a leg up over competitors.”

“Musk is using X data to train Grok, which is delivering news summaries for the Stories feature on X. In the future, Musk could also use visual data from Tesla cars for model training and integrate xAI’s technology into Tesla’s Optimus humanoid robot, investors say.”

“Musk has talked publicly about both the need for massive amounts of data to train models and the amount of data that Tesla collects.”

“The two sources of unlimited data are synthetic data and real world video,” he said during an interview on X Spaces last month. “Tesla has a pretty big advantage in real-world video.”

If the specific data feeds from Tesla EV cars and X/Twitter are general enough for xAI’s larger ambitions to ‘understand the true nature of the universe’, AND catch up/surpass OpenAI GPT-x, Google Gemini and beyond, and other LLM AIs to come, is anyone’s guess.

Time to train and deploy the next generation models matter, given the exponential nature of AI Scaling Laws, and the intense race amongst LLM AI hyperscaler companies already under way.

As Nvidia founder/CEO Jensen Huang explained the urgency in time for LLM AI models on the latest earnings call Q&A:

“Let me give you an example of time being really valuable, why this idea of standing up a data center instantaneously is so valuable and getting this thing called time to train is so valuable. The reason for that is because the next company who reaches the next major plateau gets to announce a groundbreaking AI. And the second one after that gets to announce something that’s 0.3% better. And so the question is, do you want to be repeatedly the company delivering groundbreaking AI or the company delivering 0.3% better?”

“And that’s the reason why this race, as in all technology races, the race is so important. And you’re seeing this race across multiple companies because this is so vital to have technology leadership, for companies to trust the leadership and want to build on your platform and know that the platform that they’re building on is going to get better and better. And so leadership matters a great deal.”

“Time to train matters a great deal. The difference between time to train that is three months earlier just to get it done, in order to get time to train on three-months project, getting started three months earlier is everything. And so it’s the reason why we’re standing up Hopper systems like mad right now because the next plateau is just around the corner.”

Elon Musk is not waiting. He’s getting his share of Nvidia chips and AI data center infrastructure, as fast as possible. He is not going to be left out of this AI Tech Wave race. That’s for sure. It’s time for xAI lift-off.

Pass the popcorn, please, and… Stay tuned.

(NOTE: The discussions here are for information purposes only, and not meant as investment advice at any time. Thanks for joining us here)

Want the latest?

Sign up for Michael Parekh's Newsletter below:

Subscribe Here