Navigated to China's Plans to Make AI a Utility - Transcript

China's Plans to Make AI a Utility

Episode Transcript

Speaker 1

Bloomberg Audio Studios, podcasts, radio news.

Speaker 2

Earlier this year, our colleague James Mager took a trip to the middle of the desert in the remote region of Shinjong, China.

Speaker 3

Shinjung is the western third of China.

Basically it borders Mongolia, Kazakhstan, Uzebekistan and where we went on this trip is situated in a river valley.

It's got these lovely snow capped mountains overlooking the town.

Really was a stunningly beautiful place, but very hard to get to.

Speaker 2

James was there on a lead generated by Bloomberg's data journalism team that a handful of Chinese companies were building some forty data centers across the region and planned to power them with tens of thousands of Nvidia chips.

Chips that the US banned from being exported to China in twenty twenty two is as well.

China has criticized the US moves to expand restrictions one is access to sympiconductor technology, saying that they will harm supply chains and the world economy.

Speaker 1

They want to build a AI industry that can go head to head with the US.

Speaker 2

Bloomberg data reporter Andy Linn led the team's investigation.

Speaker 1

In modern warfares, AI is playing a larger role, so the US is worried about China developing the AI capacity with developing their high end military must be.

Speaker 2

In Since the Biding Administration's initial restrictions on advanced AI chips, the US has tightened export controls, putting them at the center of tensions with Beijing.

Just last week, Bloomberg learned of plans by the Trump administration to further restrict the shipments of AI chips to Thailand and Malaysia, part of an effort to prevent them from being potentially smuggled into China.

Speaker 1

So by building these data centers and build up the computing power, they aim to build a local, domestic EI industry that can go ahead to have with opening AI Alphabet Amazon Meta in the US.

Speaker 2

This is the Big Take Asia from Bloomberg News.

I'm Wanha.

Every week we take you inside some of the world's biggest and most powerful economies and the markets, tycoons, and businesses that drive this ever shifting region.

Today on the show, China's AI ambition rises from the desert.

China is fast becoming a world leader in artificial intelligence it's one of the top consumers of semiconductors, and as Deepseek's latest model shows, Chinese companies are hungry to create top tier AI technology to rival international competitors.

But Andy says the US government's export controls have created a real problem for China by limiting its access to Nvidia's priced semiconductors.

Speaker 1

Nvidia is the industrial standard for any AI training operations, so for example, Open Ai, the most famous AI companies in the world, use Nvidia chips for training the chat, GPT models and whatever.

You want to build a sophisticated AI model, you need to use a lot of Nvidia chips.

So MVA chip is sought after across all the world for all the AI development because every country wants their own AI.

Speaker 2

Industry, and he says China's ultimate goal is to have its own source of advanced AI chips, ones that are on par with Nvidia's offerings.

For now, though, in semiconductors are the best product on the market and Chinese companies are eager to access them.

Speaker 1

Since the US introduced these export controls, I heard sources in China saying they are having a hard time getting large volumes of Nvidia chips for their operations.

I think this slow China's access to these chips, but not stop them from getting them outright.

Speaker 2

The US government has no official consensus on how many restricted Nvidia chips are currently in China.

Two senior Biding officials estimated that China had around twenty five thousand chips, but most of the people Bloomberg spoke to say there isn't an agreed upon estimate.

That's why when Andy found documents laying out the volume of Nvidia chips that companies hope to obtain for these data centers, he was.

Speaker 1

Surprised because most of the people I talked to would say that they assume some developments of these data standard using Nvidia chips is ongoing, but not like declaring to the world that they're going to use band Nvidia chips in this large volume.

Speaker 2

In the fourth quarter of last year, the Chinese government approved a total of thirty nine data center investment projects in Shinjan and the neighboring Chinghai province.

They've built out plans to use more than one hundred and fifteen thousand Nvidia processors.

Speaker 1

If wanted to compare this scale to what the US have usually the tech companies there will claim if you have more than one undred thousand Nvidia chips in one place, then you'ing in a place to develop a state of the art models which can possibly compete with Gemini Chat JBT.

Speaker 2

These kind of models, there's no explanation in the official documents of how companies plan to acquire the band in Nvidia chips, and it could all be aspirational.

In a response from Nvidia, the company said that posting a web page about restricted products is not the same as successfully licensing, building and operating a data center, and that Nvidia does not provide any support or repairs for restricted products.

In the meantime, the companies listed in the filings, state officials and central government representatives in Beijing all declined to comment when asked to explain.

Still the building continues in the desert.

Most of these data centers are located in a single compound in Ewu County, set up by the local government in Shinjug.

These data centers house big computers, servers, and other components and are used to process, store, and distribute data.

And while they might be located far from big cities like Shanghai or Shenzen, startups in these cities can send their models to the data centers for training and optimization.

Speaker 1

Theoretically, any company in China can access that computing power.

Essentially, you can send a request to the data centers in Shinjioan saying that, hey, we want to use how many chips to train our models in the eastern regions, and then the data centers and Shinjong will produce a result to you, maybe in a few minutes or even a few seconds.

Yeah, so these are all connected because you don't need to be present at the data center use the computing power.

Speaker 2

Shinjong is an ideal location for large scale data centers.

Land is cheap and rental costs are low.

The region also has cool weather, which helps offset the heat generated by the servers.

Most importantly, it's a hub for green energy.

On the way to look for the data centers, James Mager stumbled across miles and miles of renewable energy installations.

Speaker 3

The desert was just a dusty, rocky plane, and scattered across the desert was just windmill after windmill, after solar panel installation after solar panel installation, and then there was also this power generation, which is where the company uses mirrors to concentrate sunlight on tooward tower, which is filled with molten sodium, and then that heat is generated by concentrating all that sunlight is then used to generate electricity.

And you can see this like twenty thirty kilometers away because the sunlight is so blinding as it's concentrated onto this tower.

Speaker 2

This combination of green energy and advanced computing fits in with Beijing's economic push for sustainable development.

After the break, what is China's master plan to build these data centers?

We oom in on one company to find out.

Bloomberg's James Mager spent three days in Shinjung tracking down the data centers.

Andy Linn and his team had found on paper, Iwu.

Speaker 3

Is pretty remote, even in Shinjong, which is pretty remote in China.

To get there, it was a four hour plane ride from Beijing to the city of Hami, and then from Hami to Iwu is another three and a half for our car ride up into the mountains.

Speaker 2

Using the data team's coordinates, James was able to locate massive building complexes in the desert right where Andy said the data centers were being built.

Speaker 3

There was a lot of construction in Eyu, and there was some construction going on up in the area with all the solar panels.

I've never seen a data center before.

I mean, this is the first time I've seen a data center.

But they look like what I imagine a data center.

Speaker 1

Would look like.

Speaker 3

You have a large building which is three, four or five stories tall, has almost no windows.

They obviously don't need a lot of light.

Having a lot of windows is going to make it harder to keep the temperature inside the data center controlled at the proper level.

Speaker 2

It sounds like they had certainly all the hallmarks of being a data center.

Speaker 1

Yes.

Speaker 2

Among all the projects examined in the investigation, one caught the team's attention.

It involves a company controlled by Nyocore, an energy firm that's partially owned by the state government based in Tianjin in northern China.

Niacore's main business is supplying solar and wind power.

And I asked Andy, why a green energy firm, of all companies, would be building data centers.

Speaker 1

Yeah.

So you don't usually think that green energy companies delve into the data standard business, but in China, Beijing has been encouraging green energy companies to invest into AI data centers because they want to lower the carbon foot prints.

That will create a win win.

Speaker 2

According to the documents, Bloomberg found Nyacore plans to build a data center with six hundred and twenty five servers.

Speaker 1

It is one of the largest projects we found in the batch of the documents.

Based on their investment approvals, they need two thousand, one hundred GPU chips.

This a lot because if you imagine one at one hundred chips cost around twenty thousand dollars, then well is a very big amount of money just for mid sized green energy company in China.

Speaker 2

Niacord is selling its computer power to infinitgens Ai, one of the largest AI infrastructure companies in China.

The company has raised one billion yun or almost one hundred and forty million dollars in funding since it was founded in twenty twenty three.

Niacord declined to comment and infinogens Ai couldn't be reached for a response, but last year, the CEO of infinogens Ai said in an interview with local media that the goal is to make computing power more accessible to AI companies all across China.

Speaker 1

Their CEO said they aim to create a computing power system that allows AI developers across the main line to just log into their system and get all the computing power they want, just like opening a like water tap and then you can get the water you want.

Speaker 2

Computer power on tap.

But in order to make computer power accessible anywhere, anytime, there's an issue that needs to be addressed first.

Shinjong is China's most western region.

Most of the AI companies that need the computing power are based in big cities like Shanghai, far in the east.

That's more than two thousand miles apart, about the distance from Chicago to Los Angeles.

Andy says.

The Chinese government plans to bridge this supplied demand gap by building what they call computing power corridors.

Speaker 1

You can think about this as the water resource management system in China.

They have been moving the water from the southern region to the north region, which is suffering from draft.

So they have been moving crucial resources around in the mainland, and now they want to move the computing power.

Speaker 2

So in some ways they're thinking of a computing power like a utility, right, and the computing power corridor would be like the power grid.

Speaker 1

Yeah, yeah, that's what they are aiming for.

They're building infrastructure like more cables along the codors, so that they can encourage more AI developers to build infrastructure in the western regions.

Speaker 2

This is all part of China's plan to dominate in AI.

But in order for China to get there, it needs all these things to come together.

Its own high tech chips that can rival in NVIDIAs, computer power on tap, and an industry with cutting edge innovations.

And Andy says building such a national network could encourage more Chinese companies to jump into the AI space without needing to build their own data centers.

That would help startups come up with more innovations like Deepseek did with its chatbot, and ultimately narrow the gap between the US and China.

Speaker 1

I think one of the most powerful aspect of this utility for AI computing powers that it will not be controlled by a couple of tech giants only.

So if you make a comparison between US and China.

In the US, computing power is mostly owned by a handful of tech giants.

Open ai madea alphabet, but if you make it as a utility, every developer across the mainline can just log into a system and then got as much computing power as they want.

Then it will create a very distributed system for AI innovations, so that anyone who have the idea can just create its own model, and then we'll have a more collaborative environment for these developments, which is totally different from what we have in the US.

Speaker 2

There's a lot of uncertainty about this model.

Will China be able to create a cheap and reliable power source that everyone can tap into, will it be able to acquire computer chips that AI development demands, and will China be able to make those chips itself rather than relying on a US company Like in video.

Speaker 1

What we found in Shinjiai and Shanghai for these scale of data CENTTA development shows that Beijing is trying very hard to nurture is on domestic AI industry so that they can match the tech giants from the wests.

Speaker 2

This is The Big Take Asia from Bloomberg News.

I'm wanha.

To get more from The Big Take and unlimited access to all of Bloomberg dot Com, subscribe today at Bloomberg dot com slash podcast offer.

If you like the episode, make sure to subscribe and review The Big Take Asia wherever you listen to podcasts.

It really helps people find the show.

Thanks for listening, See next time.