China’s AI firms are cleverly innovating around chip bans
Tweaks to software blunt the shortage of powerful hardware
TODAY’S TOP artificial-intelligence (AI) models rely on large numbers of cutting-edge processors known as graphics processing units (GPUs). Most Western companies have no trouble acquiring them. Llama 3, the newest model from Meta, a social-media giant, was trained on 16,000 H100 GPUs from Nvidia, an American chipmaker. Meta plans to stockpile 600,000 more before year’s end. XAI, a startup backed by Elon Musk, has built a data centre in Memphis powered by 100,000 H100s. And though OpenAI, the other big model-maker, is tight-lipped about its GPU stash, it had its latest processors hand-delivered by Jensen Huang, Nvidia’s boss, in April.
Explore more
This article appeared in the Science & technology section of the print edition under the headline “Miniature model-building”
Science & technology September 21st 2024
Discover more
AI researchers receive the Nobel prize for physics
The award, to Geoffrey Hinton and John Hopfield, stretches the definition of the field
A Nobel prize for the discovery of micro-RNA
These tiny molecules regulate genes and control how cells develop and behave
AI offers an intriguing new way to diagnose mental-health conditions
Models look for sound patterns undetectable by the human ear
Why it’s so hard to tell which climate policies actually work
Better tools are needed to analyse their effects
Isolated communities are more at risk of rare genetic diseases
The isolation can be geographic or cultural
An adult fruit fly brain has been mapped—human brains could follow
For now, it is the most sophisticated connectome ever made