Hello Betamax, China-based AI company DeepSeek is making a subtle but important argument about the future of AI: The next leap may not come from more GPUs but from learning how to think around constraints. Earlier this month, researchers from DeepSeek and Peking University published a paper introducing Engram, a training method designed to bypass one of modern AI's most stubborn limits: GPU memory.  As models grow larger and are pushed to handle longer and more complex inputs, memory has struggled to keep up. The idea behind Engram is simple. Instead of keeping everything in memory at once, the AI uses memory only when needed, separating storage from computation to reduce strain on the system. The result is a model that scales more efficiently, rather than one that simply grows heavier. Tested on a 27-billion-parameter model, Engram showed stronger benchmark performance and improved long-context handling, which has been a weak spot for many large models. That matters because handling long inputs makes AI more useful in real life. The timing is also telling. DeepSeek is reportedly preparing another model launch in February. If Engram works outside the lab, it could give DeepSeek an edge at a time when GPUs are scarce and expensive. Engram questions whether "the only path forward is bigger closed models with more GPUs," Amit Verma, founding head of engineering and AI at Neuron7.ai, tells Tech in Asia. Still, he believes that Engram doesn't eliminate the need for GPUs but only uses them better, freeing them from acting like costly storage and letting them focus on reasoning. This potentially levels the playing field for smaller competitors. Then there is the Easter egg. In the Engram code, DeepSeek wrote: "Only Alexander the Great could tame the horse Bucephalus." This references the story of the famous military hero and how, as a boy, he did what no one else could by taming the horse with his brain, not his brawn. The metaphor lands. While much of the AI industry is still relying on more hardware, the DeepSeek team is suggesting a smarter way to work around limits rather than win through brute strength, much like what the young Alexander did. Engram may not rewrite AI training overnight, but it shows that the next phase of progress belongs to whoever uses GPUs more intelligently. Beyond model training, China is galloping ahead in another race: robotics. Scale, labor economics, and state support are aligned in the country, a point Southeast Asia has yet to reach. Samreen Ahmad, journalist |