The newest AI boom pitch: Host a mini data center at your home

Such a distributed computing network makes sense in that “computation for AI inference can and should be distributed at the ‘edge,’ deployed on smaller platforms closer to population centers and users,” said Benjamin Lee, a computer architect and engineer at the University of Pennsylvania, in correspondence with Ars. “The strategy could impose much smaller impacts on the grid because inference requires a few GPUs, unlike training which requires thousands of them working in concert,” he said.

However, AI inference tasks can be as varied as document question-and-answer, software code generation, and multi-turn conversations—each with different computational requirements and performance expectations, Lee cautioned. So it will be important to ensure

→ Continue reading at Ars Technica

Related articles

Comments

Share article

Latest articles