AI shows the promise of turning energy efficient. We now run powerful AI algorithms on our phones. Researchers – in their quest to impart more power to algorithms – are using ever greater amounts of data and computing power, and relying on centralized cloud services. The entire process not only generates alarming amounts of carbon emissions but also limits the speed and privacy of AI applications. But a countertrend of tiny AI is changing that. New algorithms can shrink existing deep-learning models without losing their capabilities. Meanwhile, an emerging generation of specialized AI chips promises to pack more computational power into tighter physical spaces, and train and run AI on far less energy.