Your AI PC needs moar 🐏


Those NPUs are specialized parallel processors, which crunch the numbers on these virtual neural networks. with billions of "parameters" at the same time.

If the model isn't in RAM, then the NPU can't be fed quickly enough to give you results fast enough. It would impact the speed at which an LLM replies, or functions such as real-time translation, or image generation. 

So if you want AI features available at the push of a button or by simply speaking to your computer, the model essentially need to reserve as much RAM as it needs. 

Comments

Popular posts from this blog

Perplexity

Aphorisms: AI

DeepAI's Austen on China