Your AI PC needs moar 🐏
While having an AI model take up a few GBs of space might not sound like a big deal, you have to understand that the entire model has to be in RAM for the technology to work with good performance. Those NPUs are specialized parallel processors, which crunch the numbers on these virtual neural networks. with billions of "parameters" at the same time. If the model isn't in RAM , then the NPU can't be fed quickly enough to give you results fast enough. It would impact the speed at which an LLM replies, or functions such as real-time translation, or image generation. So if you want AI features available at the push of a button or by simply speaking to your computer, the model essentially need to reserve as much RAM as it needs.