Anker, the company best known for phone chargers and power banks, just announced it built its own AI processor—and the move exposes a fundamental flaw in how the entire industry has approached artificial intelligence chips.
The Thus chip represents a quiet but significant shift in who can manufacture AI hardware and where that intelligence actually runs. For years, the AI chip market has been dominated by a handful of companies optimizing for massive data centers and cloud computing. Anker’s approach is different: a processor designed to run AI directly on small devices like earbuds, speakers, and smart home gadgets, without sending data to distant servers.
- The Power Revolution: Anker’s Thus chip uses compute-in-memory design to reduce AI processing power consumption by eliminating constant data movement across traditional chip architectures.
- The Privacy Shift: Local AI processing means voice commands and audio data never leave your device, eliminating cloud-based surveillance risks.
- The Market Challenge: A consumer electronics company successfully developed custom AI silicon, breaking the traditional semiconductor industry’s barriers to entry.
According to Anker CEO Steven Yang, the Thus chip uses a design principle called compute-in-memory, which fundamentally rethinks how AI processors work. Yang explained that existing AI chips store the model parameters—the mathematical weights that make an AI system function—in one location, then physically move that data across the chip many times per second during each inference, or prediction. This constant data movement wastes power and creates bottlenecks, especially in devices with limited battery life.
What Makes Compute-in-Memory Different from Traditional AI Chips?
The Thus chip is the world’s first neural-net compute-in-memory AI audio chip, according to Anker’s announcement. By integrating memory and computation in the same place, the processor can perform complex AI calculations using far less power than traditional designs. The chip is also smaller than conventional AI processors, making it practical for devices where space is at a premium.
This matters because it changes what’s possible in consumer electronics. Today, when you use AI features on a small device—voice recognition in earbuds, noise cancellation in headphones, smart responses in home devices—most of that processing either happens in the cloud or requires a chip designed for much larger devices. Both options come with tradeoffs: cloud processing means your audio data travels to a server, raising privacy concerns and creating latency. Using oversized chips in tiny devices wastes battery and generates unnecessary heat.
• Traditional AI chips waste power moving data between memory and processing units hundreds of times per second
• Compute-in-memory eliminates this bottleneck by performing calculations where data is stored
• Result: Significantly lower power consumption and smaller chip size for battery-powered devices
How Does Local AI Processing Change Privacy?
Anker plans to integrate Thus into its audio devices, mobile accessories, and IoT products. The company hasn’t announced specific products yet, but the chip’s design suggests it could enable features like on-device voice commands, real-time audio processing, and AI-powered smart home functionality without requiring constant internet connectivity or server processing.
The Thus announcement also signals something broader about the AI industry’s architecture. For the past several years, the narrative around AI has centered on scale—larger models, more parameters, more computational power. That story benefited companies like Nvidia, which sells the specialized processors that power data centers running massive language models. But Anker’s move suggests the next phase of AI adoption may not be about bigger systems, but smarter distribution: pushing intelligence to the edge, to the devices people actually hold and wear.
This approach has privacy implications that matter to everyday users. When AI processing happens locally on your device rather than in the cloud, your data doesn’t need to leave your possession. A voice command processed by Thus on your earbud stays on your earbud. Audio from a smart speaker stays on the speaker. This is a technical advantage that also happens to align with growing user concerns about data collection and surveillance.
Why Can a Charger Company Build AI Chips When Others Can’t?
The Thus chip also challenges the assumption that AI development requires the resources of a trillion-dollar company. Anker is a significant consumer electronics manufacturer, but it’s not a chip design giant like Intel or AMD, and it’s not a cloud computing company like Google or Amazon. The fact that Anker could develop and announce its own AI processor suggests that custom silicon for AI is becoming more accessible to companies outside the traditional semiconductor elite.
• Traditional AI chip development required massive R&D budgets and semiconductor expertise
• New design tools and manufacturing partnerships are lowering barriers to entry
• Consumer electronics companies can now create specialized processors for their specific use cases
Whether Thus becomes a standard in the industry depends partly on whether Anker’s execution matches its ambitions. The company needs to deliver products that actually work, prove that the power savings are real, and demonstrate that local AI processing on these devices provides meaningful benefits to users. It also needs to convince other manufacturers to adopt Thus or similar chips, rather than relying on existing solutions.
What Does This Mean for the Future of AI?
The announcement arrives as the AI industry faces growing pressure on multiple fronts: energy consumption from massive data centers, privacy concerns about data collection, and questions about whether centralized AI models are the only path forward. Research into emerging AI semiconductor markets shows increasing interest in specialized processors designed for specific applications rather than general-purpose cloud computing.
Anker’s Thus chip doesn’t solve those problems entirely, but it offers a different answer to the question of where AI should run. Watch for actual Thus-powered products in the coming months—that’s when we’ll know if this challenge to the current order is real.
