## The Edge Inference Bet NVIDIA dominates AI training and datacenter inference. But 80% of future AI inference will happen locally — on factory floors, in autonomous vehicles, inside retail cameras, at defense installations, on robots. Data that cannot tolerate cloud latency, cannot risk leaving the device, and cannot depend on internet connectivity. Axelera AI is building the chips for that future. ## What the Chips Do **Metis AIPU** (shipping now): 214 TOPS per chip in an M.2 form factor — the same slot used by NVMe SSDs. 4-8W power draw. $149 per card. 3,200 frames per second on ResNet-50. Already deployed across 500+ customers in manufacturing, retail, defense, robotics, and agriculture. **Europa AIPU** (shipping H1 2026): The next generation. 629 TOPS per chip in a 45W envelope. 8 second-generation AI cores paired with 16 RISC-V vector-processing cores. 64 GB memory capacity. Claims 3-5x performance-per-watt over NVIDIA's L40 and Jetson products in the same edge category. Manufactured by Samsung. ## The Architecture Advantage Three engineering choices separate Axelera from conventional GPU approaches: **Digital In-Memory Computing (D-IMC):** Instead of shuttling data between memory and processor (the von Neumann bottleneck that every GPU still faces), Axelera turns each SRAM cell into a compute element. Matrix multiplications happen in-place without data movement. This is the source of the power efficiency gap. **RISC-V Dataflow Architecture:** Open ISA, no ARM or x86 licensing. Enables rapid iteration and aligns with the EU's open-source hardware sovereignty strategy. Europa's dedicated RISC-V vector cores handle pre/post-processing natively. **Standard CMOS Manufacturing:** D-IMC uses proven SRAM technology on standard fabrication processes. No exotic materials, no bleeding-edge nodes. Lower manufacturing costs, easier supply chain scaling. The practical result: for edge inference, comparable or better throughput at a fraction of the power, cost, and physical footprint. ## The European Sovereignty Play This is not just a chip company. It is a geopolitical statement. Axelera is backed by the European Innovation Council Fund, received a EUR 61.6M EuroHPC grant through the DARE Project (Digital Autonomy with RISC-V for Europe), and its Series C included BlackRock alongside Dutch and Belgian sovereign investment vehicles. The RISC-V architecture eliminates dependency on ARM and x86 licensing — aligning with the EU Chips Act's digital autonomy agenda. Europe currently has near-zero presence in AI accelerator chips. Axelera is positioned as the continent's answer to NVIDIA, AMD, and Qualcomm for edge inference. ## The Numbers $450M+ raised across equity, grants, and venture debt. The February 2026 Series C of $250M+ — led by Innovation Industries with BlackRock and SiteGround Capital — is the **largest AI semiconductor investment in European history**. 500+ customers across defense, manufacturing, retail, agritech, robotics, and telecommunications. Distribution partnerships with Lenovo, Dell, Advantech, and Arduino. Manufacturing by Samsung and TSMC. ## The Team Co-founded in 2021 by Fabrizio Del Maffeo (CEO) and Evangelos Eleftheriou (CTO), a former IBM Fellow who invented key storage technologies. Headquartered in Eindhoven, Netherlands — the Dutch semiconductor corridor alongside NXP, ASML, and Philips. ~250 employees. ## The Market The edge AI market is projected to grow from $25B (2025) to $119B by 2033. As AI agents move from chatbots to robots, autonomous vehicles, and industrial systems, the demand for on-device inference that is fast, private, and power-efficient will compound. A $149 Metis card running 24/7 replaces ongoing cloud compute bills. That unit economics argument — capex once vs opex forever — is what makes edge inference inevitable at scale.