Key Takeaways
- PIMIC is an AI semiconductor startup founded by Subi Krishnamurthy, focusing on innovative processing-in-memory (PiM) technology for ultra-low-power AI solutions.
- The company plans to launch two ultra-efficient AI silicon chips at CES 2025, promising 10x to 20x power savings tailored for edge applications.
- PIMIC's Jetstreme™ Processing-in-Memory architecture enables efficient AI inference for devices, addressing the increasing demand for performance in small to large AI models.
Subi Krishnamurthy is the Founder and CEO of PIMIC, an AI semiconductor company pioneering processing-in-memory (PiM) technology for ultra-low-power AI solutions. With over 30 years of experience in silicon design and product development, Subi has led the mass production of 12+ silicon projects and holds 30+ patents. He began his leadership journey at Force10 Networks, advancing networking silicon as a lead designer and architect, and later served as Executive Director and CTO of Dell Networking, driving technology strategy, product architecture and technology partnerships.
Subi founded Viveka Systems to innovate in networking software and silicon solutions and later consulted for various companies on Smart NICs, AI pipelines, gaming silicon, and AI inference engines. Subi holds an M.S. in Computer Science from Southern Illinois University, Carbondale, and a Bachelor of Engineering in Computer Science from the National Institute of Technology, Tiruchirappalli.
Tell us about your company?
PIMIC is a groundbreaking AI semiconductor startup delivering highly efficient edge AI solutions with unparalleled performance and energy savings. PIMIC’s proprietary Jetstreme™ Processing-in-Memory (PIM) acceleration architecture brings remarkable gains in AI computing efficiency by addressing the key requirements in edge environments, including low power, compact design, and superior AI model parameter update performance. PIMIC is set to launch two ultra-efficient AI model silicon chips for edge applications at CES 2025, delivering 10x to 20x power savings. We are also advancing our efforts on a breakthrough AI inference silicon platform designed for large-scale models, with a focus on achieving unprecedented efficiency.
What problems are you solving?
By delivering the most efficient and scalable AI inference platform for tiny to large AI models, PIMIC’s solutions meet or exceed the rapidly increasing demand for the performance and efficiency required to run the AI agentic workflows and large multimodal modeling. Our solutions also address the need to run AI inferencing tasks seamlessly and effectively on local (at the edge), battery-powered devices.
What application areas are your strongest?
Initially, PIMIC’s focus is on tiny AI model inference applications such as keyword spotting and single-microphone noise cancellation (running at 20uA and 150uA respectively) for wearables and other battery-operated devices. These solutions deliver 10x to 20x power savings while reducing system costs through a highly integrated design.
What keeps your customers up at night?
Our customers are finding that the rapid increase in AI model size, complex agentic workflows, and multimodal models require much more inference compute power that outpaces the architectural capabilities of current edge AI silicon. The demand for inference compute performance is set to far exceed what existing hardware can deliver, creating a significant disparity. This challenge necessitates a new generation of silicon with breakthrough improvements in efficiency and performance.
What does the competitive landscape look like and how do you differentiate?
Most AI inference silicon architectures currently on the market were designed over the past six years. These older designs are struggling to meet the performance and efficiency demands of rapidly evolving AI modeling.
PIMIC’s solutions are built on a brand-new architecture that incorporates a number of AI innovations to significantly improve efficiency and scalability, including our proprietary Jetstreme™ Processing-in-Memory (PIM) technology. Our focus is on delivering an efficient, scalable silicon platform capable of handling everything from tiny to large AI models with billions of parameters, offering significant PPA (performance, power, area) advantages that we believe can keep-up with performance demands, and enabling the latest AI models to be run seamlessly and effectively on any local edge device. PIMIC’s first two AI inference silicon chips based on this architecture have already demonstrated 10x to 20x improvements in PPA compared to competitors. We are confident that PIMIC holds a distinct edge in addressing the future needs of AI inference.
What new features/technology are you working on?
We are leveraging our Jetstreme Processing-in-Memory (PIM) architecture, together with number of other critical silicon innovations, to dramatically improve compute efficiency and scalability. We are working on enabling the next generation of AI modeling.
How do customers normally engage with your company?
We have a flexible approach. We provide unpackaged chips, packaged SoCs, or ASIC solutions with specific functional requirements.
What challenges are you solving for edge devices in particular?
Edge devices—devices that act as endpoints between the data center and the real world—encompass a wide range of products, all with challenging performance requirements. Edge devices generally fall into two main categories: tiny edge devices and high-performance edge devices. PIMIC’s solutions address the challenges of both categories of device.
Tiny Edge Devices:
These devices, often located near sensors, must operate with extremely low power and cost constraints to achieve widespread adoption. The primary challenges for this category include energy efficiency, cost optimization, and low latency for real time response.
High-Performance Edge Devices:
Devices such as smartphones, smart TVs, and AI-powered PCs must run large AI models in real time, ensuring seamless user interactions by balancing computational demands, latency, privacy, and energy efficiency. The key challenges include overcoming hardware limitations in power, memory bandwidth, and computational throughput to enable advanced AI tasks locally, all while scaling to meet the performance demands by the latest AI models mentioned earlier.
About PIMIC
Founded in 2022 and based in Cupertino, California, PIMIC is an AI semiconductor company specializing in ultra-efficient silicon solutions for edge AI applications. The company’s chip products deliver industry-leading performance and power efficiency, enabling advanced AI capabilities in compact, low-power devices. With a focus on empowering devices at the edge, PIMIC aims to redefine how AI is integrated into everyday technology.
For more information, visit www.pimic.ai.
Also Read:
CEO Interview: Dr Josep Montanyà of Nanusens
CEO Interview: Marc Engel of Agileo Automation
CEO Interview with Dr. Dennis Michaelis of GEMESYS
Share this post via:
Intel – Everyone’s Favourite Second Source?