WP_Term Object
(
    [term_id] => 20484
    [name] => Quadric
    [slug] => quadric
    [term_group] => 0
    [term_taxonomy_id] => 20484
    [taxonomy] => category
    [description] => 
    [parent] => 178
    [count] => 16
    [filter] => raw
    [cat_ID] => 20484
    [category_count] => 16
    [category_description] => 
    [cat_name] => Quadric
    [category_nicename] => quadric
    [category_parent] => 178
)
            
semiwiki banner (1)
WP_Term Object
(
    [term_id] => 20484
    [name] => Quadric
    [slug] => quadric
    [term_group] => 0
    [term_taxonomy_id] => 20484
    [taxonomy] => category
    [description] => 
    [parent] => 178
    [count] => 16
    [filter] => raw
    [cat_ID] => 20484
    [category_count] => 16
    [category_description] => 
    [cat_name] => Quadric
    [category_nicename] => quadric
    [category_parent] => 178
)

2026 Outlook with Steve Roddy of Quadric

2026 Outlook with Steve Roddy of Quadric
by Daniel Nenni on 01-27-2026 at 8:00 am

Key takeaways

2026 Outlook SemiWiki Image

Tell us a little bit about yourself and your company.

I am the Chief Marketing Officer at Quadric, where I have spent the past four years helping scale the company’s market presence and customer engagement. Quadric is a pure-play IP licensing company that has been operating for more than seven years. We specialize in a truly unique, fully programmable AI inference processor designed for edge and device-level inference, enabling customers to deploy advanced AI workloads without sacrificing flexibility or efficiency.

What was the most exciting high point of 2025 for your company? 

2025 was a breakout growth year for Quadric. Revenue expanded dramatically, reaching the eight-figure range, and multiple customers progressed deep into their tape-out cycles, positioning us to see customer silicon in 2026. We capped off the year by closing a very strong Series C funding round, which further validates both our technology and our long-term market opportunity (announced Jan 14, 2026).

What was the biggest challenge your company faced in 2025? 

The biggest challenge was managing the pace of growth—both in terms of team expansion and customer demand. We roughly doubled our team size and scaled our sales organization to engage with an order of magnitude more prospective customers. It was a classic “good news / bad news” scenario: rapidly growing interest in our technology required more people, more demos, more benchmarks, and more infrastructure; fast.

On the technology side, the most significant shift was the explosive demand for running LLMs and SLMs directly on devices. In 2025, the conversation changed almost overnight from “Is it possible to run an LLM on device?” to “We must run LLMs on device.” On-device LLMs moved from experimental to mainstream far faster than most of the industry anticipated.

How is your company’s work addressing this challenge? 

In 2025, we made major investments in our software infrastructure to enable efficient execution of LLMs on the Chimera processor platform. Unlike traditional CNN- or vision-centric models, modern language models require advanced techniques such as key-value caching (KV cache), which go well beyond simple graph compilation.

Our Chimera Graph Compiler (CGC) ingests AI models, generates optimized C++ representations of those graphs, and targets efficient execution on our processor. However, enabling high-performance LLM inference required additional application-level C++ code beyond graph execution alone. This is where Chimera is fundamentally different from conventional NPU “accelerators.” Chimera runs full C++ applications—not just fragments of an AI model—entirely on the processor.

As a result, we now support a complete software stack for token-based models, including launch, prefill, and KV caching, all running natively on Chimera with no reliance on a companion CPU.

What do you think the biggest growth area for 2026 will be, and why?

The biggest growth area in 2026 will be edge-resident generative AI—particularly LLMs, VLMs, and agent-based models running locally on devices. Market drivers such as latency, power efficiency, data privacy, cost control, and system resilience are pushing intelligence out of the datacenter and onto devices across automotive, industrial, consumer, and infrastructure markets. Customers are no longer willing to compromise between performance and programmability, and that shift strongly favors architectures designed for long-term flexibility.

How is your company’s work addressing this growth? 

Quadric is uniquely positioned to support this growth because Chimera is fully programmable and future-proof by design. As models evolve—and they are evolving rapidly—customers can deploy new networks, operators, and software techniques without changing hardware. Our ability to run complete AI applications in C++, including complex control flow and memory management, enables customers to deploy sophisticated generative AI workloads today and adapt them over time. This dramatically reduces risk for silicon designers planning products with long lifecycles.

What conferences did you attend in 2025 and how was the traffic?

In 2025, Quadric participated in a range of leading AI, semiconductor, and embedded systems conferences. Across the board, traffic and engagement were exceptionally strong. Booth conversations were deeper and more technically informed than in prior years, reflecting a more mature market where customers are actively evaluating deployment strategies rather than simply exploring concepts.

Will you participate in conferences in 2026? Same or more as 2025?

Yes—2026 will be a significant expansion year for us.  We started the year with a big presence at CES in Las Vegas, and throughout the full year we plan to attend more events, increase sponsorships, and focus more on vertical-specific conferences.

How do customers normally engage with your company?

Engagement typically begins with technical briefings and application discussions, followed by hands-on evaluations and benchmarking. Because Chimera™ is highly programmable, customer engagements are often collaborative and long-term.

Additional comments? 

The pace of change in AI is unprecedented, but it is also creating tremendous opportunity for companies willing to rethink traditional hardware and software boundaries. At Quadric, we believe programmability is the key to sustainable AI innovation, and we are excited to help our customers bring advanced intelligence to the edge—without compromise.

Share this post via:

Comments

There are no comments yet.

You must register or log in to view/post comments.