Off-Prem

Edge + IoT

MegaChips takes aim at edge AI in US with ASIC program

Japanese ASIC provider rolling out AI partner program to get around in-house expertise


Japanese ASIC provider MegaChips is rolling out an AI partner program pitched as allowing organizations to deliver AI capabilities without requiring in-house experts. It is also expanding into the US market to sell its full ASIC design services to American tech companies.

MegaChips is a global fabless chip company that claims to work closely with its customers' design teams to deliver the tech they need in silicon. The AI Partner Program marks the entry of the business into the global edge AI chips market, estimated to be worth $9 billion in 2020, claiming there is growing demand for embedded AI solutions.

For edge AI chips, think of silicon designed to power devices at the network edge, where processing of data may need an instant reaction, power may be limited, and sending all the data back to the cloud or a datacenter for processing may be costly in terms of bandwidth and add latency.

According to MegaChips, the AI Partner Program benefits include access to a dedicated team of engineers to identify the best ways to implement desired AI functions, custom "proof of concept" demonstrations, and optimization of the complete system. It also eliminates any need for customers to have their own silicon implementation teams, it said.

The AI Partner Program appears to build on two key technology partnerships that MegaChips recently established. One is with BrainChip Holdings Ltd, an Australian AI firm, to acquire its edge-based AI solution, BrainChip Akida, a neuromorphic processor.

"By providing Akida's on-chip learning and ultra-low power Edge AI capabilities as an integrated technology in MegaChips ASIC solutions, we are able to bring practical, cutting-edge capabilities to the edge that ensure power efficiency without compromising accuracy," said BrainChip VP of Worldwide Sales and Marketing Rob Telson.

The other is a stake in Quadric io, a US firm that has developed an AI inference processor for edge devices that integrates AI and DSP (digital signal processing) functions, enabling it to accelerate all the steps involved in processing data from a variety of sensors, according to the firm.

MegaChips also announced it is now offering its full-service ASIC solution in the US, which director of business development, Douglas Fairbairn, said is an opportunity to take its edge AI nous stateside.

At the other end of the scale, chip designer Esperanto Technologies recently showcased a processor with more than 1,000 general-purpose RISC-V cores, aimed at delivering faster and more energy-efficient AI inference performance.

The Register reported last year how AI chip startups are focusing on performance-per-dollar and energy efficiency as their competitive selling point against solutions using general-purpose processors. ®

Send us news
Post a comment

Network edge? You get 64-bit Armv9 AI. You too, watches. And you, server remote management. And you...

Arm rolls out the Cortex-A320 for small embedded gear that dreams of big-model inference

Despite Wall Street jitters, AI hopefuls keep spending billions on AI infrastructure

Sunk cost fallacy? No, I just need a little more cash for this AGI thing I’ve been working on

Apple promises to spend $500B, hire 20K over 4 years to swerve Trump import tariffs

Sorry, that should read: Boost US manufacturing and R&D, believe in the American people, etc etc

We meet the protesters who want to ban Artificial General Intelligence before it even exists

STOP AI warns of doomsday scenario, demands governments pull the plug on advanced models

How nice that state-of-the-art LLMs reveal their reasoning ... for miscreants to exploit

Blueprints shared for jail-breaking models that expose their chain-of-thought process

This open text-to-speech model needs just seconds of audio to clone your voice

El Reg shows you how to run Zyphra's speech-replicating AI on your own box

LLM aka Large Legal Mess: Judge wants lawyer fined $15K for using AI slop in filing

Plus: Anthropic rolls out Claude 3.7 Sonnet

The future of AI is ... analog? Upstart bags $100M to push GPU-like brains on less juice

EnCharge claims 150 TOPS/watt, a 20x performance-per-watt edge

UK's new thinking on AI: Unless it's causing serious bother, you can crack on

Plus: Keep calm and plug Anthropic's Claude into public services

Microsoft warns Trump: Where the US won't sell AI tech, China will

Rule hamstringing our datacenters is 'gift' to Middle Kingdom, vice chair argues

Hurrah! AI won't destroy developer or DBA jobs

Bureau of Labor Statics warns lawyers and customer service reps to brace for change, says techies will be fine

If you thought training AI models was hard, try building enterprise apps with them

Aleph Alpha's Jonas Andrulis on the challenges of building sovereign AI