Software

AI + ML

Fujitsu picks model-maker Cohere as its partner for the rapid LLM-development dance

Will become exclusive route to market for joint projects


Fujitsu has made a "significant investment" in Toronto-based Cohere, a developer of large language models and associated tech, and will bring the five-year-old startup's wares to the world.

The relationship has four elements, one of which will see the two work on a Japanese-language LLM that's been given the working title Takane. Fujitsu will offer Takane to its Japanese clients. Takane will be based on Cohere's latest LLM, Command R+, which we're told features "enhanced retrieval-augmented generation capabilities to mitigate hallucinations."

The duo will also build models "to serve the needs of global businesses."

The third element of the relationship will see Fujitsu appointed the exclusive provider of jointly developed services. The pair envisage those services as private cloud deployments "to serve organizations in highly regulated industries including financial institutions, the public sector, and R&D units."

The fourth and final element of the deal will see Takane integrated with Fujitsu's generative AI amalgamation technology – a service that selects, and if necessary combines, models to get the best tools for particular jobs.

It's 2024, so no IT services provider can afford not to be developing generative AI assets and partnerships. To do otherwise it to risk missing out on the chance of winning business in the hottest new enterprise workload for years, and thereby forgetting the time-honored enterprise sales tactic of "land and expand." At worst – if things go pear-shaped – they end up as a siloed app that becomes legacy tech and can be milked for years.

This deal is notable, given the likes of OpenAI, Mistral AI, and Anthropic are seen as the LLM market leaders worthy of ring-kissing by global tech players.

By partnering with Canadian Cohere, Fujitsu has taken a different path – and perhaps differentiated itself.

Cohere is not, however, a totally left-field choice. Nvidia and Cisco have invested in the biz, and its models are sufficiently well regarded and in demand that AWS, Microsoft and HuggingFace have all included its wares in their ModelMarts. ®

Send us news
4 Comments

Despite Wall Street jitters, AI hopefuls keep spending billions on AI infrastructure

Sunk cost fallacy? No, I just need a little more cash for this AGI thing I’ve been working on

We meet the protesters who want to ban Artificial General Intelligence before it even exists

STOP AI warns of doomsday scenario, demands governments pull the plug on advanced models

How nice that state-of-the-art LLMs reveal their reasoning ... for miscreants to exploit

Blueprints shared for jail-breaking models that expose their chain-of-thought process

LLM aka Large Legal Mess: Judge wants lawyer fined $15K for using AI slop in filing

Plus: Anthropic rolls out Claude 3.7 Sonnet

Phantom of the Opera: AI agent now lurks within browser, for the lazy

Too shiftless to even click on a few things while online shopping, hm? Just ask this built-in assistant

Microsoft warns Trump: Where the US won't sell AI tech, China will

Rule hamstringing our datacenters is 'gift' to Middle Kingdom, vice chair argues

Hurrah! AI won't destroy developer or DBA jobs

Bureau of Labor Statics warns lawyers and customer service reps to brace for change, says techies will be fine

UK's new thinking on AI: Unless it's causing serious bother, you can crack on

Plus: Keep calm and plug Anthropic's Claude into public services

If you thought training AI models was hard, try building enterprise apps with them

Aleph Alpha's Jonas Andrulis on the challenges of building sovereign AI

Network edge? You get 64-bit Armv9 AI. You too, watches. And you, server remote management. And you...

Arm rolls out the Cortex-A320 for small embedded gear that dreams of big-model inference

Microsoft's drawback on datacenter investment may signal AI demand concerns

Investment bank claims software giant ditched 'at least' 5 land parcels due to potential 'oversupply'

Hey programmers – is AI making us dumber?

If you don't need to think about easy questions, will you be able to answer complex questions?