Doubleword logo black
Product
Products
Doubleword API
NEW
Inference built for scale
Doubleword Inference Stack
High performance inference stack
Use Cases
Async Agents
Long running background agents
Synthetic Data Generation
Generate high volumes of data for fine- tuning
Data Processing
Apply intelligence to large volumes of data
Resources
Documentation
Technical docs and API reference
Workbooks
Ready-to-run examples
Seen in the Wild
Community content and projects
Resource Centre
All our blogs and guides
Technical Blog
Our blog on building inference systems
Al Dictionary
Key Al terms explained
Savings Calculator
See how much you save with Doubleword
Solutions
By Deployment Option
On-premiseCloudHybrid
By Team
AI, ML & Data SciencePlatform, DevOps & ITCompliance & Cyber
Pricing
Docs
Pricing
Get started - Free
Get started - Free
Resources
/
Press
/
Doubleword doubles down on NVIDIA collaboration to give enterprises control over their AI with NVIDIA NIM microservices integration
June 11, 2025

Doubleword doubles down on NVIDIA collaboration to give enterprises control over their AI with NVIDIA NIM microservices integration

No items found.
Share:
https://doubleword.ai/resources/doubleword-doubles-down-on-nvidia-collaboration-to-give-enterprises-control-over-their-ai-with-nvidia-nim-microservices-integration
Copied
To Webinar
•

Paris, France - June 11, 2025 -  Doubleword, the leading self-hosted inference platform for enterprises, has today announced that it is doubling down on its collaboration with NVIDIA. Fresh from a $12 million Series A round, Doubleword is expanding its platform offering through integrating NVIDIA NIM microservices. This will enable its customers to deploy a range of LLMs on NVIDIA infrastructure, all while benefiting from the enhanced monitoring, observability and scalability provided by Doubleword. 

Key highlights of the collaboration include:

  • Expanded Inference Platform: Doubleword now integrates NVIDIA universal LLM NIM microservices, letting enterprises tap model-specific AI power with ease.
  • Universal GPU Compatibility: The platform runs on any NVIDIA GPU - including the latest releases - for highest-level speed and maximum flexibility.
  • Sovereign AI by Design: Doubleword's on-prem inference stack, wrapped around NVIDIA NIM microservices capabilities, ensures enterprises have the tools they need to maintain control, privacy, and trust in their AI deployments
  • Integration with NVIDIA AI Blueprints: Doubleword integrates seamlessly with NVIDIA AI Blueprints, enabling teams to follow best practices in application design and fully leverage their AI infrastructure without the burden of managing complex underlying systems.

Doubleword was founded to solve the inference problem pre-ChatGPT in London by Meryem Arik (CEO), Dr. Jamie Dborin (CSO), and Dr. Fergus Finn (CTO). The team just raised $12m from Europe's leading B2B investor, Dawn Capital, and is now on a mission to help enterprises solve one of the biggest barriers to large-scale enterprise AI adoption: self-hosted inference. 

Inference is where AI delivers real-world value: from answering questions to generating images, it transforms models into business outcomes. As AI adoption grows, inference has become mission-critical - a capability enterprises must own and control - but it brings with it the enormous task of building and maintaining performant, scalable inference infrastructure. With the integration of NVIDIA NIM microservices, Doubleword expands its full-stack solution that simplifies deployment and enhances the effectiveness of enterprise AI initiatives. It offers enterprises of all sizes a production-ready, future-proofed self-hosted inference platform.

“We’re delighted to further support our customers in owning and scaling their AI through this initiative with NVIDIA . By integrating the universal LLM NIM microservices, we’re making it even easier for enterprises to deploy state-of-the-art AI models, fully optimized for their target NVIDIA hardware deployments.”

— Meryem Arik, CEO & Co-founder, Doubleword

‍

About Doubleword

Doubleword is a self-hosted inference platform purpose-built for enterprises. Committed to making self-hosting AI as easy as using 3rd party APIs, Doubleword is on a mission to help enterprises own & control their AI. Doubleworld was founded by AI researchers and has received backing from top investors and industry leaders, including Dawn Capital, Hugging Face CEO Clément Delangue, and Dataiku CEO Florian Douetteau. To learn more about Doubleword’s Self-Hosted Inference Platform, click here. 

‍

Footnotes

Table of contents:

Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
"
Learn more about self-hosted AI Inference
Subscribe to our newsletter
Thanks you for subscription!
Oops! Something went wrong while submitting the form.

Stop overpaying for inference.

Teams use Doubleword to run low-cost, large-scale inference pipelines for async jobs.
‍
Free credits available to get started.

Get started - Free
Doubleword logo black
AI Inference, Built for Scale.
Products
Doubleword APIDoubleword Inference Stack
Use Cases
Async AgentsSynthetic Data GenerationData Processing
Resources
Seen in the WildDocumentationPricingAsync Pipeline BuilderResource CentreTechnical BlogAI Dictionary
Company
AboutPrivacy PolicyTerms of ServiceData Usage Policy
Careers
Hiring!
Contact
© 2026 Doubleword. All rights reserved.
We use cookies to ensure you get the best experience on our website.
Accept
Deny