Doubleword logo black
Product
Products
Doubleword API
NEW
Inference built for scale
Doubleword Inference Stack
High performance inference stack
Use Cases
Async Agents
Long running background agents
Synthetic Data Generation
Generate high volumes of data for fine- tuning
Data Processing
Apply intelligence to large volumes of data
Resources
Documentation
Technical docs and API reference
Workbooks
Ready-to-run examples
Seen in the Wild
Community content and projects
Resource Centre
All our blogs and guides
Technical Blog
Our blog on building inference systems
Al Dictionary
Key Al terms explained
Savings Calculator
See how much you save with Doubleword
Solutions
By Deployment Option
On-premiseCloudHybrid
By Team
AI, ML & Data SciencePlatform, DevOps & ITCompliance & Cyber
Pricing
Docs
Pricing
Get started - Free
Get started - Free
Resources
/
Blog
/
27/3 Weekly Update: Doubleword CLI and OCR model release
March 27, 2026

27/3 Weekly Update: Doubleword CLI and OCR model release

Meryem Arik
Share:
https://doubleword.ai/resources/27-3-weekly-update-doubleword-cli-and-ocr-model-release
Copied
To Webinar
•

Big week at Doubleword, lots shipped - here's what's new.

Introducing dw - the Doubleword CLI 👩🏽‍💻

You can now manage everything from your terminal. dw is a command-line tool for the Doubleword high-volume inference platform: manage projects, prepare and manipulate JSONL files, run batches, stream results, and orchestrate multi-stage inference pipelines - all without leaving your shell. Docs →

pip install dw-cli

Three new OCR models 📄

We've added three state-of-the-art OCR models to the platform:

  • deepseek-ai/DeepSeek-OCR-2
  • allenai/olmOCR-2-7B-1025-FP8
  • lightonai/LightOnOCR-2-1B-bbox-soup

OCR is one of the best fits for high-volume inference - documents don't care about latency, and the volumes can be enormous. Our Chief Scientist Jamie wrote about why the bitter lesson applies here too. Read Jamie's OCR blog →

Also this week:

  • ⚡️ 10-20x faster batch uploads: you can now upload much larger batches significantly quicker.
  • 🦞 OpenClaw integration: if you're running an OpenClaw agent, you can now route background inference through Doubleword to cut costs. Setup guide →
  • 🏥 Use case: OpenMed used Doubleword to annotate 119K medical images at 93% accuracy for under $500, then fine-tuned three small models - best result: +15% exact match. Read more →

Give the CLI a try - install instructions are in the docs. Let us know what you think.

Footnotes

Table of contents:

Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
"
Learn more about self-hosted AI Inference
Subscribe to our newsletter
Thanks you for subscription!
Oops! Something went wrong while submitting the form.

Stop overpaying for inference.

Teams use Doubleword to run low-cost, large-scale inference pipelines for async jobs.
‍
Free credits available to get started.

Get started - Free
Doubleword logo black
AI Inference, Built for Scale.
Products
Doubleword APIDoubleword Inference Stack
Use Cases
Async AgentsSynthetic Data GenerationData Processing
Resources
Seen in the WildDocumentationPricingAsync Pipeline BuilderResource CentreTechnical BlogAI Dictionary
Company
AboutPrivacy PolicyTerms of ServiceData Usage Policy
Careers
Hiring!
Contact
© 2026 Doubleword. All rights reserved.
We use cookies to ensure you get the best experience on our website.
Accept
Deny