OpenCode + BlueClaw: Escape the AI Credit Treadmill

Your Claude credits hit zero at 11pm on a Tuesday.

You were in the middle of a refactor. The agent was actually being helpful for once. Then: “You’ve reached your usage limit.” Cool. Productivity over.

Or maybe you’re on OpenAI’s pay-as-you-go and watching the meter tick. Every prompt costs something. Every iteration of an agent loop adds up. By the end of the month you’re looking at a bill that makes you wonder if you should just hire a junior.

There’s a way out. This post is how.


The Credit Treadmill

Commercial LLM APIs sell you a story: drop in a key, pay per token, scale forever. The reality:

  1. Credits run out. Right when you need them most. Mid-refactor, mid-debug, mid-flow.
  2. Costs scale unpredictably. An agentic coding tool can burn through tokens fast. Tool calls, retries, context refills — it adds up.
  3. Your code goes through their servers. Every prompt, every file, every snippet. Read the ToS. Some of it sticks.
  4. You’re locked in. Switch providers and your tooling breaks. The “open” ecosystem isn’t very open.

There’s a better way.


What BlueClaw Is

BlueClaw is an open market for idle GPU compute.

GPU providers around the network have capacity sitting around — overnight, off-peak, between workloads. That idle compute is wasted compute. BlueClaw lets those providers sell that capacity into an open market. Buyers (you) get LLM tokens at a fraction of what the centralized providers charge.

The pitch is simple:

  • Significantly cheaper tokens than the commercial APIs. The market prices idle compute down because the alternative is letting it sit.
  • OpenAI-compatible API. If your tool speaks OpenAI, it speaks BlueClaw.
  • No data harvesting. BlueClaw exists for people who don’t want their prompts mined, logged, and recycled into someone else’s training set.
  • No rug pull. You aren’t at the mercy of a single provider’s pricing whims or rate-limit policies.

If you’re building coding agents — or just using one — BlueClaw is the path off the credit treadmill.


What OpenCode Is

OpenCode is an open-source, terminal-based AI coding agent — think Claude Code or Cursor, but you bring your own model provider. It’s MIT-licensed, runs locally, and treats providers as pluggable configuration rather than a hardcoded dependency.

That last part is the unlock. OpenCode + a custom provider config = your tool, your model, your bill.


The Setup

Five steps. Should take less than ten minutes start to finish.

Step 1 — Install OpenCode

npm i -g opencode-ai

That’s it. One global install.

Step 2 — Sign up for BlueClaw

Head over to blueclaw.network and sign up.

Signup is public. Beta cohorts are run in batches of 25 — the first 25 of each cohort are selected from the BlueClaw network, and the current cohort wraps up soon. If you want a free seat in the next cohort, sign up now and get in line.

→ Sign up for the beta: blueclaw.network

→ Follow @BlueClawNetwork on X for cohort announcements and supported-model updates.

Once you’re in, head to the portal and grab your API key.

Step 3 — Configure OpenCode

OpenCode reads its config from ~/.config/opencode/opencode.json. Create it (or edit it) so it looks like this:

{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "blueclaw": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "BlueClaw Network",
      "options": {
        "baseURL": "https://openai.blueclaw.network/v1"
      },
      "models": {
        "Qwen3.6-27B": {
          "name": "Qwen3.6-27B",
          "cost": {
            "input": 0.4,
            "output": 1.2
          },
          "limit": {
            "context": 196608,
            "input": 188416,
            "output": 8192
          }
        },
        "Qwen3-Coder-30B-A3B-Instruct": {
          "name": "Qwen3-Coder-30B-A3B-Instruct",
          "cost": {
            "input": 0.4,
            "output": 1.2
          },
          "limit": {
            "context": 65536,
            "input": 65536,
            "output": 8192
          }
        }
      }
    }
  }
}

A few notes on what’s happening here:

  • npm: "@ai-sdk/openai-compatible" tells OpenCode to use the generic OpenAI-compatible adapter from the Vercel AI SDK. No custom code required.
  • baseURL points at BlueClaw’s OpenAI-compatible endpoint.
  • The models block declares which models you want exposed, with cost-per-million-token and context-window metadata so OpenCode can display usage info accurately.

Heads-up: the supported model lineup evolves. Always check the BlueClaw portal for the latest list of supported models and update the models block to match. Adding a model name OpenCode doesn’t know about will fail silently — keep this config in sync.

Step 4 — Connect

Launch opencode in your project directory, then run:

/connect

It will prompt you for your API key — paste the one from the BlueClaw portal — and ask you to pick a model. Done.

Step 5 — Track Your Usage

Watch your token spend any time at portal.blueclaw.network/#usage. Real numbers, real-time. No mystery bills at the end of the month.


The Architecture

Here’s what’s happening under the hood when you ask OpenCode to refactor a function:

┌─────────────────────────────────────────────────────────┐
│                  YOUR TERMINAL                          │
│  ┌───────────────────────────────────────────────────┐  │
│  │  opencode (CLI)                                   │  │
│  │  - reads ~/.config/opencode/opencode.json         │  │
│  │  - uses @ai-sdk/openai-compatible                 │  │
│  └────────────────────┬──────────────────────────────┘  │
└───────────────────────┼─────────────────────────────────┘

                        ▼ OpenAI-compatible request
        ┌───────────────────────────────────┐
        │  openai.blueclaw.network/v1       │  ← Auth + routing
        └───────────────┬───────────────────┘


        ┌───────────────────────────────────┐
        │  BLUECLAW NETWORK                 │
        │  Open market — bids matched to    │
        │  available idle GPU capacity      │
        └───────────────┬───────────────────┘


        ┌───────────────────────────────────┐
        │  GPU Provider (idle capacity)     │
        │  Runs the requested model         │
        └───────────────────────────────────┘

Your code never touches a centralized provider’s servers. Your prompts aren’t logged into someone’s training pipeline. You pay for compute that would otherwise sit idle, at market rates the market sets.


Why This Works (the Economics)

Centralized providers price tokens based on what the market will bear. They have margin to protect, datacenters to amortize, training runs to fund. Those costs end up on your invoice.

Idle GPU compute is different. The hardware exists. The electricity is already running. The capacity is already there — it’s just unused. Selling that capacity at any price above zero is profit for the provider.

That’s the BlueClaw thesis: an open market for compute that would otherwise be wasted, priced by supply and demand, with no surveillance overhead baked into the bill.

The result for you:

ConcernCentralized APIBlueClaw
Cost per million tokensPremium pricingSignificantly cheaper
Rate limitsHard caps that interrupt your flowNo artificial throttling
Data privacyPrompts may be logged/used for trainingNot mined for training
Provider lock-inOne vendor, take it or leave itOpen market, OpenAI-compatible
PredictabilityPer-request surprise billsVisible usage in the portal

The Beta Cohort — Get In

Beta access is run in rolling cohorts of 25 seats. The current cohort is wrapping up soon, which means the next batch is about to open. Signup is public, but seats are limited and the first picks of each cohort come from the BlueClaw network.

If you’ve been hunting for an alternative to expensive commercial AI coding tools — this is the moment.

→ Sign up here: blueclaw.network

→ Follow for cohort updates: @BlueClawNetwork

Get in the queue. Follow the X account so you don’t miss the next cohort opening or supported-model announcements.


What’s Next

I’ll be posting more on running real agentic workloads on decentralized infrastructure — coding tools, autonomous agents, the works. The pattern is the same across all of it: commercial APIs are a treadmill, decentralized compute is the off-ramp.

If you build coding tools, run agents, or just write a lot of code with AI assistance — this stack is worth ten minutes of your time.


Get Involved

Questions, feedback, or war stories about running out of Claude credits at 11pm? Find me on X at @mikezupper.


The future of AI tooling isn’t a handful of providers metering your every keystroke. It’s open markets, open models, and infrastructure you actually own. BlueClaw is a step toward that future. Get on the train.