Oliver Servín

January 30, 2026

How I use one AI subscription for everything

I refuse to pay $100+ per month for ChatGPT, Claude Code, Raycast AI, Perplexity, Gemini, and the next ten AI subscriptions launching this year. Here's how I get all their value for the price of one.

The subscription fatigue problem

AI is everywhere now, and every vendor wants you to pay for their walled garden.

ChatGPT Plus: $20/month. 
Claude Code: $20/month. 
Raycast AI Pro: $20/month. 
Perplexity Pro: $20/month. 
Google Gemini Advanced: $20/month. 

That's $100+ per month just to access the same frontier models across different interfaces.

The economics don't add up, but the bigger problem is what you're actually buying: vendor lock-in.

When you subscribe to a specific AI service, you're not just paying for access to a model. You're paying for an experience that's tied to that vendor's ecosystem. Your subscription isn't portable. You can't take it with you when you switch tools.

This matters because tools change. Vendors change their terms. Sometimes they change them dramatically.

Case in point: On January 27, 2026, Anthropic blocked third-party tools like OpenCode from accessing Claude models. Developers who had built their workflows around OpenCode suddenly found themselves locked out with no option except to migrate to Anthropic's official Claude Code CLI.

That's the reality of vendor lock-in: your workflow is held hostage.

My strategy: one subscription, portable workflow

I took a different approach. Instead of paying for multiple vendor-specific subscriptions, I invested in one OpenAI-compatible subscription that works everywhere.

My choice: Z.ai coding subscription.

Here's why this matters:
  • One API endpoint that works with tools supporting custom providers
  • OpenAI-compatible, so it plugs into any tool that supports BYOK (Bring Your Own Key)
  • Includes MCP (Model Context Protocol) for web searches and content reading
  • Cost-effective compared to paying for five separate subscriptions

The key is choosing tools that embrace custom providers instead of walled gardens.

My daily workflow

I work primarily from my desktop computer. Being at my desk is how I shift my mind to work mode, so that's where all my AI interactions happen.

Raycast for quick queries and writing cleanup

I use Raycast all day for fast AI interactions:
  • Quick AI for one-off questions (just hit Tab after typing your query)
  • "Improve Writing" command for grammar and spelling cleanup
  • Custom AI commands for specific tasks

Raycast promotes their own subscription plan for their AI tools, but I'm not subscribed to it. Instead, I configured Raycast to use my Z.ai subscription as a custom provider.

OpenCode for coding (avoiding Claude Code lock-in)

For coding, I use OpenCode instead of Anthropic's official Claude Code CLI. Why? Because I want the freedom to switch models and providers without changing my workflow.

With OpenCode, I can:
  • Use GPT-5 for coding
  • Switch to Gemini if I want
  • Use local models via Ollama for free
  • Configure any OpenAI-compatible endpoint

I'm not locked into any specific model. I'm locked into my workflow.

The configuration tradeoff

Here's the thing: choosing tools with custom provider support requires manual configuration. It's not as frictionless as clicking "Subscribe to Raycast AI" and being done.

For Raycast, the setup looks like this:
  1. Open Raycast Settings → AI → Custom Providers
  2. Click "Reveal Providers Config" to open providers.yaml
  3. Configure your custom provider:

providers:
  - id: z.ai
  - name: Z.ai
    base_url: https://api.z.ai/api/coding/paas/v4
    api_keys:
      zia: YOUR_ZAI_API_KEY
    models:
      - id: glm-4.7
        name: GLM 4.7
      - id: glm-4.6
        name: GLM 4.6

Save the file and restart Raycast.

The tradeoff is ongoing maintenance:
  • When new models are released, you have to manually update the config file
  • You need to keep track of which models are available from your provider
  • There's no auto-discovery or model list updates

When Z.ai released GLM 4.7, I had to update my config file to replace GLM 4.6. It's a few minutes of work every few months.

Is it worth it? Absolutely.

What I get in return

Save $100+ per month

Instead of paying for ChatGPT, Claude Code, Raycast AI, Perplexity, and Gemini separately, I pay for one subscription that powers them all.

That's $1,200+ per year in savings.

Portable workflow

All my AI interactions flow through the same subscription and models. No friction switching between tools. It's all the same brain, just different interfaces.

If a tool changes its terms or becomes less useful, I can switch to another tool without losing my subscription. My workflow isn't held hostage by any vendor.

Freedom from lock-in

When Anthropic blocked OpenCode from accessing Claude models, users who had built their workflow around OpenCode had to either:
  • Switch to Claude Code (official, but locked to Claude)
  • Keep using OpenCode with GPT/Gemini instead

I chose option 2 because I was already using GPT through my Z.ai subscription. My workflow didn't change. I just changed the model.

That's the power of portability.

The future: tools that embrace custom providers

Smart tools will win by embracing custom providers. Walled gardens will lose as users demand portability.

The tools that thrive are the ones that let you bring your own subscription:
  • Raycast (supports BYOK and custom providers via providers.yaml)
  • OpenCode (supports OpenAI-compatible endpoints)
  • Cursor (works with official API keys from multiple providers)

The tools that force you into their subscription are betting on lock-in. The ones that give you choice are betting on portability.

I'm betting on portability.

You're not cheap, you're strategic

Choosing tools that support custom providers isn't about being cheap. It's about avoiding vendor lock-in and maintaining control over your workflow.

When you pay $20 for Claude Code, you're not just paying for Claude access. You're paying to be locked into Anthropic's ecosystem. When you pay $20 for ChatGPT, you're paying to be locked into OpenAI's ecosystem.

Multiply that across five tools, and you've spent $100 to be locked into five different ecosystems.

The smarter approach: Invest in one high-quality subscription that offers OpenAI-compatible APIs, then choose tools that let you use it.

Next steps

If you're tired of paying for multiple AI subscriptions or worried about vendor lock-in, here's what to do:

  1. Audit your current AI subscriptions. How much are you paying monthly? What services are they tied to?
  2. Look for tools that support custom providers. Key terms to search for: "BYOK," "custom LLM endpoints," "Bring Your Own Key," "OpenAI-compatible."
  3. Pick one high-quality subscription. Make sure it offers OpenAI-compatible APIs and the models you need.
  4. Reconfigure your tools. It's manual upfront, but you save money and gain portability.

The bottom line

The best AI subscription isn't the one from the biggest vendor. It's the one you control.

One subscription. Portable workflow. No vendor lock-in.

That's how I use AI without going broke or getting held hostage by any single vendor.

About Oliver Servín

Working solo at AntiHQ, a one-person software company.