Skip to content
ai.rud.is
Go back

ollama-usage

hrbrmstr

I play with many local and remote AI providers/tools, one of which is Ollama Cloud. I don’t know how they’re serving up models so cheap, but the Pro plan goes a prety long way, and it makes it pretty painless to test out other giant models to see how they’re catching up to Opus (or its forthcoming big, scary sibling).

The fine folks at Ollama Cloud have yet to provide an API endpoint for one to check usage, and they also have not seen fit to support dark mode, which means I go blind at 0-dark-30 when doing a quick check.

Until they do, I made a tiny golang CLI — ollama-usage that uses a saved cookie (either via Firefox cookie store on macOS or one you provide on the CLI or file) to fetch usage stats from the Ollama Cloud dashboard.

Running it can either provide a pretty view:

Ollama Cloud Usage (pro plan)

  Session:    2.1% used  (resets in 49 minutes)
  Weekly:    19.8% used  (resets in 2 days)

  Session █░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░   2.1%
  Weekly  ████████░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░  19.8%

or JSON output:

{
  "plan": "pro",
  "session_percent": 2.1,
  "session_resets_at": "2026-04-03T16:00:00Z",
  "weekly_percent": 19.8,
  "weekly_resets_at": "2026-04-06T00:00:00Z",
  "fetched_at": "2026-04-03T15:11:00Z"
}

Kick the tyres, shoot me a PR, and hope we don’t need this for much longer (how hard can it be to make this an API endpoint?).



Previous Post
A Chrome extension to save links with optional AI-generated summaries to Outline
Next Post
Cognitive Labor, AI, And Economic Value