The End of the All-You-Can-Use AI Model
The shift marks a fundamental reset in how GitHub monetizes its flagship AI coding assistant. Under the current request-based system, a quick chat question and a multi-hour autonomous coding session cost users the same amount, even though the latter consumes vastly more computational resources. GitHub has absorbed much of the escalating inference cost behind that usage, but the current premium request model is no longer sustainable,
explained Mario Rodriguez, chief product officer on the GitHub product team, in a blog post.
The economics became untenable as AI model capabilities improved and developer demand surged following high-profile agent breakthroughs in early 2026. GitHub, along with competitors like Anthropic and Google, faced mounting infrastructure costs that subscription revenue could no longer cover.
How GitHub AI Credits Will Replace Premium Requests
GitHub is introducing a virtual currency called GitHub AI Credits, valued at $0.01 each. Customers will consume input tokens, output tokens, and cached tokens, with each priced according to the model used. Subscription pricing remains unchanged: Copilot Pro stays at $10/month, Pro+ at $39/month, Business at $19/user/month, and Enterprise at $39/user/month.
However, the allotments differ sharply. Copilot Pro subscribers receive 1,000 AI Credits monthly, while Pro+ subscribers get 3,900. Business and Enterprise customers receive 1,900 and 3,900 credits per user respectively, though existing customers enjoy higher allocations (3,000 and 7,000) through September 1, 2026, as a transition incentive.
Once users exhaust their monthly allowance, they can define an overflow budget or simply wait for the next billing cycle. Code completions and Next Edit Suggestions remain unlimited on paid plans, ensuring basic functionality continues even after credits are depleted.
The Token Price Shock Ahead
Premium model pricing will spike significantly under the new regime. Anthropic’s Opus 4.7, currently subject to a 7.5x multiplier, will jump to 27x. OpenAI’s GPT-5.4 multiplier will rise from 1x to 6x. Different models meter tokens at different rates, making cost prediction difficult.
GitHub plans to soften the blow by introducing a preview bill experience in early May, giving users and admins visibility into projected costs before the June 1 transition.
Users on annual subscription plans can cancel for a pro-rated refund or downgrade to Copilot Free upon expiration; those plans will not renew.
Industry-Wide Price Corrections Accelerating
GitHub’s move follows similar actions by Anthropic and Google, both of which have tightened usage limits on subsidized services. OpenAI debuted a $100 subscription tier to boost adoption of its Codex model while contemplating an end to unlimited usage altogether. Cloud providers including AWS and Azure have battled capacity challenges as demand outpaced infrastructure investment.
The industry’s pivot away from unlimited plans reflects a hard reality: venture-backed generosity meets physics. When inference costs exceed subscription revenue, companies must recalibrate pricing or face unsustainable losses, as GitHub signaled last week by suspending new Copilot plan creation.
What Developers Should Watch
The June 1 transition date gives developers two months to assess their actual usage patterns and plan budgets accordingly. Usage-based billing introduces unpredictability; token consumption varies by prompt complexity, model choice, and tool integration. The preview bill feature will be critical for teams managing Copilot costs across multiple developers.
Existing annual subscribers face the steepest cliff: premium model multipliers climbing as high as 27x will make long-running sessions prohibitively expensive unless overflow budgets are carefully managed.
Follow Hashlytics on Bluesky, LinkedIn , Telegram and X to Get Instant Updates

