Official Introduction
What Can This Tool Do for You?
Scenario 1: Switching to a Better IDE
I have a Kiro account that gives me free monthly access to Claude Sonnet 4.5, but the Kiro IDE isn’t very comfortable. I’d prefer using Claude Code or Cursor for coding, but don’t want to pay extra for API access.
Scenario 2: Sharing Quota with Other Tools
I still have plenty of unused Claude Code quota this month — instead of letting it go to waste, I’d like to redirect it to Cherry Studio for chatting, or use it as an API backend for my AI Agent project.
Scenario 3: Unified Management of Multiple AI Accounts
I own several accounts (Kiro, Gemini CLI, Qwen), and I want to manage them collectively — automatically using whichever has available quota.
ProxyCast solves these problems — it converts your existing AI client credentials into standard OpenAI-compatible APIs, enabling any OpenAI-compatible tool to leverage your free quotas.
How It Works
Your AI Client Credentials ProxyCast Any OpenAI-Compatible Tool
┌─────────────────────┐ ┌─────────────┐ ┌─────────────────────┐
│ Kiro OAuth │ │ │ │ Claude Code │
│ Gemini OAuth │ ───▶ │ Local API │ ───▶ │ Cherry Studio │
│ Qwen OAuth │ │ Proxy │ │ Cursor / Cline │
│ ... │ │ Service │ │ Your AI Agent │
└─────────────────────┘ │ │ └─────────────────────┘
└─────────────┘
Difference from AClient-2-API
ProxyCast is the desktop version of AIClient-2-API, offering a more user-friendly GUI and one-click operations without requiring command-line configuration.
Core Features
Multi-Provider Unified Management
- Kiro Claude - Free usage of Claude Sonnet 4.5 via OAuth
- Gemini CLI - Bypass Gemini’s free-tier limits via OAuth
- Gemini API Key - Load balancing across multiple accounts, model exclusion support
- Qwen (Tongyi Qianwen) - Use Qwen3 Coder Plus via OAuth
- OpenAI Codex - Access GPT models via OAuth
- iFlow - Supports both OAuth and Cookie authentication
- Vertex AI - Google Cloud AI platform with model alias support
- Custom OpenAI - Configure custom OpenAI-compatible APIs
- Custom Claude - Configure custom Claude APIs
User-Friendly GUI
- Dashboard - Service status monitoring, API testing panel
- Provider Management - One-click credential loading, token refresh, default provider switching
- Settings Page - Server configuration, port settings, API key management
- Log Viewer - Real-time logging, operation tracking
Intelligent Credential Management
- Automatically detects credential file changes (every 5 seconds)
- One-click read of local OAuth credentials
- Automatic token refresh upon expiration
- Environment variable export (.env format)
- Automatic Quota Failover - Automatically switches to next available credential when quota is exceeded
- Preview Model Fallback - Attempts preview versions when primary model quota is exhausted
- Per-Key Proxy - Individually configure proxy for each credential
Security & Management
- TLS/HTTPS Support - Optional HTTPS encrypted communication
- Remote Management API - Remotely manage configurations and credentials via API
- Access Control - Supports localhost restriction and key-based authentication
Amp CLI Integration
- Supports
/api/provider/{provider}/v1/*routing pattern - Model mapping - Map unavailable models to available alternatives
- Management endpoint proxy - Proxy authentication and account functions
Full API Compatibility
/v1/chat/completions- OpenAI Chat API/v1/models- Model list/v1/messages- Anthropic Messages API/v1/messages/count_tokens- Token counting/api/provider/{provider}/v1/*- Amp CLI routing/v0/management/*- Remote management API
Interface Screenshots
Dashboard - Service Control & API Testing
Provider Pool - Multi-Credential Management
API Server - Routing & Logs
Settings Page - Server Configuration
AI Clients - Client Configuration
MCP Server Management
Prompts Management
Quick Start
Download & Install
Download the appropriate package from the Releases page:
- macOS (Apple Silicon):
ProxyCast_x.x.x_aarch64.dmg - Windows (x64):
ProxyCast_x.x.x_x64-setup.exe
Default Credential File Locations
| Provider | Default Path | Description |
|---|---|---|
| Kiro | ~/.aws/sso/cache/kiro-auth-token.json |
Kiro OAuth Token |
| Gemini | ~/.gemini/oauth_creds.json |
Gemini CLI OAuth |
| Qwen | ~/.qwen/oauth_creds.json |
Tongyi Qianwen OAuth |
Tip:
~refers to the user home directory (/Users/usernameon macOS,C:\Users\usernameon Windows)
Usage Steps
- Launch the App - Open ProxyCast
- Load Credentials - Go to Provider Management page and click “Load Credentials”
- Start Service - Click “Start Server” on the Dashboard
- Configure Clients - In tools like Cherry-Studio or Cline, set:
API Base URL: http://localhost:3001/v1 API Key: proxycast-key
API Usage Examples
OpenAI Chat Completions
curl http://localhost:3001/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer proxycast-key" \
-d '{
"model": "claude-sonnet-4-5-20250514",
"messages": [
{"role": "user", "content": "Hello!"}
],
"stream": true
}'
Anthropic Messages API
curl http://localhost:3001/v1/messages \
-H "Content-Type: application/json" \
-H "x-api-key: proxycast-key" \
-H "anthropic-version: 2023-06-01" \
-d '{
"model": "claude-sonnet-4-5-20250514",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Hello!"}
]
}'
Development Build
Prerequisites
- Node.js >= 20.0.0
- Rust >= 1.70
- pnpm or npm
Local Development
# Install dependencies
npm install
# Start dev server
npm run tauri dev
Build for Release
# Build production version
npm run tauri build
Open Source License
This project is open-sourced under the GNU General Public License v3 (GPLv3).
Acknowledgments
- AIClient-2-API - Core logic reference
- Tauri - Cross-platform desktop framework
- shadcn/ui - UI component library
Disclaimer
Risk Notice
This project (ProxyCast) is intended solely for educational and research purposes. Users assume all risks associated with its use. The author assumes no responsibility for any direct, indirect, or consequential damages arising from the use of this software.
Third-Party Service Responsibility
This project is an API proxy tool and does not provide any AI model services. All AI model services are provided by their respective third-party providers (e.g., Google, Anthropic, Alibaba Cloud). Users must comply with the terms of service and policies of each third-party provider when accessing these services through this tool.
Data Privacy Statement
This project runs locally and does not collect or upload any user data. However, users should take care to protect their API keys and other sensitive information while using this tool.