You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
*(`uv` will handle dependencies based on `pyproject.toml` when you run the server)*
27
+
28
+
```bash
29
+
curl -LsSf https://astral.sh/uv/install.sh | sh
30
+
```
31
+
32
+
_(`uv` will handle dependencies based on `pyproject.toml` when you run the server)_
31
33
32
34
3. **Configure Environment Variables**:
33
35
Copy the example environment file:
34
-
```bash
35
-
cp .env.example .env
36
-
```
37
-
Edit `.env` and fill in your API keys and model configurations:
38
-
39
-
*`ANTHROPIC_API_KEY`: (Optional) Needed only if proxying *to* Anthropic models.
40
-
*`OPENAI_API_KEY`: Your OpenAI API key (Required if using the default OpenAI preference or as fallback).
41
-
*`GEMINI_API_KEY`: Your Google AI Studio (Gemini) API key (Required if PREFERRED_PROVIDER=google).
42
-
*`PREFERRED_PROVIDER` (Optional): Set to `openai` (default) or `google`. This determines the primary backend for mapping `haiku`/`sonnet`.
43
-
*`BIG_MODEL` (Optional): The model to map `sonnet` requests to. Defaults to `gpt-4.1` (if `PREFERRED_PROVIDER=openai`) or `gemini-2.5-pro-preview-03-25`.
44
-
*`SMALL_MODEL` (Optional): The model to map `haiku` requests to. Defaults to `gpt-4.1-mini` (if `PREFERRED_PROVIDER=openai`) or `gemini-2.0-flash`.
45
-
46
-
**Mapping Logic:**
47
-
- If `PREFERRED_PROVIDER=openai` (default), `haiku`/`sonnet` map to `SMALL_MODEL`/`BIG_MODEL` prefixed with `openai/`.
48
-
- If `PREFERRED_PROVIDER=google`, `haiku`/`sonnet` map to `SMALL_MODEL`/`BIG_MODEL` prefixed with `gemini/`*if* those models are in the server's known `GEMINI_MODELS` list (otherwise falls back to OpenAI mapping).
36
+
37
+
```bash
38
+
cp .env.example .env
39
+
```
40
+
41
+
Edit `.env` and fill in your API keys and model configurations:
42
+
43
+
- `ANTHROPIC_API_KEY`: (Optional) Needed only if proxying _to_ Anthropic models.
44
+
- `OPENAI_API_KEY`: Your OpenAI API key (Required if using the default OpenAI preference or as fallback).
45
+
- `GEMINI_API_KEY`: Your Google AI Studio (Gemini) API key (Required if PREFERRED_PROVIDER=google).
46
+
- `PREFERRED_PROVIDER` (Optional): Set to `openai` (default) or `google`. This determines the primary backend for mapping `haiku`/`sonnet`.
47
+
- `BIG_MODEL` (Optional): The model to map `sonnet` requests to. Defaults to `gpt-4.1` (if `PREFERRED_PROVIDER=openai`) or `gemini-2.5-pro-preview-03-25`.
48
+
- `SMALL_MODEL` (Optional): The model to map `haiku` requests to. Defaults to `gpt-4.1-mini` (if `PREFERRED_PROVIDER=openai`) or `gemini-2.0-flash`.
49
+
50
+
**Mapping Logic:**
51
+
52
+
- If `PREFERRED_PROVIDER=openai` (default), `haiku`/`sonnet` map to `SMALL_MODEL`/`BIG_MODEL` prefixed with `openai/`.
53
+
- If `PREFERRED_PROVIDER=google`, `haiku`/`sonnet` map to `SMALL_MODEL`/`BIG_MODEL` prefixed with `gemini/` _if_ those models are in the server's known `GEMINI_MODELS` list (otherwise falls back to OpenAI mapping).
49
54
50
55
4. **Run the server**:
51
-
```bash
52
-
uv run uvicorn server:app --host 0.0.0.0 --port 8082 --reload
53
-
```
54
-
*(`--reload` is optional, for development)*
56
+
```bash
57
+
uv run uvicorn server:app --host 0.0.0.0 --port 8082 --reload
58
+
```
59
+
_(`--reload` is optional, for development)_
55
60
56
61
### Using with Claude Code 🎮
57
62
58
63
1. **Install Claude Code** (if you haven't already):
59
-
```bash
60
-
npm install -g @anthropic-ai/claude-code
61
-
```
64
+
65
+
```bash
66
+
npm install -g @anthropic-ai/claude-code
67
+
```
62
68
63
69
2. **Connect to your proxy**:
64
-
```bash
65
-
ANTHROPIC_BASE_URL=http://localhost:8082 claude
66
-
```
70
+
71
+
```bash
72
+
ANTHROPIC_BASE_URL=http://localhost:8082 claude
73
+
```
67
74
68
75
3. **That's it!** Your Claude Code client will now use the configured backend models (defaulting to Gemini) through the proxy. 🎯
69
76
70
77
## Model Mapping 🗺️
71
78
72
79
The proxy automatically maps Claude models to either OpenAI or Gemini models based on the configured model:
73
80
74
-
| Claude Model | Default Mapping | When BIG_MODEL/SMALL_MODEL is a Gemini model |
0 commit comments