Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
39 changes: 39 additions & 0 deletions .dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
# Git
.git
.gitignore

# Environment
.env
.env.example

# Python
__pycache__/
*.py[cod]
*$py.class
*.so
.Python
env/
venv/
ENV/
*.egg-info/
dist/
build/

# Docker
Dockerfile
docker-compose.yml
.dockerignore

# Tests
tests.py

# Documentation
README.md
CLAUDE.md
*.png

# Editor files
.vscode/
.idea/
*.swp
*.swo
5 changes: 5 additions & 0 deletions .gitattributes
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
# Shell scripts require LF
*.sh text eol=lf
# Batch scripts require CRLF
*.bat text eol=crlf
*.cmd text eol=crlf
87 changes: 87 additions & 0 deletions CLAUDE.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,87 @@
# CLAUDE.md

This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.

## Project Overview

This project is a proxy server that allows you to use Anthropic clients (like Claude Code) with Gemini or OpenAI backends via LiteLLM. It translates API requests between Anthropic's format and OpenAI/Gemini formats, enabling Claude clients to use non-Anthropic models.

## Key Commands

### Running the Server

```bash
# Start the server with hot reloading (for development)
uv run uvicorn server:app --host 0.0.0.0 --port 8082 --reload

# Start the server without hot reloading (for production)
uv run uvicorn server:app --host 0.0.0.0 --port 8082
```

### Running Tests

```bash
# Run all tests
python tests.py

# Skip streaming tests
python tests.py --no-streaming

# Run only simple tests (no tools)
python tests.py --simple

# Run only tool-related tests
python tests.py --tools-only

# Run only streaming tests
python tests.py --streaming-only
```

## Dependencies and Setup

This project uses Python 3.10+ and manages dependencies with `uv`. Key dependencies include:

- fastapi: Web framework
- uvicorn: ASGI server
- httpx: HTTP client
- pydantic: Data validation
- litellm: LLM API abstraction for model mapping
- python-dotenv: Environment variable management

## Configuration

For detailed configuration instructions, including environment variables and model settings, please refer to the "Setup" and "Model Mapping" sections in [README.md](README.md).

## Architecture

The proxy server works by:

1. Receiving requests in Anthropic's API format
2. Translating the requests to OpenAI/Gemini format via LiteLLM
3. Sending the translated request to the target provider
4. Converting the response back to Anthropic format
5. Returning the formatted response to the client

Key components:

- `server.py`: Main FastAPI application with all proxy logic
- `tests.py`: Test suite for verifying proxy functionality
## Development Patterns

When working on this project, follow these guidelines:

1. Use type hints and Pydantic models for data validation
2. Log important events for debugging
3. Handle errors gracefully and provide informative error messages
4. Write tests for new functionality
5. Maintain backward compatibility with the Anthropic API
6. Keep model mappings updated as new models are released

## Code Style Guidelines

1. DO NOT add inline comments for code that is self-explanatory
2. Only add comments for complex logic that requires explanation
3. Keep code clean and readable without relying on comments
4. Use descriptive variable and function names instead of comments
5. Use comments sparingly and only when they add genuine value
6. For Docker and shell scripts, avoid obvious comments - the commands should be self-explanatory
24 changes: 24 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
FROM python:3.10-slim

RUN useradd -m appuser

WORKDIR /app
RUN chown -R appuser:appuser /app

RUN pip install --no-cache-dir uv

COPY pyproject.toml ./
COPY server.py ./

RUN chown -R appuser:appuser /app

USER appuser

RUN uv venv && . .venv/bin/activate && uv pip install -e .

EXPOSE 8082

ENV PYTHONUNBUFFERED=1
ENV PATH="/app/.venv/bin:$PATH"

CMD ["/app/.venv/bin/uvicorn", "server:app", "--host", "0.0.0.0", "--port", "8082"]
83 changes: 83 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,41 @@ A proxy server that lets you use Anthropic clients with Gemini or OpenAI models

### Setup 🛠️

You can set up the proxy server either manually or using the provided setup script.

#### Option 1: Using Setup Script (Recommended)

1. **Clone this repository**:
```bash
git clone https://github.com/1rgs/claude-code-openai.git
cd claude-code-openai
```

2. **Run the setup script**:
```bash
./setup.sh
```

This script will:
- Check your Python version
- Install uv if not already installed
- Create and activate a virtual environment
- Install all dependencies
- Create a .env file if one doesn't exist

3. **Update your API keys**:
Edit the `.env` file with your API keys.

Note: The `.env.example` file must always be present in the project root directory.

4. **Run the server**:
```bash
source venv/bin/activate
python -m uvicorn server:app --host 0.0.0.0 --port 8082 --reload
```

#### Option 2: Manual Setup

1. **Clone this repository**:
```bash
git clone https://github.com/1rgs/claude-code-openai.git
Expand Down Expand Up @@ -153,6 +188,54 @@ This proxy works by:

The proxy handles both streaming and non-streaming responses, maintaining compatibility with all Claude clients. 🌊

## Docker Deployment 🐳

You can run the proxy server using Docker for easier deployment:

### Using Docker Compose

1. **Configure Environment**:
```bash
cp .env.example .env
```
Edit `.env` with your API keys as described in the Setup section.

2. **Build and Run**:
```bash
docker compose up -d
```

3. **Connect**:
```bash
ANTHROPIC_BASE_URL=http://localhost:8082 claude
```

### Manual Docker Build

1. **Build the Docker image**:
```bash
docker build -t claude-code-proxy .
```

2. **Run the container**:
```bash
docker run -d \
-p 8082:8082 \
-e OPENAI_API_KEY="your-openai-key" \
-e PREFERRED_PROVIDER="openai" \
--name claude-code-proxy \
claude-code-proxy
```

Alternatively, use environment variables from your .env file:
```bash
docker run -d \
-p 8082:8082 \
--env-file .env \
--name claude-code-proxy \
claude-code-proxy
```

## Contributing 🤝

Contributions are welcome! Please feel free to submit a Pull Request. 🎁
23 changes: 23 additions & 0 deletions docker-build.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
#!/bin/bash
[ "$1" = -x ] && shift && set -x
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"

set -e

cd "${DIR}"

TAG=$1

if [[ -z "${TAG}" ]]; then
TAG=${CLAUDE_CODE_PROXY_DOCKER_TAG}
fi

if [[ -z "${TAG}" ]]; then
TAG=latest
fi

LOCAL_DOCKER_IMG=claude-code-proxy:${TAG}

set -e

docker build -t "${LOCAL_DOCKER_IMG}" .
13 changes: 13 additions & 0 deletions docker-compose-start.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
#!/bin/bash
[ "$1" = -x ] && shift && set -x
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"

set -e

cd "${DIR}"

docker compose build

docker compose up -d

docker compose logs -f claude-code-proxy
23 changes: 23 additions & 0 deletions docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
services:
claude-code-proxy:
build:
context: .
dockerfile: Dockerfile
ports:
- "8082:8082"
environment:
- OPENAI_API_KEY=${OPENAI_API_KEY}
- GEMINI_API_KEY=${GEMINI_API_KEY}
- ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
- PREFERRED_PROVIDER=${PREFERRED_PROVIDER:-openai}
- BIG_MODEL=${BIG_MODEL:-gpt-4.1}
- SMALL_MODEL=${SMALL_MODEL:-gpt-4.1-mini}
volumes:
- ./.env:/app/.env:ro
restart: unless-stopped
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8082/"]
interval: 30s
timeout: 10s
retries: 3
start_period: 5s
33 changes: 33 additions & 0 deletions docker-push.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
#!/bin/bash
[ "$1" = -x ] && shift && set -x
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"

set -e

cd "${DIR}"

TAG=$1

if [[ -z "${TAG}" ]]; then
TAG=${CLAUDE_CODE_PROXY_DOCKER_TAG}
fi

if [[ -z "${TAG}" ]]; then
TAG=latest
fi

LOCAL_DOCKER_IMG=claude-code-proxy:${TAG}

if [[ -z "${DOCKER_REMOTE_REGISTRY}" ]]; then
echo "DOCKER_REMOTE_REGISTRY is not set" >&2

exit 1
fi

REMOTE_DOCKER_IMG=${DOCKER_REMOTE_REGISTRY}/${LOCAL_DOCKER_IMG}

set -e

docker tag "${LOCAL_DOCKER_IMG}" "${REMOTE_DOCKER_IMG}"

docker push "${REMOTE_DOCKER_IMG}"
Loading