Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
22 commits
Select commit Hold shift + click to select a range
b4cccf1
chore(internal): detect missing future annotations with ruff
stainless-app[bot] Oct 10, 2025
29874b0
chore: bump `httpx-aiohttp` version to 0.1.9
stainless-app[bot] Oct 17, 2025
cb29768
fix(client): close streams without requiring full consumption
stainless-app[bot] Oct 29, 2025
1b7f280
chore(internal/tests): avoid race condition with implicit client cleanup
stainless-app[bot] Oct 30, 2025
46f738d
chore(internal): grammar fix (it's -> its)
stainless-app[bot] Nov 3, 2025
785c446
chore(package): drop Python 3.8 support
stainless-app[bot] Nov 10, 2025
009ca0d
fix: compat with Python 3.14
stainless-app[bot] Nov 10, 2025
eb03850
fix(compat): update signatures of `model_dump` and `model_dump_json` …
stainless-app[bot] Nov 11, 2025
3b9a132
chore(internal): codegen related update
stainless-app[bot] Nov 21, 2025
4b8f9b7
fix: ensure streams are always closed
stainless-app[bot] Dec 18, 2025
8c1ce31
chore(deps): mypy 1.18.1 has a regression, pin to 1.17
stainless-app[bot] Nov 27, 2025
2b01e25
chore: update lockfile
stainless-app[bot] Dec 2, 2025
e04fade
chore(docs): use environment variables for authentication in code sni…
stainless-app[bot] Dec 2, 2025
aab06ad
fix(types): allow pyright to infer TypedDict types within SequenceNotStr
stainless-app[bot] Dec 8, 2025
edaf4a2
chore: add missing docstrings
stainless-app[bot] Dec 8, 2025
9223e75
chore(internal): add missing files argument to base client
stainless-app[bot] Dec 15, 2025
cc25ae3
chore: speedup initial import
stainless-app[bot] Dec 16, 2025
7a5d301
fix: use async_to_httpx_files in patch method
stainless-app[bot] Dec 17, 2025
2f2aed5
feat(api): manual updates
stainless-app[bot] Dec 18, 2025
a87c139
feat(api): manual updates
stainless-app[bot] Dec 18, 2025
345a9f0
feat(api): manual updates
stainless-app[bot] Dec 18, 2025
f633a12
release: 0.6.0
stainless-app[bot] Dec 18, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{
".": "0.5.0"
".": "0.6.0"
}
2 changes: 1 addition & 1 deletion .stats.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
configured_endpoints: 7
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/meta%2Fllama-api-edf0a308dd29bea2feb29f2e7f04eec4dbfb130ffe52511641783958168f60a4.yml
openapi_spec_hash: 23af966c58151516aaef00e0af602c01
config_hash: 431a8aed31c3576451a36d2db8f48c25
config_hash: 416c3d950e58dbdb47588eaf29fa9fa5
36 changes: 36 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,41 @@
# Changelog

## 0.6.0 (2025-12-18)

Full Changelog: [v0.5.0...v0.6.0](https://github.com/meta-llama/llama-api-python/compare/v0.5.0...v0.6.0)

### Features

* **api:** manual updates ([345a9f0](https://github.com/meta-llama/llama-api-python/commit/345a9f0554fe30991f936078a3242e44c0cca302))
* **api:** manual updates ([a87c139](https://github.com/meta-llama/llama-api-python/commit/a87c139abe8dc412688855d2ea8226e02d3d1376))
* **api:** manual updates ([2f2aed5](https://github.com/meta-llama/llama-api-python/commit/2f2aed50d52ca6d1e23523a6d4a2469445feb088))


### Bug Fixes

* **client:** close streams without requiring full consumption ([cb29768](https://github.com/meta-llama/llama-api-python/commit/cb29768aa1a4c3d9e8e94ef28cfa40a856618fb4))
* compat with Python 3.14 ([009ca0d](https://github.com/meta-llama/llama-api-python/commit/009ca0d914ec813285e1f195d645871f9cd3d6df))
* **compat:** update signatures of `model_dump` and `model_dump_json` for Pydantic v1 ([eb03850](https://github.com/meta-llama/llama-api-python/commit/eb03850c5905d443da89b71fe8306af8cf5d7062))
* ensure streams are always closed ([4b8f9b7](https://github.com/meta-llama/llama-api-python/commit/4b8f9b7b7f63e0d72daf9bd24c3f12c424040c6d))
* **types:** allow pyright to infer TypedDict types within SequenceNotStr ([aab06ad](https://github.com/meta-llama/llama-api-python/commit/aab06adc22ed41bd16af636c3bc94e08b9bf2c82))
* use async_to_httpx_files in patch method ([7a5d301](https://github.com/meta-llama/llama-api-python/commit/7a5d3019d53edd2a3b92c8aa91971aa3421ae758))


### Chores

* add missing docstrings ([edaf4a2](https://github.com/meta-llama/llama-api-python/commit/edaf4a2677b2c2a5d4b89a96ac1de289430c6957))
* bump `httpx-aiohttp` version to 0.1.9 ([29874b0](https://github.com/meta-llama/llama-api-python/commit/29874b0abe332ac6c10d44fa93088bd13a4b793f))
* **deps:** mypy 1.18.1 has a regression, pin to 1.17 ([8c1ce31](https://github.com/meta-llama/llama-api-python/commit/8c1ce316a22980fd33b11bacb6f23d3166322f13))
* **docs:** use environment variables for authentication in code snippets ([e04fade](https://github.com/meta-llama/llama-api-python/commit/e04fade6f4b6c8ce315e337a7f58b7b72a981a28))
* **internal/tests:** avoid race condition with implicit client cleanup ([1b7f280](https://github.com/meta-llama/llama-api-python/commit/1b7f2809275a7c39377d5841dd77e52bad1476ed))
* **internal:** add missing files argument to base client ([9223e75](https://github.com/meta-llama/llama-api-python/commit/9223e753f32938fc39e0866ce8f86d4fbcef37ec))
* **internal:** codegen related update ([3b9a132](https://github.com/meta-llama/llama-api-python/commit/3b9a132f8271d76851669aedab9ee880806a83e8))
* **internal:** detect missing future annotations with ruff ([b4cccf1](https://github.com/meta-llama/llama-api-python/commit/b4cccf14ca10e965177a0904692646da0c2892f0))
* **internal:** grammar fix (it's -> its) ([46f738d](https://github.com/meta-llama/llama-api-python/commit/46f738d2b1bf8c210607abe2d1acc46f8361895d))
* **package:** drop Python 3.8 support ([785c446](https://github.com/meta-llama/llama-api-python/commit/785c4468206841d6c9d172b2733b0bcf053dcce6))
* speedup initial import ([cc25ae3](https://github.com/meta-llama/llama-api-python/commit/cc25ae390cb3e10aa517638ca25ba60b9e6b8b07))
* update lockfile ([2b01e25](https://github.com/meta-llama/llama-api-python/commit/2b01e25f540a589b7bedef48a3f18e0d4bba8d7d))

## 0.5.0 (2025-10-01)

Full Changelog: [v0.4.0...v0.5.0](https://github.com/meta-llama/llama-api-python/compare/v0.4.0...v0.5.0)
Expand Down
11 changes: 6 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
<!-- prettier-ignore -->
[![PyPI version](https://img.shields.io/pypi/v/llama_api_client.svg?label=pypi%20(stable))](https://pypi.org/project/llama_api_client/)

The Llama API Client Python library provides convenient access to the Llama API Client REST API from any Python 3.8+
The Llama API Client Python library provides convenient access to the Llama API Client REST API from any Python 3.9+
application. The library includes type definitions for all request params and response fields,
and offers both synchronous and asynchronous clients powered by [httpx](https://github.com/encode/httpx).

Expand Down Expand Up @@ -94,14 +94,15 @@ pip install 'llama_api_client[aiohttp] @ git+ssh://[email protected]/meta-llama/lla
Then you can enable it by instantiating the client with `http_client=DefaultAioHttpClient()`:

```python
import os
import asyncio
from llama_api_client import DefaultAioHttpClient
from llama_api_client import AsyncLlamaAPIClient


async def main() -> None:
async with AsyncLlamaAPIClient(
api_key="My API Key",
api_key=os.environ.get("LLAMA_API_KEY"), # This is the default and can be omitted
http_client=DefaultAioHttpClient(),
) as client:
create_chat_completion_response = await client.chat.completions.create(
Expand Down Expand Up @@ -281,15 +282,15 @@ client.with_options(max_retries=5).chat.completions.create(

### Timeouts

By default requests time out after 1 minute. You can configure this with a `timeout` option,
By default requests time out after 10 minutes. You can configure this with a `timeout` option,
which accepts a float or an [`httpx.Timeout`](https://www.python-httpx.org/advanced/timeouts/#fine-tuning-the-configuration) object:

```python
from llama_api_client import LlamaAPIClient

# Configure the default for all requests:
client = LlamaAPIClient(
# 20 seconds (default is 1 minute)
# 20 seconds (default is 10 minutes)
timeout=20.0,
)

Expand Down Expand Up @@ -490,7 +491,7 @@ print(llama_api_client.__version__)

## Requirements

Python 3.8 or higher.
Python 3.9 or higher.

## Contributing

Expand Down
30 changes: 18 additions & 12 deletions pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,30 +1,32 @@
[project]
name = "llama_api_client"
version = "0.5.0"
version = "0.6.0"
description = "The official Python library for the llama-api-client API"
dynamic = ["readme"]
license = "MIT"
authors = [
{ name = "Llama API Client", email = "[email protected]" },
]

dependencies = [
"httpx>=0.23.0, <1",
"pydantic>=1.9.0, <3",
"typing-extensions>=4.10, <5",
"anyio>=3.5.0, <5",
"distro>=1.7.0, <2",
"sniffio",
"httpx>=0.23.0, <1",
"pydantic>=1.9.0, <3",
"typing-extensions>=4.10, <5",
"anyio>=3.5.0, <5",
"distro>=1.7.0, <2",
"sniffio",
]
requires-python = ">= 3.8"

requires-python = ">= 3.9"
classifiers = [
"Typing :: Typed",
"Intended Audience :: Developers",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Programming Language :: Python :: 3.14",
"Operating System :: OS Independent",
"Operating System :: POSIX",
"Operating System :: MacOS",
Expand All @@ -39,14 +41,14 @@ Homepage = "https://github.com/meta-llama/llama-api-python"
Repository = "https://github.com/meta-llama/llama-api-python"

[project.optional-dependencies]
aiohttp = ["aiohttp", "httpx_aiohttp>=0.1.8"]
aiohttp = ["aiohttp", "httpx_aiohttp>=0.1.9"]

[tool.rye]
managed = true
# version pins are in requirements-dev.lock
dev-dependencies = [
"pyright==1.1.399",
"mypy",
"mypy==1.17",
"respx",
"pytest",
"pytest-asyncio",
Expand Down Expand Up @@ -141,7 +143,7 @@ filterwarnings = [
# there are a couple of flags that are still disabled by
# default in strict mode as they are experimental and niche.
typeCheckingMode = "strict"
pythonVersion = "3.8"
pythonVersion = "3.9"

exclude = [
"_dev",
Expand Down Expand Up @@ -224,6 +226,8 @@ select = [
"B",
# remove unused imports
"F401",
# check for missing future annotations
"FA102",
# bare except statements
"E722",
# unused arguments
Expand All @@ -246,6 +250,8 @@ unfixable = [
"T203",
]

extend-safe-fixes = ["FA102"]

[tool.ruff.lint.flake8-tidy-imports.banned-api]
"functools.lru_cache".msg = "This function does not retain type information for the wrapped function's arguments; The `lru_cache` function from `_utils` should be used instead"

Expand Down
114 changes: 63 additions & 51 deletions requirements-dev.lock
Original file line number Diff line number Diff line change
Expand Up @@ -12,40 +12,45 @@
-e file:.
aiohappyeyeballs==2.6.1
# via aiohttp
aiohttp==3.12.8
aiohttp==3.13.2
# via httpx-aiohttp
# via llama-api-client
aiosignal==1.3.2
aiosignal==1.4.0
# via aiohttp
annotated-types==0.6.0
annotated-types==0.7.0
# via pydantic
anyio==4.4.0
anyio==4.12.0
# via httpx
# via llama-api-client
argcomplete==3.1.2
argcomplete==3.6.3
# via nox
async-timeout==5.0.1
# via aiohttp
attrs==25.3.0
attrs==25.4.0
# via aiohttp
certifi==2023.7.22
# via nox
backports-asyncio-runner==1.2.0
# via pytest-asyncio
certifi==2025.11.12
# via httpcore
# via httpx
colorlog==6.7.0
colorlog==6.10.1
# via nox
dependency-groups==1.3.1
# via nox
dirty-equals==0.6.0
distlib==0.3.7
dirty-equals==0.11
distlib==0.4.0
# via virtualenv
distro==1.8.0
distro==1.9.0
# via llama-api-client
exceptiongroup==1.2.2
exceptiongroup==1.3.1
# via anyio
# via pytest
execnet==2.1.1
execnet==2.1.2
# via pytest-xdist
filelock==3.12.4
filelock==3.19.1
# via virtualenv
frozenlist==1.6.2
frozenlist==1.8.0
# via aiohttp
# via aiosignal
h11==0.16.0
Expand All @@ -56,82 +61,89 @@ httpx==0.28.1
# via httpx-aiohttp
# via llama-api-client
# via respx
httpx-aiohttp==0.1.8
httpx-aiohttp==0.1.9
# via llama-api-client
idna==3.4
humanize==4.13.0
# via nox
idna==3.11
# via anyio
# via httpx
# via yarl
importlib-metadata==7.0.0
iniconfig==2.0.0
importlib-metadata==8.7.0
iniconfig==2.1.0
# via pytest
markdown-it-py==3.0.0
# via rich
mdurl==0.1.2
# via markdown-it-py
multidict==6.4.4
multidict==6.7.0
# via aiohttp
# via yarl
mypy==1.14.1
mypy-extensions==1.0.0
mypy==1.17.0
mypy-extensions==1.1.0
# via mypy
nodeenv==1.8.0
nodeenv==1.9.1
# via pyright
nox==2023.4.22
packaging==23.2
nox==2025.11.12
packaging==25.0
# via dependency-groups
# via nox
# via pytest
platformdirs==3.11.0
pathspec==0.12.1
# via mypy
platformdirs==4.4.0
# via virtualenv
pluggy==1.5.0
pluggy==1.6.0
# via pytest
propcache==0.3.1
propcache==0.4.1
# via aiohttp
# via yarl
pydantic==2.11.9
pydantic==2.12.5
# via llama-api-client
pydantic-core==2.33.2
pydantic-core==2.41.5
# via pydantic
pygments==2.18.0
pygments==2.19.2
# via pytest
# via rich
pyright==1.1.399
pytest==8.3.3
pytest==8.4.2
# via pytest-asyncio
# via pytest-xdist
pytest-asyncio==0.24.0
pytest-xdist==3.7.0
python-dateutil==2.8.2
pytest-asyncio==1.2.0
pytest-xdist==3.8.0
python-dateutil==2.9.0.post0
# via time-machine
pytz==2023.3.post1
# via dirty-equals
respx==0.22.0
rich==13.7.1
ruff==0.9.4
setuptools==68.2.2
# via nodeenv
six==1.16.0
rich==14.2.0
ruff==0.14.7
six==1.17.0
# via python-dateutil
sniffio==1.3.0
# via anyio
sniffio==1.3.1
# via llama-api-client
time-machine==2.9.0
tomli==2.0.2
time-machine==2.19.0
tomli==2.3.0
# via dependency-groups
# via mypy
# via nox
# via pytest
typing-extensions==4.12.2
typing-extensions==4.15.0
# via aiosignal
# via anyio
# via exceptiongroup
# via llama-api-client
# via multidict
# via mypy
# via pydantic
# via pydantic-core
# via pyright
# via pytest-asyncio
# via typing-inspection
typing-inspection==0.4.1
# via virtualenv
typing-inspection==0.4.2
# via pydantic
virtualenv==20.24.5
virtualenv==20.35.4
# via nox
yarl==1.20.0
yarl==1.22.0
# via aiohttp
zipp==3.17.0
zipp==3.23.0
# via importlib-metadata
Loading