Skip to content

Deepseek support not working #3032

@harryslimes

Description

@harryslimes

Bug Report: DeepSeek Provider Issues

Summary

DeepSeek model provider is broken in Letta due to undefined type references and missing streaming support.

Environment

  • Letta version: Latest (as of October 11, 2025)
  • DeepSeek models affected: All (deepseek-chat, deepseek-reasoner, etc.)
  • Deployment: Docker (dev-compose.yaml)

Issue 1: NameError - _Message is not defined

Description

When attempting to use DeepSeek as a model provider, the application crashes with a NameError when loading the DeepSeek client module.

Steps to Reproduce

  1. Configure an agent with DeepSeek as the LLM provider
  2. Attempt to send a message to the agent
  3. Server crashes with NameError

Error Traceback

File "/app/letta/llm_api/llm_client.py", line 90, in create
    from letta.llm_api.deepseek_client import DeepseekClient
File "/app/letta/llm_api/deepseek_client.py", line 60, in <module>
    def map_messages_to_deepseek_format(messages: List[ChatMessage]) -> List[_Message]:
                                                                             ^^^^^^^^
NameError: name '_Message' is not defined

Root Cause

The deepseek_client.py file references an undefined type _Message in two locations:

  • Line 60: Function return type annotation
  • Line 104: Function parameter type annotation

The type _Message was never imported or defined in the module.

Issue 2: Streaming Not Supported

Description

When attempting to use streaming with DeepSeek models, the application fails with a "Streaming not supported" error, even though DeepSeek's API is OpenAI-compatible and the DeepseekClient already has streaming implemented.

Steps to Reproduce

  1. Configure an agent with DeepSeek as the LLM provider
  2. Attempt to send a streaming message to the agent (e.g., via /v1/agents/{agent_id}/messages/stream)
  3. Request fails with ValueError

Error Message

ValueError: Streaming not supported for provider deepseek

Root Cause

The streaming adapters do not include ProviderType.deepseek in their provider checks, even though:

  1. DeepSeek uses an OpenAI-compatible API
  2. The DeepseekClient extends OpenAIClient and has stream_async() implemented
  3. The streaming infrastructure is already in place

Impact

  • Severity: High - Complete provider unavailability
  • Affected Users: Anyone attempting to use DeepSeek models
  • Workaround: None available without code changes

Additional Notes

The DeepseekClient already has proper OpenAI-compatible streaming support implemented via the stream_async() method. The streaming adapters simply needed to recognize DeepSeek as a valid streaming provider.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions