Skip to content

agent_k.infra.providers

Model factory and provider configuration.

agent_k.infra.providers

Model configuration for AGENT-K.

@notice: | Model configuration for AGENT-K.

@dev: | See module for implementation details and extension points.

@graph: id: agent_k.infra.providers provides: - agent_k.infra.providers:get_model - agent_k.infra.providers:create_devstral_model - agent_k.infra.providers:create_openrouter_model - agent_k.infra.providers:is_devstral_model - agent_k.infra.providers:DEVSTRAL_MODEL_ID - agent_k.infra.providers:DEVSTRAL_BASE_URL pattern: model-factory

@similar: - id: agent_k.infra.config when: "General configuration; this module builds model instances."

@agent-guidance: do: - "Use agent_k.infra.providers as the canonical home for this capability." do_not: - "Create parallel modules without updating @similar or @graph."

@human-review: last-verified: 2026-01-26 owners: - agent-k-core

(c) Mike Casale 2025. Licensed under the MIT License.

create_devstral_model

create_devstral_model(model_id: Annotated[str, Doc('Devstral model identifier.')] = DEVSTRAL_MODEL_ID, base_url: Annotated[str | None, Doc('Base URL for the Devstral endpoint.')] = None) -> OpenAIChatModel

Create a Devstral model instance for local LM Studio server.

This creates an OpenAI-compatible model that connects to a local
LM Studio server running Devstral.

@dev: | See module for behavior details and invariants.

@notice: |
    Creates an OpenAIChatModel configured for Devstral.

@factory-for:
    id: pydantic_ai.models.openai:OpenAIChatModel
    rationale: "Centralizes local LM Studio configuration."
    singleton: false
    cache-key: model_id

@canonical-home:
    for:
        - "devstral model creation"
    notes: "Use create_devstral_model for local Devstral endpoints."
Source code in agent_k/infra/providers.py
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
def create_devstral_model(
    model_id: Annotated[str, Doc("Devstral model identifier.")] = DEVSTRAL_MODEL_ID,
    base_url: Annotated[str | None, Doc("Base URL for the Devstral endpoint.")] = None,
) -> OpenAIChatModel:
    """Create a Devstral model instance for local LM Studio server.

        This creates an OpenAI-compatible model that connects to a local
        LM Studio server running Devstral.

    @dev: |
        See module for behavior details and invariants.

        @notice: |
            Creates an OpenAIChatModel configured for Devstral.

        @factory-for:
            id: pydantic_ai.models.openai:OpenAIChatModel
            rationale: "Centralizes local LM Studio configuration."
            singleton: false
            cache-key: model_id

        @canonical-home:
            for:
                - "devstral model creation"
            notes: "Use create_devstral_model for local Devstral endpoints."
    """
    url = base_url or DEVSTRAL_BASE_URL

    return OpenAIChatModel(
        model_id,
        provider=OpenAIProvider(
            base_url=url,
            api_key="not-required",  # Local LM Studio doesn't require auth
        ),
    )

create_openrouter_model

create_openrouter_model(model_id: Annotated[str, Doc('OpenRouter model identifier.')]) -> OpenAIChatModel

Create a model instance using OpenRouter.

OpenRouter provides access to many models including Devstral, Claude,
GPT-4, and more through a unified API.

@dev: | See module for behavior details and invariants.

@notice: |
    Creates an OpenAIChatModel configured for OpenRouter.

@factory-for:
    id: pydantic_ai.models.openai:OpenAIChatModel
    rationale: "Centralizes OpenRouter provider configuration."
    singleton: false
    cache-key: model_id

@canonical-home:
    for:
        - "openrouter model creation"
    notes: "Use create_openrouter_model for OpenRouter endpoints."
Source code in agent_k/infra/providers.py
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
def create_openrouter_model(model_id: Annotated[str, Doc("OpenRouter model identifier.")]) -> OpenAIChatModel:
    """Create a model instance using OpenRouter.

        OpenRouter provides access to many models including Devstral, Claude,
        GPT-4, and more through a unified API.

    @dev: |
        See module for behavior details and invariants.

        @notice: |
            Creates an OpenAIChatModel configured for OpenRouter.

        @factory-for:
            id: pydantic_ai.models.openai:OpenAIChatModel
            rationale: "Centralizes OpenRouter provider configuration."
            singleton: false
            cache-key: model_id

        @canonical-home:
            for:
                - "openrouter model creation"
            notes: "Use create_openrouter_model for OpenRouter endpoints."
    """
    return OpenAIChatModel(model_id, provider=OpenRouterProvider())

get_model

get_model(model_spec: Annotated[str, Doc('Model specification string.')]) -> Model | str

Get a model instance based on specification string.

Supports: - Standard pydantic-ai model strings (e.g., 'anthropic:claude-3-haiku-20240307') - Local Devstral (e.g., 'devstral:local') - OpenRouter models (e.g., 'openrouter:mistralai/devstral-small-2505')

@notice: | Resolves model specs into provider-specific model objects when needed.

@dev: | Returns a string for standard pydantic-ai model specs.

@factory-for: id: pydantic_ai.models:Model rationale: "Centralized model resolution for all agents." singleton: false cache-key: model_spec

@canonical-home: for: - "model resolution" notes: "Use get_model to normalize model specs."

Source code in agent_k/infra/providers.py
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
def get_model(model_spec: Annotated[str, Doc("Model specification string.")]) -> Model | str:
    """Get a model instance based on specification string.

    Supports:
    - Standard pydantic-ai model strings (e.g., 'anthropic:claude-3-haiku-20240307')
    - Local Devstral (e.g., 'devstral:local')
    - OpenRouter models (e.g., 'openrouter:mistralai/devstral-small-2505')

    @notice: |
        Resolves model specs into provider-specific model objects when needed.

    @dev: |
        Returns a string for standard pydantic-ai model specs.

    @factory-for:
        id: pydantic_ai.models:Model
        rationale: "Centralized model resolution for all agents."
        singleton: false
        cache-key: model_spec

    @canonical-home:
        for:
            - "model resolution"
        notes: "Use get_model to normalize model specs."
    """
    if model_spec.startswith("devstral:"):
        # Parse devstral model specification for local LM Studio
        suffix = model_spec[len("devstral:") :]

        if suffix == "local":
            # Use default local LM Studio configuration
            return create_devstral_model()
        elif suffix.startswith("http"):
            # Custom base URL provided
            return create_devstral_model(base_url=suffix)
        else:
            # Assume suffix is a model ID
            return create_devstral_model(model_id=suffix)

    if model_spec.startswith("openrouter:"):
        # Parse OpenRouter model specification
        model_id = model_spec[len("openrouter:") :]
        return create_openrouter_model(model_id)

    # Return string for standard pydantic-ai model resolution
    # (e.g., 'anthropic:claude-3-haiku-20240307', 'openai:gpt-4o')
    return model_spec

is_devstral_model

is_devstral_model(model_spec: Annotated[str, Doc('Model specification string.')]) -> bool

Check if a model specification refers to Devstral.

@dev: | See module for behavior details and invariants.

@notice: |
    Returns true when the model spec targets Devstral.
Source code in agent_k/infra/providers.py
188
189
190
191
192
193
194
195
196
197
def is_devstral_model(model_spec: Annotated[str, Doc("Model specification string.")]) -> bool:
    """Check if a model specification refers to Devstral.

    @dev: |
        See module for behavior details and invariants.

        @notice: |
            Returns true when the model spec targets Devstral.
    """
    return model_spec.startswith("devstral:")