Skip to main content
Chunkr AI API allows you to configure the LLMs that will be used to process the documents. The LLMs configuration is applied to your segments during the segment_processing step, click here to learn more. This is how you can configure the LLMs:
llm_processing=LlmProcessing(
    llm_model_id="chunkr-parse-1-thinking",
    fallback_strategy=FallbackStrategy.model("chunkr-parse-1"),
    max_completion_tokens=4096,
    temperature=0.0
)

LLM Processing Options

The LlmProcessing configuration controls which language models are used for processing segments and provides fallback strategies if the primary model fails.
FieldTypeDescriptionDefault
llm_model_idStringThe ID of the model to use for processing. If not provided, the system default model will be used.System default
fallback_strategyFallbackStrategyStrategy to use if the primary model fails.System default
max_completion_tokensIntegerMaximum number of tokens to generate in the model response.None
temperatureFloatControls randomness in model responses (0.0 = deterministic, higher = more random).0.0

Fallback Strategies

When working with language models, reliability is important. Chunkr provides three fallback strategies to handle cases when your primary model fails:
  • FallbackStrategy.none(): No fallback will be used. If the primary model fails, the operation will return an error.
  • FallbackStrategy.default(): Use the system default fallback model.
  • FallbackStrategy.model("model-id"): Specify a particular model ID to use as a fallback. This gives you explicit control over which alternative model should be used.

Example Usage

Here’s how to configure LLM processing in different scenarios:
from chunkr_ai import Chunkr
from chunkr_ai.models import (
    Configuration,
    LlmProcessing,
    FallbackStrategy
)

chunkr = Chunkr()

# Use Gemini Pro 2.5 with no fallback strategy
config = Configuration(
    llm_processing=LlmProcessing(
        llm_model_id="gemini-pro-2.5",
        fallback_strategy=FallbackStrategy.none(),
        temperature=0.0
    )
)

chunkr.upload("path/to/file", config)

Available Models

The following models are currently available for use with Chunkr:
Note: This table is dynamically generated by fetching data from our API. Model availability may change over time.