How do I make the DeepSeek V3.1 model "think"?
Last updated: August 29, 2025
Because this model is a "Hybrid" model, you do need to pass in an additional chat_template_kwargs={"thinking": True} configuration in your Request to the "deepseek-ai/DeepSeek-V3.1" model to turn on reasoning response generations.
An example call would look like this:
from together import Together
client = Together()
stream = client.chat.completions.create(
model="deepseek-ai/DeepSeek-V3.1",
messages=[
{"role": "user", "content": "What is the most expensive sandwich?"}
],
chat_template_kwargs={"thinking": True},
stream=True,
)
for chunk in stream:
delta = chunk.choices[0].delta
# Show reasoning tokens if present
if hasattr(delta, "reasoning") and delta.reasoning:
print(delta.reasoning, end="", flush=True)
# Show content tokens if present
if hasattr(delta, "content") and delta.content:
print(delta.content, end="", flush=True)
Please note for this model function calling only works in non-thinking ("thinking": False) mode.