二、API接口调用
1.DeepSeek-V3模型调用
# Please install OpenAI SDK first: `pip3 install openai` from openai import OpenAI client = OpenAI(api_key="<DeepSeek API Key>", base_url="https://api.deepseek.com") response = client.chat.completions.create( model="deepseek-chat", messages=[ {"role": "system", "content": "You are a helpful assistant"}, {"role": "user", "content": "Hello"}, ], stream=False ) print(response.choices[0].message.content)
2.DeepSeek-R1模型调用
# Please install OpenAI SDK first: `pip3 install openai` from openai import OpenAI client = OpenAI(api_key="<DeepSeek API Key>", base_url="https://api.deepseek.com") response = client.chat.completions.create( model="deepseek-reasoner", messages=[ {"role": "system", "content": "You are a helpful assistant"}, {"role": "user", "content": "Hello"}, ], stream=False ) print(response.choices[0].message.content)
三、本地化部署接口调用
1.ollama本地化安装(略)
2.DeepSeek-R1本地化安装
ollama run deepseek-r1:1.5b
3.本地ollama接口调用
# Please install OpenAI SDK first: `pip3 install openai` from openai import OpenAI client = OpenAI(api_key="ollama", base_url="https://localhost:11434/v1/") response = client.chat.completions.create( model="deepseek-r1:1.5b", messages=[ {"role": "system", "content": "You are a helpful assistant"}, {"role": "user", "content": "Hello"}, ], stream=False ) print(response.choices[0].message.content)