明天的明天 永远的永远 未知的一切 我与你一起承担 ??

是非成败转头空 青山依旧在 几度夕阳红 。。。
  博客园  :: 首页  :: 管理

DeepSeek API大模型接口实现

Posted on 2025-02-07 15:18  且行且思  阅读(315)  评论(0)    收藏  举报

二、API接口调用

1.DeepSeek-V3模型调用

# Please install OpenAI SDK first: `pip3 install openai`
 
from openai import OpenAI
 
client = OpenAI(api_key="<DeepSeek API Key>", base_url="https://api.deepseek.com")
 
response = client.chat.completions.create(
    model="deepseek-chat",
    messages=[
        {"role": "system", "content": "You are a helpful assistant"},
        {"role": "user", "content": "Hello"},
    ],
    stream=False
)
 
print(response.choices[0].message.content)

2.DeepSeek-R1模型调用

# Please install OpenAI SDK first: `pip3 install openai`
 
from openai import OpenAI
 
client = OpenAI(api_key="<DeepSeek API Key>", base_url="https://api.deepseek.com")
 
response = client.chat.completions.create(
    model="deepseek-reasoner",
    messages=[
        {"role": "system", "content": "You are a helpful assistant"},
        {"role": "user", "content": "Hello"},
    ],
    stream=False
)
 
print(response.choices[0].message.content)

 

三、本地化部署接口调用

1.ollama本地化安装(略)

2.DeepSeek-R1本地化安装

ollama run deepseek-r1:1.5b

 

3.本地ollama接口调用

# Please install OpenAI SDK first: `pip3 install openai`
 
from openai import OpenAI
 
client = OpenAI(api_key="ollama", 
    base_url="https://localhost:11434/v1/")
 
response = client.chat.completions.create(
    model="deepseek-r1:1.5b",
    messages=[
        {"role": "system", "content": "You are a helpful assistant"},
        {"role": "user", "content": "Hello"},
    ],
    stream=False
)
 
print(response.choices[0].message.content)