OpenAI兼容性

2024年2月8日

OpenAI compatibility

Ollama现在内置兼容OpenAI 聊天完成API,使您可以将更多工具和应用程序与本地Ollama一起使用。

设置

首先下载Ollama并提取模型,例如Llama 2Mistral

ollama pull llama2

用法

cURL

要调用Ollama的OpenAI兼容API端点,请使用相同的OpenAI格式并将主机名更改为https://127.0.0.1:11434

curl https://127.0.0.1:11434/v1/chat/completions \
    -H "Content-Type: application/json" \
    -d '{
        "model": "llama2",
        "messages": [
            {
                "role": "system",
                "content": "You are a helpful assistant."
            },
            {
                "role": "user",
                "content": "Hello!"
            }
        ]
    }'

OpenAI Python库

from openai import OpenAI

client = OpenAI(
    base_url = 'https://127.0.0.1:11434/v1',
    api_key='ollama', # required, but unused
)

response = client.chat.completions.create(
  model="llama2",
  messages=[
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "Who won the world series in 2020?"},
    {"role": "assistant", "content": "The LA Dodgers won in 2020."},
    {"role": "user", "content": "Where was it played?"}
  ]
)
print(response.choices[0].message.content)

OpenAI JavaScript库

import OpenAI from 'openai'

const openai = new OpenAI({
  baseURL: 'https://127.0.0.1:11434/v1',
  apiKey: 'ollama', // required but unused
})

const completion = await openai.chat.completions.create({
  model: 'llama2',
  messages: [{ role: 'user', content: 'Why is the sky blue?' }],
})

console.log(completion.choices[0].message.content)

示例

Vercel AI SDK

Vercel AI SDK是一个用于构建会话流应用程序的开源库。要开始使用,请使用create-next-app克隆示例仓库。

npx create-next-app --example https://github.com/vercel/ai/tree/main/examples/next-openai example
cd example

然后在app/api/chat/route.ts中进行以下两个编辑,以更新聊天示例以使用Ollama。

const openai = new OpenAI({
  baseURL: 'https://127.0.0.1:11434/v1',
  apiKey: 'ollama',
});
const response = await openai.chat.completions.create({
  model: 'llama2',
  stream: true,
  messages,
});

接下来,运行应用程序。

npm run dev

最后,在浏览器中打开示例应用程序,地址为https://127.0.0.1:3000

Autogen

Autogen是微软的一个流行的开源框架,用于构建多代理应用程序。在此示例中,我们将使用Code Llama模型。

ollama pull codellama

安装Autogen

pip install pyautogen

然后创建一个Python脚本example.py以将Ollama与Autogen一起使用。

from autogen import AssistantAgent, UserProxyAgent

config_list = [
  {
    "model": "codellama",
    "base_url": "https://127.0.0.1:11434/v1",
    "api_key": "ollama",
  }
]

assistant = AssistantAgent("assistant", llm_config={"config_list": config_list})

user_proxy = UserProxyAgent("user_proxy", code_execution_config={"work_dir": "coding", "use_docker": False})
user_proxy.initiate_chat(assistant, message="Plot a chart of NVDA and TESLA stock price change YTD.")

最后,运行该示例以让助手编写绘制图表代码。

python example.py

更多内容即将推出

这是对OpenAI API的初步实验性支持。正在考虑的未来改进包括:

  • 嵌入API
  • 函数调用
  • 视觉支持
  • Logprobs

欢迎提交GitHub问题 are welcome!更多信息,请参见OpenAI兼容性文档