OpenManus 本地ollama 安装和使用指南
目录
环境要求
- Python 3.9 或更高版本
- Git
- 支持的操作系统:Windows/Linux/MacOS
安装步骤
1. 克隆项目
git clone https://github.com/mannaandpoem/OpenManus.git
cd OpenManus
2. 创建虚拟环境(推荐)
python -m venv venv
# Windows
venv\Scripts\activate
# Linux/MacOS
source venv/bin/activate
3. 安装依赖
pip install -r requirements.txt
4. 安装 Ollama(可选)
如果你想使用本地模型,需要安装 Ollama:
- 访问 Ollama官网 下载并安装
- 拉取所需模型:
ollama pull llama3.2 # 或其他支持的模型
ollama pull llama3.2
- 启动 Ollama 服务:
ollama serve
5. 安装 Playwright(用于浏览器自动化)
playwright install
基本使用
1. 配置设置
在使用前,需要先配置 config/config.toml
文件:
# Global LLM configuration
[llm] #OLLAMA:
api_type = 'ollama'
model = "llama3.2" # The LLM model to use
base_url = "http://127.0.0.1:11434/v1" # API endpoint URL
api_key = "ollama" # Your API key
max_tokens = 4096 # Maximum number of tokens in the response
temperature = 0.0 # Controls randomness
[llm.vision] #OLLAMA VISION:
api_type = 'ollama'
model = "lama3.2-visionl" # The vision model to use
base_url = "http://127.0.0.1:11434/v1" # API endpoint URL for vision model
api_key = "ollama" # Your API key for vision model
max_tokens = 4096 # Maximum number of tokens in the response
temperature = 0.0 # Controls randomness for vision model
# Optional configuration, Search settings.
[search]
# Search engine for agent to use. Default is "Google", can be set to "Baidu" or "DuckDuckGo" or "Bing".
engine = "Baidu"
# Fallback engine order. Default is ["DuckDuckGo", "Baidu", "Bing"] - will try in this order after primary engine fails.
fallback_engines = [ "Baidu", "Bing"]
# Seconds to wait before retrying all engines again when they all fail due to rate limits. Default is 60.
retry_delay = 60
# Maximum number of times to retry all engines when all fail. Default is 3.
max_retries = 3
# Language code for search results. Options: "en" (English), "zh" (Chinese), etc.
lang = "zh"
# Country code for search results. Options: "us" (United States), "cn" (China), etc.
country = "cn"
headless = false # 设置为 true 则不显示浏览器界面
[mcp]
server_reference = "app.mcp.server" # default server module reference
2. 运行示例
python run_flow.py
3. 基本API调用
from app.agent.manus import Manus
# 使用 Ollama
agent = Manus(use_ollama=True, ollama_model="llama2")
response = agent.run("你的指令")
常见问题
1. Playwright 相关问题
如果遇到 Playwright 浏览器启动失败,请确保:
- 已正确执行
playwright install
- 系统安装了必要的依赖
- 防火墙设置允许浏览器访问
2. API 调用失败
- Ollama:
- 确保 Ollama 服务正在运行
- 检查 ollama_base_url 配置是否正确
- 验证所需模型是否已下载
3. 内存使用问题
- 建议使用 8GB 或更多内存
- 对于大型任务,可能需要调整内存限制
获取帮助
- 访问 GitHub Issues
- 加入 Discord 社区
- 查看 官方文档