FastGPT私有化部署完整指南

🚀 FastGPT 私有化部署完整指南

📋 环境要求

硬件要求
最低配置:
  CPU: 4核
  内存: 8GB
  存储: 50GB
  网络: 稳定互联网连接

推荐配置:
  CPU: 8核+
  内存: 16GB+
  存储: 100GB+ SSD
  网络: 10Mbps+带宽
软件环境
必需软件:
  - Docker: >= 20.10.0
  - Docker Compose: >= 2.0.0
  - Git: 最新版本

操作系统:
  - Ubuntu 20.04+ (推荐)
  - CentOS 7+
  - Windows Server (支持Docker)
  - macOS (开发测试)

🐳 Docker Compose 部署(推荐)

1. 获取源码

# 克隆仓库
git clone https://github.com/labring/FastGPT.git
cd FastGPT

# 切换到稳定版本
git checkout v4.9.14

2. 配置环境

# 进入部署目录
cd projects/app/docker

# 复制配置文件
cp .env.template .env
cp config.json.template config.json

3. 修改配置文件

编辑 .env 文件
# 数据库配置
MONGO_PASSWORD=your_mongo_password
PG_PASSWORD=your_postgres_password

# 服务端口
PORT=3000

# 域名配置(可选)
DEFAULT_ROOT_PSW=your_admin_password
编辑 config.json 文件
{
  "SystemParams": {
    "gitBranch": "v4.9.14",
    "chatApiKey": "",
    "vectorMaxProcess": 15,
    "qaMaxProcess": 15,
    "pgHNSWEfSearch": 100
  },
  "llmModels": [
    {
      "model": "gpt-3.5-turbo",
      "name": "GPT-3.5-turbo",
      "apiKey": "YOUR_OPENAI_API_KEY",
      "baseUrl": "https://api.openai.com/v1",
      "maxTokens": 4000,
      "maxTemperature": 1.2
    }
  ],
  "vectorModels": [
    {
      "model": "text-embedding-ada-002",
      "name": "OpenAI-Ada",
      "apiKey": "YOUR_OPENAI_API_KEY", 
      "baseUrl": "https://api.openai.com/v1",
      "dbConfig": {}
    }
  ]
}

4. 启动服务

# 启动所有服务
docker-compose up -d

# 查看服务状态
docker-compose ps

# 查看日志
docker-compose logs -f fastgpt

⚙️ 详细配置说明

大模型配置

OpenAI 配置
{
  "model": "gpt-4",
  "name": "GPT-4",
  "apiKey": "sk-xxxxxxxx",
  "baseUrl": "https://api.openai.com/v1",
  "maxTokens": 8000,
  "maxTemperature": 1.2,
  "vision": true
}
国产大模型配置
// 阿里通义千问
{
  "model": "qwen-max", 
  "name": "通义千问Max",
  "apiKey": "sk-xxxxxxxx",
  "baseUrl": "https://dashscope.aliyuncs.com/compatible-mode/v1",
  "maxTokens": 6000
}

// 深度求索
{
  "model": "deepseek-chat",
  "name": "DeepSeek Chat", 
  "apiKey": "sk-xxxxxxxx",
  "baseUrl": "https://api.deepseek.com/v1",
  "maxTokens": 4000
}

向量模型配置

// 本地BGE模型
{
  "model": "bge-large-zh-v1.5",
  "name": "BGE-Large-ZH",
  "baseUrl": "http://localhost:6006/v1", 
  "dbConfig": {
    "dimensions": 1024
  }
}

// OpenAI Embedding
{
  "model": "text-embedding-3-large",
  "name": "OpenAI-Embedding-3-Large",
  "apiKey": "sk-xxxxxxxx",
  "baseUrl": "https://api.openai.com/v1",
  "dbConfig": {
    "dimensions": 3072
  }
}

🛠️ 本地模型部署

1. Ollama 本地部署

# 安装 Ollama
curl -fsSL https://ollama.com/install.sh | sh

# 拉取模型
ollama pull qwen2.5:7b
ollama pull bge-m3:latest

# 启动服务
ollama serve
FastGPT 配置 Ollama
{
  "llmModels": [
    {
      "model": "qwen2.5:7b",
      "name": "通义千问2.5-7B",
      "baseUrl": "http://host.docker.internal:11434/v1",
      "apiKey": "ollama",
      "maxTokens": 4000
    }
  ],
  "vectorModels": [
    {
      "model": "bge-m3:latest", 
      "name": "BGE-M3",
      "baseUrl": "http://host.docker.internal:11434/v1",
      "apiKey": "ollama"
    }
  ]
}

2. Xinference 部署

# 安装 Xinference
pip install xinference

# 启动服务
xinference-local --host 0.0.0.0 --port 9997

# 通过 Web UI 管理模型
# http://localhost:9997

🌐 反向代理配置

Nginx 配置

server {
    listen 80;
    server_name your-domain.com;
    
    # 重定向到 HTTPS
    return 301 https://$server_name$request_uri;
}

server {
    listen 443 ssl;
    server_name your-domain.com;
    
    # SSL 证书配置
    ssl_certificate /path/to/cert.pem;
    ssl_certificate_key /path/to/key.pem;
    
    location / {
        proxy_pass http://localhost:3000;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
        
        # WebSocket 支持
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection "upgrade";
    }
}

📊 数据库管理

MongoDB 管理

# 连接 MongoDB
docker exec -it fastgpt-mongo mongo -u myusername -p mypassword

# 备份数据库
docker exec fastgpt-mongo mongodump -u myusername -p mypassword -d fastgpt -o /backup

# 恢复数据库
docker exec fastgpt-mongo mongorestore -u myusername -p mypassword -d fastgpt /backup/fastgpt

PostgreSQL 管理

# 连接 PostgreSQL
docker exec -it fastgpt-pg psql -U username -d postgres

# 备份数据库
docker exec fastgpt-pg pg_dump -U username fastgpt > backup.sql

# 恢复数据库
docker exec -i fastgpt-pg psql -U username fastgpt < backup.sql

🔧 常见问题解决

1. 服务启动失败

# 检查端口占用
netstat -tulpn | grep :3000

# 检查磁盘空间
df -h

# 重启服务
docker-compose restart

2. 内存不足

# 在 docker-compose.yml 中限制内存
services:
  fastgpt:
    deploy:
      resources:
        limits:
          memory: 4G
        reservations:
          memory: 2G

3. API 连接问题

# 测试 API 连通性
curl -X POST https://api.openai.com/v1/chat/completions \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"model":"gpt-3.5-turbo","messages":[{"role":"user","content":"Hello"}]}'

🚀 生产环境优化

性能优化配置

# docker-compose.yml 优化
version: '3.8'
services:
  fastgpt:
    restart: always
    logging:
      driver: "json-file"
      options:
        max-size: "100m"
        max-file: "3"
    deploy:
      resources:
        limits:
          cpus: '4.0'
          memory: 8G
        reservations:
          cpus: '2.0'
          memory: 4G

监控配置

# 添加健康检查
healthcheck:
  test: ["CMD", "curl", "-f", "http://localhost:3000/api/system/getInitData"]
  interval: 30s
  timeout: 10s
  retries: 3

📋 部署检查清单

部署前检查 ✅

□ 服务器资源充足
□ Docker 环境正常
□ 网络连接稳定
□ 域名解析配置
□ SSL 证书准备
□ API Key 有效

部署后验证 ✅

□ 服务正常启动
□ Web 界面可访问
□ 数据库连接正常
□ 大模型调用成功
□ 文件上传功能正常
□ 对话功能测试通过

🎯 总结

FastGPT 私有化部署相对简单,关键要点:

  1. 环境准备:确保 Docker 环境和硬件资源充足
  2. 配置管理:正确配置大模型和向量模型 API
  3. 安全考虑:使用 HTTPS、强密码、防火墙配置
  4. 监控维护:定期备份、日志监控、性能优化

部署成功后,你将拥有一个完全私有的 AI 知识库平台!🎉

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值