外贸哪家做网站,网站点击赚钱怎么做,广东建设信息网专业版,百度网页高级搜索服务器原先有ollama#xff0c;想要重装#xff0c;遇到一系列问题
安装下载连接#xff1a;https://github.com/ollama/ollama/blob/main/docs/linux.md模型下载链接#xff1a;https://ollama.com/library/deepseek-r1:1.5b
一、安装新的ollama
在root用户下操作
1.卸…
服务器原先有ollama想要重装遇到一系列问题
安装下载连接https://github.com/ollama/ollama/blob/main/docs/linux.md模型下载链接https://ollama.com/library/deepseek-r1:1.5b
一、安装新的ollama
在root用户下操作
1.卸载已安装的ollama
# Remove the ollama service:sudo systemctl stop ollama
sudo systemctl disable ollama
sudo rm /etc/systemd/system/ollama.service# Remove the ollama binary from your bin directory (either /usr/local/bin, /usr/bin, or /bin):sudo rm $(which ollama)# Remove the downloaded models and Ollama service user and group:sudo rm -r /usr/share/ollama
sudo userdel ollama
sudo groupdel ollama# Remove installed libraries:sudo rm -rf /usr/local/lib/ollama# Rremove the old libraries
sudo rm -rf /usr/lib/ollama
2.安装
curl -fsSL https://ollama.com/install.sh | sh
该命令国内下载缓慢采取手工安装方式
Download and extract the package:curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz
# 如果在服务器无法下载直接在本地电脑浏览器打开 https://ollama.com/download/ollama-linux-amd64.tgz 下载文件然后将文件移到服务器上sudo tar -C /usr -xzf ollama-linux-amd64.tgz# Start Ollama:ollama serve
# 可以正常启动但是显示日志所以使用以下命令
# ollama serve # In another terminal, verify that Ollama is running:ollama -v
3.设置启动服务
# Adding Ollama as a startup service (recommended)
# Create a user and group for Ollama:sudo useradd -r -s /bin/false -U -m -d /usr/share/ollama ollama
sudo usermod -a -G ollama $(whoami)# Create a service file in /etc/systemd/system/ollama.service:[Unit]
DescriptionOllama Service
Afternetwork-online.target[Service]
ExecStart/usr/bin/ollama serve
Userollama
Groupollama
Restartalways
RestartSec3
EnvironmentPATH$PATH[Install]
WantedBymulti-user.target# Then start the service:sudo systemctl daemon-reload
sudo systemctl enable ollama
4.对外提供模型服务
便于difyragflow平台调用
vim /etc/systemd/system/ollama.service
[Unit]
DescriptionOllama Service
Afternetwork-online.target[Service]
ExecStart/usr/bin/ollama serve
Userollama
Groupollama
Restartalways
RestartSec3EnvironmentPATH$PATH
EnvironmentOLLAMA_HOST0.0.0.0
EnvironmentOLLAMA_PORT11434
EnvironmentOLLAMA_ORIGINS*[Install]
WantedBymulti-user.targetsudo systemctl daemon-reloadsudo systemctl restart ollama二、遇到的问题
1.Error: listen tcp 127.0.0.1:11434: bind: address already in us
# 命令
ollama serve
# 问题
Error: listen tcp 127.0.0.1:11434: bind: address already in us
查找占用 11434 端口的进程并终止进程
# 查看
sudo lsof -i :11434
# 终止
kill -9 PID
# 重新启动
ollama serve
若问题没有解决
rootuser-NF5280M6:/home/lzm/Downloads# sudo lsof -i :11434
COMMAND PID USER FD TYPE DEVICE SIZE/OFF NODE NAME
ollama 3558609 root 3u IPv4 274643309 0t0 TCP localhost:11434 (LISTEN)
rootuser-NF5280M6:/home/lzm/Downloads# kill -9 3558609
rootuser-NF5280M6:/home/lzm/Downloads# sudo lsof -i :11434
COMMAND PID USER FD TYPE DEVICE SIZE/OFF NODE NAME
ollama 3563660 root 3u IPv4 274711630 0t0 TCP localhost:11434 (LISTEN)
ps aux | grep ollama# 强制终止所有 Ollama 进程
pkill ollama
2.下载模型超时无法下载
ollama run deepseek-r1:1.5b
[GIN] 2025/06/27 - 10:42:02 | 200 | 42.146µs | 127.0.0.1 | HEAD /
[GIN] 2025/06/27 - 10:42:02 | 404 | 209.992µs | 127.0.0.1 | POST /api/show
pulling manifest ⠏ time2025-06-27T10:42:12.65508:00 levelINFO sourceimages.go:713 msgrequest failed: Get \https://registry.ollama.ai/v2/library/deepseek-r1/manifests/1.5b\: dial tcp: lookup registry.ollama.ai on 127.0.0.53:53: read udp 127.0.0.1:36213-127.0.0.53:53: i/o timeout
[GIN] 2025/06/27 - 10:42:12 | 200 | 10.00749667s | 127.0.0.1 | POST /api/pull
pulling manifest
Error: pull model manifest: Get https://registry.ollama.ai/v2/library/deepseek-r1/manifests/1.5b: dial tcp: lookup registry.ollama.ai on 127.0.0.53:53: read udp 127.0.0.1:36213-127.0.0.53:53: i/o timeout
选用国内镜像
阿里云https://registry.ollama.ai
DeepSeek 官方镜像https://ollama.deepseek.com
浙江大学镜像站https://ollama.zju.edu.cn
魔搭社区https://ollama.modelscope.cn
mkdir -p ~/.ollama
cat EOF ~/.ollama/config.json
{registry: {mirrors: {registry.ollama.ai: https://ollama.deepseek.com}}
}
EOFsudo systemctl restart ollamasudo systemctl status ollama
参考链接Ollma通过国内源实现模型本地化部署_ollama国内镜像源-CSDN博客
通义
三、运行情况展示
1.服务器端UBUNTU OLLAMA 2.平台端DIFY