在FreeBSD下安装Ollama并体验DeepSeek r1大模型

发布于:2025-02-10 ⋅ 阅读:(60) ⋅ 点赞:(0)

在FreeBSD下安装Ollama并体验DeepSeek r1大模型

在FreeBSD下安装Ollama

直接使用pkg安装即可:

sudo pkg install ollama

安装完成后,提示:

You installed ollama: the AI model runner.

To run ollama, plese open 2 terminals.
1. In the first terminal, please run:
   $ OLLAMA_NUM_PARALLEL=1 OLLAMA_DEBUG=1 LLAMA_DEBUG=1 ollama start
2. In the second terminal, please run:
   $ ollama run mistral

This will download and run the AI model "mistral".
You will be able to interact with it in plain English.

Please see https://ollama.com/library for the list
of all supported models.

The command "ollama list" lists all models downloaded
into your system.
 

运行Ollama

按照提示,在bash下执行(注意FreeBSD系统下可能默认shell是csh,需要切换到bash):

OLLAMA_NUM_PARALLEL=1 OLLAMA_DEBUG=1 LLAMA_DEBUG=1 ollama start

然后到另一个bash下,执行启动模型,比如启动deepseek-r1:1.5b模型:

ollama run deepseek-r1:1.5b

如果报错,可以查找下是否防火墙的问题,将ollama端口打开即可。 

 

调试

ollama启动报错

 ollama run tinyllama
bError: Head "http://127.0.0.1:11434/": dial tcp 127.0.0.1:11434: i/o timeout

原来是pf防火墙的问题,解决方法是:

到文件/etc/pf.conf里,加入11434端口的许可

tcp_services = "{ftp, ssh, smtp, domain, www, pop3, auth, pop3s, 8000, 8080, 114
34 }"
block in all
# pass out proto tcp to any port $tcp_services keep state
pass in proto tcp to any port $tcp_services keep state
pass in proto icmp from any to any

然后重启服务:

service pf restart

好了,现在再启动ollama,就能自动下载模型了。

 


网站公告

今日签到

点亮在社区的每一天
去签到