Warning: could not connect to a running Ollama instance

Ollama 大模型报错


root@DESKTOP-I3BHMQ9:/home/hhi# ollama --version
Warning: could not connect to a running Ollama instance
Warning: client version is 0.1.47

root@DESKTOP-I3BHMQ9:/home/hhi#


root@DESKTOP-I3BHMQ9:/home/hhi# ollama run llama3:8b
Error: could not connect to ollama app, is it running?

解决办法:

 执行命令:ollama serve

相关推荐

最近更新

  1. docker php8.1+nginx base 镜像 dockerfile 配置

    2024-07-12 18:36:02       66 阅读
  2. Could not load dynamic library ‘cudart64_100.dll‘

    2024-07-12 18:36:02       70 阅读
  3. 在Django里面运行非项目文件

    2024-07-12 18:36:02       57 阅读
  4. Python语言-面向对象

    2024-07-12 18:36:02       68 阅读

热门阅读

  1. 大语言模型

    2024-07-12 18:36:02       19 阅读
  2. EasyExcel文档链接与使用示例

    2024-07-12 18:36:02       17 阅读
  3. html转markdown nodejs实现

    2024-07-12 18:36:02       18 阅读
  4. 记一次nodeBB部署

    2024-07-12 18:36:02       23 阅读
  5. 使用Spring Boot实现分布式配置管理

    2024-07-12 18:36:02       16 阅读
  6. 快速上手文心一言:让创作更轻松

    2024-07-12 18:36:02       18 阅读
  7. 树状数组(Binary Indexed Tree, BIT)

    2024-07-12 18:36:02       19 阅读
  8. LeetCode 20. 有效的括号

    2024-07-12 18:36:02       17 阅读