docker / genai-stack

Langchain + Docker + Neo4j + Ollama

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Docker stuck at creating container

aturevich opened this issue · comments

Hi, on win 11 inside wsl 2
running
docker run -it --rm -v ./ollama_files:/root/.ollama -p 11434:11434 --name ollama ollama/ollama

Getting stuck at
2024/06/28 11:29:42 routes.go:1064: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE: OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:1 OLLAMA_MAX_QUEUE:512 OLLAMA_MAX_VRAM:0 OLLAMA_MODELS:/root/.ollama/models OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://*] OLLAMA_RUNNERS_DIR: OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES:]" time=2024-06-28T11:29:42.101Z level=INFO source=images.go:730 msg="total blobs: 0" time=2024-06-28T11:29:42.103Z level=INFO source=images.go:737 msg="total unused blobs removed: 0" time=2024-06-28T11:29:42.103Z level=INFO source=routes.go:1111 msg="Listening on [::]:11434 (version 0.1.47)" time=2024-06-28T11:29:42.104Z level=INFO source=payload.go:30 msg="extracting embedded files" dir=/tmp/ollama2712116269/runners time=2024-06-28T11:29:44.796Z level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cpu_avx cpu_avx2 cuda_v11 rocm_v60101 cpu]" time=2024-06-28T11:29:44.812Z level=INFO source=types.go:98 msg="inference compute" id=0 library=cpu compute="" driver=0.0 name="" total="15.4 GiB" available="14.1 GiB"

tried different versions of docker, no success so far

total="15.4 GiB" available="14.1 GiB" Seems your system don't have enough memory, try to use a smaller model in ollama