Guest User

Untitled

a guest
Feb 1st, 2025
37
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
Bash 0.77 KB | None | 0 0
  1. cd /llm/scripts/
  2. # set the recommended Env
  3. echo "START.SH: START IPEX LLM INIT"
  4. source ipex-llm-init --gpu --device $DEVICE
  5. chmod +x start-ollama.sh
  6. echo "START.SH: START OLLAMA"
  7. # init ollama first
  8. mkdir -p /llm/ollama
  9. cd /llm/ollama
  10. init-ollama
  11. export OLLAMA_NUM_GPU=999
  12. export ZES_ENABLE_SYSMAN=1
  13.  
  14. export SYCL_CACHE_PERSISTENT=1
  15. # [optional] under most circumstances, the following environment variable may improve performance, but sometimes this may also cause performance degradation
  16. export SYCL_PI_LEVEL_ZERO_USE_IMMEDIATE_COMMANDLISTS=1
  17. # [optional] if you want to run on single GPU, use below command to limit GPU may improve performance
  18. export ONEAPI_DEVICE_SELECTOR=level_zero:0
  19.  
  20. # start ollama service
  21. ./ollama serve
  22. echo "START.SH: OLLAMA STOPPED"
  23.  
  24.  
  25.  
Advertisement
Add Comment
Please, Sign In to add comment