Global AI and Data Science

 View Only
Expand all | Collapse all

Server crashes after starting vLLM and Ollama [Power9]

  • 1.  Server crashes after starting vLLM and Ollama [Power9]

    Posted Fri March 14, 2025 09:36 AM

    Architecture:        ppc64le
    Model name:          POWER9
    OS: AlmaLinux 8.10
    ---------------
    I have 4x Tesla V100-SXM2-32GB gpus
    NVIDIA-SMI 550.54.15         
    Driver Version: 550.54.15   
    CUDA Version: 12.4
    Cuda compilation tools, release 12.4, V12.4.131
    ------------------
    ---------- vLLM try [https://community.ibm.com/community/user/powerdeveloper/blogs/manjunath-kumatagi/2024/06/27/run-vllm-on-ppc64le-architecture]
    I installed vllm via docker. It runs well on cpu.
    When I try to run it via gpu, it crashes.


    ----------- Ollama try

    Same story with Ollam.

    Does anyone know the reason for this? Why does this happen?



    ------------------------------
    Bekbolat Kurmetbek
    ------------------------------


  • 2.  RE: Server crashes after starting vLLM and Ollama [Power9]

    Posted Mon March 17, 2025 10:41 PM

    "It crashes" is not a very detailed problem report. Can you post some error messages?
    Furthermore, there is a long discussion on the blog you show a link to. Perhaps you can join that discussion.



    ------------------------------
    Jack Woehr
    Senior Consultant
    Seiden Group LLC
    Beulah CO
    3038478442
    ------------------------------