Turbonomic

 View Only
  • 1.  IBM Turbonomic for GPU Optimization

    IBM Champion
    Posted Wed July 10, 2024 04:35 AM
    Hi Team,

    I have just come across few articles and wanted to get more insights on the matter and hence would like to rely on the most reliable source of truth.

    Is IBM Turbonomic capable of rightsizing GPUs from Intel or Nvidia with respect to the processes like inferencing, finetuning, prompting in addition how could this capability support the aspect of billing(Type of GPUs, usage duration, data processing), metering(active usage time of GPUs) and payment processing(produced out of metered usage of GPUs) ?  

    Any reference material or demo would be highly appreciated.

    Kind regards

    ­­­

    Sandeep Koul

    Lead Solution Architect – Emerging Offerings

    Distributed and Hybrid Cloud

    Wipro Full Stride Cloud Service

    Wipro Limited

    T: +31 651549099 || Connect with me on LinkedIn

     

    Being Respectful | Being Responsive | Always Communicating | Demonstrating Stewardship | Building Trust

     

     

    'The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. www.wipro.com'

    Internal - General Use



  • 2.  RE: IBM Turbonomic for GPU Optimization

    Posted Mon July 15, 2024 03:42 PM

    Hello Sandeep, in what type of environment are you referring to?  Turbonomic does support GPU's optimization in on-prem, cloud and containers.  Specific to inferencing we support that in Kubernetes and Red Hat OpenShift environments.  See blogs: https://community.ibm.com/community/user/aiops/blogs/murtuza-mukadam/2024/05/22/scale-gen-ai-workloads-in-kubernetes
    and https://community.ibm.com/community/user/aiops/blogs/paul-carley/2024/06/27/turbonomic-support-for-gpu-optimization-on-contain

    any other questions let me know, thanks



    ------------------------------
    Jason Shaw
    Product Manager | Turbonomic
    IBM Software | IT Automation
    ------------------------------



  • 3.  RE: IBM Turbonomic for GPU Optimization

    IBM Champion
    Posted Tue July 16, 2024 01:09 AM
    like Sandeep Koul reacted to your message:





  • 4.  RE: IBM Turbonomic for GPU Optimization

    Posted Tue July 16, 2024 10:00 AM

    Hello Sandeep!

    Adding to Jason´s message, we support NVIDIA only and we look at historical GPU usage via DCGM stats to determine previous and expected future utilization. We do not specifically look at usage patterns with respect to training or inferencing, but just focus on the utilization of the GPU cores. 



    ------------------------------
    Juan Angel Muñoz
    ------------------------------



  • 5.  RE: IBM Turbonomic for GPU Optimization

    IBM Champion
    Posted Tue July 16, 2024 10:13 AM
    Hi Juan,
    Thanks for adding more inputs.
    Is there any clarity on adding other Silicon Chip(GPUs) Manufacturers to the Turbo Capabilities?

    Eg: Intel, AMD, Qualcomm, Apple, ASUS, EVGA etc.

    Kind regards

    ­­­

    Sandeep Koul

    Lead Solution Architect – Emerging Offerings

    Distributed and Hybrid Cloud

    Wipro Full Stride Cloud Service

    Wipro Limited

    T: +31 651549099 || Connect with me on LinkedIn

     

    Being Respectful | Being Responsive | Always Communicating | Demonstrating Stewardship | Building Trust

     

     


    Internal - General Use






  • 6.  RE: IBM Turbonomic for GPU Optimization

    Posted Wed July 17, 2024 08:35 AM