Global AI and Data Science

Global AI & Data Science

Train, tune and distribute models with generative AI and machine learning capabilities

 View Only

Using GPU Acceleration for Deep Learning (Keras) on our IBM-Macs, feasible, practical?

  • 1.  Using GPU Acceleration for Deep Learning (Keras) on our IBM-Macs, feasible, practical?

    Posted Fri October 07, 2022 05:34 AM
    Hi team

    Configuring a ML env environment for using GPU acceleration on machines with Nvidia cards is pretty straighforward, however for our Macs with AMD cards it doesnt seem to be so easy.  I found PlaidML (https://github.com/plaidml/plaidml) and followed the installation instructions but Im getting errors using Keras.

    I was wondering if is worth the effort investing more time trying to fix the environment (I use Pycharm) or the performance gains we will get are not worth the effort ? Any feedback will be most welcomed.

    ------------------------------
    Daniel Lopez Sainz
    ------------------------------



    #AIandDSSkills
    #DataandAILearning
    #AIandDSSkills