Machine Learning with eGPU

Discussion in 'Linux Virtual Machine' started by JacobK7, Nov 5, 2020.

  1. JacobK7

    JacobK7 Bit poster

    Messages:
    1
    I've read that Parallels Desktop supports eGPUs now. Does anyone know if this is an option for machine learning development? I'd like to do TensorFlow development on my MacBook Pro 16" in a Linux virtual machine that's making use of an Nvidia card connected as an eGPU.

    Is this possible?

    Thanks!
     
  2. Ajith1

    Ajith1 Parallels Support

    Messages:
    2,719
    Parallels virtual machines cannot directly access eGPU. So it is not possible to use it within a virtual machine.
     
    Rohits19 likes this.

Share This Page