Use iPad M1 processor as GPU

Hello,

I’m currently working on Tiny ML or ML on Edge using the Google Colab platform. Due to the exhaust of my compute unit’s free usage, I’m being prompted to pay. I’ve been considering leveraging the GPU capabilities of my iPad M1 and Intel-based Mac. Both devices utilize Thunderbolt ports capable of sharing connections up to 30GB/s. Since I’m primarily using a classification model, extensive GPU usage isn’t necessary.

I’m looking for assistance or guidance on utilizing the iPad’s processor as an eGPU on my Mac, possibly through an API or Apple technology. Any help would be greatly appreciated!

Answered by DTS Engineer in 796971022

Hello @HemantATindia,

Please file an enhancement request for this capability using Feedback Assistant.

Best regards,

Greg

Hello @HemantATindia,

Please file an enhancement request for this capability using Feedback Assistant.

Best regards,

Greg

Hi Hemant, to the best of my knowledge there is no officially supported method for using an iPad as an eGPU for a MacOS system. If you are interested in running ML workloads directly on your iPad device, consider using TensorFlow-Lite: https://ai.google.dev/edge/litert, or Executorch: https://github.com/pytorch/executorch

I've created a feedback / enhancement request for this albeit a little less specific than this question was worded.... I'd love to see the ability to repurpose M series hardware to facilitate wider concurrency of ai/ml workloads where sensible implementation would simply allow for some of the concurrent work to be parceled off onto the adjacent / isolated thunderbolt-attached hardware for isolated processing

Use iPad M1 processor as GPU
 
 
Q