Sitemap

Nvidia GPU Acceleration For Mac!

4 min readMar 9, 2025

Impossible, right? After all, Nvidia GPUs don’t work with Macs. Well actually it is possible using the latest innovation from Juice Labs. In this blog, I’ll show you step-by-step how to run your CUDA-based app on Mac, accelerated on a remote Nvidia GPU — all of this without any code changes!

Apple GPU, NVidia GPU, and CUDA

So why would you want to run CUDA apps on Mac? The latest Macs have decent GPUs. In fact, many popular AI frameworks (Pytorch, HuggingFace’s Transformer library, and vLLM to name a few) have started to support native Mac GPU acceleration, bypassing CUDA and instead leveraging Apple’s Metal or MLX software libraries.

All that said, the best AI trains and runs in the cloud. Apple has little presence in the data center where Nvidia maintains its dominance. Nvidia is light years ahead of its competitors (Intel, AMD, Google, Microsoft, Meta, Amazon, Cerebras, Groq, Tenstorrent, etc) not only because of its decade head start in hardware innovation, but also because of its CUDA software stack powers the lion share of AI applications that train and serve models.

So how do you connect-the-dots from CUDA apps running on Mac to Nvidia GPUs in the data center? JuiceLabs has introduced a unique GPU-over-IP solution that’s truly friction-free — it does not require re-architecting CUDA applications or changing a line of code. I’ll take you through how to use these tools yourself on your Mac. Alright, let’s get started!

Get Started With JuiceLabs

You will need the following:

  • You need a Mac (obviously!)
  • You should know how to run CLI commands in Mac’s terminal console.
  • Hopefully you have some experience running CUDA apps on Linux or Windows machines with an attached Nvidia GPU.

Next go to the JuiceLabs web site and sign up for a free trial account. In the final step, you should download the Linux installation tar ball. The file is called juice-gpu-linux.tar.gz. Don’t do anything with that file yet, we’ll deal with it later.

Set Up Your Mac

You should probably update your Mac to the latest operating system. I tested with Sequoia 15.3.1 (I have an M2 MacBook Pro). Likely you have brew installed, but if not, follow these instructions.

You might have been wondering why I had you download the Linux installer. As some of you know, Linux apps usually run only on machines with X86 host processors (Intel and AMD) but recent Macs have ARM processors. In the next step, I’ll take you through installing X86 virtualization software called colima which lets you run Linux apps on Mac.

Open the terminal app on your Mac and run the commands shown in the gist below:

Those steps take you through first installing colima, then creating an Ubuntu Linux virtual machine (named juice), and finally ensuring the virtual machine is running. Note the gist above is not showing the output of all commands, but you should make sure each one runs successfully.

Set Up Your Virtual Machine

The next commands take you through logging into the virtual machine you just created by using ssh, installing a few Linux system packages, downloading the latest Anaconda Python distribution, and finally installing Python 3.10, CUDA 12.1.1, and Pytorch 2.6.0.

As before, I don’t show the output of each command in the gist above, so you need to make sure each command completes successfully.

Pytorch->CUDA->NVidia GPU On Your Mac

Now your system is ready to test a simple Pytorch script that leverages CUDA to communicate with an Nvidia GPU. The following gist shows a complete session. When you try it yourself, you will need to adjust a few of the command arguments. I’ve made a relevant comment near the command:

Next Steps With DeepSeek Models

Admittedly, running a couple of lines of Pytorch is not the same as running a real fully-featured CUDA app. In this blog, I just wanted to get your Mac setup properly and familiarize you with JuiceLabs’ utilities. In the next blog, we’ll do something a bit more advanced. We’ll run a CUDA app that loads and runs a DeepSeek model (for a sneak peek, checkout the youtube video below.). I’ll also show you some benchmarks I’ve run that compares “juiced” models vs running them natively on my Mac.

If you liked this blog, please hit the like button. And don’t forget to subscribe to my medium feed to get a notification of the next blog!

Disclaimer: I am not currently affiliated with Apple, Nvidia, or JuiceLabs. I’m just a tech evangelist working at the intersection of hardware and artificial intelligence.

--

--

George Williams
George Williams

Written by George Williams

AI Consultant/Evangelist. Previous: Head of AI at Smile Identity. Dir of DS at GSIT, CDS at Sophos/Capsule 8, Senior DS at Apple, Motiv, Researcher at NYU.

No responses yet