Apple Just Let Nvidia Back on the Mac. Sort Of.

5 min read 1 source clear_take
├── "This driver finally solves the CUDA gap that has kept Macs out of serious ML and GPU computing workflows"
│  └── top10.dev editorial (top10.dev) → read below

The editorial argues that CUDA dominance in ML training and inference has made Macs unusable for serious machine learning work. With Nvidia eGPU support via a notarized driver, Mac users can finally access the CUDA ecosystem without abandoning their platform, changing the fundamental equation for ML engineers on macOS.

├── "Apple's explicit approval signals a meaningful thaw in the Apple-Nvidia cold war after nearly a decade of hostility"
│  ├── naves (Hacker News, 222 pts) → read

The submission highlights Apple's notarization of the driver as the key detail — this isn't a hack running with SIP disabled, but a signed driver that Apple reviewed and approved. Given the history of Apple refusing to sign Nvidia's Mojave web drivers in 2018 and dropping all eGPU support with Apple Silicon, this represents a reversal of longstanding policy.

│  └── The Verge (The Verge) → read

The Verge's report emphasizes that this is the first time Nvidia GPUs have officially worked with any Mac since Apple dropped support in macOS Mojave in 2018. The notarization process means Apple explicitly reviewed and permitted the driver's distribution, marking a notable shift in the relationship.

└── "The practical impact extends well beyond ML to 3D rendering, video production, and scientific computing"
  └── top10.dev editorial (top10.dev) → read below

The editorial argues that while CUDA for ML gets the headlines, Nvidia's RTX hardware ray tracing and GPU acceleration are equally critical for 3D rendering, video production, and scientific computing. These professional workflows have been locked out of the Mac ecosystem since 2018, and this driver reopens them.

What happened

Apple has approved a driver that enables Nvidia external GPUs to function with Arm-based Macs. For anyone who's tracked the Apple-Nvidia cold war over the past decade, this sentence alone is newsworthy. Nvidia GPUs haven't officially worked with any Mac since Apple dropped Nvidia support in macOS Mojave (2018), and Apple Silicon Macs launched in 2020 with zero eGPU support of any kind — even for AMD cards that worked on Intel Macs.

The driver appears to have passed Apple's notarization process, meaning Apple has explicitly reviewed and approved its distribution. This isn't a hacky kernel extension running with SIP disabled — it's a signed, notarized driver that plays by Apple's security rules. The Verge's report prompted 222+ points of discussion on Hacker News, with the developer community reacting with a mix of disbelief and cautious optimism.

To understand why this matters, you need to understand the timeline. Apple included Nvidia GPUs in Macs until around 2014. A series of GPU failures in MacBook Pros (2008-2011) soured the relationship. By 2016, Apple was all-AMD for discrete GPUs. In 2018, Nvidia released macOS Mojave-compatible web drivers that Apple never signed, effectively killing Nvidia on Mac. When Apple Silicon arrived, eGPU support was dropped entirely — even for the AMD cards that had worked via Thunderbolt 3 on Intel Macs.

Why it matters

The CUDA problem is real. For machine learning engineers, Nvidia isn't optional — it's the platform. CUDA dominates ML training and inference. PyTorch and TensorFlow's GPU acceleration is Nvidia-first, everything-else-second. Apple's Metal and the Neural Engine are technically capable, but framework support remains thin. If you're training models, you need CUDA. If you need CUDA, you can't use a Mac. This driver changes that equation, even if imperfectly.

The same applies to 3D rendering, video production, and scientific computing. Nvidia's RTX cards with hardware ray tracing, NVENC encoding, and CUDA compute are the industry standard for these workloads. Professionals who prefer macOS for their daily workflow have been forced to maintain separate Windows or Linux machines (or cloud instances) for GPU-heavy work. An eGPU with a capable Nvidia card could consolidate that workflow onto a single machine.

But let's be precise about what this is and isn't. This is a third-party driver that Apple chose not to block. It's not Apple shipping Nvidia support in macOS. It's not Apple and Nvidia announcing a partnership. The distinction matters because Apple controls the entire stack — they could revoke notarization, change driver APIs, or simply not test against this driver in future macOS updates. Anyone building a workflow around this should understand they're building on a foundation that Apple tolerates, not one Apple guarantees.

The Thunderbolt bandwidth constraint is also significant. Thunderbolt 4 (standard on current Macs) provides 40 Gbps, which creates a bottleneck for high-end GPUs. An RTX 4090 over Thunderbolt will not perform like an RTX 4090 in a native PCIe 4.0 x16 slot. For ML training on large models, this bandwidth limitation means you're leaving substantial performance on the table. For inference, lighter rendering, and CUDA development, the penalty is more manageable.

The deeper signal

The interesting question isn't whether this driver works well enough. It's why Apple approved it at all.

One reading: Apple is quietly acknowledging that Apple Silicon's GPU capabilities, while impressive for integrated graphics, aren't sufficient for the professional GPU compute market. The M-series chips have gotten progressively more powerful — the M4 Ultra reportedly offers GPU performance competitive with discrete cards — but they can't match a dedicated Nvidia card with 24GB+ of VRAM for ML workloads. By allowing a third-party Nvidia driver, Apple can serve the pro market without officially blessing a competitor's silicon or admitting a gap in their own lineup.

Another reading: Apple's notarization approval is procedural, not strategic. The driver met the technical and security requirements, so it was approved. Reading corporate intent into a bureaucratic process is a classic tech-press mistake.

The truth is probably somewhere in between. Apple is meticulous about what runs on its platform. Notarization isn't rubber-stamped. But approving a third-party driver also gives Apple plausible deniability — they didn't build Nvidia support, they just didn't block someone else from doing it.

The Hacker News community reaction reflects this ambiguity. Some see it as a thaw in the Apple-Nvidia relationship and the beginning of real GPU flexibility on Mac. Others point out that eGPU over Thunderbolt is a compromised experience, and that Apple could pull the rug at any time. The pragmatists are already asking the right question: does it work well enough for my specific use case?

What this means for your stack

If you're a Mac-based ML engineer who's been running CUDA workloads on cloud instances or a separate Linux box, this is worth evaluating — but don't burn your bridges. Start with inference and development workflows where Thunderbolt bandwidth is less of a constraint. Training large models will still be bottlenecked.

If you're making hardware purchasing decisions, the calculus just shifted: a MacBook Pro plus an Nvidia eGPU enclosure might be a viable single-machine setup for mixed workflows. The cost of a Thunderbolt eGPU enclosure ($200-400) plus an Nvidia card is non-trivial, but it's cheaper than maintaining two machines. Test your specific workload before committing.

For teams managing developer environments, this adds a new configuration to support. Driver updates, macOS compatibility, and Thunderbolt hot-plug behavior will all need testing. Don't standardize on this setup for your team until it's survived at least one macOS major version update without breaking.

If you're building GPU-accelerated applications, you now need to consider a new deployment target: Nvidia-on-Mac via eGPU. CUDA code should work, but driver quirks, power management, and hot-plug scenarios may introduce edge cases you haven't seen on Linux or Windows.

Looking ahead

The real test comes with the next macOS release. If Apple continues to approve this driver — and especially if they make any accommodations for external GPU support at the OS level — it signals a genuine shift. If the driver breaks with macOS 17 and Apple shrugs, we'll know this was a brief window, not a policy change. For now, it's the most significant crack in the Apple-Nvidia wall since 2018. Treat it accordingly: interesting enough to prototype against, too fragile to bet your infrastructure on.

Hacker News 457 pts 207 comments

Apple approves driver that lets Nvidia eGPUs work with Arm Macs

→ read on Hacker News

// share this

// get daily digest

Top 10 dev stories every morning at 8am UTC. AI-curated. Retro terminal HTML email.