Latest or all posts or last 15, 30, 90 or 180 days.
2024-04-27 08:55:22
Designed for the most demanding needs of photographers and videographers.
877-865-7002
Today’s Deal Zone Items... Handpicked deals...
$1999 $998
SAVE $1001

$500 $400
SAVE $100

$2499 $1999
SAVE $500

$5999 $4399
SAVE $1600

$2499 $2099
SAVE $400

$5999 $4399
SAVE $1600

$999 $849
SAVE $150

$1049 $849
SAVE $200

$680 $680
SAVE $click

$300 $300
SAVE $click

$5999 $4399
SAVE $1600

$4499 $3499
SAVE $1000

$999 $999
SAVE $click

$799 $699
SAVE $100

$1199 $899
SAVE $300

How Fast a GPU for AI Denoise and Focus Stacking? Distributed GPU Support for AI Denoise?

re: Apple 2023 MacBook Pro M3 Max: my Perspective on Apple’s Powerful New Laptops

Apple 2023 MacBook Pro M3 Max

MPG tested both the 14-inch and 16-inch models Apple 2023 MacBook Pro M3 Max 128GB /4TB, model Z1AU002AK and model Z1AF001AN. See all Apple MacBook Pro M3 Max.

The world’s best review of the 2023 MacBook Pro M3 Max is here:

REVIEWED: 16-inch and 14-inch 2023 MacBook Pro M3 Max

See also my video walkthrough of all the test results.

How fast a GPU?

The fastest GPU I have is the 60-core in my 2023 Mac Pro M2 Ultra. There is a 76-core variant which I now rather wish I had, but another $1000 for ~26% faster felt a bit steep when I purchased, and I could not foresee just how intensively I would be using the GPU.

I use Adobe Camera Raw AI Denoise + Enhance Details with every RAW file I process. At 21 seconds each, it takes minutes for even a small focus stack or aperture series. On the M3 Max it takes 36 seconds or so for each file, or more. Worse, I am locked-out of using PS while it does its thing (Adobe tells me that this is done because things become unstable if processing is done while regular PS operations are conducted).

An M3 Ultra (dual M3 Max), would mean 80 GPU cores, turning in a time of 36.5/2 of 15 second each and maybe as little as 10 seconds since there would be no thermal throttling in a properly-cooled Mac Studio or Mac Pro. And 10-15 seconds is a big improvement over 21 seconds. OTOH, I cannot easily afford and M3 Ultra even if one existed right now.

I’d like to see something approaching 5 seconds each. That implies ~240 GPU cores and that’s assuming perfect scalability (dubious). But we will not likely see 120 GPU cores soon, let alone 240, or see more than incremental speed improvements per GPU core. Therefore...

Apple 2023 MacBook Pro M3 Max

Request to Adobe: reintroduce distributed GPU computing

Why can’t I have a 2nd or 3rd or 4th machine to share the GPU computing load for batch operations like AI Denoise?

There were and are “render farm” software solutions that allowed computers connected together to share in the computing tasks. So long as the computing jobs take a lot longer than the time to transfer data, this works great.

Adobe states that more than one GPU cannot be used on Apple M1/M2/M3 Macs, but that is not at issue. In terms of engineering, it is trivially easy: transmit a RAW file to another machine, transfer the finished result back. All over 10 gigabit eg 1200 MB/sec bandwidth (even 1 gigabit would be acceptable since I/O can be done concurrently). It doesn’t get much simpler than a simple client/server protocol. It could be done over https or sshs, if only Adobe would provide a command line. The only logic needed is distributing the load, but that is not hard.

There are two distributed approaches here, but only one is viable for most of us:

Another even simpler option is for Adobe to provide a command line that could be invoked from as many machines on the LAN as desired, each doing its own batch. But this approach takes my time and attention to script it, so I deem it less useful. And Adobe does not have such a command line, to my knowledge.

Distributed computing on local LAN using 10 gigabit ethernet

I have a 2021 MacBook Pro M1 Max just sitting there unused while at home (my travel machine). While it’s 40% of the speed of my Mac Pro M2 ultra for AI Denoise, it could still process about 1/34 of the files in the time it takes the Mac Pro to do the other 2/3, for a time savings of 30% eg 15 seconds instead of 21 seconds. And if I had a 76-core Mac Studio also on tap, the time could be cut by more than half. Now we’re talking!

Cloud-based GPU processing

This is highly appealing in that massive computing power could be applied, but...

Nice idea but to upload a 120MB RAW file and to download a ~400MB result requires an awfully fast internet connection—taking longer for data transfer than just doing it locally (eg 20 MB/sec for 400MB would mean 20 seconds and my internet is less than half that speed). A non-starter, at least until we have commonplace gigabit internet—few of us do.

PLEASE buy using links from this website to OWC/MacSales + B&H Photo—thank you!

CLICK TO VIEW: Recommended Storage,etc

CLICK TO VIEW: Apple MacBook Pro and Goodies


View all handpicked deals...

Seagate 22TB IronWolf Pro 7200 rpm SATA III 3.5" Internal NAS HDD (CMR)
$500 $400
SAVE $100

diglloyd Inc. | FTC Disclosure | PRIVACY POLICY | Trademarks | Terms of Use
Contact | About Lloyd Chambers | Consulting | Photo Tours
RSS Feeds | X.com/diglloyd
Copyright © 2022 diglloyd Inc, all rights reserved.