Good and inexpensive GPU for PP?

West Coast Birder

Administrator
Staff member
Joined
6 Nov 2023
Posts
1,724
Likes
2,792
Location
Santa Barbara, California
Name
Sam
Image Editing
No
Hey guys,

I'm looking for a decent GPU card for doing PP. I don't play games, so the kinds of metrics that gamers require from their graphics card are unimportant. Right now, my computer only has on-processor graphics, so obviously doing LR Denoise AI takes for ever, about 10 minutes per photo.

Can you recommend a decent graphics card that will speed this up and not break the bank?

My current computer:

Intel Core i7-10700K CPU @ 3.80GHz
32 GB RAM
Windows 10 Pro

Thanks!
 
I recently swapped out a three year old Gigabyte 1050ti 4GB card I used for my photos, for a 12GB card in an attempt at future proofing my PC. For a relatively low GB card it performed ok on Lightroom 5.7 without any hesitation, but not being familiar with LR denoise and its requirements, it may not be up to the task. Something you would need to check on maybe. Similar items are readily available on fleabay in the UK for about £90. The only trouble is if its been overclocked it may have shortened its life. But, beware of the unbranded look alike 'knock offs' that are flooding Amazon and ebay, new items for about £80 but Genuine brands sell for £200+
 
I've always used Nvidia cards.. Can't say I've ever had one fail.
They prices have really risen lately going up over $1k+
 
Problem is that I am using my on-CPU graphics, which is fine and dandy for simply driving my monitors, but struggles mightily with any serious computation required. Editing a modest photo (not 60 MP) off a 7D Mark II or an R7 takes 10-ish minutes. Basically, I get it started and fix a coffee and when I'm back it is generally about 75% done. Does neither my time effectiveness nor my stomach lining any good :giggle:
 
I have a 2019 MB Pro with a 9th gen i9 and the AMD Radeon Pro 550M graphics card w/ 4GB RAM. LR Denoise takes me about 65 seconds for R5 CRAW images.

That's an old graphics card, so I would think almost anything you can buy today with 4GB or more will make your life much more pleasant. Obviously the faster (more $$) the better, but I don't know where the "good enough" spot is.
 
For image processing just about any modern card will do fine. No need to spend more than $100usd. I think that the easiest thing to do is head on to Amazon, put in graphic card in the search window and set the price range to $50 to $100 and you get a bunch to look at. Any of those will most likely out perform your internal card significantly if for no other reason than freeing up more memory.
 
My Setup below.

Intel i7 9700 CPU @ 3.00GHz
16 GB RAM
Windows 10 home

I know this is a lower end NVidia card so not so expensive over in the UK.
NVidia GTX 1660 Ti
6GB RAM

LR Denoise AI on a Canon 7D MKII ISO 16,000 file. Denoise amount at 60 took 44secs.
 
ATI/AMD Radeon discrete graphic cards have always served me well, and typically carry
a far less hefty price of admission than Nvidia.

There are many varied manufaccturers producing cards from both GPU camps, with some
offering models from only one.. Sapphire and EVGA are two makers I know who offer GPUs from either
camp, while a few others remain exclusivve. Gigabyte is anothher agnostic brand.

ATI(now AMD) has never let me down since Windows 95.
Way back when, I had gawdawful trouble witth a crappy Trident driver,
went back to the vendor, asked about it, and ATI was the immediate answer.
The problems cleared up as soon as the card was installed with drivers,
and I've used ATI ever since.

I don't do hardcore bleeding-edge gaming, while ATI has served me well in all other aspects
of video, games, photo processing, disk duplication, and all else. My current card is now a few years old
while it serves up 4K/UHD/HDR to my monitors and TV, throws HDMI surround audio to my receiver,
doesn't lag or jitter with the games I play, and still cost far less than the so-called 'equivalent' Nvidia of the time.

I'll always recommend red over green in thhis camp.
 
Last edited:
1 grand? :yikes:

I was thinking closer to $100 lol...
Depends on the card there are plenty of options below $1k. The $1k+ card are usually for serious gamers with disposable income or those doing 3d modelling or professional work that requires a GPU. A recent generation mid range card can be had for around $300.

A quick google search found this thread that includes some users testing the performance with various GPU's: https://community.adobe.com/t5/lightroom-classic-discussions/gpu-benchmark-denoise-ai/td-p/13779983

TLDR: A $300 mid range GPU will take around 30 seconds a 60MP image, while a $1k+ GPU can do that same task in about 14 seconds.
 
I was using a low power gpu with 2gb ram mainly as an hdmi for my monitor on my again Win10 pro. Denoise in Lrc was taking "40 minutes" lol....

Early 2023 Photoshop required higher specs and I got a super deal on a GTX 1650 4gb. It didn't require any extra power and now my Denoise takes 1 to 2 minutes.
It would be faster on your PC because mine is a 3rd gen i5 from 2013.
 
I was using a low power gpu with 2gb ram mainly as an hdmi for my monitor on my again Win10 pro. Denoise in Lrc was taking "40 minutes" lol....

Early 2023 Photoshop required higher specs and I got a super deal on a GTX 1650 4gb. It didn't require any extra power and now my Denoise takes 1 to 2 minutes.
It would be faster on your PC because mine is a 3rd gen i5 from 2013.
40 minutes ouch... I was using a 3 year old ultra book with an IGPU and my R5 images were taking 8-9 minutes each and I thought that was painful while I was on the road. My 3080 and a more modern CPU take about 10 seconds on a 45MP file.
 
40 minutes ouch... I was using a 3 year old ultra book with an IGPU and my R5 images were taking 8-9 minutes each and I thought that was painful while I was on the road. My 3080 and a more modern CPU take about 10 seconds on a 45MP file.
Yeah my previous 2gbgpu was 'very basic' and its an old cpu. Glad to have 16gb ram...
 
As a side note, If you were to go the Nvidia route, download the 'Studio' Driver and not the 'Game Ready Driver'. I wrongly thought when I first got the machine that the fastest driver would be the 'Game Ready one'. After trial and error the 'Studio' driver was far faster when using Lightroom and Photoshop.
 
I'm running a Ryzen 9-3950 CPU and recently upgraded from a GTX-1080 with 8 GB memory to a RTX-4070 with 12gb memory that I picked up during a Black Friday sale. It now chews through 42MB Leica DNG files in 15-20 seconds, and 69MB Sony RAW files in 30 seconds. That's roughly half the time it took with the older GTX-1080.

One thing to check before you buy a graphics card - check the graphics card power requirements against your power supply ratings to make certain the power supply can handle it.
 
...One thing to check before you buy a graphics card - check the graphics card power requirements against your power supply ratings to make certain the power supply can handle it...
Great advice about power requirements, but look at motherboard/bus and other requirements
for any given system.

I always had more power available than required as I onboarded SSDs in favor of HDDs,
while I slowly updated the GPU.
My most-recent update was a motherboard/CPU/RAM purchase to bring myself
into compliance with upcoming software, while the ATI/AMD Radeon video card
is nowhere near underpowered or lacking, while it's a few years old.

I'm not using software specifically optimized towards GPU-centric enhancements,
while I'd like to note that a base-level upgrade can and often will enhance a system's
overall performance and speed, as has been my experience.

And as mentioned above, there may be differing versions of drivers for cards,
which makes it worthwhile to look into such.
 
Back
Top Bottom