Nvidia 1000M vs 2000M with W520 FHD
Nvidia 1000M vs 2000M with W520 FHD
About to pull the trigger on a W520 with FHD video. Video card options are the Nvidia Quadro 1000M or 2000M. The ONLY difference from a Nvidia product chart is the number of CUDA cores (1000M=96 cores, 2000M=192 cores). Doubling the CUDA cores increases not only the power consumption (45W v 55W), but also adds $250 to the W520 purchase price.
I'm not into gaming, at all, nor am I a graphic artist/designer. But I do need the FHD's extra screen space for my work with many open windows updating concurrently in real time.
Wikipedia explains that CUDA cores are devices for multiprocessing video data to render faster video.
Anyone know where the 96 CUDA core Nvidia 1000M adapter fails to deliver with FHD? Trying to neither waste $ nor be penny wise and pound foolish.
I'm not into gaming, at all, nor am I a graphic artist/designer. But I do need the FHD's extra screen space for my work with many open windows updating concurrently in real time.
Wikipedia explains that CUDA cores are devices for multiprocessing video data to render faster video.
Anyone know where the 96 CUDA core Nvidia 1000M adapter fails to deliver with FHD? Trying to neither waste $ nor be penny wise and pound foolish.
Re: Nvidia 1000M vs 2000M with W520 FHD
I bet the 1000M is pretty similar, a little less powerful perhaps, than the 880m in the W510. I think you'll be perfectly fine. I know the whole CUDA thing has got everyone excited, but how much of a difference is this really going to make to the average user, even when editing video? I would think a fast hard drive/SSD would be more important.
W510 4319-2PU, X201 tablet 2985-C6U, HP 8740W DC2
Re: Nvidia 1000M vs 2000M with W520 FHD
Then I would get the less expensive card. Usually the better the GPUs generate more heat and drain the battery more quickly. You might even want to look at the T520 with the Intel GPU, which is more than enough to drive the FHD screen for most uses. Why pay extra for something you're not going to use? Multitasking and general responsiveness are more related to the memory and hard drive.caonen wrote:I'm not into gaming, at all, nor am I a graphic artist/designer.
E7440
Re: Nvidia 1000M vs 2000M with W520 FHD
It seems unanimous (including Lenovo's sales techies) there is no good reason for my needs to pay for the extra CUDA cores.
re. T520 v W520. I configured both to look for savings, but surprisingly, after adding the few extras I needed, the T ended up $150-200 more than the W model.
Thanks to all.
re. T520 v W520. I configured both to look for savings, but surprisingly, after adding the few extras I needed, the T ended up $150-200 more than the W model.
Thanks to all.
Re: Nvidia 1000M vs 2000M with W520 FHD
I've always thought that CUA is nonsense, overall. Unless you are buying the $1500 graphics card, it can't possibly make a difference in your day to day computing.
W510 4319-2PU, X201 tablet 2985-C6U, HP 8740W DC2
Re: Nvidia 1000M vs 2000M with W520 FHD
is there any other difference that these CUDA cores? I checked some chart on nVidia site and the 1000M and the 2000M looks the same, expect the 1000M is 35W than 45W on 2000M. Are any differences at external displaying or OpenGL power?
I work too as a graphic designer, so I use InDesign, Photoshop, Illustrator, Acrobat and Bridge. I checked Adobe, and they told me that only Premiere is using CUDA. So if the OpenGL power is the same, it is pointless to have 2000M for my use. Can anybody confirm these information?
I work too as a graphic designer, so I use InDesign, Photoshop, Illustrator, Acrobat and Bridge. I checked Adobe, and they told me that only Premiere is using CUDA. So if the OpenGL power is the same, it is pointless to have 2000M for my use. Can anybody confirm these information?
Re: Nvidia 1000M vs 2000M with W520 FHD
commander,
Looking at Nvidia's mobile product comparison chart, it corroborates the chart you reference, except the power reqm'ts of the 1000M (45W) and 2000M (55W). As you observed, aside from power draw, the ONLY differences btwn these 2 adapters are the number of CUDA cores (192 vs 96). The chart lists OpenGL for both adapters at 3.3. Not sure if any other measures in the chart refer to external displaying, but they are absolutely identical between the 1000M and 2000M.
The url for the chart I see is: http://www.nvidia.com/docs/IO/11761/com ... -final.pdf
Looking at Nvidia's mobile product comparison chart, it corroborates the chart you reference, except the power reqm'ts of the 1000M (45W) and 2000M (55W). As you observed, aside from power draw, the ONLY differences btwn these 2 adapters are the number of CUDA cores (192 vs 96). The chart lists OpenGL for both adapters at 3.3. Not sure if any other measures in the chart refer to external displaying, but they are absolutely identical between the 1000M and 2000M.
The url for the chart I see is: http://www.nvidia.com/docs/IO/11761/com ... -final.pdf
Re: Nvidia 1000M vs 2000M with W520 FHD
thanks. It is strange how these basic information are hidden. Can you imagine that you will buy a graphic card and the manufacturer doesn't tell you the clock speeds, monitor supports, etc?
-
Volker
- Junior Member

- Posts: 482
- Joined: Fri Oct 01, 2004 10:21 am
- Location: Dublin, Ireland
- Contact:
Re: Nvidia 1000M vs 2000M with W520 FHD
Can you imagine you will buy a graphics card and the manufacturer doesn't tell you which commands the card accepts from the host system so you can write your own interface code if you want? All you get from NVidia is a piece of hardware and a binary-only driver; A few years down the road their driver will stop supporting your hardware and you'll be stuck with an old driver that may or may not work with OS updates.commander wrote:Can you imagine that you will buy a graphic card and the manufacturer doesn't tell you the clock speeds, monitor supports, etc?
Re: Nvidia 1000M vs 2000M with W520 FHD
I just ordered the Quadro 2000M setup with my W520. I use video editing/creation software that is aware of how many cuda cores you have and the process gets speeded up if you have more cores. Granted, it's not as many cores as a GTX 590, but it's twice what the 1000M has.
2011 MacBook Pro 17" non-glare 2820QM 8GB 256GB SSD
T61P - 6459CTO - T9300 - 15.4 WUXGA ++ E6600 desktop, HP dv6000, P4 desktop w/ RAMBUS
Previous: P3 desktop, P1 desktop, 386-40Mhz desktop, 386-20 Mhz desktop, 10Mhz TURBO XT clone, Commodore Amiga 1000, 128, 64, VIC-20
T61P - 6459CTO - T9300 - 15.4 WUXGA ++ E6600 desktop, HP dv6000, P4 desktop w/ RAMBUS
Previous: P3 desktop, P1 desktop, 386-40Mhz desktop, 386-20 Mhz desktop, 10Mhz TURBO XT clone, Commodore Amiga 1000, 128, 64, VIC-20
Re: Nvidia 1000M vs 2000M with W520 FHD
Yeah, pure nonsense... Some people use it to help cure cancer, some other to help in stopping global warming, etc. - but who would care about stuff like that?Oliver26n wrote:I've always thought that CUA is nonsense, overall.
Re: Nvidia 1000M vs 2000M with W520 FHD
Well sticking strictly to the original poster's requirements then yes, the 1000M is the way to go. Your FHD will be perfectly fine.
The main reasons I am going with the 2000M are threefold; the first two are mentioned in the previous posts -
1) video processing
2) CUDA-based "charity computing" scientific research projects I do like Folding@Home
and then, of course,
3) gaming. This sucker should be able to run virtually any game I throw at it for the next couple years.
In terms of pure computational oomph per dollar, the $250 to move from the 1000M to 2000M is a fairly decent deal in this market segment, and WAY better than coughing up $500 (minimum) to upgrade to the Extreme Edition CPU.
But if you want to keep your W520 below $2,000 then the 1000M will still handily get the job done (at 96 cores it's still got 3x the number of cores as the old Quadro 570M in my T61p)
The main reasons I am going with the 2000M are threefold; the first two are mentioned in the previous posts -
1) video processing
2) CUDA-based "charity computing" scientific research projects I do like Folding@Home
and then, of course,
3) gaming. This sucker should be able to run virtually any game I throw at it for the next couple years.
In terms of pure computational oomph per dollar, the $250 to move from the 1000M to 2000M is a fairly decent deal in this market segment, and WAY better than coughing up $500 (minimum) to upgrade to the Extreme Edition CPU.
But if you want to keep your W520 below $2,000 then the 1000M will still handily get the job done (at 96 cores it's still got 3x the number of cores as the old Quadro 570M in my T61p)
-
thinkbigger
- Posts: 11
- Joined: Sat Oct 15, 2011 9:05 pm
Re: Nvidia 1000M vs 2000M with W520 FHD
One more twist you may find interesting. I'm not a gamer, but I find a high resolution crucial for my work with html, pdf, and doc. After waiting 5 years to replace my T61, I think I’ll finally go for the W520, in spite of the worse horizontal display resolution, sigh (1080 px vs. 1200).
Currently, I'm running in addition two external TFTs with a resolution of 2560*1600 each. When I get the W520, I'd like to add a third one with the same resolution. This seems to be possible with the ThinkPad Mini Dock Plus Series 3 (170W) if I use its two DisplayPorts, plus the display port from the W520. Lenovo says that I can use the three DP simultaneously, but doesn’t telling the max resolution (http://support.lenovo.com/en_US/detail. ... MIGR-76617).
Three questions:
1. Is this high resolution (4800*2560) really possible with the W520?
2. Does 1000M vs. 2000M make any difference for high resolutions? (I assume not)
3. Would the same work for the T520? They take the same Mini Dock Plus, but have only the integrated graphic card. So maybe it's possible, but really slow?
Currently, I'm running in addition two external TFTs with a resolution of 2560*1600 each. When I get the W520, I'd like to add a third one with the same resolution. This seems to be possible with the ThinkPad Mini Dock Plus Series 3 (170W) if I use its two DisplayPorts, plus the display port from the W520. Lenovo says that I can use the three DP simultaneously, but doesn’t telling the max resolution (http://support.lenovo.com/en_US/detail. ... MIGR-76617).
Three questions:
1. Is this high resolution (4800*2560) really possible with the W520?
2. Does 1000M vs. 2000M make any difference for high resolutions? (I assume not)
3. Would the same work for the T520? They take the same Mini Dock Plus, but have only the integrated graphic card. So maybe it's possible, but really slow?
Re: Nvidia 1000M vs 2000M with W520 FHD
1. Impossible, only 1 DVI and 1 DP port can be used simultaneously, and DVI on MDP3 supports up to 1920*1200thinkbigger wrote:One more twist you may find interesting. I'm not a gamer, but I find a high resolution crucial for my work with html, pdf, and doc. After waiting 5 years to replace my T61, I think I’ll finally go for the W520, in spite of the worse horizontal display resolution, sigh (1080 px vs. 1200).
Currently, I'm running in addition two external TFTs with a resolution of 2560*1600 each. When I get the W520, I'd like to add a third one with the same resolution. This seems to be possible with the ThinkPad Mini Dock Plus Series 3 (170W) if I use its two DisplayPorts, plus the display port from the W520. Lenovo says that I can use the three DP simultaneously, but doesn’t telling the max resolution (http://support.lenovo.com/en_US/detail. ... MIGR-76617).
Three questions:
1. Is this high resolution (4800*2560) really possible with the W520?
2. Does 1000M vs. 2000M make any difference for high resolutions? (I assume not)
3. Would the same work for the T520? They take the same Mini Dock Plus, but have only the integrated graphic card. So maybe it's possible, but really slow?
2. No difference
3. T520 with Nvidia card works the same
-
thinkbigger
- Posts: 11
- Joined: Sat Oct 15, 2011 9:05 pm
Re: Nvidia 1000M vs 2000M with W520 FHD
1. Good point. Looks like I have to stick with two 2560*1600 screens as long as I don't use a separate ViDock. Curses on Lenovo for making only a SL-DVI (my 5-year old dock has DL).1. Impossible, only 1 DVI and 1 DP port can be used simultaneously, and DVI on MDP3 supports up to 1920*1200
2. No difference
3. T520 with Nvidia card works the same
3. Really? The second graphic card wouldn't even be used?
Re: Nvidia 1000M vs 2000M with W520 FHD
1. You can try VGA port, but I'm not sure if it supports 2560*1600 resolution.
3. T520 with Nvidia card is equipped with Nvidia Optimus by default, so two GPUs supports two external screens.
3. T520 with Nvidia card is equipped with Nvidia Optimus by default, so two GPUs supports two external screens.
-
thinkbigger
- Posts: 11
- Joined: Sat Oct 15, 2011 9:05 pm
Re: Nvidia 1000M vs 2000M with W520 FHD
I got my W520, dock, DP-DL-DVI adapters, tried, and -- surprise -- I could run both 2560*1600 screens in full resolution FROM THE DOCK! The result is a sweet 3200*2560 desktop, plus the TP display.
This is different from what Lenovo and trustworthy people (in this forum) deemed possible. I didn't need to use the DP on the W520.
Makes me wonder if I may be able to add another (third) WQXGA screen by using the DP on the W520.
This is different from what Lenovo and trustworthy people (in this forum) deemed possible. I didn't need to use the DP on the W520.
Makes me wonder if I may be able to add another (third) WQXGA screen by using the DP on the W520.
Re: Nvidia 1000M vs 2000M with W520 FHD
That sounds cool. Are you connecting the monitors on the DVI and DP port respectively?thinkbigger wrote:I got my W520, dock, DP-DL-DVI adapters, tried, and -- surprise -- I could run both 2560*1600 screens in full resolution FROM THE DOCK! The result is a sweet 3200*2560 desktop, plus the TP display.
This is different from what Lenovo and trustworthy people (in this forum) deemed possible. I didn't need to use the DP on the W520.
Makes me wonder if I may be able to add another (third) WQXGA screen by using the DP on the W520.
My bad for the misleading info., as according to Lenovo, the DVI port is not DVI-D one. If it is, then it rocks
-
thinkbigger
- Posts: 11
- Joined: Sat Oct 15, 2011 9:05 pm
Re: Nvidia 1000M vs 2000M with W520 FHD
Unfortunately, you were right with regards the DVI ports. They are both only Single-DVI; even the dock for my old T60p had Dual-DVI. So what I use are two DisplayPort to Dual-DVI adapters. The DP on the laptop is still free.chairsky wrote: That sounds cool. Are you connecting the monitors on the DVI and DP port respectively?
My bad for the misleading info., as according to Lenovo, the DVI port is not DVI-D one. If it is, then it rocks
Lenovo makes it look as if you could use only on DP from the dock at one time, that's now disproved. Taking out the Laptop with two WQXGA screens is really fast now, as is re-attaching (just take it out/drop it in). Also, it looks like the 1000M I have is not even used much; only minimal fan noise. It drives the external screens, and the other graphic card the Laptop screen. Good to have two graphic cards… I hope that also saves battery power on the road when using only the Intel graphics.
I don't have a third screen to try, but the fact that there is a free DP on the laptop makes it looks like it could drive another WQXGA? That would seriously rock!
-
Colonel O'Neill
- ThinkPadder

- Posts: 1359
- Joined: Tue Oct 27, 2009 8:03 am
- Location: Vancouver
Re: Nvidia 1000M vs 2000M with W520 FHD
The Quadro's are limited to a total of three screens for whatever reason.
The lesser Quadro NVS's are able to use up to four screens total.
Stupid NVidia.
The lesser Quadro NVS's are able to use up to four screens total.
Stupid NVidia.
W520: i7-2720QM, Q2000M at 1080/688/1376, 21GB RAM, 500GB + 750GB HDD, FHD screen & MB168B+
X61T: L7500, 3GB RAM, 500GB HDD, XGA screen, Ultrabase
Y3P: 5Y70, 8GB RAM, 256GB SSD, QHD+ screen
X61T: L7500, 3GB RAM, 500GB HDD, XGA screen, Ultrabase
Y3P: 5Y70, 8GB RAM, 256GB SSD, QHD+ screen
-
thinkbigger
- Posts: 11
- Joined: Sat Oct 15, 2011 9:05 pm
Re: Nvidia 1000M vs 2000M with W520 FHD
3 screens WQXGA on Quadro 1000M
1 screen 1080p on Intel
Sounds possible?
1 screen 1080p on Intel
Sounds possible?
-
Colonel O'Neill
- ThinkPadder

- Posts: 1359
- Joined: Tue Oct 27, 2009 8:03 am
- Location: Vancouver
Re: Nvidia 1000M vs 2000M with W520 FHD
I meant in total, including the integrated card.
W520: i7-2720QM, Q2000M at 1080/688/1376, 21GB RAM, 500GB + 750GB HDD, FHD screen & MB168B+
X61T: L7500, 3GB RAM, 500GB HDD, XGA screen, Ultrabase
Y3P: 5Y70, 8GB RAM, 256GB SSD, QHD+ screen
X61T: L7500, 3GB RAM, 500GB HDD, XGA screen, Ultrabase
Y3P: 5Y70, 8GB RAM, 256GB SSD, QHD+ screen
-
- Similar Topics
- Replies
- Views
- Last post
-
-
SOLD: Lenovo Thinkpad T440, 256gb SSD, 12GB, i5 4300U, 2.5ghz, FHD 1920x1080
by nizmoz » Sat Dec 31, 2016 7:27 pm » in Marketplace - Forum Members only - 4 Replies
- 690 Views
-
Last post by nizmoz
Fri Jan 13, 2017 6:17 pm
-
-
-
X220/X230 to flash or not to flash coreboot after FHD mod
by carcuevas » Sat Jan 14, 2017 7:14 am » in ThinkPad X200/201/220 and X300/301 Series - 22 Replies
- 3430 Views
-
Last post by jaspen-meyer
Fri Apr 14, 2017 9:34 am
-
-
-
T420s/430s FHD/2k upgrade
by tomatohat » Sun Feb 05, 2017 1:31 pm » in ThinkPad T430/T530 and later Series - 18 Replies
- 1983 Views
-
Last post by llemon
Wed May 24, 2017 6:05 pm
-
-
-
[FS] T440p i7 4700mq, dGPU, FHD, 16GB RAM, 256SSD, 256 SSD (ultrabay)
by suchness » Mon Feb 06, 2017 3:40 pm » in Marketplace - Forum Members only - 2 Replies
- 524 Views
-
Last post by Johan
Thu Feb 09, 2017 11:02 am
-
Who is online
Users browsing this forum: No registered users and 1 guest




