Page 1 of 1

Display Adapter Default 16 Bit, not 32 Bit, why?

Posted: Thu Apr 06, 2006 6:29 pm
by Stefan Bruckel
Not normally an issue, but one of my photo editing programs advised to change from 16 bit to 32 bit, which I did. Since the system is capable of displaying 32 bit rather than 16 bit, I assume the default is that 16 bit is plenty for most applications but takes less memory to process, hence the 16 bit default?

Other than processor speed, any other reason to change to 16 bit for normal use?

Posted: Sat Apr 08, 2006 4:55 am
by dr_st
I think you're pretty safe working in 32bit all the time.

Posted: Sat Apr 08, 2006 7:53 am
by Stefan Bruckel
Seems that way, and that's the setting I have had on all my other laptops in the past. I was just curious why the default setting would be less than the best. My guess is that it's sufficient for just about every application, except maybe for some specific photo editing or similar video applications... and since 32 bit color rendering requires more resources than 16 bit, maybe that's why they picked 16 as the default.

I am still trying to resolve jerky video when playing DVDs, and maybe setting it back to 16 bit will help there. We'll see.

Posted: Sat Apr 08, 2006 8:54 am
by jeremivw
32bit gets you no where on the LCD. Even the 15' IPS UXGA that I've got won't do more than 16bit color...it's just a limitation of the LCD technology.

If you're running an external monitor, then go for it, but on the LCD you're just wasting GRAM to make it keep up with millions of colors when the LCD is only actually showing you a couple hundred thousand...

Posted: Sat Apr 08, 2006 9:02 am
by pundit
jeremivw wrote:32bit gets you no where on the LCD. Even the 15' IPS UXGA that I've got won't do more than 16bit color...it's just a limitation of the LCD technology.

If you're running an external monitor, then go for it, but on the LCD you're just wasting GRAM to make it keep up with millions of colors when the LCD is only actually showing you a couple hundred thousand...
Are you sure about this? The first thing I noticed when I booted up the computer were that the colour gradients weren't rendered as smoothly as they could be, and I rushed to check if it was on 16-bit. And voila, it was.

All this was within the first 30 seconds after the first-boot stuff completed.

The moment I changed it, all was fine again. Clearly, there is a noticeable difference---and I don't see how that's possible if it is impossible for the screen to render more than 16 bit colour.

Posted: Sat Apr 08, 2006 9:13 am
by jeremivw
pundit wrote:Are you sure about this?
I'm positive. These LCDs do not display 32bit color.

That being said, maybe I mispoke when I stated that 32bit gets you nowhere...I too have noticed those gradients on bootup...

Now, if I remember correctly, 32bit is 16 million colors and 16bit is like 212,000 or so. In this case, according to the BOE-Hydis spec, only hundreds of thousands of colors are supported. I've only seen a few high end desktop LCDs that support millions of colors...none on laptops.

Since I too have seen what you speak of, there must be something else going on, too. But I'm sure about the LCD color support, man.

Posted: Sat Apr 08, 2006 9:18 am
by pundit
jeremivw wrote:That being said, maybe I mispoke when I stated that 32bit gets you nowhere...I too have noticed those gradients on bootup...

Now, if I remember correctly, 32bit is 16 million colors and 16bit is like 212,000 or so. In this case, according to the BOE-Hydis spec, only hundreds of thousands of colors are supported. I've only seen a few high end desktop LCDs that support millions of colors...none on laptops.

Since I too have seen what you speak of, there must be something else going on, too. But I'm sure about the LCD color support, man.
OK, let me see if this is what might be going on.

May be the LCD screen (when the graphics are on 32-bit mode) doesn't do millions of colours, but still more than my eye can discern; something which it can't pull-off in 16-bit? Because that's why I haven't once worked in 16-bit in years; the instant something happens and my computer is working on 16-bit mode, I see a difference---for the worse.

Additional video memory use or computational expense is a small price to pay.

Posted: Sat Apr 08, 2006 9:31 am
by jeremivw
pundit wrote:OK, let me see if this is what might be going on.

May be the LCD screen (when the graphics are on 32-bit mode) doesn't do millions of colours, but still more than my eye can discern; something which it can't pull-off in 16-bit? Because that's why I haven't once worked in 16-bit in years; the instant something happens and my computer is working on 16-bit mode, I see a difference---for the worse.

Additional video memory use or computational expense is a small price to pay.
That's a decent explanation so I'll except it. I just moved mine back up to 32 bit and can definitely see the difference, too.

Checked the spec and sure enough it states Color Support: 262,000 colors (16bit) for the IPS UXGA LCD. Doesn't that suck? Everything else being equal, you'd expect 16 mil on your state of the art laptop...at least I would. I was running (and seeing - ok, as much as my eyes could) 16 mil on CRTs 15 years ago!

BTW: agreed on the price to pay, but I was just trying to answer the thread, man. It's curious why it comes defaulted to 16bit...almost as if THEY KNOW (and are watching to see what we do!)... :roll:

EDIT: Woot! Finally no more Freshman for me!

Posted: Sat Apr 08, 2006 9:45 am
by pundit
jeremivw wrote:Checked the spec and sure enough it states Color Support: 262,000 colors (16bit) for the IPS UXGA LCD. Doesn't that suck? Everything else being equal, you'd expect 16 mil on your state of the art laptop...at least I would. I was running (and seeing - ok, as much as my eyes could) 16 mil on CRTs 15 years ago!
While it's cool to be aware of specs and such, sometimes you just got to follow your eyes. I don't expect my laptop to be able to do the millions of colours that 32-bit claims---I just expect that I don't see obvious visual artifacts.

But even today, given a choice between a high-end LCD display and a high-end CRT display, I'll still gladly take the CRT. The colour rendition is usually superior on the CRT.

Posted: Sat Apr 08, 2006 9:55 am
by jeremivw
pundit wrote:But even today, given a choice between a high-end LCD display and a high-end CRT display, I'll still gladly take the CRT. The colour rendition is usually superior on the CRT.
I certainly agree...but...

This IPS I'm lookin' at bridges that gap (CRT > LCD) better than any portable I've seen to date...

Posted: Sat Apr 08, 2006 2:52 pm
by dr_st
Just a small correction:

262K colors is 18-bit
16.7M colors is 24-bit

Posted: Sun Apr 09, 2006 6:35 pm
by astro
The reason you see the difference is not because of the screen, but because of Windows.

When you tell Windows you're running in 16-bit it will probably apply some dodgy one-size-fits-all colour dithering to everything it paints. Not sure how it does this.

When you're in 24-bit or higher it doesn't bother doing this.

Posted: Mon Apr 10, 2006 6:07 pm
by dbertoni5000
Looking at the IBM support site, the color support for both the 14.1" and 15" 1400x1050 displays says this:

Max colors or gray shades 16777216

Which works out to 24-bit color.

I don't know where the color support is stated as 262k , but I suspect it's wrong.

Posted: Tue Apr 11, 2006 3:58 pm
by beeblebrox
Actually, most (if not all) notebook displays use a 6-bit converter per pixel. therefore the max gradient resolution is 3x6=18 bit.

This does not fit with memory. therefore 16 bit. you lose 2 bit information -> gradients.

if you have 18 to 24 bit video memory, you would be fine. that memory does not exist for mass markets. Next step is 32 bit, you waste a lot of memory, but have smooth gradients.

I am using an external Samsung 193p, which has 24bit resolution (it is a PVA display, not a TN display from notebooks). The difference to the ThinkPad screen is phenomenal, contrasts and color brilliance are beyond belief.

IPS (flexview) displays have a far better contrast due to their dual crystal liners, however the electronics is still 6 bit per pixel. :-(


Hope I could help...