Jump to content
Compatible Support Forums
Sign in to follow this  
doniej

which is best to run... 16 or 32 bit color?

Recommended Posts

smile Could someone help me out? What should I use on my computer... 16 or 32 bit, and why? I have it set on 16 bit at the moment. ;(

Share this post


Link to post

do you mean in games or in the desktop environment. Either way 32-bit is the way to go on most newer video cards because of the fact that it is more colors. With 32-bit color you can do many different things to it aswell, like alpha blending, better looking transparency effects. Im sure you can find more on the web but the best way to go is 32-bit.

Share this post


Link to post

Well the diff between 24bit and 32 bit is you get the extra transparency in 32bit over 24bit. Which for desktop usage you don't really need 32bit unless your using a proggy that uses transparency. As for 16, 24bit it depends on if the graphics that you are displaying need 16bit or 24bit color.

 

Mostly you'll only ever need 16bit at the desktop unless your running a flashy desktop with tons of colors and transparencies which means 32bit.

Share this post


Link to post

Stuff looks crap on 16bit. I always use 32bit. As long as your video card is AGP, 32bit won't be a burden.

Share this post


Link to post
Quote:
Stuff looks crap on 16bit. I always use 32bit. As long as your video card is AGP, 32bit won't be a burden.


i completely agree, i have an wallpaper that looks horrible on 16 bits res., but 32 bits works fine and the resources use difference its minimal (if there is a difference, hhehe laugh )

Share this post


Link to post

As was stated in the other posts it depends on the "flashiness" aka "extra junk" or "coolness"" of your desktop. Basic 2K or XP desktop with all of the extras removed you will not notice that much of a diff except for the icons. It's really more about personal taste than anything else. Some people are more picky than others.

Share this post


Link to post

Running a 32bit desktop environment used to be a problem back in the day of 2mb and 4mb video cards. The "true" color bump ate alot of the video ram. On todays 64mb and 128mb cards, heck even a 32mb card doesnt tax the video subsystem much at all if any. Even a 1mb bitmap loaded as a desktop wallpaper doesnt slow down the new stuff.

Share this post


Link to post

if you have a voodoo card in 16 bit its as good as 32 bit

not anything else though

i had a nvdia card and in 16 bit could notice a horrible pattern

in my game textures, and yes it is in all nvdia cards, and the pattern is deliberate.

dont believe me? play unreal tournament, turn up the brightness to max, and look at a dark corner/dark simple texture.

no company nowadays even tries to make 16 bit color good

now they're cheapening 32bit so you have to buy the geforceFX

that has 128bit color

ugh, video card companies piss me off

Share this post


Link to post

^^^

well said

 

 

is like

 

take an image that fades from one color to another, try it with both 16bit and 32bit - the 32bit will show a smoother transition from color to color - where as 16bit - u are likely to see the obvious changes in color.

Share this post


Link to post

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×