doniej 0 Posted February 1, 2003 Could someone help me out? What should I use on my computer... 16 or 32 bit, and why? I have it set on 16 bit at the moment. ;( Share this post Link to post
ThC 129 0 Posted February 2, 2003 do you mean in games or in the desktop environment. Either way 32-bit is the way to go on most newer video cards because of the fact that it is more colors. With 32-bit color you can do many different things to it aswell, like alpha blending, better looking transparency effects. Im sure you can find more on the web but the best way to go is 32-bit. Share this post Link to post
DosFreak 2 Posted February 3, 2003 Well the diff between 24bit and 32 bit is you get the extra transparency in 32bit over 24bit. Which for desktop usage you don't really need 32bit unless your using a proggy that uses transparency. As for 16, 24bit it depends on if the graphics that you are displaying need 16bit or 24bit color. Mostly you'll only ever need 16bit at the desktop unless your running a flashy desktop with tons of colors and transparencies which means 32bit. Share this post Link to post
Champion_R 0 Posted February 4, 2003 Stuff looks crap on 16bit. I always use 32bit. As long as your video card is AGP, 32bit won't be a burden. Share this post Link to post
el_vago32 0 Posted February 5, 2003 Quote: Stuff looks crap on 16bit. I always use 32bit. As long as your video card is AGP, 32bit won't be a burden. i completely agree, i have an wallpaper that looks horrible on 16 bits res., but 32 bits works fine and the resources use difference its minimal (if there is a difference, hhehe ) Share this post Link to post
Champion_R 0 Posted February 5, 2003 Even the Windows XP Welcome screen looks rubbish in 16bit mode. Share this post Link to post
karendar 0 Posted February 7, 2003 To be perfectly honest, I see no apparent difference between 16 bit and 32 bit Share this post Link to post
DosFreak 2 Posted February 7, 2003 As was stated in the other posts it depends on the "flashiness" aka "extra junk" or "coolness"" of your desktop. Basic 2K or XP desktop with all of the extras removed you will not notice that much of a diff except for the icons. It's really more about personal taste than anything else. Some people are more picky than others. Share this post Link to post
jwl812 0 Posted February 7, 2003 Running a 32bit desktop environment used to be a problem back in the day of 2mb and 4mb video cards. The "true" color bump ate alot of the video ram. On todays 64mb and 128mb cards, heck even a 32mb card doesnt tax the video subsystem much at all if any. Even a 1mb bitmap loaded as a desktop wallpaper doesnt slow down the new stuff. Share this post Link to post
pr0iv2 0 Posted February 7, 2003 if you have a voodoo card in 16 bit its as good as 32 bit not anything else though i had a nvdia card and in 16 bit could notice a horrible pattern in my game textures, and yes it is in all nvdia cards, and the pattern is deliberate. dont believe me? play unreal tournament, turn up the brightness to max, and look at a dark corner/dark simple texture. no company nowadays even tries to make 16 bit color good now they're cheapening 32bit so you have to buy the geforceFX that has 128bit color ugh, video card companies piss me off Share this post Link to post
GTwannabe 0 Posted February 11, 2003 That pattern you speak of is highly visible in the Thief series. Fugly Nvidia 16-bit color... Share this post Link to post
Mr.Guvernment 0 Posted February 11, 2003 ^^^ well said is like take an image that fades from one color to another, try it with both 16bit and 32bit - the 32bit will show a smoother transition from color to color - where as 16bit - u are likely to see the obvious changes in color. Share this post Link to post