That latter number is bigger than 12 bits, which implies that even *if* the LCD's interface is only 12 bit the video hardware may do some sort of time-based multiplexing/dithering to increase the available color range (resulting in the "flicker"?). Perhaps that's the source of the "64 grey" claim, *or* perhaps whoever typed that up confused something between the LCD's capabilities and the external video port. (15/16 bit "highcolor" DACs, which is what the developer note says the CLUT is part of, usually only had 6 bit DACs.) All we really know is the video hardware *logically* supports "256 greys".The display controller IC contains a 256-entry CLUT. Although the CLUT supports apalette of 32,768 colors, many of the possible colors do not look acceptable on the display.
Due to the nature of color LCD technology, some colors are dithered or exhibit noticeable
flicker. Apple has developed new gamma tables for these displays that minimizes flicker
and optimizes available colors. With these gamma tables, the effective range of the CLUT
for the active matrix color display is about 24,000 colors; for the DualScan color display,
the effective range is about 4000 colors.
Not true greys, no. You could certainly make an image that included the 256 real greys alongside another 256 shades in which the color values for RGB are not exactly the same. (IE, in addition to using #000000 through #FFFFFF you could add some arbitrary stripes where you subtract 1 from one of the RGB channels, IE, #FEFFFF.) But by definition those extra "greys" will actually be subtly "sepia toned". WHICH IS CHEATING.Is it possible to define 512 distinct gradations of gray in a 24bit color graphic?
The misleading part is the resolution of the CLUT doesn't necessarily tell you the resolution of the DACs. ***Using a CLUT for 15 bit color is actually unusual... (*** EDIT: Rest of this thought deleted. Glancing again at the piece of the manual I quoted I see the CLUT in this Powerbook doesn't do 15 bit indexed color, its palette register is 8 bits long. Derp. In my defence, the verbage of the service manual is nonstandard, since it says "CLUT" to refer to what in a VGA card would be the entire "RAMDAC", of which the CLUT is only a part, aka, the palette register. Anyway...) All PC "highcolor" VGA cards that did 15 or 16 bit color, aka, "Thousands of colors" in Mac terminology, still had *at least* six bit ADCs in their RAMDACs, they just couldn't use the entire range when in the highcolor mode. (Which leads to the odd situation that you could actually have more true grays running in an 8 bit grayscale mode than you could in "thousands of colors" mode, because, as you note, the effective grayscale resolution is only 5 bit in "thousands of colors" but it's six or 8 bit in the indexed mode.)If the CLUT is capable of 32,768, then it's 15 bit, meaning there are 32 shades of gray at most.
Hard to say. If the panel is 6-bit, and the CLUT can use 6 of the 8 bits in the register in grayscale mode, it is possible it could display 64 shades of gray.[attachment=0]Screen Shot 2013-08-29 at 7.56.33 PM.png[/attachment]
so does this mean they are telling a fib?
Interesting thought. I wonder if the "driver" side of things could be taken care of in one of the FPGA/CPLD based panel driver boards from Chinese ebay sellers.In principle you could get more shades of grey if instead of feeding a grayscale monitor conventionally you combined the outputs of all three D/As via a resistor network (or some other method) but you'd have to write a custom driver to make a sensible grayscale palette out of it.
The "Driver" issues would actually be OS and application related. Let's say that you were to, via whatever method, modify a video card so it supported something ridonkulous like 24 bit grayscale. (Most accurate way I can think of, assuming we're going oldschool and trying to do this with an analog monitor output, would be to make a board that had three very-high-speed 8 bit analog-to-digital converters on one side and a single 24 bit DAC on the other, connected to the single input on a true monochrome monitor. Digitize the incoming RGB signals into three bytes of data, stack them up, and use that on the output side as a single 24 bit value. Easy Peasy, relatively, The simplest method would be to simply use a different value resistor on each RGB line to combine them into one signal; see the comment below as to why from a practical standpoint the results would probably be indistinguishable.) The problem now is that you'll have to describe to the OS that it should treat its 24 bit framebuffer as grayscale, IE, use the three bytes that were RGB into a linear set of luminosity values instead of three sets of luminosity+chroma; if you simply keep working with 24 bit images as 24 bit color you might be able to display some pretty pictures but they won't make a lick of sense if you view them on a normal machine. "Deep Color" support (IE, support for 30/36/48 bit color depths) has been in Windows for a few years but googling seems to indicate it's *not* really in OS X yet. (Oddly there *was* apparently a 30 bit deep color card made by Radius for Macs in the 90's that was supported by PhotoShop.). I have no idea if "Deep Color" support comes along with an understanding of extended grayscale only images, but it would seem to be a prerequisite for really making use of a deeper grayscale framebuffer.Interesting thought. I wonder if the "driver" side of things could be taken care of in one of the FPGA/CPLD based panel driver boards from Chinese ebay sellers.
That doesn't actually surprise me. 6 bit grey resolution was what Mono VGA was capable of, it would make sense they'd manufacture panels that support it.Apparently the LCD in the 540 is a true 6-bit panel.
What's interesting about this is that it means that the CLUT register is capable of either 256 colors (3.3.2.0.0), or, at a minimum, a 6-bit linear grayscale value (6.0.0). Considering grayscale values larger than a single color channel are possible in the register, technically, it should be possible for it to output true 256 grayscale values, with a true 8-bit grayscale panel attached to it. Of course, at that time, the most one ever saw was 6-bit panels. However, it would be fun to find a true 8-bit grayscale panel and see if the CLUT could drive it at the full 8-bit range. The OS is capable of handling 256 grays, so I don't see why not. A fun little hack to try. Unfortunately, far beyond my capabilities.That doesn't actually surprise me. 6 bit grey resolution was what Mono VGA was capable of, it would make sense they'd manufacture panels that support it.Apparently the LCD in the 540 is a true 6-bit panel.
I'm sure with the color display the control panel says "256 greys". I would just wager that in practice they're dithered, not true.