Also at WWDC were Apple's new cinema displays. Yeah, they look all right and I am not going to have one – both for their price and lack of DVI output. While I did like the idea of ADC, getting rid of it probably isn't the worst thing. Most people just don't (un)plug their screens that often. So now people are back to the good old times of AppleVision displays with multiple plugs coming out of a single cable.
I thought that having FireWire as well as USB on the screen was a neat idea. With more and more devices being around, this makes it easier to quick plug in the iPod, say. On the pictures the placement of those sockets doesn't look like they'll be too easy to reach, though – most likely a tradeoff between a pleasing look and ease-of-use. But why do they only freaking offer two USB ports? Surely having four wouldn't cost a lot more. Apple are notoriously stingy on those ports. And that's a bad idea, particularly with more and more devices being available, and cheaply available. Having a keyboard, a mouse, a USB stick, a graphics tablet, a digital camera, a printer and a scanner doesn't seem to be unusual today and means you're running out of ports soon. Particularly as USB can't carry a lot of power to the bus-powered devices and thus you may end up to have an additional powered hub next to your pretty >€1000 display.
But lets move on to the big things. Whoa, that 30" display looks huge. But will it be any good? It looks so huge that you probably can't see all of it without having to move your head. Perhaps that's a bit too much. In addition, contrary to what Steve said, 4 million pixels just aren't that much. In 1993, my LC III could drive around half a million pixels, in 1996 my Powermac could drive almost 2 million, my three year old Powerbook can drive almost 3 million (on two screens) and now people are asked to shell out for a cutting edge graphics card just to attach that display. Greatness? Badly designed technology, I'd say.
Also, I'd like to note that this new generation of displays finally sees Apple making a statement about resolutions again. Their new displays are advertised as having
100ppi optimum resolution:
After years of experience, Apple engineers have discovered the ideal resolution to display both sharp text and graphics — a pixel density of about 100 pixels per inch (ppi).
Now what does that tell us? After having been with good old 72 ppi for 20 years, Apple now finally upped that number a bit. But – as looking at their OS suggests, they're also going to stick with assuming a fixed resolution with the UI remaining non-scaleable. As we all know this has benefits for usability as auto-scaling interfaces tend to look crappy or even be unusable. On the other hand, finding a good way to make the UI scale, be it for making it work with ultra high-resolution screens where UI elements can still be read, or to enable old people or the visually impaired to enjoy the systems more, would be a great step in future-proofing the system.
With Apple boldly stating that 100ppi is the
ideal resolution, it looks like they are trying to postpone facing this particular challenge. To me the statement reads like:
100 ppi is a resolution at which our screens don't look to outdated and people can still see the menu-bar at the fixed size we envisioned for it. Ideal indeed.
With screen resolutions and sizes as well as GPU powers increasing at the same rate as they did before this approach may still be good for a few years but the word
innovative doesn't exactly come to mind.