576 words on Mac OS X
One side of Core Image is that it makes developers’ life easier. It provides a number of tools and filters for manipulating images, thus taking the maths and technicalities of that out of developer’s minds and letting them focus on the results they want to achieve. At the same time, the technology tries to provide good performance for these image operations by streamlining the combination of several steps and potentially making use of the vast processing power available in the machine graphics chips rather than just its CPU. People even talked about this being a way to exploit that processing power for scientific applications – but I haven’t seen any of that yet.
By and large this technology is a pretty bold move which certainly lets us have more fancy graphics in the future. It is not entirely clear to me, though, how it fits in with Apple shipping a sizeable share of their computers today (MacBook, Mac mini and the smallest iMac) without a particularly strong graphics chip. And I sometimes think that the optimisations of CoreImage currently – i.e. a year after these machines have been released – aren’t particularly good in that situation.Just using PhotoBooth with some image effects makes you realise that it’s fun but the time needed to apply the effects is large enough to make it non-realtime. Looking at the machine’s CPU usage while doing that reveals that the CPU will at most spend 50% of its time on that and my assumption is that Core Image just uses the GPU for most of the processing and fails to utilise the CPU which is just idling. These effects aren’t phenomenally difficult to compute and I’m sure a machine which can decode full-screen H.264 video – something fairly complex – with half of its processing power, will be powerful enough to apply a few transformations to much fewer pixels.
This theory may be corroborated by watching Quartz Composer closely. Playing with it on a machine with a weak graphics system (both the graphics processor speed and the amount and speed of the video memory may play a role here I suppose) can easily bring you into a situation where the ‘Rendering Load’ in Quartz Composer is maxed out and only low frame rates are achieved while the computer’s CPU is mostly idle.
Another problem I have been seeing with graphics processing in Quartz Composer (which I suppose to be closely related to Core Image at least) ever since I got the MacBook with its weak graphics system, is that results not only are a little imprecise but massively imprecise. I am not talking about super-subtle differences here but about clearly visible ones. These are quite easy to reproduce when rendering text and comparing the results from a MacBook Pro with those from a MacBook, say. [Sandvox users can easily see this in the Cathedral or Bubble Bath designs where the replacement text looks great on a computer with a graphics chip and crappily blurred on a computer without in current OS X versions] This makes the practicality of these new technologies somewhat questionable as it suggest that it’s buggy in a way that makes the results of processing very strongly dependent on the hardware you are using.
Well, no more so than just about any other graphics-oriented technologies. How many times have I picked up a PC game box only to see that you can’t play it at full-frame rate with special effects turned on unless you have a bad-ass video card? Too many times to count. Sure you can play it… but you won’t have the best experience unless you’ve got the hardware it wants to run on.
The difference here is that Core Image is not some add-on application, but is instead a “core” (ha ha) piece of the OS. This makes me question Apple’s decision to use a graphics chip that borrows RAM from the system memory. Because you are absolutely correct… it will diminish the user experience for those apps dependent on Core Image or the Quartz Extreme Compositor (the “original” Quartz Compositor didn’t use hardware GLSL). I guess it’s the only way to make entry-level machines like the MacBook hit the price point they needed, and was required to distinguish the line from the “Pro” machines.
Still, in many respects, I have to put some of the blame for poor user experience on developers who don’t adequately warn their customers of requirements… or don’t program a proper fallback routine for machines not having the hardware to handle the intended task. OS X Dashboard is a good example of this. If you have adequate hardware, you get a nice ripple effect when you place a widget. If you don’t have the hardware to support the ripple, it is turned off. This way, the placement of widgets may not look as pretty, but it is fast, efficient, and useable… all because it is programmed for proper fallback.
With all that being said, I trust since you are an Apple Developer that you have played around with the Core Animation package that’s part of Leopard? Talk about sweet! It’s all most too good, and I am a little scared at some of the wacky interface elements developers might come up with since it makes even the most complex transitions almost effortless.
I think Core Image is in quite a different league than high end gaming. In that area the limitations are quite well understood and people buy specific software just to use it.
Because of its ‘Core’ status, though, people shouldn’t need to worry about their computer’s graphics power. Just as in the Dashboard example you mention where graceful failure is included.
But I think the potential Core Image difficulties go deeper. Say, in the Sandvox example I mentioned (which may not strictly be Core Image but Quartz Composer), as far as I can tell there is no way to predict the large difference between what a MacBook Pro and a MacBook do from reading Apple’s documentation. It’s something that took everybody by surprise. And it’s something – static text rendering – which shouldn’t depend on your video hardware as even a ten year old computer can handle that perfectly well in real time with its CPU alone.
In other cases, the developers may be to blame as you say. How many applications use Core Image for things that absolutely require a powerful graphics card these days? (Examples?) My impression is that in most cases the benefit of these technologies is more for the developer than the user. Which is a good thing of course, but which is also something where the user may inadvertedly lose out on some functionality or is required to start caring about some technical details of his computer which he doesn’t want to care about. In particular I wonder whether it really is clear how to test an application using Core Image and friends. On how many machines/graphics setups does a developer need to test to be sure things work? And what kind of differences can you expect?
And you are totally right about Core Animation. I think it’s quite similar to Core Image in that it is a great long-term plan because it significantly reduces the complexity of implementing animations. That’s a great thing. However, it will most likely see even more abuse than Core Animation. Particularly in user interfaces I think that animations are dangerous and merit very thorough investigation before being implemented. Otherwise those eye-candy effects will become tiring and time-wasting by the third time you see them.
Examples for animation would be: Fast and smooth page-down scrolling in web browsers as a good example as it helps you keep a reference of where the previous bottom of the screen ends up after scrolling. Submenus sliding out of their menu items rather than just appearing in the default settings of Windows XP as a bad example because it will drive you crazy in case you have to select something from a submenu more than twice (Nikon Scan software I’m looking at you!). Another very difficult example would be resize and fade actions of windows when their content changes (like in System Preferences). Having seen several implementations of such effects, I got the impression that the margin within which the speed and other parameters of the fade make it look good and smooth rather than awkward and ugly is really small. And I am pretty sure that many developers will give us too many such effects without putting enough effort into fine tuning those details. Just because it’s very easy to set up. [Which sounds a bit like RealBasic… it isn’t a bad thing per se – but it enables many people who shouldn’t unleash their GUIs to do just that, so it ends up having a bit of a bad reputation.]
Received data seems to be invalid. The wanted file does probably not exist or the guys at last.fm changed something.