A nice thing about high level frameworks like Cocoa is that they do most of the ugly work for you. An example for that is mouse tracking. Usually you don’t even have to worry about it. The framework knows where the mouse cursor is and it also knows where your UI elements are. And it will deduce all the necessary consequences from that. You don’t have to do anything.
That’s fine and dandy until you want to have custom mouse behaviour. That requires some extra effort and I am tempted to say that it’s not entirely clear how to implement it. Simply because there are several ways to do do it.
The most traditional way of achieving that would be to run in a tight event loop and evaluate the mouse position after each step. That’s the way people did things in the System 7 days and thankfully they are considered ridiculously effortful today. In fact, I’m not even sure how you’d use such a technique in Cocoa.
The other ways of working on this use
NSViews. I’d say that’s the way things are supposed to be. Yet, when using
NSViews there seem to be several ways to do mouse handling, none of which is as straightforward to use as I’d like them to be. A single method with clear documentation that kicks butt would have been preferable.
The first and most basic way to work with mouse events would again be to track all mouse events that may be related to a window. Cocoa won’t send you those by default and you need to use
-setAccentpsMouseMovedEvents: method to activate them. My experiences with this were mixed. In particular my impression was that I kept receiving plenty of events even when the mouse wasn’t inside the window in question. But even if all that worked as expected, tracking every single mouse movement and operating on that would seem a rather clumsy way of doing things. This being the 21st century and all.
The second way of manipulating things are the cursor rects of an
NSView. You can set up a number of rectangles and assign an
NSCursor to each of them. If the mouse cursor enters the rectangle, its image changes to the specified one. Probably a nice and simple way of doing this with the view automatically calling your
-resetCursorRects override when the view’s size changes, giving you a chance to update things for the new situation.
That’s probably useful for a number of situations but it doesn’t let you actually do things when the mouse cursor enters one of those specified rectangles.
To do that, you will need the third method: the slightly more sophisticated tracking rectangles. After setting the rectangles up, you will be called whenever the mouse cursor enters or leaves one of them. Some interactivity can be achieved this way.
But in Mac OS X.5, Cocoa introduced the fourth player: the slightly more powerful
NSTrackingArea. It looks like it’s supposed to be the generalisation of everything that was there before. You can simply tell your tracking area which events it should respond to; and it will then call your view’s mouse entred, exited, moved and cursor update methods accordingly.
That’s the theory anyway. Things still aren’t quite as good as the first looks suggest. Despite the thing being named ‘area’ it is nothing but a rectangle with horizontal and vertical sides. I also found it to send plenty of
mouseMoved: calls I didn’t ask for. I found tracking areas somewhat inconvenient to use simply because they are hard to identify as they don’t have a simple
name as an identifier but only an inconvenient-to-access
And those are the obvious things. Bonus questions - to be answered from documentation - might be the following non-trivial but still obvious ones: When tracking areas overlap, what happens? Will I receive messages for each of them? And if so, in which order? If not, which of the areas will be considered ‘on top’?
From there we can more to the more interesting field of moving tracking areas around. Any control-like view with moving parts probably does that. Despite
NSTrackingArea being an object it is immutable. Meaning that you set its location and the type of events it responds to when creating it. And that you’ll have to delete and re-create it when things move around. Not the kind of elegance I was hoping for. Even more so as that way of working means that moving a tracking area kills its history. When a tracking area is relocated such that the mouse cursor moves out of it after relocation, you won’t receive a message about that, simply because the newly created tracking area cannot know about the change.
This effectively means that you are back to the technique of doing manual checks whenever things move around just to be on the safe side - which is exactly the stuff you wanted the clever framework to do for you. Not too impressive.
And don’t get me started on coordinates. All coordinates are floating point numbers. It’s just that if you don’t round them to floating point integers things start behaving strangely. In particular because there seem to be two distinct mouse positions: some ‘actual’ floating point mouse position and the rounded floating point integer mouse position you get from
-mouseLocationOutsideOfEventStream. The bitter irony with that is that Cocoa will call your
-mouseEntered: method as soon as the mouse cursor’s non-rounded coordinates are inside the tracking rectangle. Yet the actual cursor coordinates at that moment - after the rounding that happens before they are passed to you - may not reflect that and when you check with
NSPointInRect whether or not you are inside the rectangle of the tracking area that just alerted you about a mouse cursor then you’ll learn that the mouse cursor isn’t actually in there.
This doesn’t make terribly much sense to me and it effectively means that you’re heading for trouble as soon as you start using non-integer coordinates for anything, even non-visible tracking rectangles. Which of course makes you wonder where exactly a developer benefits from everything being on a floating point basis internally.
Please share your own experiences with Cocoa mouse tracking.
Not quite sure if my recollection is correct, but I believe that non-integral cursor positions only (mainly?) occur when using a Wacom tablet or possibly similar input device. An advantage of this may be that for drawing apps you can use the higher-than-screen resolution to get smoother curves, even when having the canvas at zoom levels lower than 100%. A not so nice side effect is that screenshots where you manually select the rectangle to capture (using Apple-Shift-4) may end up blurry, as the window server (or the screenshot app or whatever) cuts out the screen region using the non-integral rect. I guess I should have filed this as a bug report with Apple, but I kept using a simple workaround (select the rect with the trackpad instead of the tablet).
Something I have noticed is that mouse moved events are always sent, by-and-large, to the last window you clicked on. I say by-and-large because I know of at least one instance where this is not true, but either way it’s not what I want. I want the mouse moved event to be sent to the window under the cursor and that doesn’t happen. Bummer.
I found this in one of the sample applications from apple:
NSTrackingArea *area = [[NSTrackingArea alloc] initWithRect:[containingView frame] options:NSTrackingMouseEnteredAndExited | NSTrackingActiveInActiveApp | NSTrackingInVisibleRect owner:self userInfo:nil]; [containingView addTrackingArea:area]; [area release];
this code, don’t know how, automaticaly resize the tracking area to fit the new size of the view greetings
Received data seems to be invalid. The wanted file does probably not exist or the guys at last.fm changed something.