Monday, December 22, 2008

Cocoa Memory Management

It becomes evident, thanks to the mass centralisation of the neverending september effect that is stackoverflow, that despite the large number of electrons expended on documenting the retain/release/autorelease reference counting mechanism for managing memory in Cocoa, Cocoa Touch, UIKit, AppKit, Foundation, GNUstep, Cocotron and Objective-C code, very few people are reading that. My purpose in this post is not to re-state anything which has already been said. My purpose is to aggregate information I've found on the topic of managing memory in Cocoa, so I can quote this post in answers to questions like these.

In fact, I've already answered this question myself, as How does reference counting work? As mentioned in the FAQ, I actually answered the question "how do I manage object lifecycles in (Cocoa|GNUstep|Cocotron)"? It's actually a very violently distilled discussion, so it's definitely worth checking out the references (sorry) below.

Apple have a very good, and complete, Memory Management Programming Guide for Cocoa. They also provide a Garbage Collection Programming Guide; remember that Objective-C garbage collection is opt-in on 10.5 and above (and unavailable on iPhone OS or earlier versions of Mac OS X). GNUsteppers reading along should remember that the garbage collector available with the GNU objc runtime is entirely unlike the collector documented in Apple's guide. GNUstep documentation contains a similar guide to memory management, as well as going into more depth about memory allocation and zones. Apple will also tell you how objects in NIBs are managed.

The article which gave me my personal eureka moment was Hold Me, Use Me, Free Me by Don Yacktman. Stepwise has another article, very simple rules for memory management in Cocoa by mmalc, which is a good introduction though with one caveat. While the table of memory management methods at the top of the article are indeed accurate, they might give you the impression that keeping track of the retain count is what you're supposed to be doing. It's not :). What you're supposed to be doing is balancing your own use of the methods for any given object, as described in rules 1 and 2 of "Retention Count rules" just below that table.

James Duncan Davidson's book "Learning Cocoa with Objective-C" has not been updated in donkey's years, but its section on memory management is quite good, especially the diagrams and the "rules of thumb" summary. Luckily, that section on memory management is the free sample on O'Reilly's website.

If reading the theoretical stuff is all a bit too dry, the Mac Developer Network have a rather comprehensive memory management training video which is $9.99 for non-MDN members and free for paid-up members.

Finally, Chris Hanson has written a good article on interactions between Cocoa memory management and objc-exceptions; if you're using exceptions this is a good discussion of the caveats you might meet.

Wikipedia == fail

On the same day that fark announce the wikipedia irony, I would like to point out a similar situation I saw just today.

This is from the Susie Dent entry discussion page:

IMDb relies on the contributions of the public, so isn't an overly reliable source.

Sunday, December 21, 2008

better security, not always more security

Today's investigative investigations have taken me to the land of Distributed Objects, that somewhat famous implementation of the Proxy pattern used for intra-process, inter-process and inter-machine communication in Cocoa. Well, by people who measure whether it's a performance hog, rather than those who quote it; as a hint, it was indeed a significant overhead when your CPU was a 25MHz 68030 and your network link a 10BASE-2 coaxial wire. These days we can spend around those problems freely.

Specifically, I wondered whether I should add discussion of the authentication capabilities in PDO to the FAQ entry. Not that it's frequently asked - indeed, it's a NAQ - but because getting mentions of security into a Usenet FAQ is likely to cause newbies to be thinking about security, which is possibly a good thing (for the world, not so much my uniquely employable attributes). But I decided no, though the subject is interesting, it's not because of the technicality, but the philosophy.

Distributed Objects works by sending NSPortMessage messages over NSConnection connections. The connections and message-passing bumph are peer-to-peer, but DO adds some client-server distinction by having servers register their vended connections with name servers and clients look up the interesting vendors in said name servers. By default, anything goes; all connections are honoured and all clients serviced. There are two security features (both implemented as delegate methods) baked into DO, though. The most interesting of the two is the authentication.

The reason that the authentication feature is interesting is that it's implemented in such a way as to make non-security-conscious developers question the security. The end sending the NSPortMessage includes some data based on the constituent parts of the message, and the end receiving the message decides whether to accept it based on knowledge of the constituents and of the data. On the face of it, this looks like shared-secret encryption, with the shared secret being the algorithm used to hash the port message. It also appears to have added no security at all, because the message is still sent in plain text. In fact, what this gives us is more subtle.

All that we know is that given the source information and the sender's authentication data, the receiver gets to decide whether to accept the sender's message. We don't necessarily know the way that the receiver gets to that decision. Perhaps it hashes the information using the same algorithm as the sender. Perhaps it always returns YES. Perhaps it always expects the authentication data to be 42. On the other hand, perhaps it knows the public key of the sender, and the authentication data is a signature derived from the content and the sender's private key. Or perhaps the "authentication data" isn't used at all, but the source material gives the server a chance to filter malicious requests.

Now all of that is very interesting. We've gone from a system which looked to be based on a shared secret, to one which appears to be based on whichever authentication approach we decide is appropriate for the task at hand. Given a presumed-safe inter-process link, we don't need to be as heavyweight about security as to require PKI; whereas if the authentication were provided by a secure tunnel such as DO-over-SSL, we'd have no choice but to accept the cost of the PKI infrastructure. Given the expectation of a safe server talking to hostile clients, the server (or, with some amount of custom codery, a DO proxy server) can even sanitise or reject malicious messages. Or it could both filter requests based on authentication and on content. The DO authentication mechanism has baked in absolutely zero policy about how authentication should proceed, by letting us answer the simple question: should this message be processed? Yes or no? Choose an approach to answering this question based not on what you currently believe could never be circumvented, but on what you currently believe is sufficient for the environment in which your DO processes will live. If a shared secret is sufficient and adds little overhead, then do that, rather than 4096-bit asymmetric encryption.

By the way, the second security feature in DO is the ability to drop a connection when it's requested. This allows a DO server to survive a DoS, even from a concerted multitude of otherwise permissible clients.

Thursday, December 11, 2008

Whither the codesign interface?

One of the higher-signal-level Apple mailing lists with a manageable amount of traffic is apple-cdsa, the place for discussing the world's most popular Common Data Security Architecture deployment. There's currently an interesting thread about code signatures, which asks the important question: how do I make use of code signatures?

Well, actually, it's not so much about how I can use code signatures, but how the subset of Clapham Omnibus riders who own Macs (a very small subset, as the combination of overheating batteries in the old G4 PowerBooks and combustible bendy-busses means they don't stay around very long) can use code signatures. Unfortunately, the answer seems to currently be "not by much", with little impression of that changing. The code signing process and capability is actually pretty damned cool, and a nice security feature which I'll be talking about at MacDev 09. It's used to good effect in the iPhone, where it and FairPlay DRM are part of that platform's locked-down execution environment.

The only problem is, there's not much a user can do with it. It's pretty hard to find out who signed a particular app, in fact the only thing you can easily do is discover that the same entity signed two versions of the same app. And that's by lack of interface, not by any form of dialogue or confirmation. That means that when faced with the "Foobar app has changed. Are you sure you still want to allow it to [whatever]" prompt, many users will be unaware of the implications of the question. Those who are and (sensibly) want to find out why the change has occurred will quickly become frustrated. Therefore everyone's going to click "allow", which rather reduces the utility of the feature :-(.

Is that a problem yet? Well, I believe it is, even though there are few components yet using the code signature information in the operating system. And it's precisely that allow-happy training which I think is the issue. By the time the user interfaces and access control capabilities of the OS have developed to the point where code signing is a more useful feature (and believe me, I think it's quite a useful one right now), users will be in the habit of clicking 'allow'. You are coming to a sad realisation; allow or deny?

Monday, December 01, 2008

You keep using that word. I do not think it means what you think it means.

In doing a little audience research for my spot at MacDev 2009, I've discovered that the word "security" to many developers has a particular meaning. It seems to be consistent with "hacker-proof", and as it could take most of my hour to set the record straight in a presentation context, here instead is my diatribe in written form. Also in condensed form; another benefit of the blog is that I tend to want to wrap things up quickly as the hour approaches midnight.

Security has a much wider scope than keeping bad people out. A system (any system, assume I'm talking software but I could equally be discussing a business process or a building or something) also needs to ensure that the "good" people can use it, and it might need to respond predictably, or to demonstrate or prove that the data are unchanged aside from the known actions of the users. These are all aspects of security that don't fit the usual forbiddance definition.

You may have noticed that these aspects can come into conflict, too. Imagine that with a new version of OS X, your iMac no longer merely takes a username and password to log a user in, but instead requires that an Apple-approved security guard - who, BTW, you're paying for - verifies your identity in an hour-long process before permitting you use of the computer. In the first, "hacker-proof" sense of security, this is a better system, right? We've now set a much higher bar for the bad guys to leap before they can use the computer, so it's More Secure™. Although, actually, it's likely that for most users this behaviour would just get on one's wick really quickly as they discover that checking Twitter becomes a slow, boring and expensive process. So in fact by over-investing in one aspect of security (the access control, also sometimes known as identification and authorisation) my solution reduces the availability of the computer, and therefore the security is actually counter-productive. Whether it's worse than nothing at all is debatable, but it's certainly a suboptimal solution.

And I haven't even begun to consider the extra vulnerabilities that are inherent in this new, ludicrous access control mechanism. It certainly looks to be more rigorous on the face of things, but exactly how does that guard identify the users? Can I impersonate the guard? Can I bribe her? If she's asleep or I attack her, can I use the system anyway? Come to that, if she's asleep then can the user gain access? Can I subvert the approval process at Apple to get my own agent employed as one of the guards? What looked to be a fairly simple case of a straw-man overzealous security solution actually turns out to be a nightmare of potential vulnerabilities and reduced effectiveness.

Now I've clearly shown that having a heavyweight identification and authorisation process with a manned guard post is useless overkill as far as security goes. This would seem like a convincing argument for removing the passport control booths at airports and replacing them with a simple and cheap username-and-password entry system, wouldn't it? Wouldn't it?

What I hope that short discussion shows is that there is no such thing as a "most secure" applications; there are applications which are "secure enough" for the context in which they are used, and there are those which are not. But the same solution presented in different environments or for different uses will push the various trade-offs in desirable or undesirable directions, so that a system or process which is considered "secure" in one context could be entirely ineffective or unusable in another.

Free apps with macdev ticket

The Mac Developer network currently have a Special Offer running until christmas eve, get a free copy of Changes and Code Collector Pro with your ticket. Both are useful apps for any developer's arsenal.

Sunday, November 23, 2008

Some bloody genius

Link to the image, because I know it's too wide for the Blogger template to display properly.

Tuesday, November 04, 2008

More on MacDev

Today is the day I start preparing my talk for MacDev 2009. Over the coming weeks I'll likely write some full posts on the things I decide not to cover in the talk (it's only an hour, after all), and perhaps some teasers on things I will be covering (though the latter are more likely to be tweeted).

I'm already getting excited about the conference, not only because it'll be great to talk to so many fellow Mac developers but due to the wealth of other sessions which are going to be given. All of them look really interesting though I'm particularly looking forward to Bill Dudney's Core Animation talk and Drew McCormack's session on performance techniques. I'm also going to see if I can get the time to come early to the user interface pre-conference workshop run by Mike Lee; talking to everyone else at that workshop and learning from Mike should both be great ways to catch up on the latest thoughts on UI design.

By the way, if you're planning on going to the conference (and you may have guessed that I recommend doing so), register early because the tickets are currently a ton cheaper. Can't argue with that :-).

I'm sorry, I haven't a Clu

One of my many "repeat-until-funny" jokes, anyway here is what I have to say on Mr. Cluley's blog regarding iPh0wn.

Wednesday, October 29, 2008

Solaris iPhone Edition

Apple's one new feature in Snow Leopard is support for Exchange, which if not squarely an Enterprise lure is certainly bait for medium businesses. But here we hit Apple's perennial problem; they want to sell more into businesses (because that's where at least 2/3 of all PC money is to be made) but they want to design their systems for home users. When a system is designed to cover every possible potential use for a computer we end up with Windows, which is the kind of "few things to all people" solution that Apple are - rightly - keen to avoid. But as Tim Cook's "state of the Mac" segment in the recent laptop event showed, one of Apple's biggest growth areas is education which is organised along enterprisey lines.

Their solution thus far has been a partial one; we get Mac OS X which is basically a consumer OS, and then we get Mac OS X server which is the same OS with a few configuration changes and extra apps to support being used as a workgroup server. This is less distinct than the changes between Mac OS X and iPhone OS X, but the principle is the same; the same technology is used in different ways, so we get different interfaces to it. Note that these aren't really very divergent products - a UNIX expert could set up an Open Directory Master on a standard Mac OS X box were they so inclined. We get the Mac Pro and the XServe as nods to the existence of more powerful hardware than the iMac. While Apple do have a network of business development managers, enterprise sales people, sales engineers and so on who can support larger customers, their capabilities and freedom are restricted by working on a consumer product in a consumer organisation.

Assuming that Apple aren't going to retreat and consolidate all of their effort on the consumer/prosumer, the logical plan seems to be "the same only more so"; carry on the scheme of applying a common technology base to multiple markets, but with the product interfaces and configurations being specific to the role in which they'll be used. Empower those enterprise sales, support and development teams to make the changes required in both the shared technology base and the domain-specific parts in order to advance their own cause. Allow them to do so in such a way that the consumer focus of the standard products is not diluted. To do all this, what Apple would need is to clearly delineate their Core OS, Consumer OS and Server OS engineering groups, while adding staff, expertise and intellectual property to their Server OS, Server Hardware and Enterprise Support groups.

The bit about "adding staff, expertise and intellectual property to their Server OS, Server Hardware and Enterprise Support groups" can be easily achieved by using the Blue Peter principle. Here's one I prepared earlier. And no, I'm not going mad. Sun have plenty of experience in supporting larger customers and what marketing people like to call vertical markets, and have some good technology: hardware, operating systems software, enterprise services and applications. Their only problem is that they can't make any money on it. On the other hand with Apple it seems that the money is there to be made, and the problem is stepping up to that plate without compromising the consumer products. Consolidating Mac OS X [+ Server] and Solaris 10 would not be trivial but is not beyond the realms of fantasy. NeXTSTEP ran on SPARC hardware, and as we know that Mac OS X runs on PPC, two different Intel architectures and ARM it's likely that the effort to port Mac OS X to SPARC would not be great. But perhaps more useful in the short term is that OpenStep ran on Solaris before, and could do again. Even though Sun have switched Solaris to a SYSV-derived platform, due to Apple's recent push for standardisation with Leopard the two OS are likely more source-code compatible than NeXTSTEP and SunOS 4 ever were. Getting Cocoa up on Solaris would mean that application portability (for the sorts of apps that server admins will want - including Apple's own server admin tools, not for OmniDazzle) becomes viable while the combined company (Snapple?) concentrate on integrating the core tech. They could even get Jonathan Schwartz to do the coding.

Another factor in this proposition is that JAVA is cheap. Apple currently have about $20B in cash and Sun's shares are worth $3.6B, but taking into account that Sun have lost 98% of their dot-com-boom value without slowing their R&D projects, the value for money when you want them for their tech, smarts and goodwill rather than their user base is astounding.

Oh, and speaking of JAVA, what about Java? Java currently represents Sun's main income due to the licensing scheme, but Apple's investment in the platform has declined over time from the Rhapsody days of "everything is Java"; currently the available Java on Mac OS X lags behind Sun's version and isn't ppc64 compatible. The WebObjects team (and hence the Apple store and iTunes) have a heavy Java investment, while other teams have dropped Java (Cocoa) and still others eschew it completely. The iPhone has a very busy developer ecosystem - and absolutely no Java. Where the hypothetical Snapple would leave Java is entirely open, but the option of packaging up the combined company's Java assets and re-selling them would seem unnecessary, unless you thought that even $3.6B was too much to pay.

Sunday, October 19, 2008

All you never wanted to know about temporary files and were too ambivalent to ask

In the beginning, there was mktemp. And it was good.


Actually, that's a load of rubbish, it wasn't good at all. By separating the "give me the name of a temporary file" and "open the file" stages, there's a chance for an attacker to create the temporary file with the name you've chosen between the stages, or create a symlink with the same name.


Next there came mkstemp. And that was better. But not by much. mkstemp opens the file for you as well as choosing a name, so the file was guaranteed to not exist beofre you tried to use it, and will definitely have the ownership and permissions you want.


There is yet another step which the über-paranoid application could take in order to ensure that no other process can see its temporary files, and that is to unlink the file as soon as you get it back. Unfortunately there's no "mkstempr" function (and it might get confused with the equally non-existent mkstemp_r), so this is still a two-stage operation. Unlinking a file which you have open removes it from the directory listing, but doesn't change the fact that you have it open; it's now exclusively yours.

Wednesday, October 15, 2008

It was asked for: the "features" post

Someone anonymous once said:
I'm intrigued by your feature comment. Please publish said blog post!
Where said comment was:
The fact that I have stopped using the word 'feature' in many contexts is an entire blog post and a few therapy sessions in itself.
So here, for your delectation, is that entire blog post.

When you're trying to decide what software people want, and indeed how to tell them that they want whatever software they're going to get instead, that's marketing (mainly - it's partly sales, and there's yet another tangential post on why I occasionally deliberately conflate marketing and sales). Marketing works in terms of features, which for the purposes of marketing means "properties or qualities of the software which we think might make people interested in that software".

When you're trying to decide what software to build, or trying to build the software, more specific terms are used. Initially people split requirements into two distinct groups, functional (what the system is capable of) and non-functional (how the system goes about its capabilities), but a more precise organisation is often needed. For instance, a requirement of system security might result in both functional and non-functional aspects of the system being specified.

Of course, some or all of the capabilities are also features, in fact it's generally true that the set of all features, the set of all known requirements and the set of things the customer wants are intersecting subsets of the set of all possible qualities of a software system. Companies without an intersection between any two of these sets tend to go out of business very quickly. But the sets rarely perfectly overlap.

For instance, it's a feature of Windows 7 that it's named differently from Windows Vista, because Microsoft's marketing requires that customers believe that they've put Vista behind them. However, it's also a feature of Windows 7 that it not be very distinct from Vista, because marketing require that application compatibility doesn't get broken. Hence we have the interesting situation that Windows 7 is also Windows 6.1. And if Microsoft think they're being innovative in that version numbering policy, they should try looking up the history of SunOS/Solaris version numbers. BTW, indeed I haven't switched my SUNW tag to JAVA, because I already use the java tag to mean the Java language and the Java platform. Marketing people can be funny sometimes.

Another example, less confusing though more contradictory, is Apple's Snow Leopard collateral. The fact that marketing are telling us there are no new features in Snow Leopard means that "no features" is something they believe we might want to buy, which in turn makes it a feature… confused?

So anyway, I try to avoid using the word "feature" when I'm talking about software, because I'm usually instead talking about a capability or property of a software system, and not about marketing that software system. For instance, in Properties about a year on I described properties as a capability of the Objective-C 2.0 language, which indeed they are. It happens that properties is also a feature of the language (don't believe that programming languages have marketing departments? What else do Apple's tech evangelists do, if it isn't marketing?), but in the case of that post I was talking about what can be done with properties, how properties can be used, and not how they can switch developers to Leopard from Tiger or .NET.

And in other news, it seems that badly-parked tech company founder Mercs are back in fashion.

Sunday, October 05, 2008

The view(s) from the hotel

Which two buildings are across the street from my current location?

Trend HQ

and

Symantec HQ

anyone might think it had been deliberately chosen for comedy purposes…

Saturday, October 04, 2008

Properties about a year on

Leopard has now been out for nearly a year, which means that (publicly) we've had Objective-C 2.0 for the same amount of time. At the release many developers were champing at the bit to talk about the new language capabilities[*], including properties. There were arguments on both sides of the divide, and even a little bit of discussion. But now that we've been using these things for a while, and because I'm bored awake grouchy vocal opinionated, let's have a look back at what they've given us.

There is a broken abstraction in traditional Objective-C, which is the accessor-method-as-property-declaration. Essentially an object can give you two things; work (i.e. it can do stuff) and state information (i.e. it can say stuff about itself and let you change it). In traditional object-oriented languages, because 'saying' and 'changing' are verbs which can be 'done', the two have both been expressed using the same method (heh) as the expression of work. This is not the case in much object-oriented design, for instance in UML a class always has separate "attributes" and "operations".

Properties fix up this abstraction by giving us orthogonal ways to express the two concepts. Work is done in methods; state is got/changed in properties. Now it may be that the state information is actually backed by a method (although it may bang on the ivar directly; more below), but we don't need to know that any more than we need to know in the interface that a property is synthesized or dynamic. All we do need to know is that it is there for us to use, and has certain attributes such as being read-only.

The "on more below" bit is that discussions of KVC-like mechanisms - such as KVC :-) - often involve someone pointing out that they break encapsulation, because it's possible to access an @private ivar with no accessors by retrieving it by key. That's really thinking about the design of a class in terms of the way it's executed rather than its interface contract with the developer, because the @private ought to tell the developer not to touch that particular ivar. Properties neither help nor hinder breakage from the execution side, but from the design side they do provide a stronger distinction between "properties I'm telling you about in the interface" and "things you shouldn't touch". Now we can all get back to using the class's interface to observe how to use it, and that C struct bit at the top to observe how to extend it, as nature intended. It's both a blessing and a curse that Objective-C allows things to appear in source files which don't make it into the executable code, but that doesn't stop such information being useful to the developer in the same way that code comments can be read but not executed.

One of the popular complaints about ObjC properties is the syntax for referring to them in methods (OK, or indeed in functions), where it is argued that myObject.someProperty = 4; doesn't readily tell you whether myObject is an ObjC object, a C struct or a C union. That seems to be at worst a straw man argument to me, and at best a hypothetical issue; in well-designed software it will be rare to mix code at various levels of abstraction except in limited circumstances such as adapter classes. Besides, if the code has been written such that it can be inspected or reviewed (i.e. to some agreed style and standard) and the reviewer is paying attention then it will be easy to distinguish use of the various types. At some conceptual level the C . and Objective-C . operator are doing the same thing anyway; they're both saying "this attribute of that thing".

[*]The fact that I have stopped using the word 'feature' in many contexts is an entire blog post and a few therapy sessions in itself.

Sunday, September 28, 2008

MacDev 2009!

It's a long way off, but now is a good time to start thinking about the MacDev '09 conference, organised by the inimitable Scotty of the Mac Developer Network. This looks like being Europe's closest answer to WWDC, but without all those annoying "we call this Interface Builder, and we call this Xcode" sessions. Oh, and a certain Sophist Mac engineer software will be talking about building a secure Cocoa application.

Thursday, September 25, 2008

Rhetoric, smoothly outlined

Something I did a number of years ago (I could tell you how many, couldn't I? If I could remember; I think it must have been 7) was to study critical analysis. That's the application of linguistics and sociology to, well, basically to refusing to believe anything people say to you ever again. As an example of how it's useful to someone who isn't a professional rhetorician, here's a discussion of the things I read in The iPhone Store Impending Disaster Myth. Mainly because that article is fairly close to the top of my RSS feed reader.

The first thing to note is the use of loaded language in the title - the hyperbolic phrase "impending disaster" and its syzygy with the word "myth" clearly setting the author's stall out. This is reinforced by the first paragraph:

According to the predictable opinion scribes [...]They’re wrong, here’s why.

That first sentence fragment paints the subjects of the author's post as thoughtless machines, churning out page after page of text reinforcing their unchanging opinion. Ironically that is exactly what we are about to read for the next several paragraphs. It's a convenient amalgamation of two rhetorical techniques; most obviously it is an ad hominem (to the man) argument. Attention is diverted away from the discussion of Apple's app store and onto the people with which the author disagrees. This then is the beginning of a straw man which will be constructed toward the end of the piece, sowing the seed in the reader's mind that the author's opponent does not have a relevant argument.

The final sentence, "they're wrong, here's why", is a trademark of this particular author (or maybe that's an example of confirmation bias on my part) and actually renders the rest of the article meaningless for most people. It tells us that the rest of the article is a repudiation (for why it isn't a refutation, read on, but the point of this sentence is some verbal sleight of hand to make you believe that a refutation is to follow) of the position the author has defined for the "predictable opinion scribes", which is either going to make you believe that what's coming up will be an excellent riposte or a boring diatribe, depending on the opinion you've already formed about this author. All that the remaining part of the article needs to do is to fill up past the end of the page so that you believe the riposte/diatribe really exists, and it performs this task with aplomb.

What happens from here is actually rather subtle. The author outlines the position he intends to oppose, followed by "here’s[sic] the facts they’re missing". But the next few sections, from "Developers, Developers, Developers" to "Why Platforms Win" contain an opinionated retrospective on the computing industry, using links to the author's own articles as references. Opinionated? Well, count the number of times the phrase "third rate, old technology" appears. It's actually only four, but it moves from what "IBM, Microsoft, and the PC cloners [Oxford comma sic]" were doing to "the Microsoft strategy". There's enough filler (26 paragraphs and 10 linked articles in the same style by the same author) that it could be easy to forget that segue occurred. A fact which doesn't escape the author:

If you made it this far, you may have forgotten that the first argument against Apple vetoing apps

Too right we might have forgotten. What we haven't forgotten is that we were told "here's why" the app store naysayers were wrong, but have actually been told why Lotus 1-2-3 outsold Visicalc. The author's argument follows the pattern "B follows A. C. Therefore A." Loosely the argument could be described as a "red herring fallacy", although a word I prefer is that the intervening text underwent a process known as "contextomy".

Anyway, before we got here, our author let his façade slip a little:

Now let’s hammer away at the sappy pleading on behalf of developers who want Apple to cater to their whims due to the attractive populist concept of fairness in doing so.

Ooops! Now, do we think that the author is for or against people who disagree with Apple? Anyway, enough backtracking. Why don't we move forward from the end of my previous <q>?

[...] is that its decisions are unpredictable and arbitrary.

Now read the rest of that section. There's a good amount of text to describe why these decisions aren't arbitrary. Whatever happened to unpredictable? Oh, and for bonus points, look for where the final paragraph contradicts the earlier thrust of the section and reinforces the notion that arbitrary rejections have occurred.

The rest of the article carries on in the same vein, and having seen the way in which I automatically parse the earlier part you can probably guess how my cynical mind interprets the rest of the text. Oh, and speaking of cynicism, if you're still wondering why this is a repudiation and not a refutation, then my evil little mind-play trick worked! You've read at least part of every paragraph in the hope to get information I promised at the beginning; if only I'd put some adverts in the post somewhere. So to refute means to prove to be false, whereas to repudiate means to reject. The article we've just looked at is an internally inconsistent expression of the author's opinion, no proof having occurred. It's also an example of the informal fallacy of suppressed correlative. Apple's practices can't be bad, because Microsoft's practices are bad and Apple's are better than Microsoft's.

Well, that was fun! The next time you're talking to your boss (or better, your marketing people), listen out for those rhetorical devices and remember to stay critical :-).

Wednesday, September 24, 2008

AppleScript, for once

AppleScript isn't something I write much about, in fact this is the first post I've ever created on the topic. But AppleScript, like the Services menu and Automator, provides that most useful of usability enhancements: the ability to use multiple applications together without fulfilling the marketing requirements of having to look at them all.

As an example, a folder action script might let me combine the Finder with any other application, such as, choosing completely at random, Sophos Anti-Virus:

on adding folder items to this_folder after receiving these_items

  set theScript to "/usr/bin/sweep -nc"

  repeat with i from 1 to number of items in these_items

    set thePath to POSIX path of item i of these_items

    set theScript to theScript & space & thePath

  end repeat

  set theScript to theScript & space & "--quarantine:mode=000"

  do shell script theScript

end adding folder items to


that script then scans any file which appears in a particular folder and locks it if it contains a virus (up to a point). But that's not really the point, the point is that I haven't actually had to use any of the target apps in order to get this combined functionality. It's like I was able to summon the Megazord without having to actually talk to the individual Power Rangers. Erm, or something. And that, really, is how a computer should work; I didn't buy OmniFocus so that I could look at its icon, or a splash screen, I bought it because it can manage my lists of things to do. And I got iCal in order to manage events in time. If I have things to do at specific times, then I ought to be able to combine the two, and the computer can do the work involved. After all, that is why I bought the computer.

Monday, September 15, 2008

Overdoing the risk management

I own a notebook. In fact, I own several notebooks. One in particular has an interesting feature (where I use "feature" in the "different from the competition, though we don't know whether anyone actually needs it" sense); inside the front cover is space to write your address, and a dollar value reward available to the person who returns the notebook.

Now the notebook itself is probably worth about $20, but on the face of it a used notebook is worth less than a pristine notebook, with a full notebook having no value. Presumably the value of the reward should be related to the value of the notes contained within it, and therefore can't be ascertained until I've filled the notebook up. But then if I were to lose it before filling in the pages, I would not have entered an interim value; and if I had then whenever I made new notes I would need to update the worth of the book.

And who should be footing the bill, anyway? Are my musings of any financial benefit to me, or if my employers get more worth from them should they be contributing to the reward fund? Could I possibly make the same notes again were I to lose this book? Could I pay someone with a lower salary than mine to have thoughts with a similar monetary value? Would someone else who came across my notebook be able to extract the same worth from the contents than me? If so, should I write in an encrypted fashion? How much more would that cost me? Should the reward factor in the costs of decrypting the contents, possibly reverse-engineering the method if I've forgotten it?

Do ideas depreciate? Clearly patentable ideas do, will my ideas be patentable? Will I be able to benefit from the patents? If someone finds the notebook and returns it, are the ideas still patentable? What about non-patentable thoughts, do they all depreciate at a constant rate? Should the reward value be a function of time?

Clearly the only people who can answer all of these questions upfront, and therefore the people who can use this reward feature with confidence, are the people whose ideas can be modelled with a waterfall development process. Take Terry Pratchett; he might know that the content of one notebook equates to roughly 50% of a novel, and that each novel is worth £200k, and therefore the value to him of the notebook is less than £100k. A thought process which eventually results in a cash value for a notebook. For those of us whose ideas are somewhat more iterative (read: chaotic), this seems like a complete misfeature.

Monday, September 08, 2008

Me.com. Your identity, everywhere.

Title linky goes to a Sophos blog post I wrote about the relative success of MobileMe phishing scams, and the insecurity of MobileMe web access.

Friday, September 05, 2008

Apple 2, iamleeg 0

So, my few-year-old iPod decided it had had enough, and with pay day having only just passed I thought maybe it would be nice to get a new one. What's happened today? Got the new one home, and it won't work at all (searching for "error 1434" isn't particularly useful, either). However, the one that previously broke, having now been taken apart, started working again. So my 20GB 4G iPod is now humming along nicely (running Podzilla), and my 160GB classic is b0rked :-(.

Tuesday, September 02, 2008

The twitter sitter hit a bitter critter

Yup, more on the subject of a home-grown Twitter client. This time, posting and sorting out the UI somewhat have both been achieved:



Posting tweets is amazingly simple - just take the tweet and stuff it into the body of a POST NSURLRequest. The Twitter API even handily returns the posted tweet, so the same code which parses the friends timeline can also insert the new tweet.

So, where to go next? Well, I'm getting bored of typing my password in all the time so Keychain would be nice, @reply buttons and perhaps searching. I'm going to need cache management soon, too.

Mac user Gmail account hack

I found today in Macintouch reader reports the news that a Mac user found his Gmail account had been taken over. He writes:


I woke up this morning and looked at my gmail and thought, gee that's weird, it won't accept my password. I figured it was a glitch and tried it on my iphone, same thing.

Then I asked for a password reset. When I got back into the account, found a bunch of sent emails from a Nigerian scammer. I also looked at the ip history in gmail and noticed the weird IP, which of course came from Nigeria.

This relates well to a point I've made repeatedly in podcasts and papers; namely that having information worth stealing is not a Windows-only situation. As more data is stored "in the cloud" then the security of the cloud and of the way we use it becomes as important what is going on in our own computers. Having a weak Facebook password compromised will work just as well if you're on Trusted Solaris as Windows.


In other news, yesterday's Twitter client is not really much further along, because a thunderstorm has meant I've unplugged all of my electronics (the laptop isn't plugged in to anything, obviously). I am now very grateful to MarsEdit for having offline editing capability, otherwise I'd have to try and remember all this stuff later ;-)

Monday, September 01, 2008

A better bit o' twitter than the bitter twitter Tommy Titter bought

Just because everyone these days writes a Twitter client:



This was actually a quick hack project to make up for the fact that I missed CocoaHeads tonight (due to a combination of an uninteresting phone call, and a decision to recover from the phone call by using the rest of my petrol tank). Really just an excuse to play with some APIs (the tweets are grabbed by the controller using NSURLConnection, then some NSXML/XPath extracts the useful information (or not, it is Twitter after all) and puts it into the model), there are many things which need to happen before this is at all a useful Twitter client; the ability to write back, nicer formatting are just the starters. Shiny Core Animation twitting ought to happen.

Still, not bad for two hours I think.

Fuzzing as a security testing tool

Google have a new browser project, called chrome, and in their introduction they explain perfectly, through the medium of image, how fuzzing works.


Of course, as anyone could tell you, if you take a thousand monkeys and a thousand typewriters and put them all in a room for long enough, you will end up with a thousand broken typewriters, ten fat monkeys and 990 monkey skeletons.

Friday, August 29, 2008

Walking a mile dans ses chausseurs

The word 'translator' has an interesting history. In the Anglo-Saxon language, 'wealhstod' meant "learned in Welsh" more or less, and described someone who could parlay with the important members of the local British tribes. As is often the case with invasions the British started to use the word, so the Welsh title 'Gwalstawt' means "interpreter of tongues", i.e. the Welsh word for "can speak another language" originally meant "can speak Welsh" (there's another word more closely related to Breton treiñ or Cornish trélya in Welsh, too; trosi).



Anyway, to see what localisation people go through during the l10n process, I decided the best thing to do was to try it myself. To save the time it would have taken to write an internationalised app, I used someone else's; namely TextEdit. Here's the result after about 90 minutes of work:



Trahtendebyrdenne

The first thing to notice is that I haven't actually got much done yet. I've started working on the main menu NIB file (Edit.nib), and I'm about halfway through that. At this rate, it would take me at least a (working) day to finish - granted I'm no expert at the task, so I'm having to make a more heroic effort on otherwise "standard" translations than most localisers would. Although I do have a glossary to help. Even so, TextEdit is a fairly simple app; it's easy to see that even if the translation became a mechanical process, translating a complex program would take a long time.



The other thing you might have noticed is that Mac OS X doesn't actually support Old English, and yet that's the language of my translation. There's a simple trick here; convince Mac OS X that it does support Old English ;-). Type this command in the Terminal:


$ defaults write NSGlobalDomain AppleLanguages '(ang, en, /* other languages */)'

and Robert, as they say, is your father's brother. Apps will now look for localised resources in 'ang.lproj' when they start, so that's where your Old English resources live.

Wednesday, August 27, 2008

Next CocoaHeads Swindon meeting

1st September (that's this coming Monday), in the Glue Pot, Swindon. 8:00pm start. Chris Walters will talk about, well, something, and we'll be drinking beer, listening and occasionally chipping in. See you there!

Wednesday, August 13, 2008

Back from holiday

I went on holiday to Stockholm this week. Of the ~1.5GB I downloaded from the camera, this was the photo I thought most apt to describe the experience on this blog:

Ubuntu

Tuesday, July 29, 2008

Cocoa#, Mono and Me

My great application:


Currency Converter.net


Yeah, OK, not so great. But this Inverse Hoffman is the result of a couple of hours hacking in Mono, with Cocoa#. My app's largely based on the Stupid Word Counter tutorial, though it's a from-scratch implementation of the famous Apple/NeXT sample application in C#.


Firstly, a little history. My first encounter with .NET was back in about 2003 at a Microsoft Developer Roadshow in the car park (and later lecture theatre) in Oxford's comlab. I was particularly interested in their discussions of cross-platform capability, Project Rotor (I kept in touch with one of the Rotor developers) and so on, but really didn't see much exciting in .NET. Nonetheless, being a fair man, I took my beta CDs of Windows 2003 and Visual Studio .NET and gave them a whirl. Unfortunately, still not much interesting. Largely due to buggy betas and a lack of beta documentation.


Now accessing Cocoa from non-ObjC languages is nothing new, we've been doing it from Perl, Python, Ruby and Java for ages and partcularly old farts might even remember Objective-Tcl. Why should I care about Cocoa#? Well for a start, there are likely to be a lot Windows developers out there with some (language that boils down to MS IL eventually) skills who are wanting to produce Mac applications, and it'd be interesting to see what we'll end up with when they do. And it's always fun to learn a new language, anyway ;-)


Good points


  • Real NIB files. No really, that interface is genuinely an IB 3.x NIB based on the Cocoa Application template. The objects inside it are Objective-C objects, and there are good old outlets and actions (I haven't yet investigated whether Cocoa Bindings would work).

  • macpack. A command-line tool which takes your IL executable and wraps it up in a Cocoa application bundle, ready for drag-deployment.

  • Good inline bridging information. Unlike, say, PyObjC the language bridge isn't completely dynamic, but unlike the Java bridge or JIGS you don't have to keep a separate manual mapping of real classes onto ObjC shams. For instance, here's the controller from Currency Converter.net, complete with class, ivar and method exports:

namespace info.thaesofereode.CurrencyConverter
{
[Cocoa.Register("CurrencyConverterController")]
public partial class CurrencyConverterController : Cocoa.Object
{
public CurrencyConverterController(System.IntPtr native_object) : base(native_object) {}

//Cocoa IBOutlets
[Cocoa.Connect]
private Cocoa.TextField inputCurrency;
[Cocoa.Connect]
private Cocoa.TextField outputCurrency;
[Cocoa.Connect]
private Cocoa.TextField conversionRate;

//Cocoa IBAction
[Cocoa.Export("calculate:")]
public void calculate(Cocoa.Object sender)
{
//get the rate from the view
System.String rate = conversionRate.Value;
CurrencyConverterModel.Rate = System.Convert.ToDouble(rate);
//get the currency
System.String input = inputCurrency.Value;
System.Double output = CurrencyConverterModel.convert(System.Convert.ToDouble(input));
//update the UI
outputCurrency.Value = System.Convert.ToString(output);
}
}
}

Bad points


  • Not very Cocoa-like wrapper classes. I think this is deliberate; they've gone for making the Cocoa shim look like a .NET interface because after all, we're programming from .NET. This is a bit disappointing as I'm more familiar with PyObjC and the Perl-ObjC-Bridge where the APIs are left pointedly alone, but given the target audience of Cocoa# it's unsurprising.

  • MonoDevelop. Luckily, using it isn't mandated.

So, overall, one more good point than bad (and a tentative two, if you overlook MonoDevelop); a pretty good initial evaluation and I might give this a deeper scrape.

Tuesday, July 22, 2008

Microblogging

For a long time, I deliberately avoided microblogs like twitter. I thought that they were simply an acknowledgement that people want to be published more than they want to have something to say. However, it would be rude of me to completely disavow the medium without actually giving it a go.

To that end, I may indeed be iamleeg on twitter, as soon as twitter actually finishes processing the signup form.

I'd like to point out that one problem I'm going to have is brevity - I have spent 650 characters telling you what my username is. Constraining myself to SMS-sized wibblings will indeed be tricksy.

Monday, July 21, 2008

Common sense writ large

Looking at the bottom of Apple's Q3 results, as indeed with any similar publication from a US publicly-traded company, we see the following text.

This press release contains forward-looking statements including without limitation those about the Company’s estimated revenue and earnings per share. These statements involve risks and uncertainties, and actual results may differ. Risks and uncertainties include without limitation potential litigation from the matters investigated by the special committee of the board of directors and the restatement of the Company’s consolidated financial statements; unfavorable results of other legal proceedings; the effect of competitive and economic factors, and the Company’s reaction to those factors, on consumer and business buying decisions with respect to the Company’s products; war, terrorism, public health issues, and other circumstances that could disrupt supply, delivery, or demand of products; continued competitive pressures in the marketplace; the Company’s reliance on sole service providers for iPhone in certain countries; the continued availability on acceptable terms of certain components and services essential to the Company’s business currently obtained by the Company from sole or limited sources; the ability of the Company to deliver to the marketplace and stimulate customer demand for new programs, products, and technological innovations on a timely basis; the effect that product transitions, changes in product pricing or mix, and/or increases in component costs could have on the Company’s gross margin; the effect that product quality problems could have on the Company’s sales and operating profits; the inventory risk associated with the Company’s need to order or commit to order product components in advance of customer orders; the effect that the Company’s dependency on manufacturing and logistics services provided by third parties may have on the quality, quantity or cost of products manufactured or services rendered; the Company’s dependency on the performance of distributors and other resellers of the Company’s products; the Company’s reliance on the availability of third-party digital content; and the potential impact of a finding that the Company has infringed on the intellectual property rights of others. More information on potential factors that could affect the Company’s financial results is included from time to time in the Company’s public reports filed with the SEC, including the Company’s Form 10-K for the fiscal year ended September 29, 2007; its Forms 10-Q for the quarters ended December 29, 2007 and March 29, 2008; and its Form 10-Q for the quarter ended June 28, 2008, to be filed with the SEC. The Company assumes no obligation to update any forward-looking statements or information, which speak as of their respective dates.

Erm, like, duh. Stuff which we say might happen in the future, might not actually happen. Really? You've got to get out of the financial industry, there's a lucrative career ahead of you in construction.

Friday, July 18, 2008

Designing a secure Cocoa application

That's the title of next month's CocoaHeads Swindon, and I'll be leading the presentation/discussion. So if you want to learn a little about how to ensuring your Cocoa app doesn't give away the keys to the kingdom, or have some experiences to share with the rest of the group, come along! We'll be at the Glue Pot, which is nice and near the train station as well as reasonably close to a car park. We'll be congregating at 7:00 but will wait for everyone to be settled with a beer in their hand before starting ;-).

Monday, July 14, 2008

Ah, the sweet sound of my own voice

The title is a linky to the press release regarding the edition of Sophos Podcasts I recorded with Carole, and which has now (clearly) gone live. In it we mainly talk about the data theft and Macs technical paper I've already posted about. This is the first podcast I've ever been involved in, so any feedback you have (apart from telling me that I sound drunk in my first sentence, I'm not sure what that's all about) will be most welcome!

Tuesday, July 08, 2008

CocoaHeads Swindon

Just got back from the first meeting of Swindon CocoaHeads, featuring a bunch of people who live nowhere near Swindon, some good beer and the occasional discussion of Cocoa. Special mention to Scott who came all the way from sunny Warsaw to be with us!

Tonight's event was an informal, "what do we want from Swindon CocoaHeads?" event, but it looks like being successful enough that we'll be doing it again. The format will be a presentation or directed conversation, followed by general chit-chat about all things Cocoa. In fact, um, I may have volunteered to give the first presentation at the next meeting. The subject is: well, that would be telling, wouldn't it… ;-). You'll have to find out by coming along to the Glue Pot in Swindon at 8pm on Monday, August 4th. Look out for further announcements and a mailing list over at Scotty's place in the forthcoming month!

Tuesday, June 24, 2008

Objective-C NAQs

Never-Asked Questions :-)

In Code Complete 2 §6.5, Steve McConnell presents a list of class-related design issues that "vary significantly depending on the language". So why don't we look at them for Objective-C? Especially as I can't find anyone else who's done so based on a couple of Google searches… N.B. as ever, I'm really considering Cocoa + Objective-C here, as the language itself doesn't provide enough policy for many of the issues to have an answer (only method resolution and the existence of isa are defined by the runtime and the compiler - ignoring details like static string instances, @protocol and so on).

Behaviour of overridden constructors and destructors in an inheritance tree. Constructors are simple. If you can initialise an object, return it. If you can't, return nil. The -[NSObject init] method is documented as simply returning self.

For destructors, we have to split the discussion into twain; half for garbage collection and half for not. In the world of the Apple GC, destruction is best done automatically, but if you need to do any explicit cleanup then your object implements -finalize. The two rules are that you don't know whether other objects have already been finalized or not, and that you must not resurrect your object. In the world of non-GC, an object is sent -dealloc when it has been released enough times not to stick around. The object should then free any memory claimed during its lifetime, and finally call [super dealloc]. Note that this means objects which use the voodoo cohesion of "OK, I'll clean up all my resources in -dealloc" probably aren't going to work too well if the garbage collection switch is flipped.

Behaviour of constructors and destructors under exception-handling conditions. Exceptions are a rarity in Cocoa code (we didn't have language-level exceptions for the first decade, and then there was a feeling that they're too expensive; C++ programmers like the Leopard 64-bit runtime because ObjC exceptions and C++ exceptions are compatible but that's not the case in the NeXT runtime) but obviously a constructor is an atomic operation. You can either initialise an object or you can't; there's no half-object. Therefore if an -init… method catches, the thing to do is unwind the initialisation and return nil. The only thing I can think to say for having exceptions around destruction time is "don't", but maybe a commenter will have better ideas.

Importance of default constructors. All classes implement -init, except those that don't ;-). Usually if there's a parameterised initialiser (-initWithFoo:) then that will be the designated initialiser, and -init will just return [self initWithFoo: someReasonableDefaultValueForFoo];. If there's no reasonable default, then there's no -init. And sometimes you just shouldn't use -init, for instance NSNull or NSWorkspace.

Time at which a destructor or finalizer is called. See How does reference counting work? for a discussion of when objects get dealloced. In the garbage-collected world, an object will be finalized at some time after it is no longer reachable in the object graph. When that is exactly really depends on the implementation of the collector and shouldn't be relied on. Temporal cohesion ("B happens after A, therefore A has been done before B") is usually bad.

Wisdom of overriding the language's built-in operators, including assignment and equality. This is such a brain-damaged idea that the language doesn't even let you do it.

How memory is handled as objects are created and destroyed or as they are declared and go out of scope. How does reference counting work? How do I start using the garbage collector?

Tuesday, June 17, 2008

WWDC day crosspost

WWDC summary on SophosLabs blog, written by yours truly. I wonder how many Sophos readers (I should probably avoid the term sophist, shouldn't I?) have an interest in Mac and iPhone stuff (and subequently how many comment via the e-mail address on the Labs blog).

Monday, June 16, 2008

Local KDC on Leopard

via Nigel Kersten, a great description of the operation of Leopard's built-in local KDC. I think the most exciting thing about the local KDC is the Bonjour support; could we see simple cross-system trust in the near future? Could there be someone in the world who can actually make Kerberos simple?

Friday, June 13, 2008

WWDC day 00000000000000000000000000000101

Technically day five isn't over yet, but with just one session and a taxi ride remaining I doubt I'm going to get anything prematurely wrong here. In fact, I'm no longer entirely sure that I'll remain awake through the rest of the day so even if something entirely different does happen (still haven't heard anything about IXKit this millennium) I expect I'd miss it.

Planning an east-bound flight is always difficult, because an 11-hour flight through eight time zones "takes" 19 hours. So if I sleep on the plane I'll wake up in the middle of the afternoon UK time, and going back to sleep at newly-local night-time might be complicated. Conversely if I don't sleep on the plane then I won't be going to sleep until at least 9am currently-local time, for a total of about 28 hours uptime. I was just talking to someone who recommended fasting, I'm not sure whether it's worth going through that hardship for an unverified theory (stuff all of that "in the name of science" crap). The other option would be to go into work 3pm-11pm for the beginning of the week, I'm not sure everyone else will appreciate having their meetings moved to the middle of the night and I don't think the canteen's open that late either ;-).

Still, of the three WWDCs I've been to this is definitely in the top ten list of interesting and exciting WWDCs. There definitely still are fellow NeXT fans in the woodwork (actually, *tap* *tap* I think it's gypsum board) who worm their way out during these events, I guess that most have no motivation not to be working on Cocoa. Actually, I don't recall having seen Andrew Stone this week, and I know of a couple of other guys who aren't here, but have definitely managed to gain some traction for the phrase "NeXTSTEP Mobile ;-)

WWDC day four

Not so many sessions attended today - partly because I've reached the limit of what the human physiology can achieve on a diet of coffee and doughnuts. But also due to ducking out of sessions to meet with ex-NeXT guys, ex-Lighthouse guys and ex-colleagues for most of the afternoon. Of course, this was followed by the beer bash, no longer the campus bash which we all know and fondly remember (or at least we all fondly remember queueing for a couple of hours at each end for the coach) but still a good event. Speaking of ex-Lighthouse guys, Wylie and I spent a bit of time putting Sun's world to rights (essentially, if they were to stop diluting the Java brand, stop pulling themselves in every which way and try to make some money, they might do OK).

So tomorrow is the NeXT meetup at the beginning of lunch, I have a lab appointment at exactly 12 which is a little annoying but I'll try to make it over. If the G4 cube is the right shape, wrong CPU and Real Men's Objects use the NX prefix, then come over to the front of the Moscone West after the second session.

Thursday, June 12, 2008

WWDC: afp548 Venn competition

I, with some help from Ken and the guys, did a better job.

WWDC day three

Today was not quite a full day, so I managed to spend about an hour or so milling around Yerba Buena park, doing a little gift shopping and generally existing outside of the Mascarpone centre. Also put in another update to the c.l.o-c FAQ (though I didn't update the date! oops…) to better discuss retain/release memory management. There are still situations (GNUstep, Cocotron, and some particular OS X environments) where garbage collection is either unavailable or unsuitable, and it turns out that a lot of questions on either the group or cocoa-dev recently have been about the memory management system.

In the evening, after dinner at Chevy's with the u.c.s.m guys (and yet more discussion of the Sophos feature set), checked in to the Apple Design Awards for the first 30 mins or so. All of the winners (and runners-up) I saw were worthy applications, although it was interesting to note that while most of the "productivity" app winners were from small shops, both of the games (runner-up was Command and Conquer 3, winner was Guitar Hero 3) were from large studios, ports of Windows/console games and in existing long-running series. They're both great games, too, of course. Ian was, of course, rooting for Delicious Library 2, and sadly disappointed ;-).

This was all followed by the AFP548 beer bash over at Thirsty Bear. Yet more "oh, you're from Sophos? Yeah, let me say this one thing…", which I really enjoy because if something's either good or bad enough to be the subject of a beer-fuelled rant in a party, I should probably hear about it. Also submitted a couple of entries to Peter's "Leopard bug Venn diagram" contest; I'm not sure I could do a better job of describing the situation than that but for those who were there, I came up with: \Omega = "Closed/Duplicate" and \Omega = "NeXTSTEP 7.0".

Wednesday, June 11, 2008

WWDC aside

None of Perl, Cocoa or some weird XML toolkit came up with the "simple things simple, complex things possible" quote.

WWDC day 2

Tuesday is typically associated with "starting to learn stuff" at WWDC, and today was definitely spent learning stuff. Learning, for instance, that doughnuts are considered adequate breakfast material, or that if you are prepared to pay the tiny wi-fi fee at Starbucks you can get faster INTARWEBS than all the people on the steps outside the Mascarpone. It was also a day of nice food; I've just got back from dinner with Alex at the Chevy's, having had lunch with Michael at the Ozuma sushi restaurant. More discussing Sophos-for-Mac with customers and potential customers ensued (apparently we've saved some people from the "hell that is Norton", and I point out for legal purposes that that's a direct quote and not my own opinion or words), and I got some pretty good photos of the Bay Bridge and treasure island which I'll upload as soon as I locate the USB cable.

Tuesday, June 10, 2008

ObjC FAQ update

Added a question about the ObjC 2.0 garbage collector. Sorry it's been so long in coming! I'll try and add a few more ObjC 2.0 questions over the coming days.

WWDC - day one

The WWDC keynote is always an odd event to attend. It's put on for the benefit of the investors and the media, with the developers being invited purely to act as braying masses expressing their adulation for His Steveness. It's rare for any technical content to make it into the session, except in unavoidable cases such as the 2005 keynote. The focus of that was the Intel transition, so by necessity there had to be some technical justification of the switch.

With this in mind, it's not hard to see that the keynote can be a somewhat dull affair. Obviously as both an Apple customer and member of the "economic ecosystem" of the Mac, it's always good to be as informed as possible of the company's position and direction. That said, yesterday's keynote (no wi-fi in this hotel, so a late post) contained less of interest to me than usual.

As I mentioned I'm financially dependent on Apple (in an indirect sense of course; I'm paid to write Mac software for Sophos, therefore no Mac = no job at Sophos), though as I'm not an indie dev I have a bit more of a comfort buffer than many people. The enterprise iPhone video Steve showed was basically a backslap in front of the shareholders; look, there are people who really do use this stuff! Then the laundry list of every developer who's downloaded the SDK and managed to get something to compile; interesting to see the wealth of different domains into which the iPhone is entering, but seriously. Two demos, three tops. Not all four thousand of the known apps. Good to see TEH CHEAP being applied to the 3G iPhone, though; I may have to
have a discussion with Orange about a PAC when that's available.

Which left Mobile Me. This is actually a pretty cool reboot of iTools^W.Mac, OK it looks like there might be no more iCards but on the other hand the Mobile Me syncing is really beneficial. I can see that becoming more of a cash cow for Apple, though mainly because they opened it up to the PC; people who have an iPod and Windoze could buy MM to synchronise their contacts, mail and so on, as well as getting webmail access (and webmail access which doesn't suck balls as much as
Exchange's OWA, may I add). That then might make them more amenable to the Halo Effect and the purchase of a Mac down the line.

The rest of the day was interesting but obviously undisclosable, except for the evening I spent in a couple of bars down the financial district (the Golden%Braeburn event at 111 Minna, where I went with Steffi from BNR and a couple of Cocotron committers; then Dave's bar where I met Nigel and most of Apple UK). Conversation ranged from Sophos feature requests to the drinkability of American IPAs; all good stuff!

Monday, June 09, 2008

WWDC part 0

well, here it is, the pre-WWDC "I'm jetlagged so you have to put up with my wittering" post. I'm just waiting for a softwareupdate to finish so that I can go out with my camera, taking some early-morning pictures before heading off to stand in line for the Stevenote. I was out for beers with Ian and Neil last night, we'd all heard rumours of a 5 a.m. start to the queue. On the two previous occasions that I've been, 9 a.m. has been sufficient; but with the sellout nature of the event it's likely that the room will fill up rather quickly so we've compromised on a 7 a.m. start. Actually, forget the 5 a.m. nonsense, there's a line of overnight campers - I can't decide whether they're deliberately trying to re-enact a Joy of Tech cartoon, or actually have nothing to do with their lives.

Friday, June 06, 2008

This means business

This is the design of the business card I'll be taking to WWDC. Let's look at some notable features.



  • Photo in top left. I don't know about you, but I find it much easier to remember what someone looks like than who they are - this card is available so that people can combine the two.

  • Plenty of blank space. The back is also entirely empty and can be written on. When people exchange business cards there'll usually be some context, it can help you remember what that is if you write it down. For instance, I've got cards from previous WWDCs with handwritten notes like "webobjects employer", "bindings", "parallels openstep" and "huge hat" (bonus points for identifying all four).

  • Questionable source code snippit. This is a deliberate ploy to annoy fellow-developers with our compulsive attention to detail, thus helping to cement the meeting in their mind as well as providing an inoffensive conversation starter.

  • Minimal contextual detail. I'm pretty sure I won't get through all 250 cards in one sitting, so they ought to remain relevant for as long as possible.

Wednesday, June 04, 2008

Little hack to help with testing

Want the ability to switch in different test drivers, mock objects, or other test-specific behaviour? Here's a pattern I came up with (about a year ago) to do that in a GNUstep test tool, which can readily be used in Cocoa:


NSString *driverClassName = [[NSUserDefaults standardUserDefaults] stringForKey: @"Class"];
Class driverClass = NSClassFromString(driverClassName);
id myDriver = [[driverClass alloc] init];

With a healthy dose of no, seriously, don't do this in production code, you now have the ability to specify your test driver on the command-line like this:


$ ./myTestingTool -Class GLTestDriver

This uses the oft-neglected behaviour of NSUserDefaults, in which it parses the executable's command-line arguments to create a defaults domain, higher in priority than even the user's preferences file. You can use that behaviour in a graphical app too, where it comes in handy when working in Xcode. It then uses a combination of the runtime's duck typing and introspection capabilities to create an instance of the appropriate class.

Saturday, May 31, 2008

Wistfully Wonderful Den of Coders

It's the time of the year to acknowledge that yes, I am going to WWDC this year. Left it a bit last minute to get the flights and the hotel, but everything is in place now so hopefully I'll see some of you guys/gals there. This is the first year that there's been anything going on that isn't Mac (it isn't the first year there's been non-MacDev, though; since the first WWDC I attended in 2005 there's always been an IT track occupying around 20-25% of the sessions, though not much lab space). There have been mixed impressions of that - a representative sample:


About Time

New developers might screw up the experience

New developers might realise how cool Leopard is


I think this is going to be an exciting conference, especially for the new developers. I've never been as a newbie; in 2005 I'd already been doing GNUstep, WebObjects, Cocoa and NeXTSTEP development for varing numbers of years, though admittedly without particular expertise. From a perfeshunal perspective I'm not amazingly excited about iPhone development, I might drop in to a few of the sessions just to see what the state of play is, what people are interested in, what apps they're creating and so on. No, for me this is the first year that I've actually got a project in full swing over the conference week so I'll be most interested in heading down to the labs and getting mmalc to write my code finding out what I could improve.


And, of course, the networking (by which I mean the going out for beers and food every night)…

Thursday, May 22, 2008

Managers: Don't bend it that far, you'll break it!

Go on then, what's wrong with the words we already have? I think they're perfectly cromulent, it's very hard to get into a situation where the existing English vocabulary is insufficient to articulate one's thoughts. I expect that linguists and lexicographers have some form of statistic measuring the coverage in a particular domain of a language's expression; I also expect that most modern languages have four or five nines of coverage in the business domain.


So why bugger about with it? Why do managers (and by extension, everyone trying to brown-nose their way into the management) have to monetise that which can readily be sold[1]? Why productise that which can also be sold? Why incentivise me when you could just make me happy? Why do we need to touch base, when we could meet (or, on the other hand, we could not meet)? Do our prospectives really see the value-add proposition, or are there people who want to buy our shit?


Into the mire which is CorpSpeak treads the sceadugenga that is TechRepublic, Grahames yrre bær. The first words in their UML in a Nutshell review is "Takeaway". Right, well, I don't think they're about to give us a number 27 with egg-fried rice. (As a noun, that meaning appears only in the Draft Additions to the OED from March 2007.) Nor is there likely to be some connection with golf. All right, let's read on.


UML lets you capture, document, and communicate information about an application and its design, so it's an essential tool for modeling O-O systems. Find out what's covered in O'Reilly's UML in a Nutshell and see if it belongs in your library.

Ah, that would be a précis, unless I'm very much mistaken. Maybe even a synopsis. Where did you get the idea this was a takeaway? I can't even work out what the newspeak meaning for takeaway might be. Had I not seen the linked review, I had thought the "if you take away one idea from this article, make it this" part of the article. In other words, if you're so stupid that you can only remember one sentence from a whole page, we'll even tell you which sentence you should concentrate on. This use[2] doesn't fit with that retroactive definition though, because the conclusion which can be drawn from the above-quoted paragraph is that one might want to read the whole article. I would much rather believe that management types in a hurry would remember the subsequent sentence as their only recollection of the article.


UML in a Nutshell: A Desktop Quick Reference is not misnamed.

[1]You may argue that the word should be spelled "monetize", as the word most probably came from American English, but it doesn't matter because it doesn't bloody exist. Interestingly, the verb sell originated in the Old English verb sellan, meaning to give, with no suggestion of barter or trade.


[2]Language usage is the only place I'll admit the existence of the word usage.

Monday, May 05, 2008

Social and political requirements gathering

I was originally going to talk about API: Design Matters and Cocoa, but, and I believe the title of this post may give this away, I'm not going to now. That's made its way into OmniFocus though, so I'll do it sooner or later. No, today I'm more likely to talk about The Cathedral and the Bazaar, even though that doesn't seem to fit the context of requirements gathering.


So I've been reading a few papers on Requirements Engineering today, most notably Goguen's The Dry and the Wet. One of the more interesting and subtle conclusions to draw from such a source (or at least, it's subtle if you're a physics graduate who drifted into Software Engineering without remembering to stop being a physicist) is the amount of amount of political influence in requirements engineering. Given that it costs a couple of orders of magnitude more to mend a broken requirement in maintenance than in requirements-gathering (Boehm knew this back in 1976), you'd think that analysts would certainly leave their own convictions at the door, and would try to avoid the "write software that management would like to buy" trap too.


There are, roughly speaking, three approaches to requirements elicitation. Firstly, the dry, unitarian approach where you assume that like a sculpture in a block of marble, there is a single "ideal" system waiting to be discovered and documented. Then there's the postmodern approach, in which any kind of interaction between actors and other actors, or actors and the system, is determined entirely by the instantaneous feelings of the actors and is neither static nor repeatable. The key benefit brought by this postmodern approach is that you get to throw out any idea that the requirements can be baselined, frozen, or in any other way rendered static to please the management.


[That's where my oblique CatB reference comes in - the Unitary analysis model is similar to ESR's cathedral, and is pretty much as much of a straw man in that 'purely' Unitary requirements are seldom seen in the real world; and the postmodern model is similar to ESR's bazaar, and is similarly infrequent in its pure form. The only examples I can think of where postmodern requirements engineering would be at all applicable are in social collaboration tools such as Facebook or Git.]


Most real requirements engineering work takes place in the third, intermediate realm; that which acknowledges that there is a plurality among the stakeholders identified in the project (i.e. that the end-user has different goals from his manager, and she has different goals than the CEO), and models the interactions between them in defining the requirements. Now, in this realm software engineering goes all quantum; there aren't any requirements until you look for them, and the value of the requirements is modified by the act of observation. A requirement is generated by the interaction between the stakeholders and the analyst, it isn't an intrinsic property of the system under interaction.


And this is where the political stuff comes in. Depending on your interaction model, you'll get different requirements for the same system. For instance, if you're of the opinion that the manager-charge interaction takes on a Marxist or divisive role, you'll get different requirements than if you use an anarchic model. That's probably why Facebook and Lotus Notes are completely different applications, even though they really solve the same problem.


Well, in fact, Notes and Facebook solve different problems, which brings us back to a point I raised in the second paragraph. Facebook solves the "I want to keep in contact with a bunch of people" problem, while Notes solves the "we want to sell a CSCW solution to IT managers" problem. Which is itself a manifestation of the political problem described over the last few paragraphs, in that it represents a distortion of the interaction between actors in the target environment. Of course, even when that interaction is modelled correctly (or at least with sufficient accuracy and precision), it's only valid as long as the social structure of the target environment doesn't change - or some other customer with a similar social structure comes along ;-)


This is where I think that the Indie approach common in Mac application development has a big advantage. Many of the Indie Mac shops are writing software for themselves and perhaps a small band of friends, so the only distortion of the customer model which could occur would be if the developer had a false opinion of their own capabilities. There's also the possibility to put too many "developer-user" features in, but as long as there's competition pushing down the complexity of everybody's apps, that will probably be mitigated.

Thursday, May 01, 2008

The Dock should be destroyed, or at least changed a lot

I found an article about features Windows should have but doesn't, which I originally got to from OSNews' commentary on the feature list. To quote the original article:


The centerpiece of every Mac desktop is a little utility called the Dock. It's like a launchpad for your most commonly used applications, and you can customize it to hold as many--or as few--programs as you like. Unlike Windows' Start Menu and Taskbar, the Dock is a sleek, uncluttered space where you can quickly access your applications with a single click.

Which OSNews picked up on:


PCWorld thinks Windows should have a dock, just like Mac OS X. While they have a point in saying that Windows' start menu and task bar are cumbersome, I wouldn't call the dock a much better idea, as it has its own set of problems. These two paradigms are both not ideal, and I would love someone to come up with a better, more elegant solution.

The problem I have with the Dock (and had with the LaunchPad in OS/2, the switcher in classic Mac OS, and actually less so with the task bar, though that and the Start Menu do suffer this problem) is that their job basically involves allowing the internal structure of the computer to leak into the user's experience. Do I really want to switch between NeoOffice Writer, KeyNote and OmniOutliner, or do I want to switch between the document I'm writing, the presentation I'm giving about the paper and the outline of that paper? Actually the answer is the latter, the fact that these are all in different applications is just an implementation detail.


So why does the task bar get that right? Well, up until XP when MS realised how cluttered that interface (which does seem to have been lifted from the NeXT dock) was getting, each window had its own entry in the task bar. Apart from the (IMO, hideously broken) MDI paradigm, this is very close to the "switch between documents" that I actually want to perform. The Dock and the XP task bar have similar behaviour, where you can quickly switch between apps, or with a little work can choose a particular document window in each app. But as I said, I don't work in applications, I work in documents. This post is a blog post, not a little bit of MarsEdit (in fact it will never be saved in MarsEdit because I intend to finish and publish it in one go), the web pages I referenced were web pages, not OmniWeb documents, and I found them from an RSS feed, not a little bit of NetNewsWire. These are all applications I've chosen to view or manipulate the documents, but they are a means, not an end.


The annoying thing is that the Dock so flagrantly breaks something which other parts of Mac OS X get correct. The Finder uses Launch Services to open documents in whatever app I chose, so that I can (for instance) double-click an Objective-C source file and have it open in Xcode instead of TextEdit. Even though both apps can open text files, Finder doesn't try to launch either of them specifically, it respects the fact that what I intend to do is edit the document, and how I get there is my business. Similarly the Services menu lets me take text from anywhere and do something with it, such as creating an email, opening it as a URL and so on. Granted some app authors break this contract by putting their app name in the Service name, but by and large this is a do something with stuff paradigm, not a use this program to do something one.


Quick Look and Spotlight are perhaps better examples. If I search for something with Spotlight, I get to see that I have a document about frobulating doowhackities, not that I have a Word file called "frobulating_doowhackities.doc". In fact, I don't even necessarily have to discover where that document is stored; merely that it exists. Then I hit space and get to read about frobulating doowhackities; I don't have to know or care that the document is "owned" by Pages, just that it exists and I can read it. Which really is all I do care about.

Thursday, April 24, 2008

Yeah, we've got one of those

Title linkey (which I discovered via slashdot) goes to an interview in DDJ with Paul Jansen, the creator of the TIOBE Programmer Community Index, which ranks programming languages according to their web presence (i.e. the size of the community interested in those languages). From the interview:


C and C++ are definitely losing ground. There is a simple explanation for this. Languages without automated garbage collection are getting out of fashion. The chance of running into all kinds of memory problems is gradually outweighing the performance penalty you have to pay for garbage collection.

So, to those people who balked at Objective-C 2.0's garbage collection, on the basis that it "isn't a 4GL", I say who cares? Seemingly, programmers don't - or at least a useful subset of Objective-C programmers don't. I frequently meet fellow developers who believe that if you don't know which sorting algorithm to use for a particular operation, and how to implement it in C with the fewest temporary variables, you're not a programmer. Bullshit. If you don't know that, you're not a programmer who should work on a foundation framework, but given the existence of a foundation framework the majority of programmers in the world can call list.sort() and have done with it.


Memory management code is in the same bucket as sorting algorithms - you don't need for everybody to be good at it, you need for enough people to be good at it that everyone else can use their memory management code. Objective-C 2.0's introduction of a garbage collector is acknowledgement of this fact - look at the number of retain/release-related problems on the cocoa-dev list today, to realise that adding a garbage collector is a much bigger enhancement to many developers' lives than would be running in a VM, which would basically go unnoticed by many people and get in the way of the others trying to use Instruments.


Of course, Objective-C and ApPLE's developer tools have a long history of moving from instrumental programming (this is what the computer must do) to declarative programming (this is what I am trying to achieve, the computer must do it). Consider InterfaceBuilder. While Delphi programmers could add buttons to their views, they then had to override that button's onClick() method to add some behaviour. IB and the target-action approach allow the programmer to say "when this button is clicked, that happens" without having to express this in code. This is all very well, but many controls on a view are used to both display and modify the value of some model-level property, so instead of writing lots of controller code, let's just declare that this view binds to that model, and accesses it through this controller (which we won't write either). In fact, rather than a bunch of boilerplate storage/accessors/memory management model-level code, why don't we just say that this model has that property and let someone who's good at writing property-managing code do the work for us? Actually, coding the model seems a bit silly, let's just say that we're modelling this domain entity and let someone who's good at entity modelling do that work, too.


In fact, with only a little more analysis of the mutation of Objective-C and the developer tools, we could probably build a description of the hypothetical Cen Kase, the developer most likely to benefit from developing in Cocoa. I would expect a couple of facts to hold; firstly that Cen is not one of the developers who believes that stuff about sorting algorithms, and secondly that the differences between my description of Cen and the description used by Apple in their domain modelling work would fit in one screen of FileMerge on my iBook.

Monday, April 21, 2008

Tracking the invisible, moving, unpredictable target

An idea which has been creeping up on me from the side over the last couple of weeks hit me square in the face today. No matter what standards we Cocoa types use to create our user interfaces, the official Aqua HIG, the seemingly-defunct IndieHIG, or whatever, ultimately producing what is considered a usable (or humane, if you like) interface for Mac OS X is not only difficult, but certainly unrepeatable over time.


The "interface" part of a Cocoa user interface is already hard enough to define, being a mash-up of sorts, and to differing degrees, between the Platinum HIG which directs the default behaviour of some of the *Manager controls and the OpenStep HIG which describes the default behaviour of most, if not all, of the AppKit controls. If that isn't enough, there is an inexact intersection - some controls work differently in (what are loosely called, and I'm not getting into the debate) Cocoa apps than in Carbon apps. There have also been innovative additions on top of the aforementioned guides, such as sheets, unified toolbars and (the already legacy) textured interfaces. There have been subtractions from both - miniwindows still exist but nobody uses 'em, and window shading went west with Rhapsody.


But all of that is related to the user interface, not to user interaction (I'm in the middle of reading Cooper's The Inmates Are Running the Asylum, I'm going to borrow some terminology but studiously avoid discussing any of the conclusions he presents until I'm done reading it). It's possible to make HIG-compliant inspectors, or HIG-compliant master-detail views, or HIG-compliant browser views and so on. It's also possible to make non-compliant but entirely Mac HID views, coverflow views, sidebars and so on. But which is correct? Well, whichever people want to use. But how do you know which people want to use? Well, you could get them to use them, but as that's typically left until the beta phase you could ask usability gurus instead. Or you could take the reference implementation approach - what would Apple (or Omni, or Red Sweater, or whoever) do?


Well, what Apple would do can, I think, be summed up thus: Apple will continue doing whatever Apple were previously doing, until the Master User takes an interest in the project, then they do whatever the Master User currently thinks is the pinnacle of interaction design. The Master User acts a little like an eXtreme Programming user proxy, only with less frequent synchronisation, and without actually consulting with any of the other 26M users. The Master User is like a reference for userkind, if it all works for the Master User then at least it all works for one user, so everyone else will find it consistent, and if they don't find it painful they should enjoy that. The official job title of the Master User role is Steve.


All of this means that even inside Apple, the "ideal" usability experience is only sporadically visited, changes every time you ask and doesn't follow any obvious trend such as would be gained by normalisation over the 26M users. Maybe one day, the Master User likes inspectors. Then another day he likes multi-paned, MDI-esque interaction. On a third day he likes master-detail control, in fact so much so that he doesn't want to leave the application even when it's time to do unrelated work. Of course you don't rewrite every application on each day, so only the ones that he actually sees get the modernisation treatment.


So now we come back to the obvious, and also dangerous, usability tactics which are so prevalent on the Windows platform, and one which I consciously abhor but subconsciously employ all the time: "I'm the developer, so I'll do it my way". Luckily there are usability, QA and other rational people around to point out that I'm talking shite most of the time, but the reasoning goes like this. I'm a Mac user, and have been for a long time. In fact, I might know more about how this platform works than anyone within a couple of miles of here, therefore(?) I know what makes a good application. One problem which affects my personal decisions when trying to control the usability is that I'm only tangentially a Mac person, I'm really a very young NeXTStep person who just keeps current with software and hardware updates. That means I have a tendency to inspector my way out of any problem, and to eschew custom views and Core Animation in favour of "HIG is king" standard controls, even when other applications don't. And the great thing is that due to Moving Target reference implementation, I can find an application which does something "my" way, if that will lend credence to my irrational interface.


The trick is simply to observe that taking pride in your work and expressing humility at your capabilities are not mutually exclusive. If tens of other Mac users are telling me they don't like the way it works, and I'm saying it's right, apply Occam's razor.


And if there isn't enough fun for you in one usability experience, a bunch of us are presumably going to be providing the iPhone HIG-compliant V on top of our Ms and Cs before long.