MacWorld Lampoons Grand Central
Originally published on macresearch.org, around 2009. Reproduced from the author's archive; some links may no longer resolve.
MacWorld.com Lampoons Grand Central
MacWorld.com has reprinted a piece from InfoWorld by Neil McAllister. The basic premise of McAllister’s article is that while technologies that promise greater parallel performance, like Snow Leopard’s Grand Central, are useful on web servers, they will do nothing to change our experience on the Desktop, and will be considered by most developers too difficult to bother adopting.
McAllister writes:
Adding cores and trumpeting greater CPU power through concurrency is great marketing for hardware vendors like Intel—and Apple—but the truth is that today’s high-end PCs have long since exceeded the performance needs of casual users. It’s little wonder that netbooks are the hot hardware trend. Forget multiple cores; customer demand is even driving clock speeds downward. Complex processing is moving outward, into the cloud.
While it is true that mail clients and word processing apps will probably experience the least benefit from the new approach to concurrency, there are plenty of commonly used apps that will. Probably the most important app in our arsenal, the web browser, is an obvious candidate. The opportunities for improvements through concurrency are many and varied.
Netbooks may be ‘hot’ at the moment, but they are still a relatively small part of the total PC/Mac market. Most people use their computers for media apps like iTunes, iMovie, iPhoto, and GarageBand. If all you do on your computer is open email and tweet in your Twitter client, you don’t need multi-core or Grand Central, but if — like me — you are transcoding H.264 video, these technologies can’t come soon enough. The potential for concurrency in any data intensive app is enormous.
Rather than demonstrating that multi-core is marketing hype, McAllister has just shown the fundamental difference between Mac and Windows users. Mac users create more than spreadsheets and textual documents on their computers. They want to edit video, transcode it, and post it to YouTube. They want to index the facial features of thousands of photos, and have their computer automatically pick out individuals it recognizes. They want to create a screencast in full HD. You name it, there are countless examples of applications where more grunt would improve our experience as users.
McAllister also unwittingly points to another major difference between the Mac and Windows communities: the developers.
That’s all well and good, but it still implies a sea change away from programming practices that have served the developer community well for the last 20 years or more. Grand Central will make it easier to write parallelizable software, but it’s not any kind of fairy dust that will allow existing software to take advantage of multiple cores when it couldn’t before. Developers will still need to change the way they think to write good concurrent software, including picking up practices from the world of functional programming (such as closures). As the saying goes, you can lead an old dog to water, but you can’t make it ride a horse.
In my experience, the developers he is talking about are not Mac developers. I have absolutely no doubt that Mac developers can and will adopt Apple’s new approach to concurrency. In Leopard, Apple introduced a few new classes into its Cocoa frameworks that work much the same way as Grand Central, but at a higher level of abstraction. Those classes have been adopted to simplify the concurrency in many apps.
McAllister is just spouting the same nonsense that has always been leveled at performance improvements. I could edit and print a great looking document on the original Mac 25 years ago; have all of the performance gains that followed been for naught? Of course not. Tasks like text editing require next to no computational resources, but there are plenty of tasks that do. If anything, the number of applications for which a user would need high performance is increasing, rather than decreasing. Ten years ago, very few people edited video, or even had a digital photo collection. Consumer media is an area that is still far from performance saturation, when improvements to hardware no longer benefit the user. Think about that the next time you are waiting 3 hours for your computer to export a new home movie.