To Bob Warfield's point on the multicore crisis upon us already: yes, even on the desktop.
Here's an exercise -- think of your favorite or your most frustrating desktop applications. Maybe a browser, a mail reader, presentation or drawing apps. No matter which specific application comes to mind, that application almost certainly consists of long stretches of sequential code. Almost certainly a good bit of that code could be concurrent not in the sense of a parallel algorithm, but in the sense of there being no logical reason for C to follow B to follow A other than the languages used reinforce that style.
Anytime time you see an hour glass, you see an opportunity for concurrency.
I've been trying out Yahoo's beta email application. It's fine, but really cries out to be developed in a truly concurrent system. Moving more applications into an environment worse than modern desktops, let alone far from securely concurrent, is just a shame. Modern browsers are horrible platforms. We need a new browser model that is concurrent and secure. We need a new desktop that is concurrent and connected.
Both the desktop and the browser are poorly suited for upcoming many-core laptops and desktops. Not to mention five or ten years from now when we may well be deluged in so much more cheap iron looking to do something useful for us other than running bloated sequential code.
Programming shared memory threads in C# with transactional memory will not get us any closer to where we need to be. We need to think much differently with languages that support that thinking. It's only too bad we're still in the 1970s.
Tony Hoare developed monitors in the early 1970s then replaced them with concurrent message passing in the late 1970s. Then Java brought us monitors again in the mid-1990s.
Over a decade later we're still twiddling bits in critical sections stuck in a stretch of sequential code. It's ludicrous to think that's the natural human problem solving model. The tools have shaped our thinking. And I am rambling.
4 comments:
BeOS made excellent use of multi-core desktop machines back in the 1990s. They did it using shared-memory threads in C++, but with a very strong (and easy to use) message-passing concurrency model. A separate GUI-only thread for every window!
I miss BeOS.
Correction: Per Brinch Hansen invented monitors.
Thanks. From Per Brinch Hansen's HOPL-II paper, "Monitors and Concurrent Pascal: a personal history", he credits Hoare, Dahl, and Dijkstra, as well as himself for contributing to the idea.
Hoare and Hansen published independently in 1973. Hoare described them theoretically and Hansen implemented them in Concurrent Pascal.
Close enough for rhetoric. :-)
Couple of thoughts...
For better or worse, browsers are poor platforms for many things including parallelism. Most browsers adhere to an informal agreement of at most two in-flight requests per domain. You may want to have 10 XHRs in flight for fetching various parts of your mail, but FF won't do that without hacks like Fasterfox and others simply won't.
As for monitors in Java, that's certainly one way to do things, but you can accomplish other competing consumer approaches using Java (Mina's in-VM pipes for example) as well by layering Java on top of OS constructs like dbus, Unix sockets or even TCP. I'm not sure that's cheating, even Erlang relies on OS facilities like kernel threads (via pthreads for example) for some internals.
Where I'm most confused is around the statement that "It's ludicrous to think that's the natural human problem solving mode" is sequential code. Most people I know queue up, get their coffee, take the cup an pour the milk. I don't really see many people who start to pour the milk before they have the coffee in hand. In fact, I don't ever see anyone grab the milk before they have the coffee. Trivial example, but I think one of the main reasons so many people struggle with parallel programming is that people do tend to think in serial and not parallel. We're not wired for SMP in our consciousness (subconscious yes but most can't control that). Maybe I missed the point there, not sure.
Post a Comment