Jon Udell highlights a technology lost in the late 1980's "A.I. Winter", but which kind of hung around and is now coming back into its own. I'm a veteran of the A.I. Winter and was disappointed when very practical tools like KnowledgeCraft and GoldWorks were lost in the hype of those days. Looks like GoldWorks is still available.
Some more resources to add to Jon's list...
Phil Windley draws some good points out of the interview with Grady Booch. But, and there's always a 'but' when I read something, my philosophy is there's no point in not being a critical reader. If I end up in the worse case being wrong or mistaken, I've still learned something.
Grady Booch writes just after Phil's last quote... One of the advantages of platforms such as .NET is that they codify a lot of those hard things so that global decisions can be made intelligently, then carried out locally by the individual developers.
I think this is untrue. I don't know if Booch is trying to flatter Microsoft, or the readership of the dotNET magazine, or if he really believes this.
I wish the journalist would have pushed for more detail. Why does Grady believe this to be so? Wouldn't that answer help the readers of the interview with how they use dotNET?
For one thing, I don't believe this is true for any current toolset. I'm not singling out dotNET. There is too much complexity in software development. Yes, we're building more complex systems than in the past, but that's no excuse... we should be taking unnecessary complexity out and putting more facility in. Facile --- most software is not.
From an interview with Grady Booch in dotNET magazine...
Q: What is the future of Web services and GXA in conjunction with traditional development?
Booch: I'm pretty jazzed with the direction of Web services. We look at organizations that are already effective at building component-based systems. Meaning they know how to componentize their stuff and deploy them that way. For them the move to Web services is actually a small step. Plus it's not a difficult step, because it represents a different packaging for the things they're already exposing. But there's value in it because now it allows them to transcend a particular platform.
I'm working with a group I think I can mention publicly: MedBiquitous, a project at Johns Hopkins led by Dr. Peter Green. The essence of the project is to try to tie together the various medical research agencies and data across a variety of different specializations in medicine, so that you break down the barriers of specialization. That's a gross simplification of what they're doing but it's pretty cool stuff. The essence of their architecture is based upon Web services, which enables them to make key architectural decisions that transcend particular platforms and technologies.
How helpful is this for you? As a reader and developer, should you expect more from an interview with an industry leader?
This seems to me to be filler in between the ads. I'm disappointed.
I strongly recommend that people join the ACM and use its Digital Library and read some of the things that influenced all of us in the CSCW discipline, e.g. Turoff's early work in conferencing, Flores' work on speech acts/workflow, etc. It's never been more relevant. Now that people have realized that we need to move beyond the Web, there are tons of ideas ripe for the picking in terms of innovative architectures for interpersonal communications & cooperative work e.g. awareness & event notification architectures, real-time & replicated architectures, etc. Explore SIGGROUP papers; head down to Group '03 this fall, and blog what you hear.
This probably is obvious: I couldn't have said it better myself.
I've been a reviewer for several major conferences over the years. The striking reaction I get over and over is how few connections people make to the work that has taken place in our community. Software is forever changing on the surface, but when you get more than skin deep, there are so many common themes, sources of inspiration, and even words of caution.
In an earlier posting, I characterized one axis of the debate as robustness versus agility, and wondered how we can have our cake and eat it too. Sam Ruby pointed to WS-Transaction, to which Patrick Logan replied "I think we need a rethinking of databases, messages, and coordination." I'm sure that's true. Meanwhile, what to do?
Exactly the right question to ask. Here is the kernel of my response.
Something further. Consider passing documents using the following grammar in an XML namespace as a header to the exchanged document.
Peter Nolan writes about Metaphor Computer Systems from the listserv at datawarehousing.com. If you have information about this company or its products you can add that to a wiki page at MetaphorComputingSystems. Endorsements for simple systems like this should be an inspiration to developers. I'm not sure if Meta5 is all or a subset of the original product.
If I said M3 it was a typo...Metaphor Computer Systems (M4) was founded in the early 80s (might have been 1982) and released a product called Data Interpretation System (DIS) in 1984 if my memory serves correctly. Ralph Kimball was one of the co-founders.
I'm not sure there are many places where 'the history of Metaphor' is documented. I wish there was, it was quite a company. In 1997 I got the opportunity to personally say thanks to Ralph for his part in M4 because it changed the direction I was heading in.
In 1991 I had one of those 'ah-ha' moments when I saw DIS demonstrated to a customer of mine. As I watched it took about 5 minutes to realise that what I was seeing was the way end user computing was 'meant to be'. I was completely 'sold' on DIS as 'the way' data analysis/analytical applications would be built in the future. So was the customer. They bought the product.
In another one of those industry 'if-onlys'. If only IBM knew what they had it could have been the office desktop we all use today. (IBM bought Metaphor in 1988. So in 1988 IBM owned an 'office desktop' far superior to MS Office which would not exist for a few years yet. Instead of pushing DIS they pushed Office Vision, which died.)
The list of features in DIS was endless. The folks at M4 broke so much new ground. They had some 400+ customers by 1993 and their customers were household names.
I wish there were some demos or screen shots still around to explain to people what it could do.
The most important thing about DIS was that the IT staff were no longer required to develop analytical applications. The biggest benefit of this was that the business users did not need to 'externalise' or 'communicate' their requirements to anyone. Working on the DIS desktop they could try out their ideas and if they turned out to be good they could be 'packaged' into an analytical app. Yes, I am talking 1986 here. And, as per my previous comment, the 'intellectual capital' of the analyst to who created the application was captured with the application to a very significant extent because any other business analyst could read it. They barely needed any training to be able to read even a quite complex application.
One time an actuary customer of mine borrowed the manuals and from scratch, with no training and no help from anyone else, wrote a 30 year death experience analysis application in 3 weeks. It was that easy and that good to use DIS.
DIS was 'so cool' you didn't even need a keyboard. We used to put out keyboards on top of the screens and they would stay there for weeks on end. Try writing an analytical app today without a keyboard!!!!
Unfortunately, at the time IBM bought Metaphor the future of the world was PS/2, OS/2, MicroChannel, 8514A graphics adapters and Token Ring (at least according to IBM). When moving the software from proprietary to IBM hardware, IBM specified ONLY IBM hardware. There were a lot of other reasons why Metaphor experienced trouble operating inside IBM. (They were not the only ones, who remembers 'Rolm' telephones today?)
DIS was all 32 bit and it was the first fully 32 bit app that was released on OS/2, but OS/2 was 'doomed' and by the time windows 95 came out the opportunity to get DIS out into the marketplace had pretty much gone away and MS Office had a stranglehold on the windows 3.1 desktop.
Eventually the product was stabilised and then finally withdrawn from marketing. I think that was around 1997/8.
But there are many of us out there who look at what we could do then and what we can do today and wonder how it can be that there is so much that is still so hard to do now that was so easy then. Call me nostaligic!!! ;-)
From Jon Udell's item on J2EE/EJB...
You need your business logic available to multiple applications. What if those other applications aren't Java-based?
EJB is designed for CORBA clients.
The problems include, though, firewalls and interop. Unfortunately just as CORBA was rounding third and heading toward home on those issues as well as the price/performance issue, SOAP became a monkeywrench. Too bad really that CORBA could not have been the foundation of an "Enterprise SOAP" that evolved toward some of the flavor of XML messaging.
Also, CORBA 3.0 incorporates its own "Component Model", an improved EJB model that is language neutral. Maybe too little, too late.
I just stumbled upon these WS usage scenarios. I've not seen them before, and have read through only a couple of the discussion messages. (Someone probably pointed me to this list some time ago. I "stumble upon" things this way now and again.)
Apropos to my interests, the latest message in the discussion is "REST, uniformity and semantics". Actually REST shows up about every 10-20 messages. When I am awake tomorrow I'll wonder if the conversation ever goes anywhere, and what else is discussed.
From the thoughts of Monty Widenius (MySQL)... What I am looking forward to is diskless computers, a little similar to PDAs today, but with much better capacity. Hard disks will be used more as very large backup devices for things that you don't need regularly.
Battery-backed RAM would be an incredible boost to performance and simplification to implementing ACID transactions. The other big win for large memory databases is to distinguish read-only data, where you could get rid of most semaphores as well. That's not done enough today.
For habit's sake we insist on running OLAP from OLTP systems designed 20-30 years ago. Newer designs can fit far more RO data in RAM too.
(This is a good time to be a computer scientist. Lots of change over the next five years.)