"I have a mind like a steel... uh... thingy." Patrick Logan's weblog.

Search This Blog

Saturday, October 20, 2007

Programming with Streams

Michael Lucas-Smith illustrates programming with streams and a helpful "functional" protocol added by the ComputingStreams package from Cincom Smalltalk's public repository. Michael makes the case that more code should be based on streams instead of collections, and use a common protocol like ComputingStreams instead of custom, one-off methods that won't play well with others. Streams are a better abstraction than collections because they inherently "delay" and scale up to and including infinitely long streams.

Streams are interesting creatures, fundamental to programming if you've got them available. See Structure and Interpretation of Computer Programming: "Streams Are Delayed Lists" and Richard Waters' Series package for Common Lisp for more examples.

Also this Smalltalk package is similar to Avi Bryant's Relational Object Expressions. ROE uses objects using the standard Smalltalk collections protocol to delay "realizing" complete, in-memory collections, enabling the original collections to be enormous numbers of rows in tables on disk. In the case of ROE the objects are delaying the eventual composition and execution of SQL, including the conditional expressions that may have narrowed the size of the selection.

Tuesday, October 16, 2007

Closures and Objects

I am not sure what Steve Dekorte is saying...

A language that uses objects on the bottom can use them for everything, but a language with closures on the bottom needs other types or "atoms" for things like numbers, lists, etc and then scatter around functions for operating on those other types. If you care about simplicity, consistency and organization, this is a big difference.
If you have objects, you still need compiler support for closures.

But if you have closures, you need no compiler support for objects.

Either way you need compiler support for other literals like numbers.

Unless you want to program numbers in the lambda calculus. And then too closures win.

Your name is different, but you're really a Unix system too, aren't you?

Microsoft's Unix from back in the day, Xenix...

Sunday, October 14, 2007

Simplified Javascript: Cruft Reduced

A bunch of comments just showed up, June 21, 2008.

Like it or not (and FWIW, I like it quite a bit), JSON has made an impression on networked information exchange. Douglas Crockford, who started the whole JSON thing, has a chapter in Beautiful Code on Pratt-style parsing. His example parser is written in, and happens to parse, something he calls "simplified javascript".

Simplified JavaScript is just the good stuff, including:
  • Functions as first class objects. Functions in Simplified JavaScript are lambdas with lexical scoping.
  • Dynamic objects with prototypal inheritance. Objects are class-free. We can add a new member to any object by ordinary assignment. An object can inherit members from another object.
  • Object literals and array literals. This is a very convenient notation for creating new objects and arrays. JavaScript literals were the inspiration for the JSON data interchange format.
Implementing some of the currently popular "scripting" languages, in particular "full" Javascript, Python, and Ruby will make evident the amount of cruft they have for no apparent reason. Lisp traditionally has had some cruft, but, arguably, nothing like what can be found in these languages. Such cruft makes these languages significantly more difficult to implement than implementing Lisp or Smalltalk (two amazingly "cruft-reduced" languages given their age).

Simplified Javascript would make a decent base for a scripting language. I pointed this out to a couple people looking at full Javascript over the last couple of months for an Erlang-based server and now for Croquet.

People already in the more or less "end user scripting" space tend to have some Javascript knowledge. This subset has good functional and OO capabilities, comes with a simple parser implementation already(!), and from implementing a bit of interpretation code already, seems amenable to a simple elisp/emacs-level of performance, which is more than sufficient for an interactive GUI system. Compiling to a native or byte code would not be too difficult for better performance.

Simplified javascript not only eliminates the uglier and more difficult to implement bits. It also eliminates a lot of bad security problems.

And so on designing a new scripting capability, combining Simplified Javascript with a capability-based approach to authority as with the E programming language could have some benefits. And JSON for "data"... go ahead and use "eval" to implement the JSON parser in such an environment.

Well, that would be a heck of a lot better than the current browser/javascript situation which is horrendous. Now we've got a truly undesirable "legacy" situation(s) in the browser. Apparently Mark Miller or one of those E/Capability folks is at Google now and their looking at something like the browser those folks built for DARPA. (Found in this powerpoint if you don't mind opening those up in OpenOffice.)

Innovation

From Business Week on improving what already works vs. finding new things that work better...

As once-bloated U.S. manufacturers have shaped up and become profitable global competitors, the onus shifts to growth and innovation, especially in today's idea-based, design-obsessed economy. While process excellence demands precision, consistency, and repetition, innovation calls for variation, failure, and serendipity.

Indeed, the very factors that make Six Sigma effective in one context can make it ineffective in another. Traditionally, it uses rigorous statistical analysis to produce unambiguous data that help produce better quality, lower costs, and more efficiency. That all sounds great when you know what outcomes you'd like to control. But what about when there are few facts to go on—or you don't even know the nature of the problem you're trying to define?

"New things look very bad on this scale," says MIT Sloan School of Management professor Eric von Hippel, who has worked with 3M on innovation projects that he says "took a backseat" once Six Sigma settled in. "The more you hardwire a company on total quality management, [the more] it is going to hurt breakthrough innovation," adds Vijay Govindarajan, a management professor at Dartmouth's Tuck School of Business. "The mindset that is needed, the capabilities that are needed, the metrics that are needed, the whole culture that is needed for discontinuous innovation, are fundamentally different."

Planned RESTful Services

Stu says stuff about reuse...

From this viewpoint, "build it and maybe people will use it later" is a bad thing. SOA proponents really dislike this approach, where one exposes thousands of services in hopes of serendipity -- because it never actually happens.

Yet, on the Web, we do this all the time. The Web architecture is all about serendipity, and letting a thousand information flowers bloom, regardless of whether it serves some greater, over arching, aligned need. We expose resources based on use, but the constraints on the architecture enables reuse without planning. Serendipity seems to result from good linking habits, stable URIs, a clear indication of the meaning of a particular resource, and good search algorithms to harvest & rank this meaning.

This difference is one major hurdle to overcome if we are to unify these two important schools of thought, and build better information systems out of it.

The early "mashups" were certainly serendipitous. I would expect most service providers on the web today are planning for machine-to-machine to a much greater extent than even a couple of years ago.

Is it really "serendipitous" when an organization follows those good web practices and enters formal-enough agreements with participants, say in an intra-enterprise-services situation? No, that seems very much "planned". The biggest hurdle may be the addition, perhaps, of a stronger demand to negotiate terms for support and upgrades.

This is necessary of any kind of service, networked or otherwise, that has significant risks to stakeholders. If you buy SAP systems and bring them in-house, or if you network out to Salesforce, you negotiate terms of service with in-house IT, with the vendors, with the ISPs, etc.

Blog Archive

About Me

Portland, Oregon, United States
I'm usually writing from my favorite location on the planet, the pacific northwest of the u.s. I write for myself only and unless otherwise specified my posts here should not be taken as representing an official position of my employer. Contact me at my gee mail account, username patrickdlogan.