Part of the problem is that natural language support is a time-consuming and specialized endeavor, especially if multiple languages are to be supported. It should be in the operating system...Why would this be "in the operating system"?
This could be a "service" that would fit well into a service-oriented architecture. As such it could be OS-independent, run locally or on the LAN or Internet. Also as such services could compete (e.g. in a blackboard), cooperate (also in a blackboard), and/or evolve independently (e.g. in a service-oriented architecture).
Update: Wes responds...
The problem is that natural language support requires maximum performance and should be available offline. In an application that uses it, the functionality will be tightly integrated with the operation of the application. It should be available as library in the same way OpenGL or data access libraries are available in Windows today.I can see that a natural language service would be compute-intensive and memory intensive. I don't see the connection between an application and the service being so intensive though. The communication bandwidth it seems could be relatively small compared to the space and time of computing results.
Combine this with the need for a high rate of evolution of the capability itself and the result is a need for flexibility at the component interface. Why rely on one vendor for this, even if it is Microsoft?
As for offline capability, I agree, but don't we need a general offline capability with many kinds of services? A service-oriented architecture should not be considered always-connected and always-remote.
Finally comparing this to OpenGL --- I suspect the high bandwith needed between the 3D engine and the display device is a different characteristic than natural language interpretation. Isn't NL a relatively isolated computational problem?