What did Farmer Brown Plant?
Mood:
hug me
Now Playing: Pumpkin Patch Pooch - Giant squash plant makes grease-spot out of family pet (excessive go and lots of poo)
Topic: Memories
Before we dive off in the potty pool here, let's take a glance back in the day to the kind of visions Bill Gates had for XML based technology.
I know this is seeing the world from Microsoft's view and what's that got to do with anything, but there's a reason for looking at how Bill Gates viewed the future. It's what he expected to leave as a legacy... not what he currently got stuck with.
Note the January 2003 lifting of the code name "Palladium" and the use of "now referred to as the next-generation secure computing base for Windows".
Microsoft was ready to take this to the field. The marketing posture was cast in that phrase. Have they brought that to the field? Not yet.
But THIS, was back in the day when Bill was feeling free and easy.
http://www.microsoft.com/presspass/press/2002/Jul02/07-24NETDayUmbrellaPR.mspx
Bill Gates Details Vision for Phase Two of .NET and Future of Web Services
Microsoft Announces Release Candidate 1 of Windows .NET Server, Previews Next Wave of Platform Technologies
EDITORS' UPDATE, January 25, 2003 -- Microsoft has discontinued use of the code name "Palladium." The new components being developed for the Microsoft® Windows® Operating System, which are described in this article under the code name "Palladium," are now referred to as the next-generation secure computing base for Windows.
REDMOND, Wash., July 24, 2002 — Microsoft Corp.'s Chairman and Chief Software Architect Bill Gates today outlined the company's vision and road map for phase two of Microsoft® .NET, the company's two-year-old software initiative for connecting information, people, disparate systems and devices. This next phase continues to build upon the XML-based interoperability of Web services, broadening the benefits to individuals, developers and organizations of all sizes. These efforts encompass software investments to help break down the technological barriers between people, systems and organizations as well as barriers to greater knowledge, trust and everyday use.
This next wave of technology investment builds upon today's Web services foundation to provide tangible benefits for the IT industry and goes beyond it to support dynamic business relationships between companies. Likewise, information workers will realize even greater productivity gains than they have in the past decade as Web services unlock critical information and enable them to make better business decisions. At the same time, Microsoft is providing a platform for innovation and opportunity that not only serves the needs of customers, but also of partners and the industry as a whole.
"In just two years, we've gone from debut to delivery of the first generation of Microsoft .NET. It's incredibly gratifying to see both its technology and its value to customers proven in the marketplace," Gates said during a briefing for press and analysts. "The broad industry consensus around XML-based Web services gives us a tremendous foundation for breakthrough work in many areas. The focus of phase two of .NET is on software that creates connected customer experiences that transform the way people live and work."
Breaking Down Barriers to a Connected World
Building upon the first phase of .NET that included the delivery of Visual Studio® .NET, a comprehensive suite of developer tools launched earlier this year, as well as the broad support for Extensible Markup Language (XML) and Web services across Microsoft's line of .NET Enterprise Servers, Jim Allchin, group vice president of the Platforms Group, outlined five areas of focus for the future. These areas capture the breadth of the company's future platform investments, all focused on breaking down technological barriers:
• | Breaking down barriers between systems and organizations. Tackling the problem of making it easier to connect different businesses and computer systems in a networked world, Microsoft described how it is advancing XML-based Web services working with the industry to provide a comprehensive foundation for distributed computing. Specifically, Microsoft demonstrated technologies that advance the XML Web services foundation to meet the requirements of businesses connecting their own disparate systems securely and reliably. Allchin also announced release candidate 1 (RC1) for Windows® .NET Server, which includes native support for the .NET Framework and will prove to be the most productive platform available for developing, deploying and managing XML Web services. Windows .NET Server will be one of the first products of the second phase of .NET. |
• | Breaking down barriers to trust. Identifying security, privacy and reliability as critical to realizing trustworthy computing, Microsoft detailed key investments to advance these goals, including "Palladium," a recently disclosed effort to create a new architecture for building trusted hardware and software systems. Microsoft also demonstrated forthcoming Microsoft Passport privacy and consent tools offering users more control of personal information in their digital world. In particular, Allchin demonstrated new technologies that will allow Passport users to easily and explicitly control their personal information on a site-by-site basis, enabling a richer and more private online experience. |
• | Breaking down barriers between people. Every communications mechanism -- e-mail, phone, instant messaging, group collaboration tools -- forces individuals to adapt to its approach. Microsoft's vision for next-generation communications uses Web services to enhance digital meetings and group collaboration and provides information-agent technology to unify and manage disparate communications mechanisms. Microsoft demonstrated its future direction for real-time communications and collaboration (RTC) server software code-named "Greenwich." |
• | Breaking down barriers to knowledge. As the volume of digital information continues to explode, a key goal is to help people not only keep up with the growth, but to effectively harness and distill information into knowledge and appropriate action. Microsoft showcased tools and technologies that will help developers and IT professionals unlock information and more readily analyze, visualize, share and act on that information. Highlighted technologies included the next version of SQL Server (TM) , code-named "Yukon," with technologies that will be the first step toward Microsoft's vision of unified data, as well as the forthcoming SQL Server Notifications Services for SQL Server 2000, which provides a highly scalable notifications system to alert individuals about new or updated data across a variety of delivery channels. |
• | Breaking down barriers to everyday use. Creating next-generation digital user experiences that are more useful and compelling and that work simply is a goal for users and technologists alike. Today, millions of people listen to or download their favorite music from the Internet and more than 35 percent of U.S. households take and store pictures with a digital camera and PC. Yet, it is still a challenge to seamlessly tie together the variety of experiences in a way that is useful and intrinsic to users and their needs. Microsoft highlighted key upcoming technologies designed to advance the quality of the user experience, including Windows XP Media Center Edition. |
Customer Experiences
During the briefing, Microsoft executives Eric Rudder and Jeff Raikes each addressed how technology investments in the next phase of .NET will benefit a number of customer audiences by delivering an experience very different than the past.
Developers
Eric Rudder, senior vice president of the Developer and Platform Evangelism Division, provided an overview of future versions of Microsoft's flagship developer tool, Visual Studio .NET. Rudder highlighted forthcoming versions of Visual Studio "Everett" edition and Visual Studio for "Yukon." The products will be designed to take advantage of Windows .NET Server and "Yukon," respectively. Rudder also demonstrated Web Matrix, an easy-to-use Web development tool recently released to the Web, which has had tremendous response from the broader developer community as evidenced by more than 100,000 downloads.
IT Professionals
.NET is aimed squarely at three of the biggest IT pain points: connecting disparate systems inside the organization and with business partners, addressing the applications backlog through improved developer productivity, and helping IT "do more with less" in the current economic climate. Rudder also outlined a renewed focus on deployment and operations, including efforts to use Web services infrastructure to make management intrinsic to all applications as well as integrated management solutions that combine development, deployment and operations into a unified process for managing the applications life cycle and delivering customer benefit. The company demonstrated for the first time its "Server Manager Project," which builds on the capabilities of Microsoft Operations Manager and Application Center to deliver end-to-end service management of Web-based applications to allow quicker analysis and resolution of problems.
Information Workers
Jeff Raikes, group vice president of the Productivity and Business Services Group, discussed the challenges that businesses and information workers are currently facing -- such as productivity, organizational and IT efficiency, and disconnected islands of data -- and outlined the key investments Microsoft is making to address these issues. Raikes announced version two of the Office XP Web Services Toolkit, tools that use XML to unlock the data within organizations in a way that is useful and relevant for information workers. Raikes also articulated a vision for the future of the various software investments that companies make on behalf of their information workers, including productivity applications, business applications and collaboration software. Each of these categories is evolving to take advantage of XML Web services and will drive better decision-making, collaboration and productivity.
Consumers
Throughout the day, Microsoft demonstrated how XML Web services will enable a broad array of rich and compelling next-generation user experiences that will break down the barriers to trust, everyday use and people. Specifically, Microsoft showed how forthcoming Passport technology will enable users to have more fine-grained control over how their personal information is managed online as well as how Windows XP Media Center Edition will bring the power of the Windows-based PC to home entertainment. In addition, Microsoft demonstrated future technologies that will enable a unified treatment of people and groups across a range of applications. Microsoft also demonstrated future technology that integrates XML-based Web services, allowing for new visualization and presentation via an immersive, multimedia experience.
Microsoft also highlighted MSN® 8, the newest version of MSN, which will debut this fall, to deliver a smart client with rich offline capabilities that incorporate building-block services such as Passport and .NET Alerts. In addition, MSN 8 will feature dramatically improved spam protection, online safety and security features such as parental-control features and virus protection, as well as a personalized user experience through a new My MSN home page. And, because it deploys updated software seamlessly in the background with availability as a subscription service independent of underlying access, MSN 8 is an example of software as a service in both the technical and business sense.
Enabling Partner Opportunity
Microsoft's technology investments go beyond creating new customer experiences and extend to incredible opportunity for the industry in general and industry partners in particular. Microsoft outlined the principles of the company's long-standing commitment to industry partners, including low cost of doing business and how .NET provides the industry's best total cost of ownership benefits by enabling partners to take advantage of existing skills, investments and assets; faster time to market via an integrated suite of highly productive developer tools and consistent programming across all tiers of an application; and increased revenue opportunities through relevant solutions for IT professionals, business decision-makers and information workers in any industry. Such principles foster the development of a dynamic, healthy partner ecosystem comprising business application vendors, systems integrators, service providers and hardware vendors of all types and sizes.
Founded in 1975, Microsoft (Nasdaq "MSFT") is the worldwide leader in software, services and Internet technologies for personal and business computing. The company offers a wide range of products and services designed to empower people through great software -- any time, any place and on any device.
Microsoft, Visual Studios, Windows and MSN are either registered trademarks or trademarks of Microsoft Corp. in the United States and/or other countries.
The names of actual companies and products mentioned herein may be the trademarks of their respective owners.
Note to editors: If you are interested in viewing additional information on Microsoft, please visit the Microsoft Web page at http://www.microsoft.com/presspass/ on Microsoft's corporate information pages. Web links, telephone numbers and titles were correct at time of publication, but may since have changed. Journalists and analysts may contact Microsoft's Rapid Response Team for additional assistance.
Posted by Portuno Diamo
at 12:33 AM EDT
Updated: Tuesday, 29 April 2008 1:01 AM EDT
P. Douglas :
One thing I would like to know: does the author prefer using web apps over comparable desktop apps? E.g. does the author prefer using a web email app over a desktop email client? Doesn't he realize that most people who use Windows Live Writer, prefer using the desktop client over in-browser editors? Doesn't he realize that most people prefer using MS Office over Google Apps by a gigantic margin? The author needs to compare connected desktop apps (vs. regular desktop apps) to browser apps, to gauge the viability of the former. There is no indication that connected desktop apps are going to fade over time, as they can be far richer, and more versatile than the browser. In fact, these types of apps appear to be growing in popularity.
Besides, who wants to go back to the horrible days of thin client of computing? In those days, users were totally at the mercy of sys admins. They did not have the empowerment that fat PCs brought. I just don't understand why pundits keep pushing for the re-emergence of thin client compputing, when it is fat PCs which democratized computing, and allowed them to write the very criticisms about the PC they are now doing.
Posted by P. Douglas | April 30, 2008 3:50 PM
portuno :
"I just don't understand why pundits keep pushing for the re-emergence of thin client compputing, when it is fat PCs which democratized computing, and allowed them to write the very criticisms about the PC they are now doing."
Because business and consumerism sees the move toward offloading the computing burdens from the client to other resources as a smart move. That's why.
Pundits are only reporting what the trends tell them is happening.
Posted by portuno | April 30, 2008 3:59 PM
P. Douglas :
"Because business and consumerism sees the move toward offloading the computing burdens from the client to other resources as a smart move. That's why."
Why is this a smart move? If the PC can provide apps with far richer interfaces that have more versatile utilities, how is the move to be absolutely dependent on computing resources in the cloud (and an Internet connection) better? It is one thing to augment desktop apps with services to enable users to get the best of both (the desktop and Internet) worlds, it is another thing to forgo all the advantages of the PC, and take several steps back to cloud computing of old (the mainframe). Quite frankly, if we kept on pursuing cloud computing from the 70s, there would be no consumer market for computing, and the few who would 'enjoy' it, would probably be confined to manipulating text data on green screen monitors.
"Pundits are only reporting what the trends tell them is happening."
Pundits are ignoring the trends towards connected desktop applications (away from regular desktop apps) which is proving to be more appealing than regular desktop apps and browser based apps.
Posted by P. Douglas | April 30, 2008 4:21 PM
portuno :
Why is this a smart move?
"If the PC can provide apps with far richer interfaces that have more versatile utilities, how is the move to be absolutely dependent on computing resources in the cloud (and an Internet connection) better?"
The PC can't provide apps with richer processing. The interfaces SHOULD be on the client, but, the processing resources needed to address any particular problem does not need to be on the client.
The kind of processing that can be done on a client doesn't need the entire library of functions available on the client.
If your hardware could bring in processing capabilities as they became necessary, the infrastructural footprint would be much smaller.
The amount of juggling the kernel would have to do to keep all things computational ready for just that moment when you might want to fold a protein or run an explosives simulation, would be reduced to the things the user really wants and uses.
An OS like Vista carries far too much burden in terms of memory used and processing speeds needed. THAT is the problem and THAT is why Vista will become the poster child for dividing up content and format and putting that on the client with whatever functionality is appropriate for local computing.
This isn't your grandfather's thin client.
"It is one thing to augment desktop apps with services to enable users to get the best of both (the desktop and Internet) worlds, it is another thing to forgo all the advantages of the PC, and take several steps back to cloud computing of old (the mainframe)."
Why does everyone always expect the extremes whenever they confront the oncoming wave of a disruption event? What is being made available is the proper delegation of processing power and resource burden.
You rightly care about a fast user interface experience. But, you assume the local client is always the best place to do the processing of the content that your UI is formatting.
The amount of processing necessary to accomplish building or providing the content that will be displayed by your formatting resources can be small or large. It is better to balance your checkbook on your client. It is better to fold a protein on a server, then pass the necessary interface data and you get to see how the protein folding is done in only a few megabytes... instead of terabytes.
"Quite frankly, if we kept on pursuing cloud computing from the 70s, there would be no consumer market for computing, and the few who would 'enjoy' it, would probably be confined to manipulating text data on green screen monitors."
We couldn't continue mainframing from that time because there was not a ubiquitous transport able to pass the kind of interface data needed outside of the corporate infrastructure.
Local PCs gave small businesses the ability to get the computing power in their mainframe sessions locally. And, until Windows, we had exactly that thin client experience on the "PC".
Windows gave us an improved "experience" but at the cost of a race in keeping hardware current with a kind of planned obsolescence schedule.
We are STILL chasing the "experience" on computers that can do everything else BUT formatting content well is STILL being chased - it's why "Glass" is the key improvement in Vista, is it not? It's why the "ribbon" is an "enhancement" and not just another effort to pack more functionality into an application interface...
THE INTERFACE. Not the computing. The interface; a particular amount of content formatted and displayed. Functionality is what the computer actually does when you press that pretty button or sweep over that pretty video.
Mainframes that are thirty years old connected to a beautiful modern interface can make modern thin client stations sing... and THAT is what everyone has missed in this entire equation.
Web platforming allows a modernization of legacy hardware AND legacy software without having to touch the client. When you understand how that happens, you will quickly see precisely what the pundits are seeing. That's why I said: 'Pundits are only reporting what the trends tell them is happening.'
"Pundits are ignoring the trends towards connected desktop applications (away from regular desktop apps) which is proving to be more appealing than regular desktop apps and browser based apps."
Do you know WHY "Pundits are ignoring the trends towards connected desktop applications"? Because there aren't any you can get to across the internet! At least until very recently.
If you're on your corporate intranet, fine. But, tell me please, just how many "connected desktop applications" there are? Microsoft certainly has little and THAT's even on their own network protocols.
THAT is what's ridiculous.
XML allows applications to connect. Microsoft invented SOAP to do it (and SOAP is an RPC system using XML as the conduit) and they can't do that very well. Only on the most stable and private networks.
DO IT ON THE INTERNET and the world might respect MSFT.
The result of Microsoft not being on the internet is their own operating system is being forced into islandhood and the rest of the industry takes the internet as their territory.
It's an architectural thing and there's no getting around those. It's the same thing you get when you build a highway interchange. It's set in concrete and that's the way the cars are going to have to go, so get used to it.
Lamenting the death of a dinosaur is always unbecoming. IBM did it when The Mainframe met the end of its limits in throughput and and reach. The PC applied what the mainframe could do on the desk.
Now, you need a desktop with literally the computing power of many not-so-old mainframes to send email, shop for shoes, and write letters to granny. Who's idea of proper usage is this? Those who want a megalith to prop up their monopoly.
The world wants different.
Since there are broadband leaps being carved out in the telecommunications industry, the server can do much more with what we all really want to do than a costly stranded processor unable to reach out and touch even those of its own kind much less the rest of the world's applications.
The mentality is technological bunkerism and is what happens in the later stages of disruption. It took years for this to play out on IBM.
It's taken only six months to play out on Microsoft and it's only just begun. We haven't even reached the tipping point and we can see the effect accelerating from week to week.
It's due to the nature of the media through which the change is happening. With PC's the adoption period was years. With internet services and applications, the adoption period is extremely fast.
Posted by portuno | April 30, 2008 11:55 PM
P. Douglas :
"The kind of processing that can be done on a client doesn't need the entire library of functions available on the client.
If your hardware could bring in processing capabilities as they became necessary, the infrastructural footprint would be much smaller.
The amount of juggling the kernel would have to do to keep all things computational ready for just that moment when you might want to fold a protein or run an explosives simulation, would be reduced to the things the user really wants and uses."
How then do you expect to work offline? I have nothing against augmenting local processing with cloud processing, but part of the appeal of the client is being able to do substantial work offline during no connection or imperfect / limited network / Internet connection scenarios. Believe me, for most people, limited network / Internet connection scenarios occur all the time. Also, the software + software services architecture minimizes bandwidth demands allowing more applications and more people to benefit from an Internet connection at a particular node. In other words, the above architecture is much more efficient than a dumb terminal architecture, or the one that you are advocating. This means that e.g. in a scenario where you have a movie being downloaded to your Xbox 360, several podcasts automatically being downloaded to your iTunes or Zune client software, your TV schedule being updated in Media Center, your using a browser, etc., and the above being multiplied for several users and several devices at a particular Internet connection, the software + software services architecture is seen to be far better and more practical than a dumb terminal architecture.
Posted by P. Douglas | May 1, 2008 8:04 AM
portuno :
@ P. Douglas,
"How then do you expect to work offline?"
Offline work can be done by a kernel dedicated to the kind of work needed at the time. In other words, instead of a megalith kernel (Vistas is 200MB+) running all functions, you place a kernel (an agent can be ~400KB) optimized for the specific kind of work to tbe done. This kernel can be very small (because it won't be doing ALL processing - only the processing necessary for the tasks selected - it can be only one of multiple kernels interconnected for state determinism) and the resources available online or offline (downloaded when the task is selected).
The big "bugaboo" during the AJAX development efforts in 2005 and 2006 was "how do you work offline"? The agent method places an operational kernel on the client which is a mirror (if necessary) of the processing capability on the remote server. When the system is "online", the kernel cooperates with the server for tasking and processing. When the client is "offline", the local agent does the work, then synchs up the local state with the server when online returns.
No online-offline bugaboo. Just a proper architecture. That's what was needed and AJAX doesn't provide that processing capability. All AJAX was originally intended to do was to reduce that latency between client button push, server response and client update..
"...part of the appeal of the client is being able to do substantial work offline during no connection or imperfect / limited network / Internet connection scenarios."
Correct. And you don't need a megalithic operating system to do that. What you DO need is an architecture that's fitted to take care of both kinds of processing with the most efficient resources AT THE TIME. Not packaged and lugged around waiting for the moment.
"...limited network / Internet connection scenarios occur all the time."
Agreed. So the traditional solution is to load everything that may ever be used on the client? Why don't we use that on-line time to pre-process what can be done and load the client with post processing that is most likely for that task set?
"Also, the software + software services architecture minimizes bandwidth demands allowing more applications and more people to benefit from an Internet connection at a particular node. In other words, the above architecture is much more efficient than a dumb terminal architecture, or the one that you are advocating."
"More efficient" at the cost of much larger demands on local computing resources. Much larger demands on memory (both storage and runtime). Much larger demands on processor speed (the chip has to run the background load of the OS plus any additional support apps running to care for the larger processing load you've accepted).
You will find there will be no "dumb terminals" in the new age. A mix of resources is what the next age requires and a prejudice against a system that was limited by communications constraints 20 years ago doesn't address the problems brought forward by crammed clients.
"This means that e.g. in a scenario where you have a movie being downloaded to your Xbox 360, several podcasts automatically being downloaded to your iTunes or Zune client software, your TV schedule being updated in Media Center, your using a browser, etc., and the above being multiplied for several users and several devices at a particular Internet connection, the software + software services architecture is seen to be far better and more practical than a dumb terminal architecture."
At a much higher cost in hardware, software, maintenance and governance.
Companies are not going to accept your argument when a fit client method is available. The fat client days are spelled out by economics and usefulness.
Because applications can't interoperate (Microsoft's own Office XML format defies interoperation for Microsoft - how is the rest of the world supposed to interoperate?) they are limited in what pre-processing, parallel processing or component processing can be done. The only model most users have any experience with is the fat client model... and the inefficiencies of that model are precisely what all the complaining is about today.
Instead of trying to justify that out-moded model, the industry is accepting a proper mix of capabilities and Microsoft has to face the fact (along with Apple and Linux) that a very large part of their user base can get along just fine with a much more efficient, effective and economical model - being either thin client or fit client.
It's a done deal and the fat client people chose to argue the issues far too late because the megaliths that advocate fat client to maintain their monopolies and legacies no longer have a compelling story.
The remote resources and offloaded burdens tell a much more desirable story.
People listen.
Posted by portuno | May 1, 2008 12:07 PM
P. Douglas :
"Offline work can be done by a kernel dedicated to the kind of work needed at the time. In other words, instead of a megalith kernel (Vistas is 200MB+) running all functions, you place a kernel (an agent can be ~400KB) optimized for the specific kind of work to tbe done. This kernel can be very small (because it won't be doing ALL processing - only the processing necessary for the tasks selected - it can be only one of multiple kernels interconnected for state determinism) and the resources available online or offline (downloaded when the task is selected)."
I don't quite understand what you are saying. Are you saying computers should come with multiple, small, dedicated Operating Systems (OSs)? What do you do then when a user wants to run an application that uses a range of resources spanning the services provided by these multiple OSs? Do you understand the headache this will cause developers? Instead of having to deal with a single coherent set of APIs, they will have deal with multiple overlapping APIs? Also it seems to me that if an application spans multiple OSs, there will be significant latency issues. E.g. if OS A is servicing 3 applications, and one of the applications (App 2) is being serviced by OS B, App 2 will have to wait until OS A is finished servicing requests made by 2 other applications. What you are suggesting would result in unnecessary complexity, and would wind up being overall more resource intensive than a general purpose OS - like the kinds you find in Windows, Mac, and Linux.
"The agent method places an operational kernel on the client which is a mirror (if necessary) of the processing capability on the remote server. When the system is "online", the kernel cooperates with the server for tasking and processing. When the client is "offline", the local agent does the work, then synchs up the local state with the server when online returns."
The software + services architecture is better because: of the reasons I indicated above; a user can reliably do his work on the client (i.e. he is not at the mercy of an Internet connection); data can be synched up just like in your model.
""More efficient" at the cost of much larger demands on local computing resources. Much larger demands on memory (both storage and runtime). Much larger demands on processor speed (the chip has to run the background load of the OS plus any additional support apps running to care for the larger processing load you've accepted)."
Local computing resources are cheap enough and are far more dependable than the bandwidth requirements under your architecture.
"At a much higher cost in hardware, software, maintenance and governance.
Companies are not going to accept your argument when a fit client method is available. The fat client days are spelled out by economics and usefulness."
Thin client advocates have been saying this for decades. The market has replied that the empowerment, and versatility advantages of the PC, outweigh whatever maintenance savings there are in thin client solutions. In other words, it is a user's overall productivity which matters (given the resources he has), and users are overall much more productive and satisfied with PCs, than they are with thin clients.
Posted by P. Douglas | May 1, 2008 2:27 PM
portuno :
P. Douglas:
"I don't quite understand what you are saying. Are you saying computers should come with multiple, small, dedicated Operating Systems (OSs)? "
What would you think Windows 7 will be? More of the same aggregated functionality packaged into a shrinkwrapped package? Would you not make the OS an assembly of interoperable components that could be distributed and deployed when and where needed, freeing the user's machine to use the hardware resources for the user experience rather than as a hot box for holding every dll ever made?
"unnecessary complexity"????
Explain to me how a single OS instance running many threads is less complex than multiple OS functions running their own single threads and passing results and state to downstream (or upstream if you need recursion) processes.
What I've just described is a fundmental structure in higher end operating systems for mainframes. IBM is replacing a system with thousands of servers with only 33 mainframes. What do you think is going on inside those mainframes? And why can't that kind of process work just as well in a single client or a client connected to a server or a client connected to many servers AND many clients fashioned into an ad hoc supercomputer for the period needed?
"Thin client advocates have been saying this for decades."
The most dangerous thing to say is "this road has been this way for years" and driving into the future with your eyes closed.
If your position were correct, we would never be having this conversation. But, we ARE having this conversation because the industry is moving forward and upward and leaving behind those who say "...advocates have been saying this for decades...".
Yada Yada Yada
Posted by portuno | May 1, 2008 3:48 PM