Blog Tools
Edit your Blog
Build a Blog
View Profile
« November 2011 »
S M T W T F S
1 2 3 4 5
6 7 8 9 10 11 12
13 14 15 16 17 18 19
20 21 22 23 24 25 26
27 28 29 30
You are not logged in. Log in
Entries by Topic
All topics  «
Apple Fritters
Calamity
Chinadotcom and VCSY
DD to da RR
Endorsements
Facebook
GLOSSARY
Gurgle
HP and VCSY
Integroty
Microsoft and VCSY
Nobody Can Be That Stupid
Notable Opinions
Off the Wall Speculation
Panama
Pervasive Computing
Reference
SaaS
SOA
The DISCLAIMER
The Sneaky Runarounds
TIMELINE
VCSY
VCSY / Baseline
VCSY / Bashed
VCSY / Infotech
VCSY / MLE (Emily)
VCSY / NOW Solutions
VCSY - A Laughing Place #2
Wednesday, 23 November 2011
Going back to basics

In order to prepare for VCSY's filing November 27, 2011 of their response to Interwoven's claim construction brief I'm going back to some basics in order to refresh the view:

 

Was .NET all a mistake?
Written by Ian Elliot  
Wednesday, 03 August 2011 09:04

http://www.i-programmer.info/professional-programmer/i-programmer/2830-was-net-all-a-mistake.html
...the recent unsettling behavior at Microsoft has caused me to re-evaluate my .NET experiences and think hard about where it all came from and where it is all going.

The Windows API technology moved on from C functions to an object oriented technology - COM - and this worked well with Visual Basic as long as it was one of the more sophisticated forms of COM, either ActiveX or COM Automation. Working with COM was, and still is, not an easy thing to do from C++ but most programmers learned how to live with it.

(more at URL)
---------------------------------------

COM was/is the Component Object Model - here's a wiki and a July 2000 tutorial on COM

It's important to understand at least an overview of Microsoft COM so you can understand the reason for the arbitrary object (component) structures that can be built with VCSY IP and the reason for the war.

We start here: "Component Object Model (COM) is a binary-interface standard for software componentry introduced by Microsoft in 1993. It is used to enable interprocess communication and dynamic object creation in a large range of programming languages."

Stop. First thing to remember.  VCSY's IP is a component object model for markup languages which are not binary but textual.

[2] "The term COM is often used in the Microsoft software development industry as an umbrella term that encompasses the OLE, OLE Automation, ActiveX, COM+ and DCOM technologies."

Meaning a variety of software objects treated as components which could be used by developers to build larger applications (a component is a tiny application - its purpose to facilitate control of the various capabilities of the operating system using an abstract set of graphics and text to aid the human developer) make up the Microsoft way of creating software and of integrating between a Microsoft and a non-Microsoft software.

VCSY IP is a way to build objects[521] that are used as components of larger applications[744],[629]:
[3] "Object Linking and Embedding (OLE) is a technology developed by Microsoft that allows embedding and linking to documents and other objects."

So in Microsoft technology there was a way to build software objects that were to act as processing components that can be plugged (essence of the "plug and play" concept) into connection with each other. The interaction of the specialized capabilities of each component combined to create complex software. Microsoft built its software empire of the 90's and today on COM.

But more specifically DCOM Distributed Component Object Model.
[4] "Distributed Component Object Model (DCOM) is a proprietary Microsoft technology for communication among software components distributed across networked computers. DCOM, which originally was called "Network OLE", extends Microsoft's COM, and provides the communication substrate under Microsoft's COM+ application server infrastructure. It has been deprecated in favor of the Microsoft .NET Framework."

[629] http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=HITOFF&d=PALL&p=1&u=/netahtml/PTO/srchnum.htm&r=1&f=G&l=50&s1=7,716,629.PN.&OS=PN/7,716,629&RS=PN/7,716,629 on the internet.

[1] wiki: http://en.wikipedia.org/wiki/Component_Object_Model
[2] tutorial: http://www.codeproject.com/KB/COM/comintro.aspx
[3] http://en.wikipedia.org/wiki/Object_Linking_and_Embedding
[4]wiki DCOM: http://en.wikipedia.org /wiki/Distributed_Component_Object_Model

-----------------------------

VCSY patents:

7076521
6826744
7716629

------------------------------

 

We've seen the focus in the claims construction document of 10-11-11 on arbitrary objects and arbitrary object libraries.

Here's a little background as to how Microsoft and Interwoven walked into the corner.

First we can learn a bit of useful information here in May 2005:
http://www.internetnews.com/ent-news/article.php/3504726/Interwoven+in+Microsofts+Gold+Circle.htm


I'm ready to write about the definitions as demonstrated against the history Microsoft and Interwoven had all the way back here:
http://www.microsoft.com/presspass/press/2000/jun00/InterwovenPR.mspx
(June 28, 2000)

by fleshing out what Microsoft and Interwoven and others were trying to do based on Microsoft's own words here:
http://news.cnet.com/2100-1001-241804.html
(June 13, 2000)

http://news.cnet.com/Microsoft-reveals-plans-for-Web-based-software-services/2100-1001_3-242273.html
(June 22, 2000)

here:
http://news.cnet.com/2100-1001-242277.html
(June 22, 2000)

here:
http://news.cnet.com/Commentary-Microsofts-Jupiter-expedition/2009-1069_3-1009533.html
(May 23, 2003)

and here:
http://news.cnet.com/Microsoft-dooms-Jupiter,-readies-BizTalk/2100-1012_3-5160976.html
(February 18, 2004)

and all to end up here:
http://www.zdnet.com/blog/microsoft/more-on-microsoft-jupiter-and-what-it-means-for-windows-8/8373
(January 6, 2011)

...for starters. A whole lot of smoke and hand waving but Microsoft sloughed off the work to partners in 2004 instead of taking the shot themselves. And we'll see how the "Jupiter" concept has been carried from back then to now.

 

-----------------------------------

Housekeeping. Cleaning up files and putting them here for reference and access. Ragingbull has been down now for four days for "maintenance". The last time this kind of thing happened they came back up with all files prior to January 21, 2003 gone.

-----------------------------------

Since I'm going assail sensibilities in the area of markup languages I feel we should also establish a view of markup languages:

=====================
http://en.wikipedia.org/wiki/Markup_language

=====================

http://en.wikipedia.org/wiki/Collaborative_Application_Markup_Language

http://en.wikipedia.org/wiki/Functional_programming

http://en.wikipedia.org/wiki/Declarative_programming

http://en.wikipedia.org/wiki/Logic_programming


Will Microsoft get to a point where a true base for a semantic language is developed?

===========================
PDF is from 2003
http://www.superfrink.net/athenaeum/e2.pdf

(The argument argues against using XML as a programming langauge while he proceeeds to demonstrate he has a shallow view of what a programming language can do.)

XML places a processing burden on sending and receiving systems.

serialisation refers to the process of placing the parts onto a sequential stream, that may, or may not, reflect their functional, spatial, or other relationships.

Data streams have been delimited and encoded in various ways since the inception of computers. Annotation has been less common, and is the aspect that now leads any people to suggest that XML data streams are self-describing.

It is partly the potential richness of the annotation of the data stream, together with the adherence to some predefined set of rules(schema), that has led to a misconception, by many in the IT industry, that data streams expressed in XML are, somehow, self describing in a way that their meaning can always be discerned by the receiving system. Nothing could be further from the truth!

It has to "parse" it and it has to "process it". It does the former using generic software that merely understands the structure and syntax. It does the latter by applying what is often described as "business logic" or "business rules" to the data items, so that the data can be validated, placed in appropriate locations in the receiver’s persistent store ( for later processing ), or shipped out to a user interface or ..., etc.

ERROR: The meaning of a <pzagg> cannot be supplied along with the message. BACKTRACKING [or if it were, the messages would be unworkably huge, and would still need to be expressed in terms of an agreed set of lower-level components from which the business logic is built].

Both the patterns and the rules have to be known a priori! That is the essence of business logic. That is the essence of almost any software that needs to process messages. If the receiving system receives an XML tag that is not in its vocabulary, it can do very little with it. It cannot magically impute meaning to it, just as we humans cannot impute meaning to a <pzagg> or a <mzdcyy>.

No amount of XML formatting is going to make these systems interoperable. The business logic at each end needs to change. They both need to conform to a common conceptual model.

CORRECT: XML is not intended for consumption by humans, but by machines! And machines are not humans! AND WOEFULLY MISUNDERSTOOD.

DUHHHH: XML was developed to allow flexibility in the way documents could be marked up. The tags to be used for annotating parts of the document, could be specified in a DTD (Document Type Descriptor ) which could be used by receiving systems to "understand" the document structure, and to process it according to the receiving system's requirements. WOOPS HE JUST DESCRIBED HOW XML CAN TRANSFER UNDERSTANDING FROM HUMAN TO MACHINE AND THROUGH THE MACHINES USING THE xml. ???so WHAT'S THE BEEF OF THE ENTIRE FIRST SECTION OF HIS PAPER?

GOOBER: The advent of business to business e-commerce required a mechanism for data and process interoperability, based on exchange of information using web (HTTP) protocols. XML seemed like a good option. It had already been used for limited exchange of web-based resources, using RDF (Resource Descriptor Framework) and RSS (RDF Site Summary). However, the number of concepts that needed to be represented was very small. As XML started to be used to cover a broader range of more complex information structures, the schema syntax became more complex. XML-Schema is a particularly complex specification, and although endorsed and promoted by W3C (World Wide Web Consortium), is only one of a number of schema representations in use. Yet its very existence, and the fact that it substantially overcomes many of the shortcomings of DTDs, has led to a belief by many that it "solves" the semantic interoperability problem, by adequately describing almost any document structure, even if it only does this in the semantic frame of reference of the source system. It is this last rider, that is overlooked by many proponents of XML.

OUR GUY JUST SHOT HIMSELF IN THE FOOT: XML is very useful for describing the structure of data streams; it can tag individual items of data; it can qualify such items; it can associate individual items with others in the stream; it can group items together; and, augmented with XML-Schema or its alternatives, it can allow for constraining, validation, data-typing and other structural niceties, that hitherto for have been difficult to achieve across a range of operating systems and APIs (Application Programming Interfaces). (DIFFICULT TO ACHIEVE ACROSS A RANGE OF OPERATING SYSTEMS AND APIs (Application Programming Interfaces). An API is a machine to machine interface just as a graphics page is a human to machine interface.

And the critical question: "But what is XML a de facto standard for? What should we adopt it for?" This is where he demonstrates his short sight.



So how would you have a system understand the language it's about to process? Provide a service to feed the knowledge about the system into the process about to process the language. If the processor doesn't have the ability or authority to digest the process instructions, the processor doesn't get to participate. That's how security should be provided - not by trying to spot a bad deed in a sea of general purpose capabilities as is the current paradigm with a hodge podge of languages and operating elements.

So XML is an attractive programming language because it offers one common agreed upon way of parsing and digesting instructions.

http://msdn.microsoft.com/en-us/library/ms973912.aspx
What's Wrong with the Windows API?

Why not just continue using the Windows API in the .NET environment? You certainly can, using the .NET Platform Invocation Services (referred to as "P/Invoke"). From the Visual Basic developer's point of view, calling the Windows API is no more difficult than using the familiar Declare statement. Using the Windows API has some serious drawbacks in the .NET world, however, and you might consider doing anything you can to avoid it. For example:

    The .NET common language runtime is meant to be platform non-specific. Whenever you use a Windows API call, you're tying your code to the specific platform (that is, a specific version of Windows, and Windows itself, as opposed to some other operating system) on which you wrote it. Converting your code to another platform, should that ever become necessary, will require modifications to each line of code that uses the API call.
    Calling the Windows API (or any unmanaged code in Dlls) from .NET isn't as simple as it was in Visual Basic 6.0. Limitations on the way structures work makes it tricky to pass structures to API calls, for example. In addition, API declarations from Visual Basic 6.0 will need to be altered, because of changes to data types and stricter type conversions.
    Techniques for using the Windows API (and external code, in general) differ from language to language. If your intent is to work in multiple .NET languages, you'll need to master different techniques for each language.
    Code that calls the Windows API requires users calling that code to have permissions to do so. This can compromise the security scheme for your application, and you'll need to plan ahead for this requirement.

The point here is simple: although you can continue to use the Windows API from Visual Basic .NET applications, you generally should look for alternatives provided by the .NET Framework if you can. Although the intent of the .NET Framework was not to shield you from ever having to use Windows functionality directly, the Framework does supply a large number of classes that can help replace your reliance on Windows API calls.
=============================


Posted by Portuno Diamo at 12:12 PM EST
Updated: Wednesday, 23 November 2011 12:31 PM EST
Post Comment | Permalink

View Latest Entries