What is 'Internet Systematics'
It is an evolving thinking pattern concerning an interpretation of the Internet Computing Paradigm. It is the result of the observation as well as of the participation into the process of building the global Internet over the last 15 years.
The background stage has been that of dis-locating the von-Neumann computing paradigm, a prominent line of thinking in Computer Science in the 80s, in order to solve the software crisis problem.
Starting date may be considered the early 90s where European Research was advancing the idea of the national research network. It was the issue of sorting out between the ISO/OSI model versus the IETF/TCPIP model that fired questions such as 'what is network ?' and 'which is the winner model?'
The above questions fussed with the on-going line of thinking above to give birth to InternetSystematics, a term coined in 1999. The core concept is the type of systems that make the global Internet and how they evolve.
Why talk about it now ?
Because there has been enough proof data collected about it that it will be useful to connect this line of thinking to other important contemporary domains such as that of IRTF's end-2-end group which seeks the successor to the next generation Internet architecture.
So it seems to us that a conceptual picture about the global Internet and its evolution is rather useful in general, so we will proceed with its publication.
Another reason it that it has taken us quite a long time to understand where InternetSystematics leads to. Why is important to talk about it ?. OK, it is a kind of Internet model but what is its use ? Why should one be interested in the concept of Meta-computing created and demonstrated by Valentin Turchin, whose works serve as our reference model.
The answer to the above is that we firmly believe that the world is staging the process of Meta-artificial Intelligence, the successor to Turing's artificial intelligence that was staged at the closed premises of Bletchley Park in the 50s, a term that had enourmous consequences to our life. Perhaps something similar is awaiting this new term.
It is only very recently that we are able to comprehend net-automation advances, our basic theoretical result, as the definition of new kind of process: man-machine intelligence evolution. Here again, Turchin's concept of a meta-mechanical process has been the inspirational force.
Internet Systematics talk is informationally quite complex, it needs an advanced medium to be communicated. The presently used 'interface' will be the front-end of a more sophisticated medium based on wiki technology currently being constructed.
Note: whatever is in bold needs further blogging and referencing, please await further communication on it.
8 Comments:
This comment has been removed by a blog administrator.
Reading a recent interview of T.B.Lee about the semantic web I locate his anxiety to clarify the 'down to earth automation' aspects of the SW against the Artificial Intelligence orientation implied by the questions of the interviewer. This is in line with the interpretation of Net advances as simple automation steps offered by IS.
Here is an interview about the Semantic Web
http://www.consortiuminfo.org/bulletins/semanticweb.php
IS should read in aglosaxon 'Internet Systemics' but as the source language is greek where the word 'system' is the base concept 'atic' is the derivative rather than the 'ic' that bears no meaning.
Another 'clever' comment about IS ontology is that it provides a name to attach to a process and its output that it does not stay still. At some point it was the Telekom, then the ARPANET, and years later WWW, semantic web etc
Also, to remember that the blogger attempt of describing Internet Systematics has proved that the chaotic nature of the task does not lend itself for marshalling. The only good thing of blog-form is the simple day-by-day mapping of the authoring production which can be timed indexed. What was I producing one month ago. Going towards a wiki-form may help.
last clever comment: the meta-level aspects of practising Internet Systematics, perhaps allowing a 'system' based interpretation is also on the agenda.
what are these aspects ?
how it started, why it went to the certain direction ? how did it get legitimized ? noticeble transitions ?
variety of evaluations about possible applications ?
Used as a model to predict issues ?
To collect findings ? a kind of filter.
I find great similarities with the o'reilly emerging technologies conference
Note that in Turing's time the model for Computing was 'chess playing'. The problem of lacking a model for Internet has been already expressed along with the comment that the Von-Neumann model already stressed to all directions does suffice any more for Internet innovations.
Post a Comment
Subscribe to Post Comments [Atom]
<< Home