Thursday, June 29, 2006

Research agenda

My Phd research (1983) was about the effects of a distributed computing architecture on functional programming, a simulation study using the SASL (1979 D.Turner) interpreter that I parallelized. and application programs expressed as SASL scripts.

Next phase concerned a deeper investigation about the nature of this novel programming paradigm (as John .Backus proclaimed in his ACM's Turing Award lecture in 1978). Seeking answers to why such a 'nice' tool did not become the killer-trend in IT while the infamous PC (personal computer) did. I re-examined a related philosophy and general outlook of functional programming given by V.Turchin (1988). There I satisfied my problematics by accepting his thesis that it is the evolutionary (a long-term trend) character of computing that matters not the one-hop destructive replacement (a short-term trend) that happens in IT up to now. We have to learn to construct systems that have an evolution capability embedded in them. This is his 'metacomputing' paradigm (1996). Turchin has implemented the concept by a tool that is called 'supercompiler' (from supervise and compile a program in order to optimise it).

My theoretical work with the Net is offering an evolutionary interpretation of the process that builds the Net adding new components to it (sub-systems) and although it replaces some old ones none the less it accumulates into an ever evolving conplex architecture. So we have a chained construction continuously advancing into something that I seek to comprehend. I applied somehow the idea of 'metacomputing' to the evolution of Internet as a method to solve the issue I raised.

My Research job in Networking in late 80s quickly brought me to the issue of developing a model of the Net within the general framework that a novel computing paradigm is emerging. From there I tried to identify its basic principles much the same way things are done in mainstream computer science. At some point in time I found 'similar' quests and got encouragement. Rohit Khare's articles is an example. Gradually the early model mentioned above aquirred a bigger set of characteristics which gave me a sense of forecasting Net events and developments. For example the selection of the WWW line rather than the initial GOPHER line in the beginning of 1990 as the prevailing trend. This was due to the fact that WWW had greater scaling capability based on automation mechanism (using DNS, introducing URL and HTML) against the manually operated mechanism of GOPHER and its tables. I was lucky to be a member of a Working Group at the time that was concerned with Networked Information Resources, there I was exposed to the thinking that led to the Web. This led me later to the awareness, as a corollary, that I had created an innovation radar since I had similar successes in forecasting Net advances. This was my way of 'proving' the correctness of the model.
The present phase of my research is concerned with wrapping up and presenting the model - a convergence of computing and communication developments of the past till the present.

Basic questions I have been dealing with:

  • what is this entity that is emerging
  • what are its origins and
  • what are its distinguished points of development.

No doubt meeting the slogan 'the Net is the computer' gives me a great sense of confidence in my research. What kind of computer is it ?

Another angle of my work is that I got involved with Net developments such as the introduction of multimedia into the Net by the MIME technology (1991 N.Borestein) a fact that led me to his associated works like Computational Email that matched my novel machine philosophy.
Another example is the introduction of CIDR into the Net taking place as a on-flight machine change (1995). The collection of facts like these are going to be integrated into my model, an unfolfing chain of steps, each adding a Net's component. What does this chain look like is my next challenge.
About 1999 I constructed a definite Net picture, a global computing (distributed) machine in fact but it is not the autonomous device of Turing but the joint man-machine enterprise that Turchin (1996) calls meta-machanical machine. The user of the machine is an organic part of it. Turchin's goal is constructive Mathematics so I could say perhaps I am on a right path since meta-mathematics (1930 Hilbert) established computing. My model of distributed computing and communication is using concepts from the same background field and this fact perhaps supports my approach. What I am saying (taking the risk) is that since the origin of computing certainly lies in the "meta" aspects of mathematics then my fiddling with the "meta" of computing (modulated as distributed computing, ie an evolution step) is promising. This is the idea of naming this blog "meta-artificial", the next to AI phase. Do not get me wrong, seeking a new computing paradigm has led me into this !.

In addition, I hope the presentation of this work will assist the global debate about the Semantic Web, the next step that is being constructed now. I will interpret this step as a new Meta-mechanical Automation Quantum that extends the global Virtual Von Neumann machine. A recent interview (2006 T.Berners-Lee) about the nature of Semantic Web fits well as the next frame of the picture I have created about the evolving Net.

These two concepts "Meta-mechanical Automation Quantum" and "global Virtual Von Neumann machine" are the pillars of my model of the Net.