Personal History Of Computing

tuned profile image Lorenzo (Mec-iS) ・8 min read

Philosophical foundations for digital programming

linked network

Some Introductory words

"After the tool, responding only to his hand; after the machines, covering complex tasks and operations but subject to his will; here he is to delegate automata to take care of managing and thinking in place of himself, on the basis of apparently rational criteria.“ — H.J. Martin, 1988

I — On contemporary society

“If you want peace, prepare to set yourself free”
“Si vis pacem para remissionem”
“Se vuoi la pace, prepara la liberazione” — A. Capitini

As young reckless troglodytes, in this new Neolithic of competences, we struggle searching for a feeble fire, donated by some other superior ancestors. They are together with a kind of a new machine­-mind demiurgos (crafter): Information [1]. Total immersion into media tracks our decisions; three centuries after political individualism single action reaches critic quantity leveraging electronics. The flames of valuable minorities (like myself or any other individual) can reach people, and be carried on. We are again fording on, carrying weak flames on thin branches from camp to camp, struggling to remember, to have a clear representation of human cognition.

This constant intellectual (Latin — inter+lego: collect, gather [2]) process, of camping and leaving from/to representations or from/to frameworks, typical of digitally-high-exposed jobs (communication jobs, finance, coding, programming,engineering, automation ...) proposes new tones of colours for a “nomadic” approach to frameworks and “theories”. Mostly because immobility can spot up dogmatism and because obsolescence is an almost omnipresent threat: to avoid which, we try to build tools to put up this logical ‘architrave’ (a cornerstone for our Archive?[11] )[3], that is for sure a mathematical, physical, algorithmic and, finally, complex tool. Let’s iterate and see what can we say by defining the words “theories” “systems” and “frameworks” in the context of highly interconnected digital networks. This can help in building a caliper to fairly measure reality with the right amount of subjectivity: an Existentialism for Computability (what we are going to define as being-search). What are the strings pulled on logical thinking and the Self by knowledge networks (networks of digital media)? We would like to argument that is the cybernetic yoking of digital networks themselves, with their new languages, artificial or natural. We draw shapes for these structures leveraging, among others, the concept of authenticity, finalised to highlight individual possibilities inside the networks that provide shape translate Information.

Let’s take this starting datum: for every frame into the monitor of a nowadays computer (1920x1080 pixels) we capture the same amount of Information transmitted in months by the whole of the telegraphic network in its early times (1920x1080x32 bits can contain approximately 1,300,000 messages). This unit of measure (bit) has both logical foundation, as measure of a decision-searching process, and physical, as expression of a set of signals on a wire (the first assumption is evident if considering how telephone numbers could address a receiver in early age of telephone). As we are, accustomed to the tools of natural language-based (socio-economical) background, we face, hidden, issues processing masses of digital information. Some general misconceptions of the tooling required is a missing epistemological update; fortunately new theories and frameworks come to the rescue, they are just hard to find, and to carry on the struggle to update them consistently. We are drifting from the lack of knowledge source to the over-abundance of them. This implies a constant adjustment of our filtering abilities inside “frameworks” with the aim of understanding The Medium, the Internet. As a race driver is conscious of the movement of his/her engine’s pistons, we all need epistemological structure to interact and stack Information together with digital machines, we feel the “breathing” of the network and the rising or lowering currents. The way to epistemological structure implies “philosophic harmonisation of logic” worldwide, based on these main components: Mathematics and Humanism in the same feedback loop. This is what has been happening with global software products. What is the state of this comprehension? We need to unravel it in a way that saves diversity while dancing on a common layer, we need peculiarity of existence, this is the outcome of searching over an unprecedented number of sources of knowledge. Ultimately, frameworks, as most of tools, incorporate knowledge to ease usage or even provide black-boxed usage. Is there any difference in this kind of usages?

The sensation of third-­party-­user­-experience referred to cognitive inputs (estrangement), multiplied to the incredible set of digital media we use, is the central knot of the quippus [4] describing the necessity of moving from framework to framework. The movement/process itself becomes Science (Systems Thinking [5]), the importance of meta-skills (reading, writing, learning, possibly very fast) surges. Every one of these experiences needs representation (reading) and binary decision-making (writing) finely tuned on the medium delivering the experience, to allow understanding. The feedback loop dances between man and medium/machine, and the third space in-between, the interface. Reading a proper representation that allows interactions with the medium via the interface (somebody called this loop Cyberspace or, in some quite weird times in the past Will-Representation [6]) is a matter of knowing codes, languages, and metalanguages on which anything, society economics communication, nowadays happens to run. We call this way of solving the problem (aka managing the fire) ‘medium-driven approach to the being-search​’, as differentiating itself to a theory- (or proof-) driven approach to Knowledge. Some day-by-day practical implementations of this are research, “code literacy” or more in general pattern-recognition. This is now happening not by observing the manifest behaviour of units (that sounds theory driven isn’t it?), but instead by studying the tools they are massively using (a more “theoristic” approach) and how they are leveraged by the units, that is possibly one way to have some gist of the concept of Complexity. The preferred vehicles of this flow among computing models, media, societies are scientific discourses [8] that, in this case, are hybridisation of natural and artificial languages (algorithms); for which we need to absorb the tools of Cybernetics and Computer Science. As a starter, what we are trying to do here is to characterise what “framework” “method” and “technique” actually mean in this new context.

II — A nomadic approach: continuous differential adjustment

“And the reason that such complexity is not usually seen in human artifacts (tools or machines) is just that in building these we tend in effect to use programs that are specially chosen to give only behaviour simple enough for us to be able to see that it will achieve the purpose we want.” — S. Wolfram, 2002

Chain production (Fordism­-Taylorism) had been the way to mass development until current times. Since not so recent time industries have developed fully automated facilities, or even entire segments of industry; they define an output exclusively through machines controlled by few technicians. This movement from the tool to the robot had surely wide opened attractive scenarios for businesses in the services industry, through reallocation of workforce and the improvements in automated tasks and calculations. The goods produced by these facilities have been conquering the markets and found place into our lives; with shapes and materials but, above all, because of the huge immaterial knowledge they carry. How they influence the psychology and the sociology of people? What is the impact into the personal sphere of individuals? Which level of adaptation is necessary to use them consistently? These software-intensive objects (software in this case intended as the immaterial added value embodied in the products) bring a peculiar epistemological style compared to Tayloristic goods, that makes them the top vehicles for a new metabolism of knowledge. This is a trend and getting even more evident for 3D-printed products and eventually future SPIMEs (cloud-based blueprints deployed over automated industrial feedback loops [7]). SPIMEs are ultimately “fully artificial being-search”, while human beings are nowadays, in the context of cognition, “human-machine being-search”.

The process of “to be“[9], being-search for us, is adapting the shapes we use in logical thinking, aiming to a “continuous analysis of different types of transformation” (M. Foucault, 1978, in our interpretation we assume “transformation” in the meaning defined by Ashby [10]); probably the most important group of transformations is framework/paradigm shift, this kind appears to happen much more often in cognitive system behaving in highly-interconnected networks (what somebody calls "liquidity" of opinion). Going beyond the fixed “shape that programs the empirical world by the imposition of his teleology”. We assume here, this is a process of leaving the way of interpreting shift as discontinuity,

“… with the aim of removing the terms of totalitarian history from epistemé, using instead differential analysis.” — M. Foucault

This kind of approach, in the end, is typical of fairly acknowledged media. Modern approaches to the experiment (a “Proof-based epistemology” or complete Formal Logic that witnessed the first cracks appearing with Gödel) was an imposition of teleology (a mechanical system of actions and reactions). In need for new “stable” shapes we are experiencing in our representation abstract and concrete objects, their designers impose a representation of reality using, according to Foucault and Wolfram, a kind of unilateral schema on the empirical world. We are witnessing increase in changes of representations with increasing popularity of Making and Coding. How programming concepts made a switch of paradigm possible? How this paradigm switch touch our cognition? How this ledger of meta-tools (programming languages, protocols, data pipes) that is the Internet is radically changing the way people does Science? Some points: less stability, more decision-making in selecting sources, more powerful frameworks (collection of shapes and/or transformation) but more prone to evolution and/or obsolescence. The eventually probably the biggest issue we are facing: how can we handle being in societies with high-degree and frequency of framework/paradigm shift?

Computational sciences (see computational Linguistics or computational Biology, Genetics, ...) have nature of ‘search for code’ more than search for theories? This kind of search has been involving mainly mathematical and philosophical works, to reach a terrible conceptual breakthrough (summed by Hofstadter, 1976), realised cognitively by widely distributed Internet. Keyword of this unveiling track is the word medium. Medium as a vehicle, a tool, a machine/medium (a system part of a teleological loop): an inter-braiding of human machine and interface: a way of dealing between increasing heights in the morphology of Knowledge. This is an overview about what we can say up to now about the being-search; picking up the first part of our sentence above: “human-machine“ or “machine-mind“, it is up to us, digital-intensive workers, to give a practical example of what it means a proper (truthful, authentic) collaboration among humans and computational devices [12] through languages, frameworks, paradigms for software.

[1] https://www.scientificamerican.com/article/claude-e-shannon-founder/
[2] https://en.wiktionary.org/wiki/lego#Latin
[3] https://en.wiktionary.org/wiki/architrave#English
[4] https://en.wiktionary.org/wiki/khipu#Quechua
[5] https://en.wikiversity.org/wiki/Systems_Thinking
[6] http://critique-of-pure-reason.com/schopenhauers-key-concepts-1-representation-vostellung/
[7] https://en.wikipedia.org/wiki/Spime#Concept
[8] http://www.massey.ac.nz/~alock/theory/foucault.htm
[9] as starting point: https://en.wikipedia.org/wiki/Being_and_Time
[10] https://en.wikipedia.org/wiki/W._Ross_Ashby#Publications
[11] https://pdfs.semanticscholar.org/24d2/3b944b269356f88dcefd5536b02c4636bc3f.pdf
[12] https://en.wikipedia.org/wiki/Garry_Kasparov#Chess_and_computers

Copyright @ Lorenzo github.com/Mec-iS 2007-2019
This is re-edited, corrected and completed edition of the blog post published here

Posted on by:

tuned profile

Lorenzo (Mec-iS)


Python. Open Source. Rust. Open Science


markdown guide