Thoughts on the connected timelines presented in The Modern History of Computing, Web at 25: Revisiting Tim Berners Lee’s Amazing Proposal, and What is Web 2.0.
To study evolution is to discuss the interactions of countless elemental building blocks coming together to form the most basic of living things — the cell. It has a limited purpose and, simultaneously, it is a recognized miracle of connected, symbiotic reactions and systems. It’s so miraculous, scientists only need to find one living cell on another planet to stake fundamental claims that could alter our understandings about who we are in this universe.
Just one cell is the singular representation of an infinite potential for life. But to accomplish complex life, the cell has to follow a few rules. It needs a code, of sorts, to live by. The code has to be flexible, allowing the cell to adapt to different environments. The code has to promote connectivity and cooperation with other cells. The code has to be able to be passed on and, with the help of other cells, be modified or rewritten over time to adjust to changes in the surrounding. Those are some of the key aspects required for a cell to collect with other cells and become a tissue, that might work in conjunction with other tissue to create an organ, that might work with other organs to create a system, that may function in a collective of systems to create, for lack of a better word, a being.
Through the course of three collective histories, we see the evolution of computers as not only the creation of machines meant to help humans perform computations but, ultimately, systems now working together to create their own intelligence. The loom and other rudimentary technologies provide the building blocks and, ultimately, inspiration for the mechanical functions of a slightly more complex apparatus. Babbage theorizes what’s possible and his successors begin building with the mechanical technologies of the day. Turing, well ahead of his time, recognizes the need for computers to rewrite their own instructions in order to “think” progressively better. By the 50’s, advancements in chemistry and electronics realize a century of computational work into the form of a computer. The miracle cell was now real.
But it wasn’t connected.
Tim Berners Lee, collector of observations and available tools, not only saw the communication value of computer connectivity, he envisioned a shared data set and eventually imagined (and named) a collective identity we now know as the World Wide Web. Every computer, connected to every computer, would be more valuable than any single computer ever could be. With a few common concepts, Berners Lee connected the cells into a simple (at the time) system.
It comes as no surprise that the first efforts of Web entrepreneurs to profit from the network mimicked the marketing efforts of the other prevailing economic sectors. It’s what they knew. Working on the proven economic principals of the time, the Web 1.0 philosophy began building capital systems based on proprietary code, copyrighted sales models, and strategies designed to own the specific environment of a maximum number of users. Applying 90’s-era economics to a more advanced, organically organized system didn’t work and the dot-com failures proved that this was an altogether new economy and ecosystem.
From the ashes of the dot-com bust rose something well beyond a new technology. In fact, the newly-prevailing technologies such as RSS feeds and blogs were truly simplified examples of available technology. What changed was how they were introduced (not applied). A top-down, we-provide-value-to-clients model was turned inside out, allowing instead for the clients (users) to act as symbiotic cells in a system where they provide and increase value to each other.
The basic building blocks became the cell. The cell, once connected, became the system. The system, with user-driven input, is now writing, rewriting, and refining its own collective intelligence. Today we stand at the edge of the next curve. Whether it’s called AI or bots or some other trending term, Turing’s “computer brain” — the collection of parts and systems and symbiotic code that thinks and adapts and thinks better the next time — is a reality in its early stages. It’s at this point we begin to ask …
Is what began as a cell, about to become a being?

I think you have a point about the extent of intelligence and capacity for independent thinking. I can appreciate the sentiment that it starts with a cell. When we think of brainstorming, we think of all the posssibilites, this is what the computer does, but in a much quicker response time than a human ever could. Could this “being” outsmart humanity?
fascinating thoughts and questions here. the cell metaphor is an interesting one, and with all the news about artificial intelligence and machine learning, it’s valid to ask what all the digital technology cells are evolving into.
what role do we play in that evolution? will there be a day when we won’t be able to control these tools at all? (in some sense, it already feels like we only have so much control… but where exactly does our human influence start/stop?)
It’s interesting to me now that storage is so cheap now that nothing gets deleted. Nothing. Even though I’ve “deleted” my Facebook account (it felt great), I know all of that data still exists. Add to that the sheer amount of data being collected: My phone knows where I am at all times; Target knows everything I bought there; My watch knows my heart rate. The only remaining component for computers is, as you stated, the AI component that synthesizes this data into other usable information (which is already being done by marketers, to some degree).
It will make you think twice when beginning to search for something on Google and it seems to know the exact, obscure thing you were about to type.
Every person that I know who has deleted their Facebook account, share in your sense of relief. They’ve all insinuated that posting and viewing other peoples posts was like having a full or part-time job. The difference between the full and part-time job was dependent upon how honest they were when they spoke of the time that they spent posting and/or lurking.