[Recording Start]
Rick Rosner: I agree with the notion that everything, whether it has evolved naturally or been developed intentionally, represents an ordering of the world.
There was a significant period in human history, which could be termed the biblical era, where humans were perceived as distinct and superior to the rest of creation. Prior to this era, it’s plausible that humans, to the extent that they conceptualized themselves, saw themselves as merely another part of the animal kingdom, not exalted in any way. This perspective is suggested by cave paintings and other ancient artifacts, indicating that early humans viewed themselves as integrated into the natural world, without a distinct hierarchy. Then, as civilization progressed, there emerged a belief that humans were somehow divinely appointed as rulers over nature. However, since the Renaissance and the advent of scientific discovery, we’ve been gradually realizing that biologically, we’re not fundamentally different from other animals. Would you agree with that assessment?
Regarding Martin Luther King Jr.’s famous quote about the arc of the moral universe bending toward justice, it can be argued that every form of order is a form of information processing, exploiting regularities in the environment. Thus, the arc of the information processing universe is bending toward AI or increasingly sophisticated forms of information processing. Although not directly related, it’s interesting to note that since we started discussing these topics eight years ago, AI has become mainstream. It reminds me of how interracial couples have become widely accepted in culture, with only fringe opposition. Similarly, AI is now a common selling point for products, though many people, including myself, only have a semi-clear understanding of it. AI used to be associated with either apocalyptic scenarios or robotic companions, but now it’s marketed as an advanced technology capable of sophisticated algorithms and understanding customer preferences.
Let’s shift gears from this tangent.
It’s worth mentioning the different time scales of evolution and innovation. Initially, there were billions of years with no life, followed by the emergence of life on Earth about four billion years ago, starting with simple organisms that didn’t do much. This period of slow evolution lasted for billions of years. Then, about 650 million years ago, evolution accelerated with the advent of animals and plants, leading to a rapid increase in species competition and diversity. The arrival of humans marked a significant increase in the speed of innovation. So, we have these epochs: no life, simple life, rapid evolutionary life, and the human era. Now, we might be entering a fifth era dominated by AI and machine learning, which could further accelerate these processes. Would you say that’s a reasonable framework?
One naive argument I tend to make, as I am admittedly naive in this area, is that the technology behind machine learning processes information faster than humans do. We still only have suggestions about how humans physically create thoughts and memories. In our discussions, I think we’ve agreed that much of it happens in the dendrites, where constant rewiring stores information. But it’s not just rewiring; before that, there’s reweighting of brain circuits, where some neural pathways are strengthened and others weakened. This ‘soft learning’ eventually becomes ‘hardwired’ as dendritic activity solidifies effective thought patterns on a micro-level. Does that sound like a reasonable argument?
However, while this process in the human brain isn’t slow, it’s certainly slower than what machine learning can achieve. Then again, there’s the issue that humans are generalists, while machine learning, at this point, requires specific programming for specific tasks. The specificity allows the harnessing of the power of machine learning, which is not yet generalistic in nature. Take Watson on Jeopardy, for instance. It doesn’t understand the questions; it’s merely a probability engine. Watson’s ‘knowledge’ that Tycho Brahe was associated with Prague is purely probabilistic, not understanding. It’s an association engine, much like we are, but we have such a rich network of associations that it feels like consciousness.
Machine learning is incredibly fast and powerful once it’s adapted for a specific task, like playing Go. The initial programming, which includes the rules and objectives, is done by humans. After that, machine learning rapidly excels, becoming the best at that specific task. The future development of machine learning will involve less human input in setting up the task, allowing AI to frame its own challenges. A small example of this evolution is seen in Google Translate. It’s suspected to operate as a massive AI association engine across numerous languages and phrases, utilizing a kind of meta-language. This meta-language acts as a central nexus for concepts like ‘love,’ where it understands how this concept interrelates with other words in various languages. So, while some translations are direct and hardwired, more complex phrases might pass through this central associative nexus. This mechanism in Google Translate, acting as an intermediary meta-language, is a step towards a truer AI that can frame its own questions and develop expertise. Does that make sense?
I agree with that, and the term that comes to mind is ‘wherewithal.’ Before the existence of DNA and RNA, there wasn’t a highly efficient mechanism for passing on information to facilitate reproduction. Theories about what preceded DNA and RNA suggest that in certain environments, like specific silts, there were chemical conditions conducive to forming membranes, which are essential for cell structures. The formation of these proto-cells and the inclusion of proto-genetic material could lead to rudimentary reproduction systems, but it was a slow process due to the lack of sufficient wherewithal. However, once a genetic system was established, gene-based evolution could occur, marking a significant step forward.
The transition from apes to humans, or proto-apes, represents another phase change. When evolutionary glitches began producing creatures with additional brain matter, these animals fared better due to their increased behavioral flexibility, supported by the wherewithal for more complex brain functions. This led to a gradient favoring more brain development.
In the context of AI, there’s a similar phase change. Remember the computers of the 80s and 90s? They lacked the wherewithal for advanced functions. Most computers still don’t possess it, but we’re beginning to see AI systems that are just starting to have the necessary capabilities. Through intentional design and sheer capacity increase, we’re moving towards a significant shift. You can’t just link thousands of old-school computers and expect consciousness to emerge. But if you design computers with AI in mind and increase their processing capacity, we might see phase changes that push these systems towards becoming generalists, capable of framing their own questions and driving their own development, much like humans, but more powerful. Did that come across semi-clearly?
To achieve AI with human-like capabilities, it’s not just about flexible AI programming; there also needs to be a vast amount of circuitry involved.
[Recording End]
Authors
Rick Rosner
American Television Writer
Scott Douglas Jacobsen
Founder, In-Sight Publishing
In-Sight Publishing
License and Copyright
License
In-Sight Publishing by Scott Douglas Jacobsen is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. Based on a work at http://www.rickrosner.org.
Copyright
© Scott Douglas Jacobsen, Rick Rosner, and In-Sight Publishing 2012-Present. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Scott Douglas Jacobsen, Rick Rosner, and In-Sight Publishing with appropriate and specific direction to the original content.