Ask A Genius 85 – Connectome and Genome (2)
Scott Douglas Jacobsen and Rick Rosner
February 10, 2017
*This session has been edited for clarity and readability.*
Scott: You mentioned the digital trace someone leaves. So, if you take the current popular social medial like Facebook, Twitter, and Instagram, people might be able to somehow backtrack how people process the type of information necessary for that, and then be able to get a rough map of how people’s brains might be laid out over time.
Rick: If you limit your second-level Turing Test to just tweets, you might be able to do a human-mediated imposter of somebody’s tweets (Encyclopædia Britannica, 2016). Hundreds of people are doing that with Trump’s tweets. Every time he tweets. People make parody tweets of whatever he says on Twitter.
If you could human-mediate somebody’s tweets, then you could build software that is not as good as humans at some parts, but better than humans are other parts. In the same way, you can do computer-based textual analysis to find trends that people weren’t previously aware of.
Trends in what kinds of verbs and nouns he used. Things people were only vaguely aware of. But you still have to you run it by people are this point because computers can’t run decent tweets. Even Watson is held up by teams of dozens of people who are making use of the statistical patterns, that have to be interpreted by people (TechTarget, 2017).
You can run a computer analysis of Trump’s tweets. You could find things that people who write fake Trump tweets are only vaguely aware of, but once it’s made clear it would make the fake Trump tweeters more effective at their job, or fake tweeting.
There was an episode of Black Mirror, where a woman’s boyfriend dies. She orders a simulation of him based on his social media presence. Since it is a science fiction program, the program is eerily accurate. That’s where the creepiness of the episode comes in.
Along with higher and higher degrees of fidelity of replication somebody’s behaviour and then eventually their inner life, there will be numerical indicators of how accurate that replication might be.
We know that we’re okay with less than 100% replication because we change from day-to-day. Nobody wants to live the same day over. Even in Groundhog’s Day, the same day happens over and over, but the main Bill Murray character accumulates information.
We are okay with forgetting information. It doesn’t bug us. All of the things that we’ve forgotten. Some of the things that we thought we’d always remember and don’t remember. We are okay with the degree of fidelity with which we reproduce ourselves from day-to-day.
Since beggars can’t be choosers, we’ll probably be okay with not great levels of technical resurrection when those are the only means of resurrection. From day-to-day, we have better than 99.9% fidelity.
Anything we liked about ourselves yesterday, we can find in ourselves today to 99.9%+ accuracy. Somebody said, “Don’t you wish you had the innocence and wonder you had at 8-years-old?” We can’t do that.
We can sit, think, and remember. Maybe, we can replicate the feeling of us as 8-year-olds to about 60% fidelity. Although, that leads to us needing to figure out what we mean by fidelity because most of the experiences for the 8-year-old are in there, but the brain architecture has changed too much.
So, you need things that trigger memory. It is not like you are remembering things from when you were 8-years-old, when circumstances prompted it. We need to learn more about brain architecture and consciousness. I assume that replication will become acceptable to people in big enough segments of the population to be commercially viable when replication offers 70-80% fidelity.
However, I don’t know how far the deal is, or how far along we are, to know what 70-80% fidelity would look like. We will figure it out. Eventually, we will be able to replicate people’s consciousness that is only a few degrees worse than our daily fidelity.
However, we decide to define. It will eventually become good enough to be in the high 90s. Where if somebody is dying and doesn’t want to, they will be able to come back with 96% accuracy. There will be a bunch of stuff that is lost.
More stuff will be not lost than lost. They’ll still have some version of themselves or something they can accept as a version of themselves, which is not too far from the person they used to be. There are processes associated with illness and aging that reduce our fidelity. Alzheimer’s is a disastrous destruction of fidelity. I’ve heard of something called ‘Pump Head’ (Fogoros, 2017). That’s not the technical term for it.
When someone is stuff on a heart and lung machine for 8 or 10 hours during cardiac bypass or some other surgical procedure, they need to shut down the heart. The mechanical pump doesn’t have as smooth an action as your own heart, and it beats up your blood cells.
The battered blood cells tend to clump together and make little clots or blockages. A lot of people who are coming out heart surgery within the few weeks after that lose a lot of their identity because they’ve had a lot of little teeny strokes from the beat up blood cells making lots of little blockages in the brain.
It reduces the fidelity of or the definition, or the sharpness, of moment-to-moment awareness. It sucks the joy out of people because it is like being wrapped in gauze at all different kinds of levels. My step-dad had cardiac bypass. It blunted his emotions.
Not that he was ever super emotional, but I was talking with my mom today, the doctor said, “Yea, people lose a feeling for things because their brains get beaten up.” Eventually, most people who get Pump Head are able to have their brain establish new pathways to work around some of the damage.
People come back to themselves over a period of months, even years. Plus, people get used to the new and reduced definition of moment-to-moment awareness. So, you already see different levels fidelity.
We will see more mechanically aided fidelity. Now, to go off on a tangent, you and I and other people have talked about how the different subsystems of the brain have to understand each other for consciousness to exist and for the brain to process information efficiently.
Every specialist subsystem in the brain needs to have a rough understanding of the work product of every other specialist subsystem, which, at first thought, makes you think that each part of the brain needs to have developed this language of understanding.
Some kind of translation mode that lets it understand what every other part of the brain is telling it, which seems like a big pain in the ass technically, biologically. It seems like a huge burden that every little part of the brain has to understand every other part.
But if you look at the information in consciousness as a universe, it’s own space and time. It may be that the language of understanding is tacitly built-in because the different clumps of information in the brain have shared histories with each other.
They developed along with each other. If you’re looking at information as the universe, the information looks like it came from a Big Bang with a shared history being generated as matter clumps up and emits gravitationally derived energy that travels throughout the rest of the universe, which makes the universe more and more defined (Wollack, 2014).
It is the apparent expansion of space. You start with a hot undefined and small universe. Then you end up an apparent few billion years later with objects in space being fairly precisely defined relative to the overall size of the universe. Maybe, that shared history builds in its own tacit understanding.
So, you have these clumps of information that can be seen as galaxies if you’re considering the analogy to extend to our actual universe. You can ask, “How does one galaxy of information understand what’s going on in another galaxy of information?”
The answer is they were once very close, spatially, and as they’ve grown apart have been continually exchanging or beaming energy past each other with the energy being absorbed into the scale and shape of space, making it apparently expand, and, maybe, that constant flooding of every galaxy with every other galaxy with energy, or flooding the universe which contains all of these other galaxies with photons that lose energy, with the lost energy being tacit information which is shared with space and the objects that space contains.
Maybe, you get that understanding, not for free but, without going to any lengths beyond the natural processes of the universe with those natural processes being seen as informational, as information acting according to the rules of information.
Of course, we’re limited by only seeing a momentary slice of the universe’s understanding of itself, which is proportional to the apparent age of the universe. We can only observe the universe.
We’ve only been astronomically observing the universe for a tiny slice of the universe’s understanding of itself, temporally. If it takes 30 billion years for the universe to have a thought, then we’re only going to have a 300-year slice of that information. So, we don’t understand anything, but we have a different way of understanding it visually.
Where the universe doesn’t understand its own information as a universe, it understands it as what the information means as a model of the world that the universe is getting information from.
- Encyclopædia Britannica. (2016, March 14). Turing test. Retrieved from https://www.britannica.com/technology/Turing-test.
- Fogoros, R.N. (2017, January 7). Pump Head – Cognitive Impairment After Bypass Surgery. Retrieved from https://www.verywell.com/pump-head-cognitive-impairment-after-bypass-surgery-1745241.
- (2017). IBM Watson supercomputer. Retrieved from http://whatis.techtarget.com/definition/IBM-Watson-supercomputer.
- Wollack, E.J. (2014, January 24). Foundations of Big Bang Cosmology. Retrieved from https://map.gsfc.nasa.gov/universe/bb_concepts.html.
 Pump Head – Cognitive Impairment After Bypass Surgery (2017) states:
A study from Duke University, published in the New England Journal of Medicine in February, 2001, confirms what many doctors have suspected, but have been reluctant to discuss with their patients: A substantial proportion of patients after coronary artery bypass surgery experience measurable impairment in their mental capabilities.
In the surgeons’ locker room, this phenomenon (not publicized for obvious reasons) has been referred to as “pump head.”
In the Duke study, 261 patients having bypass surgery were tested for their cognitive capacity (i.e. mental ability) at four different times: before surgery, six weeks, six months, and five years after bypass surgery. Patients were deemed to have significant impairment if they had a 20% decrease in test scores.
This study had three major findings
- Cognitive impairment does indeed occur after bypass surgery. This study should move the existence of this phenomenon from the realm of locker room speculation to the realm of fact.
- The incidence of cognitive impairment was greater than most doctors would have predicted. In this study, 42% of patients had at least a 20% drop in test scores after surgery.
- The impairment was not temporary, as many doctors have claimed (or at least hoped).
The decrease in cognitive capacity persisted for 5 years.
Fogoros, R.N. (2017, January 7). Pump Head – Cognitive Impairment After Bypass Surgery. Retrieved from https://www.verywell.com/pump-head-cognitive-impairment-after-bypass-surgery-1745241.
 Foundations of Big Bang Cosmology (2014) states:
The Big Bang model of cosmology rests on two key ideas that date back to the early 20th century: General Relativity and the Cosmological Principle. By assuming that the matter in the universe is distributed uniformly on the largest scales, one can use General Relativity to compute the corresponding gravitational effects of that matter. Since gravity is a property of space-time in General Relativity, this is equivalent to computing the dynamics of space-time itself. The story unfolds as follows:
Given the assumption that the matter in the universe is homogeneous and isotropic (The Cosmological Principle) it can be shown that the corresponding distortion of space-time (due to the gravitational effects of this matter) can only have one of three forms, as shown schematically in the picture at left. It can be “positively” curved like the surface of a ball and finite in extent; it can be “negatively” curved like a saddle and infinite in extent; or it can be “flat” and infinite in extent – our “ordinary” conception of space. A key limitation of the picture shown here is that we can only portray the curvature of a 2-dimensional plane of an actual 3-dimensional space! Note that in a closed universe you could start a journey off in one direction and, if allowed enough time, ultimately return to your starting point; in an infinite universe, you would never return.
Before we discuss which of these three pictures describe our universe (if any) we must make a few disclaimers:
- Because the universe has afinite age (~13.77 billion years) we can only see a finite distance out into space: ~13.77 billion light years. This is our so-called The Big Bang Model does not attempt to describe that region of space significantly beyond our horizon – space-time could well be quite different out there.
- It is possible that the universe has a more complicated global topology than that which is portrayed here, while still having the same local curvature. For example it could have the shape of a torus (doughnut). There may be some ways to test this idea, but most of the following discussion is unaffected.
Matter plays a central role in cosmology. It turns out that the average density of matter uniquely determines the geometry of the universe (up to the limitations noted above). If the density of matter is less than the so-called critical density, the universe is open and infinite. If the density is greater than the critical density the universe is closed and finite. If the density just equals the critical density, the universe is flat, but still presumably infinite. The value of the critical density is very small: it corresponds to roughly 6 hydrogen atoms per cubic meter, an astonishingly good vacuum by terrestrial standards! One of the key scientific questions in cosmology today is: what is the average density of matter in our universe? While the answer is not yet known for certain, it appears to be tantalizingly close to the critical density.
Wollack, E.J. (2014, January 24). Foundations of Big Bang Cosmology. Retrieved from https://map.gsfc.nasa.gov/universe/bb_concepts.html.
Scott Douglas Jacobsen
Editor-in-Chief, In-Sight Publishing
American Television Writer
License and Copyright
In-Sight Publishing and In-Sight: Independent Interview-Based Journal by Scott Douglas Jacobsen is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Based on a work at www.in-sightjournal.com and www.rickrosner.org.
© Scott Douglas Jacobsen, Rick Rosner, and In-Sight Publishing and In-Sight: Independent Interview-Based Journal 2012-2017. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Scott Douglas Jacobsen, Rick Rosner, and In-Sight Publishing and In-Sight: Independent Interview-Based Journal with appropriate and specific direction to the original content.