tag:blogger.com,1999:blog-237429792024-03-18T05:03:19.947-07:00Sudden Disruption... seeking simple answers to complex problems, and in the process, disrupting the status quo in technology, art and neuroscience.Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.comBlogger202125tag:blogger.com,1999:blog-23742979.post-60031610886544092602023-02-04T04:00:00.050-08:002024-01-08T11:33:58.822-08:00The Gnostic Neuron - Part 1 - A Simple Model of a Complex Brain<p></p><div class="separator" style="clear: both; text-align: center;"><br /></div><br /> Part 1<div><p><First version posted July 17, 2020></p><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEikKnaqMcsmy3xUih_RTd6hOOowzWuxfXtOXogRbKydU1MftPvbK3qKsDasiuXovVaOtRsGp8t7TlUF-_EHA9IJG_vS2Ruk_m3Av0BqZjWANvEfJCT8O3GrGXJMXvCwqVNidhec-y4p1pfhR0dylBcneB4rxyUdL68Rgzv4UZ6YYPUO1pFVfT4/s525/7azrm3.jpg" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" data-original-height="475" data-original-width="525" height="581" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEikKnaqMcsmy3xUih_RTd6hOOowzWuxfXtOXogRbKydU1MftPvbK3qKsDasiuXovVaOtRsGp8t7TlUF-_EHA9IJG_vS2Ruk_m3Av0BqZjWANvEfJCT8O3GrGXJMXvCwqVNidhec-y4p1pfhR0dylBcneB4rxyUdL68Rgzv4UZ6YYPUO1pFVfT4/w640-h581/7azrm3.jpg" width="640" /></a></div><br /><p><br /></p><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;"><br /></div><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">What if I told you… that Morpheus never actually said the above words?</span></p><p><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">What if I told you that what you “know” about the above quote is the result of the Mandela Effect, and so isn’t actually true? Does that make the stated assertion a lie? (Which once again validates everything you know?) Knowledge can be a slippery business.</span></p><p><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">What if I told you the reason for this false memory was that this meme was a better one-line summary of what Morpheus DID say during this pivotal disclosure? And that this better summary was independently created by some and passed from person to person over the last couple of decades to the extent that it replaced the original script of the movie in our culturally collective memory.</span></p><p><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">If you recognize the quote, you probably remember it as spoken by Morpheus from the movie, “The Matrix”. But your memory is wrong. Morpheus never uttered any of the above line. Go ahead, look it up. Or watch the movie again. I did. </span></p><p><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">So is the above quote a glitch in the Matrix? Nope. It’s a bit of</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> false</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> knowledge created by you, me, and many others from a cultural distillation of the conversation Morpheus had with Neo during Neo’s introduction to the “real world” in the film. The consequences of that scene in the film were so emotionally dramatic that you constructed knowledge about it which has ultimately become a collective cultural cue as others have done something similar with the experience. </span></p><p><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">That first part, “What if I told you…”, is meant to make you challenge what you think you know. The second half invalidates that knowledge. The quote embodies such a jarring summary that it has even become a meme on the internet for issues both trivial and profound. Trivial, creating humor, and profound for its impact. (By the way, cueing you with that visual image and text sets you up for the Mandela Effect.)</span></p><p><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">If you're like me and most others, you probably knew FOR SURE that Morpheus actually said the above line. So much for the accuracy of memories. So much for what we KNOW. I present this meme as an example of the actual nature of knowledge which is far less reliable than we generally think, and yet in other ways, far more useful than the actual truth, or what we know “for sure”.</span></p><p><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">The essence of the above assertion is that what Neo had experienced all of his life was not reality, but a computer simulation. That was the jarring part.</span></p><p><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">“Ironically, this is not far from the truth.”</span><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> - actually said Morpheus</span></p><p><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Except for the “computer” part and the “D” cell energy aspects, a simulation is a reasonable way to describe how the brain models the world. So is our experience of life a simulation? Yep. Actually, a sparse one but the consequences are just as extraordinary as what was depicted in the movie. Just without the Kung Fu, or ability to dodge bullets.</span></p><p><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">The simulation running in your brain is actually a dynamic collection of ionic neural signals in a cauldron of chemistry, but we’re getting the cart way ahead of the horse. We need to first understand the nature of these signals and the chemistry they affect, and in turn, are affected by other signals and chemistry. This is where I need to unplug you from the Matrix of your left-brain with its perspective of information technology and bring you into</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> the wisdom</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> </span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">of </span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">your more knowledgeable right-mind.</span></p><p><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"></span></p><p><br /></p><p><br /></p><div><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">"</span><b style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">If you can't explain it simply, you don't understand it well enough.</b><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">" - Albert Einstein</span></div><span><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I know why neurons fire, and I </span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">understand it well</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> enough to explain in a</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> relatively </span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">simple fashion, especially for such a difficult topic. I’m serious. </span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Researching the nature of neural connection and the concept of “knowledge” led me to a startling conclusion based on a single radical, yet simple idea:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Neurons create knowledge.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">More specifically, neurons literally create and define knowledge at the instant that they fire, and then they use this knowledge to cue scripts of muscle movement, yielding behavior. Such "knowledge" could also be described as a decision to fire, releasing its chemical signal. What does this even mean? How can biology make a decision, creating something as abstract as knowledge, let alone define it? </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Most knowledge can not be expressed as language, nor are words even needed for this pervasive and dynamic body of internal knowledge. Yet words are literally the expression of knowledge, a representation of knowledge outside of the skull and apart from the body.</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> Knowledge is the ethereal relationship between things.</span><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Knowledge is often confused with information, but a bit of knowledge is very different from an ordinary binary bit and would require many more of the digital kind to encode what it delivers. The encoding of knowledge is dependent upon what it moves or might move, and how this movement affects the world before being re-sensed in a continuous loop with the world. </span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Information is how knowledge is managed outside of the neuron or the brain in macro. Information is an attempt to objectify knowledge, but never quite hits the mark. It's typically a disembodied and refined REpresentation of specific knowledge fixed in some medium in the real world outside the skull. </span><span style="font-family: Arial; font-size: 18.6667px; white-space-collapse: preserve;">Knowledge knows. Information merely informs.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Data and information are more formalized, and frozen, knowledge. Knowledge is mostly internal and changing. Information is mostly external and fixed, but not exclusively so in either case. Knowledge is ancient, and information is modern; but again, not exclusively so in either case. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Am I being redundant with what might at first appear to be a minor exception? Yes, and on purpose. Repetition is part of how knowledge becomes information. It's why chanting came into existence: knowledge striving to be stored as information in the oral tradition. Both knowledge and information remain critical in this "age" of information. As long as we don't forget that it started with knowledge. Which is the point.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Most knowledge occurs far more often, and with far less quality than is generally assumed. The trick is in how we define and think about knowledge. If we relax and expand its definition in a very specific way, some fairly magical things begin to happen in modeling our multifaceted neurons, brain, and our world in general. The key is to understand the actual nature of knowledge. And how neurons create it. Neurons create knowledge, and knowledge is everything that isn't real. Knowledge is ethereal. This assertion begs a detailed clarification, which I’ll provide in due course, but here’s a quick overview: </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">It's widely assumed that knowledge and information are the same, or at least very similar things. They are not. They somewhat share a spectrum of quality and utility. Knowledge is pervasive and ORGANIC proto-information subjectively relative to a neuron, the brain, or a person.</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> Most k</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">nowledge generation is inherently biological, and there's far too much of it to even think about most of the time.</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> </span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Information is knowledge objectively viewed from the outside of a neuron, brain, or person. It's usually a more refined, abstracted, and relatively tiny subset of knowledge managed consciously in a physical form, such as words in the form of sounds or written text. This paragraph is a RE-presentation of knowledge elevated to the form of digital information. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Information can be sent as a signal if both ends agree upon its meaning typically represented by a state in some medium such as this text. This type of information signal is the objective form of knowledge.</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> Knowledge can be sent as a signal as well, but agreement is not required.</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> Instead, meaning evolves. The knowledge signaled by a neuron is far more dynamic but far less accurate and consistent. Agreement as to its meaning is a constantly changing process. Knowledge means what knowledge moves. Neurons only aspire to achieve consistency. They often fail gloriously. But typically in some useful fashion.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Information is normally defined by more objective consensus usually represented in some medium outside the skull. These information "states" only change for logical reasons. </span><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In contrast, neuronal knowledge starts from within and is inherently</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> subjective,</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> analog, ephemeral,</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> and ethereal</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">. It's also often surprisingly incorrect, and even illogical. Each bit of knowledge is the product of a specific neuron, at a specific moment, and only exists for that moment, useful or not. Knowledge is far more pervasive but far less reliable than information. Information informs us. Knowledge moves us. Sometimes. The exception is the key difference between objective information and subjective knowledge.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">If you're a technologist, the idea that knowledge is more primal and more organic than information should challenge your understanding of information theory, but the truth of this assertion is intuitively built into our language. You may already have a "feel" for this assertion. I'll try to flesh it out for you. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Of course, knowledge can also be captured in an inanimate book, but that's a RE-presentation of knowledge becoming information. The genesis of knowledge has an organic association created by our culture and language. Would you say that a door "knows" how to close? Even if it's spring-loaded? Why not? Even writing the question is intuitively awkward. Yet, "you" could be comfortably described as knowing how to close a door. Such language would be in good form. Also, this description of subjective and organic knowledge is not limited to humans. A horse knows the way home. A dog may know how to roll over. But would you ever attribute such "knowledge" to the typical automobile even if "turning over" is how we often describe the engine starting? That engine doesn't "know" how to turn over. The point is, that we intuitively KNOW the difference between authentic organic intelligence and the artificial kind. So far. Modern </span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">AI is finally blurring the line.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">(It was watching videos of Tesla automobiles using Full Self Driving when I first noticed the drivers comfortably describing the car's operation in a more organic form, crossing a sacred line between the living and the non-living - "</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">the car can see the pedestrian", or "the car knows where that bicycle is headed".)</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><br /></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">I'm getting a bit far afield for this introduction, but such subtle differences are just the beginning. There's much, much more to introduce.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><b>The brain in macro, and even each neuron, are multifaceted.</b> </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">How can a lone neuron with only a single output be multifaceted? That answer for me was the key to breaking a logjam of logic - metaphorically. Such a thing is possible because there are multiple ways of creating the same knowledge in the same neuron just as there are multiple ways of using such knowledge from the same neuron, such as freeze, fight, or flight, to name three of the most obvious examples. The ratio of activating versus inhibiting</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> synapses in each neuron is a critical hint.</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> Each facet in a single neuron both competes and cooperates with the other facets for control of when to fire the chemical signal for that</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> given neuron, neural net, or muscle group of the body.</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> And one facet need not preclude another. Or it may.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">This multifaceted yet unitary aspect of knowledge creation from a single neuron could be compared to a reversible robe in its simplest form. Such a robe may appear differently to the world and even feel different when wrapped around you, yet keep you just as essentially warm worn either way. Now think of such a robe with even more than two sur-faces (making it multi-faced and multifaceted), a type of multivariant robe that yields an invariant result. The key is understanding that the essential bit of knowledge in this case is warmth. This idea can also be described as "flexible invariance" which may seem like a contradiction in terms or even a paradox, but only if you think about it logically. The neuron has no such restriction.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">In a similar respect, the brain in macro form has different ways of coming to know the relationships between things in its environment from multiple senses at the same, or similar time frames. Our brain also has multiple ways of responding to such complex experiences. </span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Multiple faces confront the world for both input and output, sense and behavior. Sensing does not determine actual behavior. Neurons and the brain in macro do.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">This multifaceted nature of the neuron and brain, in general, is the reason for the seemingly contradictory behavior we describe as cognitive dissonance, passive-aggressive behavior, and hypocrisy in general. Multifaceted and competing neural nets also explain delusion, hypnosis, false memory, and optical illusion. Indeed, what we come to "know" are various types of illusion, some more useful than others, by degrees. Understanding our multifaceted nature is key to managing our seemingly conflicting behavior. Yes, the details get a bit complex, but would you expect anything less from such an efficient and resilient survival solution as the</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> brain</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">? And the</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> neuron</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">?</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><b>Knowledge Cues Scripts</b> </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Most significantly from an information theory perspective, neither knowledge nor its signal is stored as a fixed “state” in the neuron, or anywhere else in the brain. Instead of storing states, neurons evolve a very specific “sensitivity” to each experience much like an immune cell becomes sensitive to a specific pathogen, except more flexible and adaptable, making neurons much less "stately" than even an immune response.</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> When a similar circumstance reoccurs, that neuron may fire again in recognition of that specific bit of what is best described as approximate biological knowledge, and then adjusts its sensitivity to be a more effective cue for this particular bit of knowledge at the next opportunity. Again, this is similar to what happens when the body re-encounters a pathogen, just more flexible.</span><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> (Well,</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> if a single disease could be caused by multiple pathogens, but that would be pushing the metaphor a bit too far).</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> An immune response is driven by a type of hyper-specific knowledge used to help keep our bodies alive, healthy, and reproducing. So is knowledge in a more flexible and dynamic fashion.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">A neuron’s knowledge has a utility that is quite different from that of information, but no less significant. As other neurons fire, their specific knowledge joins in a convergent and cascading but sparse map of semiotic simulation that has evolved to create more abstract meaning from any particular experience. Each neuron knows something different, but it only knows that thing for the instant that it fires and then prepares to know that thing even better the next time it occurs in the world. Neurons only fire when they are cued by that thing from reality or imagination (AKA Global Neuronal Workspace), and that thing is best described as ethereal knowledge.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">An ionically mediated chemical signal representing this knowledge also diverges out to any other neurons that might find it useful. Ultimately, these somewhat divergent, but mostly convergent and hierarchically organized experience nets both compete and cooperate to form cues that drive scripts of muscle movement known as behavior. Each movement we make is informed by a </span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">crescendo of</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> convergent knowledge. How is this knowledge encoded? Mostly, it's not. At least not in the same way that information is encoded in a computer. Instead, neurons DEcode the world as they create knowledge, and this knowledge is constantly changing, much like the reality we encounter daily in our lives.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In the temporal background, typically out of the critical path, the cortex creates models of the world using a form of this stateless, signal-based simulation expressed as chemical feelings from both sides of the brain. We call these predictions emotions. Through the trick of priming, they increase the probability of physical movement we call behavior as the word e-motion implies. Processing thoughts in our left brain, and envisioning solutions in our right, are both higher-order forms of this emotionally driven effort. Emotions make our imagination "real" so that we'll respond in a way</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> similar to a stimulus from the world, only next time hopefully before it happens, yielding prediction</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Dreams are the off-line version of this type of chemo-semiotic stateless simulation, a type of practice run for the next real-world encounter, sorting out what we learned from forming fresh neural connections the day before, all while keeping our muscles carefully inhibited, but the emotions active. Dreams help to hone and firm up this stateless "memory" at night as a follow-up to the real-world sensitivity adjustments that have occurred during the day. This process is known as up and downregulation of neural connection, a form of biological normalization, somewhat similar to what we do with an information database. But also quite different. The result ranges from primal proto-knowledge joining together to drive increasing abstraction all the way up to information, and ultimately something approaching the truth. But by degrees. And over time.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The key to understanding the brain is this fresh perspective that neurons create knowledge, and that most knowledge is created by neurons. It’s only the quality and character of this knowledge that varies, and varies widely. Once we begin to focus on what each neuron knows, and how knowledge dynamically changes, can we begin to build a simple model of a complex brain.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><b>The Neurophilosophy of Language</b> </span></p><div><span><br /></span></div><div><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">If you’ve spent any time studying neuroscience or human behavior, this idea of neurons creating and defining knowledge may at first seem comical, radical, bizarre, or worse - meaningless. My first reaction was to laugh out loud. My second was, could it be this simple? I couldn’t look away. </span></div><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As I worked with the details of neuronal communication I soon discovered that the macro consequences of this gnostic model were so dramatic and answered so many questions about human behavior that my macro experience began to eclipse the work I was doing in the nano context with the synapses. This neo-gnostic model of neurons ultimately changed how I understand </span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">the world and even philosophy itself, which is of course the appreciation of such knowledge</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">It's now hard to see neurons as anything other than creators of knowledge. And that’s just the beginning. The concept changes not just how I see neurons and the brain, but also how I understand </span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">human behavior</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">. I now see adaptive knowledge behind the actions of everyone I meet. This model is dramatically shifting my perspective of everything. Like green letters dropping down the screens from the movie, “The Matrix”, I see bits of primal knowledge coming together in life to form effective behavior and ultimately emergent insight about everything I experience. This transformation is what I wish to share, but I'm torn between continuing to explore this model and describing its nature in this blog post. I'll try to do both in hopes that each will inform the other.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Am I delusional? Perhaps. But with a clear understanding of this first principle of the neuron and its multifaceted nature, the brain begins to make a lot more sense. The trick is to generalize and broaden the concept of knowledge while recognizing its multifaceted genesis. Once I understood that neurons literally created and defined knowledge, figuring out how this happened became a lot easier and revealed the brain's multifaceted architecture, and vice versa, yielding a map of astounding complexity largely based on this one simple principle.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre;">Even </span><span style="font-family: Arial; font-size: 14pt; white-space: pre;">more surprisingly, the concept illuminates language as a Rosetta Stone</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre;">of brain architecture hiding in plain sight. The connectome of the </span><span style="font-family: Arial; font-size: 14pt; white-space: pre;">brain is</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre;">ultimately reflected in our language and culture, but by degrees. This</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre;">evolutionary trick has evolved to yield knowledge, information, and ultimately,</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre;">wisdom.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><br /></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Words are literally the expression of this knowledge in the process of becoming disembodied information. When pre-motor neurons fire, they cue a script of choreographed muscle movements in the diaphragm, throat, tongue, and lips to create sounds. Or in the fingers to produce writing. Words are literally the expression of knowledge. So is virtually every other form of expression from dance to mathematics.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">What I’m about to present is not merely the redefinition of the "word" knowledge. It’s a radically different understanding of what it means to define all words which are only a very tiny subset of all knowledge. Knowledge is also likely the basis for all thought and imagination. From this perspective, etymology may shed light on harder problems. </span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">The most probable path to understanding the hard problem of consciousness is to understand the brain, and the most probable path to understanding the brain is to understand the neuron. It's also </span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">easier to address the simple problem first. Later we can speculate about chemo-semiotic consciousness.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><b>Scripts Both Compete and Cooperate to Yield a Multifaceted Brain</b> </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In due course, I’ll describe a collection of tricks that evolution has used to evolve a new way to evolve. (Well, knowledge is</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> only about a billion years old, so fairly new.)</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> It yields a very different, yet powerful way of thinking about the brain. And reality. No, I don’t understand all the tricks of the brain, only a relative few. But these tricks are applied disproportionately yielding a shadow of an overview that has for me become a simple model of the brain. Needless to say, understanding the nature of this chemical, signal, and biology-based knowledge has extraordinary application in our everyday interactions with the world, from science to art, and especially, philosophy. It informs everything you can imagine. And many things you can't.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Yes, I realize how audacious this claim is, probably better than most. I’ve been casually working on this problem for decades, but more intensely over the last few years. I’ve collected well over a thousand pages of technical descriptions, alternate versions, notes, and references, but all of that detail would only distract us at this point.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">A comprehensive model of anything needs to account for all known observations. This of course is currently impractical in the case of the brain. There’s simply too much data to even review, let alone validate (at least by any one person). We need a simple model of the brain first. That starts with a framework, or better yet, an overview of a model. We can fill in the details as our understanding evolves.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Whether we realize it or not, we each manage a default model of the brain along with our model for human behavior. We use it daily in various ways. It's just how the mind works. Being part of nature itself, the brain too abhors a vacuum. If your exposure to our technical media about the brain is typical, your personal brain model likely involves electrical metaphors, computers, and processing your thoughts in a sequential fashion. After that, the details are likely lost in shadow, because most of that model is simply wrong. But not completely.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Many think of neurons as logic devices or memory elements (which can be derived from electronic logic). For decades, so did I. But neurons have far more in contrast than in common with such metaphors. </span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">If you're like me, you may have a feeling that there's just something about this tech approach that doesn't seem quite right. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">We each know different things about the brain depending upon our own individual research and experience. Striving for a fresh approach, here's how I manage my model of the brain - start from the most general and work in new detail as I validate each observation. But it helps greatly to have that first principle understood - that multifaceted neurons create knowledge.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Here's a fun game: each time you use the word "know" or "knowledge", look outward into the world and think about how you came to know this thing and what your level of conviction is. Question everything. So, what do you know?</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">After that, the challenge is to generalize in a way that incorporates what we know, yet keep those generalizations broad enough to account for all the detail we’ve yet to discover. A fool’s errand? Perhaps, but here's the hyper-simplified model of the brain I now use to understand this challenging mystery.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><br /></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: transparent; color: black; font-family: arial; font-size: medium; font-style: normal; font-variant: normal; font-weight: 700; text-decoration: none; vertical-align: baseline; white-space: pre;">A Simple Model of a Complex Brain</span></p></span><span><span style="font-family: arial; font-size: medium;"><br /></span><div><span style="font-family: arial; font-size: medium;"><span id="docs-internal-guid-df32a0a9-7fff-f6cc-ce47-fe536c5d6213"><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The body delivers millions of neural signals</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> to the brain</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">, each of which represents a bit of knowledge about the world in chemical form. These signals are best understood as theatrical cues which both compete and cooperate in a converging and increasingly abstract fashion to drive scripts of muscle movement known as behavior, which in turn sometimes affects the world, which can once again be sensed. This process happens in a continuous loop with that world. Or these cues may signal glands to release internal chemistry which interacts with these chemical signals in a similar fashion also forming a dynamic loop within the body and especially the brain. In both cases, these two macro loops help to refine and normalize interactions with each real-world encounter. Or internal emotional feelings. In the process, both ionic signals and their chemistry refine and validate the accumulated knowledge of that experience in the form of adjusted sensitivity. Or not. The exceptions can be critical.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In their competition and cooperation, these cues and scripts of neural connection have formed in layers within each side of a single verticle division, left and right, providing for necessary isolation</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> to create the multifaceted nature of the macro brain. These sides and layers are best imagined as creatures from our evolutionary past. Each of these critters has many different ways of dealing with the world. As you come to know how each creature net is cued, you’ll begin to better understand your own behavior. A thousand critters each apply one of their thousand tricks to yield a million survival solutions. There are obviously too many to keep in mind. Fortunately, their application is disproportionate, even extremely disproportionate. But understanding even a few of these tricks can be quite useful in understanding the brain, and ourselves.</span></p> <br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">For instance, think of the cues that drive human competition, consumption, and reproduction. There are many, but only a relative few dominate most of the results in a form best described as sparse signaling creating a map of your body and the world in general. If the “executive” in your mind can intercept and redirect even a few of these more common cues, it can change your life dramatically. There are many self-help books that apply these techniques without ever understanding the neural details of how they work. OK, the above may be a bit too complex for now. Ignore these last three paragraphs. If you can.</span></p><br /></span><span style="white-space: pre;"><b><br /></b></span></span></div><div><span style="font-family: arial; font-size: medium;"><span style="white-space: pre;"><b>An even simpler summary of a simple model of the brain:</b></span></span></div><div><span id="docs-internal-guid-3de19a11-7fff-3179-494c-7f979acda4f8"><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">- Neurons sense the world to biologically create primal knowledge.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">- Chemical signals converge to create even more abstract knowledge.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">- This knowledge cues scrips of muscle movement known as behavior.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">- Behavior affects the world and body, and in turn is affected by the world and body, forming dynamic loops with reality, normalizing, refining, and validating neuronal knowledge with each repetition.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">- This knowledge is published widely by the neuron's axon delivering chemical signals to form a type of stateless, semiotic simulation using sparsely decoded maps of reality that help increase the probability of survival and reproduction.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">- Our conscious worldmapculus is one such map becoming both the source and the object of this ultimate expression, in a Zen fashion. It's a</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> simulation using dynamically looped signals to create an ethereal representation of reality in our skull, paradoxically.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Still too complex?</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">How about this:</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><b><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">Neurons create knowledge which is used to </span></b><b><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">cue scripts of muscle movement we describe as behavior. These cues and scripts of multifaceted neural nets both</span></b><b><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> compete</span></b><b><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> and</span></b><b><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> cooperate</span></b><b><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> to </span></b><b><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">yield a multifaceted brain</span></b><b><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> needed to survive in a complex world.</span></b></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><b><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></b></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><b><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">We are each a thousand creatures that have evolved a million tricks over a billion years.</span></b></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><br /></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial;"><span style="font-size: 18.6667px; white-space: pre-wrap;">or even more simply:</span></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><b><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"><br /></span></b></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><b><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">Neurons create knowledge yielding a skull full of chemical cues and scripts of muscle movement that help us survive and replicate. </span></b></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><b><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"><br /></span></b></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">That's about as simple as I can manage for now. Just think of your brain as a collection of</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> competing</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> and</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> cooperating</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> theatrical cues and scripts. Explore the interactions of these cues and scripts introspectively. It may provide a better understanding of how you deal with the world. Like mindfulness (closely related to knowing), this neo-gnostic approach will begin to make more sense and yield more useful results. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">If you’ve read this post more than once, it may seem to have changed. That’s because it probably did. I used to have a section here about assertion salad which I broke out as a separate post I now use as a summary. What’s useful today may not be useful tomorrow, or worse, may even become distracting. If I'm correct about this prime assertion, the consequences are as cosmic as the brain itself. It informs all of human knowledge, science, philosophy, and art. </span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">I want to keep my thinking flexible and plan to treat this content as a dynamic document much like a monitored Wiki which will evolve as I get useful feedback. Initially, it will be progressively published here as a series of dynamic blog posts. Feel free to follow or link, and share as you will. Check back later for new versions. </span></p></span></div></span></div><div><span><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">If the above summary about the brain speaks to you in any way, you’ve likely spent a great deal of time thinking about philosophy, the brain, and/or human behavior. I hope I can help you along your path, and you, along mine. If you’re purely a spectator, that’s fine for now. But I hope you’ll get involved in this effort to understand the brain. Perhaps I should clarify who I am, and who you are as my audience. My work history is steeped in computers, business, and technology, but not biology. To a significant degree, I'm writing these blog posts for myself, and to myself. I read them often. But I also need to include you as a critical element in this exercise. That's part of this multifaceted process.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">You are likely very interested in the topic or you wouldn't have read this far. I'm sure most will bail within the first few paragraphs. But those who truly understand the nature of this challenge will likely entertain even crazy ideas if it helps them in any way to understand the brain. That makes you more like me in your imagination and conviction regarding this quest. </span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">To be candid, I’m making much of this up as I go along, so I need your feedback. Here’s how I hope to inspire it:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I’ll start with an important question to help frame the problem which has been informed by this neo-gnostic model. The exploration of this question will be followed by some unlearning critical to finding a fresh start and solid ground. Then I'll describe why and how neurons create knowledge and actually define knowledge. Next, we'll take a trip starting with the first animal and then forward through history to imagine how evolution might have created this amazing result.</span><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> I’ll present an evolving description of the brain starting with a single neuron and ending with a simple model of the human brain. If using this simple model itself to inform a fresh thesis seems like circular reasoning, it’s not. It’s merely a circular presentation. Modeling the brain ultimately starts with the neuron. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">So will I.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I’ll describe the ideas that informed this model in the way I came to know them over my lifetime of subjective experience, especially the parts I had to unlearn. That’s the reason some of this presentation will be a memoir. Here’s a sample:</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiF2Bv5UFM9JVfEXxKGIGlyc6oPN7rTxexOPFCg4AIqR-OwcHLh_pWPovUcqT7IlfJnUeiOTV2TRisC0y-kLyyD_qHBHzFjEugviDUf0rfLbWqvOVKgJyo_loNhp8FQgZf27MJ822prO3igY2Wi-kY3kGSvbkEBWf6e5sAUTXDfOd5cjSNY1VI/s2066/PXL_20220813_164618453.jpg" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" data-original-height="1162" data-original-width="2066" height="225" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiF2Bv5UFM9JVfEXxKGIGlyc6oPN7rTxexOPFCg4AIqR-OwcHLh_pWPovUcqT7IlfJnUeiOTV2TRisC0y-kLyyD_qHBHzFjEugviDUf0rfLbWqvOVKgJyo_loNhp8FQgZf27MJ822prO3igY2Wi-kY3kGSvbkEBWf6e5sAUTXDfOd5cjSNY1VI/w400-h225/PXL_20220813_164618453.jpg" width="400" /></a></div><br /><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span><p></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><b>Flying</b></span></p><div><br /></div><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">My very first memory was from when I was about three years old and sitting on a rock wall in front of my grandmother’s house where I lived. Above is a current photo. This wall was already falling down 67 years ago. Most of the rocks have now been used for other projects, but at the time I was straddling not only the wall but also that remaining concrete post that originally held a gate. At the time of this memory, all that was left of this gate was a single board of the frame held by one bolt at its center. Now o</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">nly the bolt-hole remains. </span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">I don’t know what happened to the gate or the other bolts, but the remaining one allowed this board to rotate about the face of the post to a horizontal position. As a typical three-year-old fascinated by airplanes, I’d put my feet on this board which became a wing. I could bank left or right. This seat, post, and board became my airplane, not unlike Snoopy’s doghouse which I discovered years later. I recall flying my "airplane" and going to many places in my mind. I remember it well. Or do I?</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">A couple of years later my father took me on a real airplane ride with a friend of his. As a five-year-old, I had to sit in my dad's lap, but I got to fly a real airplane for a few minutes. Thirteen years later I had my pilot’s license, followed by an instrument rating. Flying for me has always been a joy, inspiring an immense sense of freedom.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I’ve since wondered many times about this first “memory” of "flying" and how it was stored in my brain. Did my later aviation ambitions affect the content or recollection? Decades later my grandmother told me I’d spent hours on that rock wall as a child. Did my memory simply come from hers? Or did I modify the genesis of my own memory? Are memories real? Or ethe-real?</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">It's unlikely she would have known about the dynamics of that board, nor did she mention it at the time, yet that aspect remains vivid, leading me to think the memory was mine. Or was this memory created anew at the moment before I typed this sentence into this blog post? A bit of both I suspect.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As we proceed, I will mostly ignore genetics, imaging, brain waves, and the rest of the more recent technical fields, especially anything having to do with the electron (once I carefully dismiss it). What’s left? Chemistry, connection, and the concept of knowledge. Oh, and a bit of theory about evolution informed by the practices of Tao and Zen. But first I need to challenge some common assumptions with a very important question, then plant a seed of doubt about the limits of information theory, and even science itself. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">One last thing before you proceed. I may be wrong about neurons creating knowledge as a first principle, but if I AM wrong, what IS the first principle of the neuron? What exactly does its signal mean? And how can we build a model of the brain if we don't clearly understand this first principle? Finally, if not neurons, </span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> from where does knowledge spring? Whatever your perspective and convictions about the brain, these questions need to be asked. And answered. </span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">While you consider them, here's that first important question to be addressed in the next post:</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span style="background-color: white; color: #222222; font-size: 18.6667px; font-weight: 700; text-indent: 10px;">How can the most profound and studied object in the world be so poorly understood?</span></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Continue: </span></p></span><h3 class="post-title entry-title" style="background-color: whitesmoke; font-family: "Trebuchet MS", verdana, sans-serif; font-size: 19.5px; text-indent: 10px;"><a href="https://suddendisruption.blogspot.com/2020/12/our-missing-model-of-brain.html" style="border-bottom: 1px dashed red; color: black; text-decoration-line: none;">The Gnostic Neuron - Part 2 - Our Missing Model of the Brain</a></h3></div><div><br /></div><div><span><br /></span></div><div><span><br /></span></div>Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.com0tag:blogger.com,1999:blog-23742979.post-50646073236393202762023-02-03T10:12:00.010-08:002023-12-31T16:12:59.301-08:00The Gnostic Neuron - Part 2 - Our Missing Model of the Brain<p></p><div class="separator" style="clear: both; text-align: center;"><div style="text-align: left;"><span style="background-color: white; font-size: large;"><b>Our Missing Model of the Brain</b></span></div><span id="docs-internal-guid-e370ab40-7fff-d294-8562-0a7821fee041"><div style="text-align: left;"><span style="background-color: white;"><br /></span></div></span><span><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"><Originally posted July 17, 2020></span></p><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"><br /></span></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEjGZ5j_8CAjU15SiR_FtoIlUh_j8JIAAzPCJ0sCsqMA3tRMUJ4KXJ5Z45kS7cS1U6XNmqk5zrBrp7t4vp8yQn0iLRFhFIHse58njQj3OPU1QDMffW72cRaiUU-Wv_rG0dequ1qMCCmQ13KctNx4WWL52b3MrHPAazr0sRTBByQIJVggx0amsDo=s612" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><span style="background-color: white;"><img border="0" data-original-height="612" data-original-width="612" height="640" src="https://blogger.googleusercontent.com/img/a/AVvXsEjGZ5j_8CAjU15SiR_FtoIlUh_j8JIAAzPCJ0sCsqMA3tRMUJ4KXJ5Z45kS7cS1U6XNmqk5zrBrp7t4vp8yQn0iLRFhFIHse58njQj3OPU1QDMffW72cRaiUU-Wv_rG0dequ1qMCCmQ13KctNx4WWL52b3MrHPAazr0sRTBByQIJVggx0amsDo=w640-h640" width="640" /></span></a></div><span style="background-color: white;"><br /></span><div style="text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-weight: 700; white-space: pre-wrap;"><br /></span></div><div style="text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-weight: 700; white-space: pre-wrap;"><br /></span></div><div style="text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-weight: 700; white-space: pre-wrap;">How can the most profound and studied object in the world be so poorly understood?</span></div><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white;"><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I’m of course talking about the brain. And “profound” is an understatement. Without our brain, nothing else matters. Without your brain, there is no you. Our brain creates our reality, and mediates our interaction with the world. We </span><span style="color: #222222; font-family: Arial; font-size: 14pt; font-style: italic; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">are</span><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> our brains. This view of the brain is not new. In the 4th century BC, Hippocrates realized the significance of the brain surprisingly well:</span></span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white;"><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">“Men ought to know that from nothing else but the brain comes joy, delights, laughter, and sports, and sorrows, griefs, despondency, and lamentations. And by this, in an especial manner, we acquire wisdom and knowledge, and see and hear and know what are foul and what are fair, what are bad and what are good, what are sweet, and what are unsavory. ... And by the same organ we become mad and delirious, and fears and terrors assail us. ... All these things we endure from the brain. ...In these ways I am of the opinion that the brain exercises the greatest power in the man.” -</span><span style="color: #1155cc; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; text-decoration-line: underline; text-decoration-skip-ink: none; vertical-align: baseline; white-space: pre-wrap;"><a href="https://www.blogger.com/blog/post/edit/23742979/5064607323639320276#"> </a><a href="http://classics.mit.edu/Hippocrates/sacred.html">On the Sacred Disease</a></span><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><a href="http://classics.mit.edu/Hippocrates/sacred.html"> </a> - Hippocrates</span></span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In spite of all that has been learned since, this ancient and intuitive summary remains one of the best and most concise descriptions of how our mind experiences our brain. Not only does “the brain exercise the greatest power in the man”, but in everything every man has ever done. Pick a topic. As you think about it, your brain informs your understanding.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white;"><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">For instance, how does simple matter yield the complex experience we call our mind? Is matter not as simple as we think? Or is consciousness not as complex? Perhaps neither. And both. We seem to have too much data about the brain and no model. Let's try to correlate the components with the result. </span></span><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If you act on your thoughts by writing them down as I have just done, it’s that collection of neurons we call our brain that clearly has ultimate control. You can not think about, nor do anything that does not involve your brain. You can not </span><span style="color: #222222; font-family: Arial; font-size: 14pt; font-style: italic; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">be</span><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> without your brain.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">And that’s just your brain. And that’s just right now. While subjectively critical, most of our individual brains will have little impact on the world at large. But collectively, all the brains that have ever existed have literally controlled everything that has ever been done. Our brains create our culture. Our brain is the key to all relationships, politics, economics, science, art, and philosophy. Yes, profound without doubt, but only loosely linked with behavior.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As for studied, has any other object gotten more attention, especially in the last few decades? Does any other intellectual challenge present as much data? And has any other effort yielded fewer conclusions? We’ve had a “Decade of the Brain”, a “New Century of the Brain” and have even treated the brain like a “moon shot” during Obama’s “Brain Initiative”. Yet, none of this rhetoric mattered. We still don’t have a useful model, nor even much consensus about how it really works. Is the brain too complicated for the mind to comprehend? Unlikely, but let’s take a closer look.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The complexity of the brain is astounding. You’ve probably heard the quantifications. Each of our brains has trillions of connections between billions of neurons, to monitor millions of sensors, all to control thousands of movements using hundreds of muscles for one primary purpose - survival. The number of possible combinations and behaviors is greater than all the atoms in the universe. And that’s just one brain. Each seems to be a little different. And each is constantly changing.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Neuroanatomy has taken the brain apart and reduced it to components. Much of the brain has been mapped by function, at least in a macro sense. But when we look closer, these “areas” and other brain “components” have few clear boundaries. Or specific functions. Most are fuzzy at the edges where millions of fibers deliver signals from one part to another.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">With heroic effort involving injury and death, various functions have been attributed to various lumps, gyri, and sulci. But this localization is only by degrees. If we try to get specific about what exactly happens where, exception becomes the rule, and rules become the exception. Brain function appears to be both localized and distributed at the same time. It’s a paradox.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Also, most of the brain is clearly divided left and right. These two halves are only connected at the bottom, center, and back. Plus the most obvious central connection, the corpus callosum, has a profoundly inhibitory nature. In both directions. Why? Even the cerebellum and brainstem have definite bilateral symmetry, and some functional division in their structure and operation. Sense and control are mostly separated in the spine, dividing the peripheral nervous system in a completely different fashion, and at ninety degrees to the bilateral symmetry of the brain itself. So is the brain truly divided? Or unified? The answer is obviously yes, without question. Which presents another paradox.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">It’s not just the brain that’s complex, it’s also the neuron. In the nano context, we’ve collected an astounding amount of data involving types of neurons, neurotransmitters, glial structures, genetics, and of course, nano, micro, and macro chemistry, each with their own functional domain bleeding into the others. We understand how the neuron creates a signal but not what that signal means. We have a clear understanding of how all of this happens, but not exactly why. At least not in the nano context. Yet.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As we zoom back out conceptually, various groups of neurons “project” their axons from one area to another. Some detailed connections have been mapped, but between the nano and the macro context, most of the micro connectome remains in shadow. Should neuronal function be associated with the location of their cell bodies and dendrites? Or the majority of their axon terminations? Specifically, what connects to where? And why?</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><h2 dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 10pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The Biology of Behavior</span></h2><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Now for understanding that result. Again, we seem to have too many answers. Even more challenging than neurophysiology or chemistry is characterizing function. The brain is where sensory input gets converted into muscle movement. We define this as behavior. This behavioral database of course includes all animal life, but even limiting it to human history, it’s still overwhelming in scale, content, and variety. The range of possible human behavior is astounding. Why did Socrates drink the poison? What led Joseph Stalin, and Mao Tse-Tung to direct the death of tens of millions of their own people? Was it simply to remain in power? If so, how did Henry II conquer much of Europe with relatively little bloodshed? Simply different management styles? Why so many answers, but no good ones?</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Generalize from a trillion behaviors then apply them to yourself. Why do you do each thing you do each day? Your behavior is far from random, but its true course can be difficult to devine, and at the same time, amazingly easy to rationalize. Behavior ranges from primal to sophisticated and obvious to enigmatic, with no clear boundaries from one limit to the next, much like the physiology of the brain itself. And that’s an important clue. Function follows form.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As individuals, we each have an introspective experience. It’s our own private view of our brain from the inside. Much of the time our behavior seems reasonable and organized. But is it? How many times per day are you surprised by your own actions? Think carefully. True self-awareness does not come easily. Where might these surprising visions, thoughts, and actions come from? How much of our thinking is conscious? How much is hidden in layers of mystery even from our conscious subjective experience?</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Scaling outward, how do your behaviors contrast and conform to those around you? And those more distant? Plus, each brain is changing dynamically from moment to moment. Repeating psychological experiments on the same subject often yield different results. Consistency is elusive as recent brain imaging meta-studies have shown. The brain is plastic by degrees, and in critical phases. So is the resulting behavior. Parts of the brain enlarge and contract over time as if challenged by our environment, and how we choose to deal with it. London taxi drivers are an example. Form follows function.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Multiply these behaviors by the trillions of creatures and all the people that have ever existed. Now correlate that with what we know about neuroanatomy, chemistry, genetics, neuroscience, and all the other academic fields we’ve enlisted in this effort. Generalizing from such a broad and changing base of information is like trying to nail an ocean of Jello to an infinite moving wall. What goes where? And why? For how long?</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">And yet, the brain is not random. As Hippocrates noted, behaviors flow from within the skull. As does subjective experience. So far we have nothing to disprove his observation. We’re left with billions of neurons doing mysterious things to yield trillions of complex behaviors. In short, the brain is a tangle, a modern Gordian Knot, and perhaps just as difficult to unravel. Or in the case of the brain, to understand.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If you’re not familiar with the Gordian Knot, it’s a parable about a very large and complex ball of rope with one loop attached to an ox cart. It was said that anyone who could manage this knot and uncouple the cart would become the King of Gordium (which was near modern-day Greece). This royal test was much like the Sword in the Stone, except the challenge was a tangle. When Alexander the Great encountered this test he simply drew his sword and cut the loop. Then he took the kingdom by force.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Though both the Sword in the Stone and Gordian’s Knot crowned a King, you might be tempted to conclude that Alex cheated. If you require the solution to conform to the spirit of the problem as presented, you’d be correct. But it could also be argued that Alex was just thinking outside the box. Or that might makes right. The story contains several possible lessons depending upon your values, sensibilities, and perspective. And that’s the point. It’s only one example of our mind entertaining multiple ways of looking at a problem. And its solutions. That’s another important hint, but for now, the concept of the Gordian Knot is a useful way to encapsulate the mystery of the brain. Speaking of childish stories, let’s take a break before we continue with this important question.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white;"><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Learning to Ride a Bike</span><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> </span></span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I got my first bicycle for Christmas when I was six years old, and of course, I didn’t know how to ride. My dad had gotten me a full-sized, 26-inch Schwinn. He probably figured I would grow into it, and so was trying to be cost-effective. Or perhaps he wanted to present not only a gift but also a challenge, which he certainly did. This Schwinn was made of steel, and nobody made training wheels for a bike that big. I could barely pick it up. And of course, other than watching bigger kids, I had no clue how to make it work. For weeks I just pushed it around with one foot on the pedal trying to get used to its weight.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">My dad was busy with work but our babysitter from across the street said she would teach me how to ride. We went to the school grounds where there was lots of smooth, flat pavement and nothing to run into. She said the weight wasn’t as important as keeping my balance. I asked her to explain. She said words wouldn’t help much. I just needed to get a feel for it. This was my first encounter with intuitive learning, a kind of Zen experience.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">She held the bike while I got on, then kept it upright as she pushed me up and down the basketball court. Unfortunately, I could only reach the pedals when they were in the top half of their rotation. So I had to start with the pedals near the top and could only push them halfway down, first one side, then the other. Between the weight, pedals, and balance I had my hands full, and so did she. We had to take breaks as she was doing all the hard work and heavy lifting. But to her credit, she kept at it. And so did I. I really wanted to be able to ride this shiny new Christmas present.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">While resting at one point, she explained that steering was the key to keeping my balance. So words did matter, but something else mattered more. Getting the feel for it was apparently critical. She was right. A few more tries and I finally found that feel. I was gliding by myself before I knew it. Looking back she was no longer holding me up. I was jubilant. That’s when I fell over. There was no way my legs would reach the ground. But I didn’t care. She was right. I had felt that balance. I knew it was there. I knew I could find it again if only I could learn to get off (and on) without crashing.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Later, on my own, I discovered that if I leaned the bike against a fence with the pedal in the right place, I could get on, push off and keep going as I bounced from side to side to make the pedals work. I still had to find a good place to jump off and catch the bike before it hit the ground. I remember riding in big circles a lot delaying the dismount. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Then I tried something radical. I put my leg through the frame below the top bar. This way I was able to pedal and also start and stop when needed. Well, mostly. It looked goofy as hell but it made the landings easier. Either way, I rode that bike for a long time hanging off the side. If you think this is too strange to have actually happened, that is the test of its veracity - stranger than truth. When my legs finally got longer I was able to sit on the seat again. And even then it was only to coast as I had to bounce from side to side to keep the bike going.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">By the way, my younger brother also got the same bike but a different color for that Christmas. He soon copied me and my, "through the frame type of peddling." He continued to use that "through the frame" style long after I moved on to regular riding. Years later we both got updated to Schwinn Stingrays which were lower to the ground and solved all the problems. Too bad we didn't start with the Stingrays.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The point is, not all learning is logical. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">And there’s more than one way to ride a bike.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Or skin a cat.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><h2 dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 10pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Intuitive Modeling</span></h2><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">“A theory can be proved by experiment; but no path leads from experiment to the birth of a theory.” - Albert Einstein</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white;"><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Open any book on neuroscience. Usually within the first few pages will be some disclaimer about the lack of a useful brain model. Jeff Hawkins of the Redwood Center for Theoretical Neuroscience put it concisely in his</span><a href="https://www.youtube.com/watch?v=G6CVj5IQkzk&t=31s" style="text-decoration-line: none;"><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> </span><span style="color: #1155cc; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; text-decoration-line: underline; text-decoration-skip-ink: none; vertical-align: baseline; white-space: pre-wrap;">2004 TED talk</span></a><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> about his book, “On Intelligence”: </span></span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white;"><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">(</span><span style="color: #222222; font-family: Arial; font-size: 14pt; font-style: italic; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Sorry for the interruption, but I need to add this note as of late March, 2021 because most of what you’re about to read was written years ago, though only posted in July of 2020. I wish to point out that Jeff Hawkins has now published a second book that gets much closer to my model than perhaps anything I’ve read so far. Unfortunately, Jeff does not make that final conceptual leap about neurons creating knowledge.</span></span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-style: italic; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Jeff and I share both a background in computers and a fascination with the brain. I agree with most of his constraints and many of his assertions, but we part company when it comes to the nature of knowledge. And how he thinks about the brain.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-style: italic; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Jeff’s new book is titled, "A Thousand Brains", though his thousand brains are quite different from the thousand creatures I'll be describing below. Like in his first book, Jeff presents some wonderful thought experiments which add to his very insightful observations, but his description of neuron firing as "spikes" indicates hours spent staring at an oscilloscope. This reflects his more technical approach. Even still, he gets close to my thesis near the bottom of page 125, “The knowledge is the model.” I did some counts on a few pages of this new book. Jeff actually uses the word “knowledge” more than he does “reference frames”, the key to his thesis. Though he does include a whole chapter about how to preserve knowledge, he doesn't seem to see how neurons might create it.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-style: italic; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Jeff, (like many others), sets out to model the neocortex before understanding the nature of the neuron itself. The neocortex being accessible and obvious might seem like a good place to start, but it was also the most recent major structure in the brain to evolve. This puts Jeff at a grave disadvantage. His view of how language is “processed” is literally the opposite of mine. He sees language as top-down deconstruction, and actually attributes aspects of his column reference frame model to the genesis of language where “features” are “stored”, again reflecting computer metaphors. He then applies this architecture recursively - “it is reference frames all the way down.” Down to where? Unlike turtles, the brain is not infinite. And practical recursion normally has a base case. I've found language generation in the brain to actually be far more simple and literally starting with the neuron. In contrast with his view, I believe language is an external expression of internally generated neuronal knowledge. It reflects the gnostic nature of the neuron itself, as I’ll shortly present from the oldest to the newest parts of the brain.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white;"><span style="color: #222222; font-family: Arial; font-size: 14pt; font-style: italic; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In any case, I highly recommend both of Jeff’s books about the brain for their many useful concepts and insights about prediction</span><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">.)</span></span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I now return you to my earlier quote from Jeff:</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white;"><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Open any book on neuroscience. Usually within the first few pages will be some disclaimer about the lack of a useful brain model. Jeff Hawkins of the Redwood Center for Theoretical Neuroscience put it concisely in his</span><a href="https://www.youtube.com/watch?v=G6CVj5IQkzk&t=31s" style="text-decoration-line: none;"><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> </span><span style="color: #1155cc; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; text-decoration-line: underline; text-decoration-skip-ink: none; vertical-align: baseline; white-space: pre-wrap;">2004 TED talk</span></a><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> about his book, “On Intelligence”: </span></span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">"We have tons of data and very little theory." To drive home this deficit, Jeff then offered a quote from decades before by Nobel laureate Francis Crick, "We don't even have a framework."</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Not even a framework! Well, this is embarrassing. Why all the intellectual abdication? After all, we have an emergent field of AI - "Artificial Intelligence". How can we create an artificial version if we don't understand the biological version? Yet, there's actually very little in common with human intelligence and the artificial kind. So what exactly IS the nature of intelligence and the brain?</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Some generalizations must certainly be more probable than others, even if extraordinarily broad. Or completely wrong. Error tends to invoke useful counterpoint. Where are our sweeping generalizations about the brain? We need a new perspective. We need a fresh approach. We need a Rosetta Stone of the brain. And most of all, we need a radical idea to break this logical logjam of data. Some wild speculation would be more useful than none at all.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Ironically, modeling is one of the things the brain does best. We model the world constantly and intuitively. We can’t help it. This modeling ranges from casual and even subconscious, to formal, detailed, and explicit. The most useful models may even become external and detailed mathematical simulations. Or computer programs. Thus, the field of AI.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The more intuitive models take many forms.</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> (Ironically, these intuitive models can also be useful for understanding the neuron itself, but back to the macro.)</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> We model the actions of other people to predict their behavior, typically without realizing it. This is known as theory of mind. Other models are conscious but still casual. Their complexity ranges from sparse to rich depending upon how much attention we pay to each topic.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">For instance, you may know more about psychology than I, or the detailed “proofs” of philosophy, but I likely know more about how a computer works. I’ve designed most major aspects of computer systems, from the bare metal of logic up through the ALU, processor, storage and I/O. But most of my career has been spent in software from hex coding and assembly through the BIOS, operating system, computer languages, and finally, application software. I understand the logic of AND, OR, NOT, along with how assignment, loop, and decision make up the essence of a Turing machine which represents any possible computer. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I understand the details of how bits are converted from digital to analog yielding the emergent result of music, which may bring a tear to your eye. I also understand the limits of computers and computability. You can probably do something similar in some other field, whether it’s technical or artistic. It’s what each of us attends to that allows us to populate our respective models of the world.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Using associative maps, allegory, and metaphor, we also model the tools we use, the work we do, and the places we live, all to great advantage. But it’s still a different experience, a different model, and a different advantage for each of us. You think about things I dismiss. And vice versa. We each have our casual and more technical models of the world. And our model of the brain is part of that world, casual or not.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Since you’re reading this, you likely entertain some default but at least conscious model of the brain. It may involve the concept of hard-wired “circuits”, programming new habits, or just processing your thoughts and feelings. Did you notice that each of these is a tech metaphor? Or perhaps your model might involve more explicit concepts of electronics, brain waves, genetics, or imaging. Again, each is a field of technology. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">For decades, my casual model of the brain focused on chemistry, ionics, and logic, yet my model was never viscerally satisfying. The closer I looked the clearer it became that the brain had more in contrast than in common with technology. I came to realize these default tech metaphors and fields of study were distorting my thinking.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Blinded by Science?</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 12pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">"The model we choose to use to understand something determines what we find." - Dr. Iain McGilchrist</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Science is built on characterizing and quantifying consistency. Once defined, these consistent objects become tools to be relied upon. Those things not well defined remain in shadow, even when they are important, like the brain. Has science taken us down some kind of blind alley? Could science, the prime tool of validation, be the very thing blinding us to the nature of the brain? Or is it that science has become so hyper-specialized that we can no longer generalize effectively? Why do we not have a big picture of the brain? Are computers getting in the way? Or perhaps the <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6822296/" target="_blank">status quo of academia</a>?</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Today, billions of dollars are being spent to understand this slippery object. As noted, the 1990s were declared the “Decade of the Brain.” That decade produced yet another tsunami of data, but again, few conclusions. This data is also a logjam waiting to be released. We’re now well into the new millennium and we still don’t have a useful model of the brain. Below is a more recent quote from Ed Boyden who leads brain investigation at the MIT Media Lab:</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><a href="https://www.edge.org/conversation/ed_boyden-how-the-brain-is-computing-the-mind" style="text-decoration-line: none;"><span style="background-color: white; color: #1155cc; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; text-decoration-line: underline; text-decoration-skip-ink: none; vertical-align: baseline; white-space: pre-wrap;">How the Brain Is Computing the Mind</span></a></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Despite the title, Ed explains very little about how the brain works, though he does acknowledge the challenge, and provides this same important clue - why would Ed presume the brain might “compute” the mind? And it’s not just Ed. Various forms of computer thinking remain our default metaphor of the brain in spite of its poor fit.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The contrasts between the brain and computer have been well known for decades. Nobel laureate Gerald Edelman effectively challenged the computer metaphor in several of his books. Yet this tech approach continues to guide most of the effort, and consume most of the resources, as did his, unfortunately.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If we think of the brain as a computer, it follows that neurons somehow represent state machines, conforming to information theory. This is not the case. If the brain were some kind of computer, we would expect it to be fast, digital, synchronous, serial, hard-wired, and determinant in its operation. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The brain is the very opposite in each of these major aspects. It’s relatively slow, surprisingly analog, mostly asynchronous, profoundly parallel, and quite plastic. Instead of consistent answers, the brain often yields an indeterminate result in a very uncomputer-like fashion. But it’s not just the computer metaphor that causes problems. It’s science itself. Let’s get back to Ed Boyden:</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">“The reason is that the brain is such a mess and it’s so complicated, we don’t know for sure which technologies and which strategies and which ideas are going to be the very best.“</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The very best? How about any approach at all? And if we don’t know “for sure”, might it help to know something by degrees? Keep this demand for a determinant model in the back of your mind for later consideration. Don't get me wrong. I love science and technology almost as much as logic, but I don't believe that science has an answer for everything. At times it even blinds us from the true nature of things. For now, let’s evaluate the rest of this quote.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As yet another example, Ed too leads with “technologies.” Why would we expect the brain to be understood in terms of technology? The brain certainly didn’t come out of a factory. The brain evolved. And yet technology has been the prime strategy for modeling the brain throughout most of recorded history. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div></span><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt;"><span><span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The brain has in turn been compared with an aqueduct, telegraph, clock, telephone, steam power, computer, and lately, the internet. Each has been the most advanced technology of its time. Some are now trying to understand the brain in terms of superconduction quantum calculation. </span></span></span><span><span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Though complex, I doubt the brain’s operation is quite that exotic. Or technical. And the distraction gets worse. It’s almost as if science itself has become our latest “new” technology. </span></span></span></p><span><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The first test of science is consistency. Though not random, the brain’s operation is often not consistent. This is a major challenge for science, and perhaps one reason for our missing model. Science requires that experiments produce repeatable results. The brain violates this with impunity, switching from one answer to another as it dynamically tunes itself to its environment. Hypocrisy is common in human behavior. When we overlay technical metaphors, things get worse. Soon these metaphors are steeped in rationalization and confusion when the true test of any model or metaphor is simplicity and utility in sorting out the data. Without a useful model, the data just piles up. We need a way to break this logjam, and the key is likely more intuitive than logical.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The technical approach to understanding the brain is like deconstructing a Boeing to understand how a bird flies, and just as useful. The Boeing applies thrust to a fixed-wing in a fairly crude fashion but also flies much faster. The bird’s solution to flight is far more subtle and elegant. But slow. Which is better? Neither. Each has advantages depending upon requirements for load, speed, and maneuverability. And that’s the point. There is no one perfect answer. Nature has evolved many different ways to fly as demonstrated by bumblebees, hummingbirds, and even gliding snakes. Human methods are just the most recent, and most clumsy in spite of their awesome power and speed.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This tech / biological contrast is not limited to the skies. Something similar happens on land, and even at sea in terms of movement. The wheel forever changed how we travel. Like human flight, it allows for greater speed, load, and distance at the cost of maneuverability. But not always. A bicycle is a hybrid of tech and biological methods for moving over land. It allows a human muscle to achieve greater speed than running, and also greater efficiency than any other application of the wheel.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white;"><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">At sea, a similar contrast exists, and also similar hybrid solutions. Powerboats will get you there faster using the brute force of a motor. Swimming is elegant but quite limiting. Sailing applies the best of both worlds when speed and capacity are critical requirements. Sailing works </span><span style="color: #222222; font-family: Arial; font-size: 14pt; font-style: italic; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">with</span><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> the wind even when sailing into it - quite elegant.</span></span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Learning to fly provides one more interesting comparison when trying to find a useful model of the brain. Just over a century ago the consensus was that man would never fly. Heavier than air human flight was thought to be beyond our reach, but there had been many attempts, and even a few hints of how to proceed.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Still, after years of trying, by December of 1903, Samuel Langley had spent the entire Smithsonian budget plus $50,000 from the Department of the Army trying to fly using the brute force of a steam engine and fixed flight surfaces. We might describe this as the technical approach to flight - a predetermined and consistent solution. Following many attempts, (some even modestly successful), the final version of his airplane crashed and the pilot died. After decades of effort, after spending a literal fortune from the government, and also in a state of grief about his friend, Dr. Langley finally gave up.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Only a couple of months before, on October 9th, 1903, <a href="https://junkscience.com/wp-content/uploads/2016/01/102025405.pdf" target="_blank">the New York Times punctuated this failure</a> (and wasted public money) by publishing an op-ed stating that man would never fly in a million years. Literally nine weeks later with far less funding, the Wright brothers proved them wrong. The Wright brothers applied a more hybrid “bicycle” approach to flight which was consistent with their background. Using a lighter gasoline engine, and having a human actively balancing the control surfaces in an organic fashion, (as suggested by the New York Times column), were the key elements that were different from Langley’s effort. Orville Wright finally took to the air. That first flight was controlled by a biological brain, not a perfectly calculated and trimmed airfoil.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The point is, whether you wish to travel by land, sea, or sky, solutions range from biological to technical. Biological is more subtle and effective. The technical approach applies more power and speed, but in many ways is far more crude. Tech succeeds, but in a different, more clumsy way. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This challenge of a brain model is similar. Electronic computers simulate the world using complex systems operating at the speed of light in a mostly serial fashion - the tech approach. But why might we think there’s only one way to simulate the world when there are so many ways to fly? The biological approach, which the brain uses, is slower but much more subtle and elegant. And in many ways, it’s far more effective. Especially when survival is involved. How many ways are there to simulate the world? Might IBM, Google and Tesla's current AI efforts actually be a hybrid form of intelligence like sailing is to swimming? What is the nature of this more biological type of simulation? What is the brain really doing? And what part does the neuron play?</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">These were the questions I should have kept in mind, but for decades I too have searched for the logic systems of the brain, and how its state machine was encoded by this special form or analog of logic. My approach was technical, but I was about to see the first hint of one possible bioanalogical, and stateless alternative - DEcoding reality.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The Gnostic Neuron</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Here’s one final acknowledgment of our missing brain model. It’s the opening line from the issue dedicated to the brain from Scientific American in 2014, The New Century of the Brain:</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">“Despite a century of sustained research, brain scientists remain ignorant of the workings of the three-pound organ that is the seat of all conscious human activity.”</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Pessimistically, the article then immediately cites an interesting discovery as just another mysterious loop in our Gordian Knot:</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">“... the discovery of a “Jennifer Aniston neuron” was something like a message from aliens, a sign of intelligent life in the universe but without any indication about the meaning of the transmission. We are still completely ignorant of how the pulsing electrical activity of that neuron influences our ability to recognize Aniston’s face and then relate it to a clip from the television show Friends. For the brain to recognize the star, it probably has to activate a large ensemble of neurons, all communicating using a neural code that we have yet to decipher.”</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If you haven’t heard about the “Jennifer Aniston neuron”, here’s a quick summary of this remarkable work by <a href="https://www.researchgate.net/publication/7770938_Invariant_Visual_Representation_by_Single_Neurons_in_the_Human_Brain" target="_blank">Rodrigo Quian Quiroga from UCLA in 2005</a>. It began when a patient was being prepared for brain surgery to treat epilepsy. As part of that process, selected neurons were monitored while the subject was shown photos of various places, people, and things. In this case, the chosen neuron fired when the patient was shown a picture of Jennifer Aniston as the character Rachel. Even more remarkably, that same neuron fired no matter how "Rachel" was presented. This was amazingly consistent for a neuron. Whether it was her spoken name, her written name, her photograph, or other likenesses, all of them worked as long as the cue seemed to capture some essence of the character Rachel. I immediately recognized this quality as the “invariance” described by Jeff Hawkins’ in his book, “On Intelligence” referred to above.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This remarkable discovery nicely demonstrates a “gnostic” neuron, or “grandmother cell”. Such neurons were described as a subset of neurons known as “concept” cells, which itself was a concept started as a joke at a neuroscience conference in 1969. Yet this was no joke. This was real, and after 15 years has yet to be effectively challenged. The results were independently verified when “Luke Skywalker”, “Bill Clinton”, and “Halle Berry” neurons were found in other tests. Some of these neurons even fired when a cartoon of the subject was presented. There are many other examples, and they all demonstrate invariant knowledge by firing when the essence of said subjects or characters was detected in ANY form.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The idea of a gnostic neuron is philosophically profound, literally, the expression of knowledge taking the form, in this case, of Jennifer Aniston's "Rachel". This neuron recognized that one specific character out of thousands of people that this particular epilepsy patient experienced during her life. That’s an impressive trick. How did this neuron come by this knowledge? And what significance does it have in breaking through this logical logjam of brain data?</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I’ve included the above pessimistic assessment of the Jennifer Aniston discovery because I reached the very opposite conclusion. For me, this gnostic neuron was not a message from aliens. It was a critical hint. The moment I read about the Jennifer Aniston neuron I literally stopped in mid-bite. I was eating lunch. The moment remains vivid.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Knowledge is the key to philosophy, or at least its object of love. As a computer architect, I’ve had a lifelong professional interest in what computers have in common, and in contrast, with the brain. Computers process information. Higher level knowledge is similar to information, but not the same thing. Like many other technologists, not only had I been mischaracterizing the neuron, I’d also been mischaracterizing knowledge. I’d spent decades analyzing neurons as logic devices, trying to understand what kind of systems these neurons might form, or how to “code” memory as suggested by the Scientific American article above. But like so many other technologists, I had the wrong perspective.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In that instant, for me, the problem changed. Instead of dismissing this result as a message from aliens, I began to wonder what all the other neurons “knew”, and how they came to know it. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">At that moment, the neuron ceased being a slippery state machine and became associated with acquiring knowledge. I began researching what it might mean to “know” something, and how a neuron might perform this amazing trick.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Fire</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Ignoring for the time being how, let me present why knowledge might be the key to modeling the brain. Simply accept the assertion that neurons magically create knowledge at the instant that they fire. Here’s an example:</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Imagine a neuron that can sense smoke, another that can feel heat, and finally, a third that can detect light (all well-characterized by neuroscience). When each of these neurons triggers we can assume that each of these things is experienced by the person in question at that moment. When each of these neurons senses their respectively tuned condition they each create a neural pulse in that instant which can be thought of as knowledge taking the form of a signal reflecting each specific experience. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Now imagine a fourth neuron tuned to sense a specific pattern from these three neurons when they all occur within a constrained amount of time, the essence of synchronicity. This fourth neuron thus combines signals from these first three neurons, (and the events they indicate), to form an abstraction of knowledge we will call “fire”.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If all three source neurons trigger at the same time, they create an association, and this fourth neuron will trigger indicating that something is burning in the world, a useful bit of knowledge quite distinct from the knowledge of smoke, heat, and light.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Now further imagine that this fourth neuron is connected to a script of other motor neurons whose muscles compress the diaphragm, adjust the vocal cords and manage the tongue and lips of the person in question. When these three original source neurons trigger in unison, they trigger the fourth that captures the pattern, and this knowledge about this fire will escape the body and alert the rest of the tribe as the word “fire” issues forth from that person’s mouth. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">At that moment, the unique sound that makes up the word "fire" becomes a new thing in the world. A thing to be sensed by others so that they may mobilize without actually smelling, seeing or feeling the result of the physical fire. The word fire has become a useful abstraction re-presented to others without the need for the actual experience. These vibrations in the air connect one human to another, not unlike the chemistry that connects one neuron to another at the synapse.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">It’s easy to see from this simple example how all words might each be represented by a single neuron dedicated to a specific bit of knowledge, and how language itself might be an external form of the brain’s internal architecture. To summarize, words in verbal or written form are an external expression of internal neuronal knowledge. Decursively. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I realize this simple description requires a great leap of faith based on the radical notion that neurons create knowledge, so if it challenges your sensibilities, relax for now. I’ll continue with how I came to understand this model of the brain. Other detailed examples including the important “how” of this example will be provided.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Our Left-Brain and Right-mind</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white;"><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I thought about this possible gnostic nature of the neuron for another seven years. No, not all the time, but a lot. I was still trying to understand how a neuron might come to know something when I happened across</span><a href="https://www.youtube.com/watch?v=dFs9WO2B8uI" style="text-decoration-line: none;"><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> </span><span style="color: #1155cc; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; text-decoration-line: underline; text-decoration-skip-ink: none; vertical-align: baseline; white-space: pre-wrap;">this video</span></a><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">. It is a TED summary of “The Master and His Emissary, The Divided Brain and the Making of the Western World”, by Dr. Iain McGilchrist. If you’ve watched this 12-minute RSA video, (which is brilliant for its own reasons), you’ll understand why I read (</span><a href="https://suddendisruption.blogspot.com/search/label/Divided%20Brain" style="text-decoration-line: none;"><span style="color: #1155cc; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; text-decoration-line: underline; text-decoration-skip-ink: none; vertical-align: baseline; white-space: pre-wrap;">and reviewed</span></a><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">) the book.</span></span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This book deals with the divided brain at the macro level, as opposed to Jennifer and her neuron where I’d been probing possible forms of logic at the nano level. And yet his descriptions were vaguely familiar. Dr. McGilchrist begins by noting the physical asymmetry of the brain, and uses it to support his model of operational asymmetry and malleable dichotomy throughout the rest of the book. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I use the terms left-brain and right-mind for a similar reason. Our mind is the subjective ethereal experience of our physical brain. Think in terms of music. The material brain is like a musical instrument; the ethereal result is the music of the mind. This left-brain, right-mind association is to remind me of the world as managed by each respective side of the brain. The left-brain objectifies things of interest. They stand apart so needs labels, titles, and indexies as access methods to find things in a</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> delayed but</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> determinant fashion.</span></p><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Our right-mind evaluates things according to their impact on us personally, and so subjectively and immediately in time and space. Experience,</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> association,</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> and imagination are its access methods - intuitively. </span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">The left-brain prefers things normalized and defined so they can be used as components in constructing other thinking. Our right-mind keeps its options open and close by as it watches for threats and opportunities, and in the process, solves the homunculus issue created by the left-brain. </span></p><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Of course, we each also have a right-brain and left-mind, which accounts for the exceptions in this broad generalization. </span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Dr. McGilchrist also notes that our left-brain naively thinks of itself as the whole brain, and prefers to define our world as logically consistent. The cause and effect of science are how our left-brained "Executive" models the world. He seeks THE answer. Objective technology is the result. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In contrast, our right-mind knows the world is not entirely consistent, nor completely random. It lives in that undefined yet personalized middle ground. Mysticism, art, and intuition are the results. Our right-mind correctly treats our whole body and whole brain as a collection of subjective survival solutions, where one solution need not preclude another.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The first half of this book describes the brain in objective and definitive terms, hallmarks of the left-brain. The second half of the book presents culture and art in a more subjective fashion. It reaches for conceptional connection, as a mystic might.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Our left-brain’s "Scientist" dominates the implementation of neuroscience and its metaphors at the direction of our left-brain's "Executive". This leaves little room for the speculations of our more intuitive right-minded "Mystic" to direct our right-minded “Artist”. Ironically, and consistent with McGilchrist’s concept, our right-minded Mystic knows more about our left-brained Executive than that Executive knows about our Mystic. At least in a holistic sense. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">(Sorry Ian, the whole Nietzsche, Master / Emissary story works for your theme, but it conflates subservience in a hierarchical relationship with the more complementary but equal nature of a divided brain. If there is subservience, the left-brain seems to be unaware of it. As I’ll shortly present, the creation of knowledge occurs on that line between yin and yang, not master and servant. I agree with your Nietzsche story that our left-brain has run amuck at times in recent history, and especially now. Though complementary in most ways, the two halves of the brain are ultimately far more equal, at least in opportunity, if not always in operation. A left-brained Scientist directed by his Executive, and a right-minded Artist directed by a Mystic better describe both the subservient and egalitarian nature of this complementary architecture. Going forward, I'll use these four titles (and many others) in my personal org chart as opposed to only two.)</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">It was also just before reading the Divided Brain that I discovered Oliver Sacks’ wonderful writing about neurological deficits associated with physiological injury or disease. It wasn’t just Oliver's writing I appreciated. His powers of observation and correlation were that of a modern Sherlock Holmes, except more subjective, which is what makes his stories so much fun to read. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Each Oliver Sacks case now took on new meaning in the context of left-brain and right-mind as I explored the gnostic nature of the neuron. Dr. Sacks intuitively used the model I’m about to describe without knowing it explicitly. I will provide examples in due course. He also suggested we move beyond objective and subjective to explore the brain with a trajective approach, but, (believe it or not), I’m trying to keep things simple so won’t wander down that rabbit hole at this point, but I will summarize by saying if McGilchrist is correct, there's a rich opportunity both intellectually and personally in trying to learn what our right-mind might have to say about the last hundred and fifty years of brain research. If only it could speak.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white;"><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Why Words Matter</span><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> </span></span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Returning to lateralization, here’s one example of why the dominance of left-brained language is so distracting:</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">You do not have an amygdala. Nor do you have a hippocampus. You don’t even have a cortex. You have two of each. Your skull contains two amygdalae, two hippocampi, and two cortices, one each for left and right. This spell checker does not even recognize any of these alternate plurals. And how often do you even encounter the more common ones - cortexes, amygdalas, and hippocampuses? This oversight is especially grievous when referring to a single brain. These three terms even sound alien. Do these plurals matter to modern neuronscience? Of course they do. Are they ever used? It's quite rare. It's as if our language-dominate left-brain is denying the very existence of our right-brain and its major components. Our left-brain can't bring itself to represent the another side it doesn't believe even exists. Neuroscience is so steeped in its left-brained world that it won't admit that the right-brain is even there in terms of functionality or anatomy. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Think about the last time you read about the cortex. It was likely presented as a singular and unified whole, as I have just done. But as McGilchrist presents, the very opposite is true. The surface of these two cortices does not touch topologically, and their corpus callosum signaling is mostly about inhibition (a useful hint). It’s the same with the amygdalae and hippocampi, along with all the other lateralized and duplicated parts. The brain is vividly, profoundly, and obviously divided both physically and operationally. Words matter. They affect our thinking. And also how we feel. Our left-brain likes to believe it controls the entire body, including the whole brain. All of the time. The absence of these plurals is just one example of our left-brain with its language dominating our narratives, and our view of the world. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This advantage with words doesn’t mean the left-brain is a villain with a superweapon called language. Quite the opposite. The left-brain is often left innocently wondering what happened in its struggles with what I’ve come to think of as the silent and secret tyranny of the right-mind. From the left-brain’s perspective, the right-mind doesn’t even exist. The left-brain tries to ignore both the right-brain and right-mind. From the left-brain’s perspective, this mysterious “other” side of the brain remains in shadow but will often simply take control at key moments, leaving the left-brain to quickly (and often inaccurately) rationalize the result, which it does with grace honed from practiced experience. Even when completely wrong. This quirky dynamic nicely explains cognitive dissonance</span><span style="font-family: Arial; font-size: 18.6667px; white-space-collapse: preserve;">, passive-aggressive behavior,</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space-collapse: preserve;"> and the ubiquity of hypocrisy in our culture, as well as many other enigmatic yet commonly observed aspects of human behavior. Contrast pain and pleasure. Oh, that we had time to walk down that rabbit hole right now.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The result of this architecture is a contest played out in the corpus callosum in the same way it’s played out between two neurons competing to create dichotomous knowledge in the nano context. Since the left-brain controls most language, it tends to dominate verbal and written description. We never get to hear, (or read about), the model of the brain that the right-mind intuitively understands. Fortunately, this more intuitive model still shows up as hints in our language and culture. A right-minded template of the brain is hiding in plain sight, as I’ll shortly describe using something I've come to call decursion.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">To summarize, Dr. McGilchrist’s work not only described the divided brain, his theme suggests one possible reason we don’t have a model for the brain. It’s that our left-brain doesn’t like the answer it’s found, and so inhibits our more intuitive right-mind, which of course has no voice. For those who have studied neuroscience, their left-brain believes simulation requires access methods, electrical communication, and logic states, but can’t find where these states are stored, or even anywhere logic is consistently applied in the brain. Our right-mind knows better but gets inhibited conceptually on any topic dealing with electricity, brain data, logic, or science.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Ultimately, the more rational left half of our brain denies the fruits of right-minded intuition and ends up in a logic trap much like Zeno’s Paradox as presented in the "Divided Brain". If you’re not familiar with Zeno’s Paradox, I’ll present my father’s version as told to me when I was a teenager. My father was always telling dirty jokes, and I don’t believe he even realized the philosophical history behind this one:</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">"An engineer and a scientist were brought into a dark room containing a large one-way mirror. On the other side of the mirror was a small white room which was empty except for a beautiful naked woman standing against the back wall. Both men were instructed that they could shortly enter the room with the naked women but could step only half the distance to her with each step.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The scientist put his head in his hands and wept, knowing he could never achieve his objective. The engineer simply smiled, realizing he could get close enough for all practical purposes."</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">What each of these men “knew” were both correct, yet they reached opposite conclusions. Knowledge is subjective, reflecting our individual talents, experience, and perspective.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Is it not likely that the reason for our missing brain model, is a very similar logic trap? Fortunately, a right-minded “Mystic” (or artistic engineer), can span a towering paradox in a single and final stride, getting him close enough for all practical purposes.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I casually speak of our left-brained Executive/Scientist and right-minded Mystic/Artist as if Dr. McGilchrist has laid the matter of multiple entities in our skull to rest. And so he has. But which aspects of our behavior lie on which side? And why? Is dichotomy the only aspect of our brain that forms physical boundaries? The brain is clearly multifaceted both physically and operationally. But how many faces do we present to the world? And why do we generally have this subjective experience of a unified mind? Time for another story from my past:</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Ricky Morrison </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I first met Ricky Morrison when I was seven years old and in second grade, but not on the first day. He didn’t show up until late September. Ricky actually started the year in the special-ed class but was soon mainstreamed into ours. I don’t know why the teacher put him next to me, but she asked me to be nice to Ricky, to help him with his work.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Ricky was ugly, awkward, and clumsy. His head, and especially his forehead, were larger than normal, even for his big frame. He outweighed me by at least fifty percent. In retrospect, I think he had already been held back a grade, maybe more. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Ricky tended to slobber and drool, which he usually caught with the sleeve of his canvas jacket. The mucus buildup took on a sheen near the cuffs. He almost never took off this jacket. I once asked him why. He told me he would get in trouble if he lost it, so he left it on, even during hot weather.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Ricky had another curious habit. If he wasn’t being forced to stay in his seat, he was always on his way somewhere else. And “somewhere else” constantly changed. He would move around the classroom from place to place but only stay a second, then off to another. I would often see him heading down a hall, stop abruptly, then head a different direction as if following some internal radio instructions. This was just one of his more bizarre behaviors. We never talked about it.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Ricky rarely spoke, but often stared intensely. When he did speak, his voice was high and had a nasal quality. His words were hard to understand, but he would glare at you if the meaning was important. I can not remember a time when he actually smiled. His countenance was generally dull. Well, at least when it wasn’t intense around the eyes. When he wasn’t trying to get my attention.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As his helper, I put his name on his papers so the teacher wouldn't lose track. He had trouble writing his name. Perhaps that’s why I remember it after all these years. He rarely added much to the page, though he could when he wanted. He wasn’t as dumb as everyone thought. I remember showing him where the "World Book" encyclopedias were. These books were my favorite thing in the classroom. We of course couldn’t read them at the time, but I showed him how the pictures could tell stories. Mostly, he wasn’t interested in school, but I did see him get correct answers on his papers now and then.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">His general appearance and tendency to stare often provoked confrontation. But his size and volatile nature usually kept the other kids at bay. He lived in fear of adult authority, but little else. You could see it in how his eyes would dart around when a teacher approached, and how he would bristle if they asked a question. I came to wonder how many of his issues were caused by his limited abilities, and how many by innocent confrontations with impatient teachers.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Weeks later we were getting ready to leave for the day and he commanded, “come”. I was curious, so I followed, with no detours this time. We walked up to the local Frosty-Freeze on Main Street. An attractive woman emerged from the back and introduced herself. It was his mom. She seemed pleased that I was with him. She brought out a large basket of French fries and sat it on the table between us, then went back to the kitchen. I only got a few. Ricky ate them as if he were starved, and shortly pulled the basket across the table so I had to reach farther. He glared at me every time I took one. It seemed that he realized he was supposed to share but wasn’t comfortable actually doing it.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">On another day we were walking over to the Frosty Freeze after school and some fourth-graders began teasing him. I was close by so grabbed him by the jacket trying to move him along the path. He slipped his arms out of the jacket and dropped into a threatening stance addressing the older boys. He was out of his jacket. He was out of his league. This was serious. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Then he did something strange. He bit down on the base of his left thumb and made a fist with his right as he crouched down. It was his idea of defense, and it worked. Perhaps he had seen the posture on some TV show, adding the bite for effect. I’m not sure. The older boys laughed at him, but also backed away. I grabbed him by the shirt and pulled him along until he noticed I had his jacket. He put it back on. I never saw him actually fight anyone but heard about one time when he had gotten a bloody nose and had to go to the principal's office.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Starting in third grade I moved to a desert community east of Tucson, Arizona. I didn’t see Ricky again until years later when I returned to northern California and we both attended the same high school. He clearly recognized me from years before but only said, “Hi.” That was it. No smile, nothing. I asked him how he’d been, but no reply. I don’t think we were ever really friends, at least not in the normal sense. His social behaviors were largely missing. I was simply part of his known world. Maybe he trusted me more than others. And perhaps naively, I trusted him.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">For me at the time, Ricky was an example of how different people had different ways of not only perceiving, but also dealing with the world. True, most were not as different as Ricky, but I began to notice how he reacted to the same events I experienced, but in different ways. We each have our own methods of dealing with the world. I remember wondering at the time, what went on in Ricky’s head? I was fascinated by human behavior. Ricky was such a vivid example. The difference between us was a hint of something important, a subjective shadow in the back of my mind. Or have I just now created these distinctions all these years later? I’m not sure, but Ricky remains vivid in my memory.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I also observed how others reacted to Ricky. Most saw him as abhorrent, a creature to be avoided. But I didn’t. Ricky was interesting. I wasn’t enchanted by Ricky, but I did have compassion for him. In a selfish way, for me, Ricky was a subject of study. But whatever his IQ, Ricky was still a person. I felt he should be given the opportunity to explore the world like the rest of us.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In another time and culture, Ricky might well have been honored as an oracle, to be consulted in various ways as a source of alternate wisdom in difficult times. I now believe that we come to know things with our right-mind, and we come to understand things with our left-brain. And also the inverse. Wisdom is created in the tension between knowledge and understanding, a type of Yin and Yang, whichever side of our skull creates it. But certainly, the brain is more complex than just the simple trick of dichotomy. What about all the other possible tricks? Let's leave Ricky for the moment and return to how the brain is divided.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The Triune Model</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Dealing with multiple brain parts was not new for me. I had introspectively tried something similar in 1977 when I first read Carl Sagan’s book, “Dragons of Eden”. It was mostly a popularization of Dr. Paul MacLean’s “Triune” model of the brain. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Dr. MacLean’s model divided the brain into three layers of ascension - the reptilian, paleomammalian, and neomammalian. Each is ascribed characteristics associated with the implied group of species. The paleomammalian experience is subtle and quixotic, actions of the reptilian brain, more obvious. Think back to any of your movements that were so quick they surprised you. They are more likely to be reptilian. Comparisons with Freud’s Id, Ego, and Superego are obvious and often made in the process of dismissing this Triune theory by associating it with some of Freud's more challenging ideas. But might that be throwing out the baby with the bathwater? I've found that both Freud and MacLean have contributed significantly to understanding the brain. At least for me.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As I explored the possibilities at the time, I remember thinking that the main problem with the Triune model was that these three delineations had too many exceptions. As the brain has become more characterized and functions more localized, the lines between these entities have become blurred, and only having three layers, far too limiting. Plus, we now know that the top two and a half layers have independent versions for the left and right sides of the brain as noted above. Perhaps five creatures might have been a better fit. But allowing five is only tugging on a string that reveals fifty more. After that, things get complicated. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As for limiting the number of major “parts” of the brain to three or five, other significant neuronal structures don’t even reside within the skull at all, such as neural control of the heart and gut. Obvious peripheral control functions are even more primal than Dr. MacLean’s reptile. In spite of these problems and others, the Triune concept has value. There is significant evidence that our brains are layered in evolutionary history phylogenetically from the spine up, out, and forward along an axis of sophistication. These three, five, and perhaps more layers may represent successful tricks evolved by various other creatures back through our evolutionary past.</span></p><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> </span></p><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Our prefrontal brain (on both sides) seems to provide for the most abstract executive (and mystical) functions. The basal ganglia (being somewhat less lateralized), deals with the more primal, far less sophisticated than even a reptile. The three anchor endpoints within the skull along this split axis of sophistication.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Note that I don’t use the word “systems” to describe any of this complex functionality. One of the most valuable aspects of the Triune model (and Carl’s presentation) is that it’s biological and subjective from these creature’s perspectives. “Systems” would take us back into the tech world with all its rigid definitions. I will use the term sparingly and present technical comparisons mostly for contrast.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Setting aside specific functions and their locations for now, the broader concept that the brain is somewhat layered in our evolutionary history remains useful. A whole class of behavior known as reflexes can be understood as largely independent creatures living in the spine. Along with the above five parts, should we not add the spine, heart, and gut for a total of eight? A “gut feel” is often how we describe conviction. In any case, we will soon visit a few others. But before we let our left-brain limit the number of these layered creatures, let’s consider more abstract behavior.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">How Many Parts?</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In, “Frames of Mind: The Theory of Multiple Intelligences”, Howard Gardner describes more than eight types of intelligence. Might these eight types of intelligence be implemented by eight or more relatively independent areas of the brain?</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">And Dr. McGilchrist was not the first to suggest a dual nature of the brain. For completely different reasons Freud was one of the first to recognize a mind with at least two aspects by contrasting our conscious and subconscious nature.</span></p><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white;"><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">More recently Daniel Kahneman’s book, "Thinking, Fast and Slow"</span><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> describes the operation of the brain in two competing “systems” reflected in the title. Daniel was also careful to make clear that his ideas have nothing to do with the vertical separation of the two corti. But again, are “fast and slow” represented by a physical or operational boundary somewhere in the brain? Perhaps a lizard contrasted with a mammal? Maybe some other creature in our past? There are so many possibilities, and for now, it’s important that one concept need not preclude another, no matter how we slice up the mind and its underlying brain.</span></span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In, "A Thousand Brains", Jeff Hawkins also obviously models the brain in a thousand parts, but these parts are general and homogeneous, as opposed to specialized and dedicated in function. In a more macro context, he refers to an "old brain" as opposed to our "neo" cortex (singular!). His old-brain, new-brain model is similar in many ways to the Triune model (minus one brain part) and has similarities with Kahneman's fast and slow versions. Do any of these limits add clarity?</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In any case, we have Iain McGilchrist effectively characterizing a brain divided left and right, Paul Mclean doing something similar in three levels from primitive to advanced, Howard Gardner describing eight types of intelligence with no physical allocation, an attribution also left undefined by Daniel Kahneman with his two “systems”, Hawkins with his old and new brains, and Freud with our conscious and subconscious. And this is the shortlist of those who have presented the brain as having multiple facets, with apparently multiple dimensions and aspects of control.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">It’s also interesting that there is no agreement as to how the brain is sliced or managed by these various creatures from our phylogenetic history. Yet reference to multifaceted behavior is so common in our language and literature that whole sections of our vocabulary are dedicated to the idea. There are literally thousands of examples. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Abraham Lincoln referred to, “the better angels of our nature”. He didn’t specify how many. Shakespeare liked to confuse us with pairs of twins, and their deceptions. Often these twins had contrasting natures. Or similar ones. Were these literary characters actually devices representing multifaceted meta-characters? And of course, Rene’ Descartes also recognized a duality in our nature, if not a reasonable physical implementation. Was his attempt to separate mind from matter driven by an ultimately multifaceted aspect of internal versus external modeling of our left-right divided brain? Or was he simply protecting the sanctity of the soul while being unable to move beyond the dichotomy of mind and body to entertain other parts? Our literature and history are rife with examples of multiple aspects of our mind’s experience, and likely, the multiple competing and cooperating behaviors created by a multifaceted brain.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">So if the physical brain and some introspective experience are multifaceted, should not its operating model also conform? Perhaps we have no model of the brain because there isn’t one. Perhaps it’s because there are many, one for each part or creature in our evolutionary past. And if the brain has many independent parts, how many? And how independent might they be? What constitutes the sovereignty of pumping blood compared to moving the gut? Are we really looking for more masters? More emissaries? How about a few minions? Certainly viewing our divided brain as simply Master and Emissary, (or even Mystic/Artist and Executive/Scientist), does not account for the observed and extensively multifaceted nature of our behavior.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">After all, we can drive down the road actively debating talk radio in our minds while eating a cheeseburger and picking our teeth. This requires at the very least five independent entities all operating in parallel, each deferring to the others as needed, and when. Some will have to be inhibited in their operation at any given moment. Someone watches the road. Someone drives the car. Someone listens to the narrative. Someone bites the burger. Another manages dental hygiene. And that’s not even counting all the autonomic and/or peripheral functions happening in the background, such as heartbeat, breathing, and digestion to get that cheeseburger down. A single-digit creature count is likely a gross under-estimation and oversimplification.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If you’re now trying to imagine what it might feel like to have multiple creatures in your skull, don’t bother. You already know. If this thesis is correct, it’s what you experience each day. Mostly we feel unified with transitions that are surprisingly smooth. It’s only when you have to grab the wheel to get you back on the road that you realize there is much more happening below the surface of your conscious mind. Every moment of every day.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">So how many operational parts make up our brain? It’s easy to imagine a brain/body with tens, or even hundreds of independent “systems” all working in various degrees of contention and harmony to yield a single and seemingly unified experience. To keep this multi-brain idea flexible and open-ended, let’s work with a nice round number, say a thousand creatures, while we explore functional boundaries and behavioral sovereignty. All while trying to avoid tech metaphors.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Our Complex Brain</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">A multifaceted brain explains so much about human behavior. It explains why people don’t keep promises. It explains why people lie. It explains a great deal about sex. It explains how people can change their minds so quickly. It explains how people can be so self-destructive. It answers those questions I posed above about Socrates, Henry II, Stalin, and Mao. It explains your doubt about what I’m presenting right now. It even explains mine.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">From a tech perspective, a multifaceted brain resolves so many issues about the subtlety of our more obvious behaviors. The very nature of a complex system is that it’s made up of multiple subsystems, each with its own agenda. In order to understand the behavior of a complex system, it’s important to know when each subsystem is in control, when they are competing, and when they are cooperating.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If control of the body’s muscles can be instantly and dynamically switched from one system to another, the result can be highly adaptive. This is especially true when one model or architecture need not preclude the others. And that’s the trick in a macro sense. Such a parallel and/or contention-resolving method would also explain the extraordinary resilience and reliability of the brain. As Malcolm from “Jurassic Park” said, “Life finds a way”.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Returning to a more right-minded perspective, these "subsystems" are better thought of as “creatures”. That’s likely how they evolved. Being able to introspectively feel the experience is especially challenging when multiple things are happening at the same time. But it’s fun to deconstruct, to think about, and to feel.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This raises other issues. What is the nature of each of these thousand creatures? How do we characterize them? What are their operational boundaries? What are their capabilities? What are their limits? And if we actually have these creatures in our skull, how are they physically organized? Left/right? Up/down? Front/back? Core to periphery? All of the above? Even more significantly, how are they operationally organized? What connects to what? Who has control? Who gives consent? When and why? </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Finally, if we have these multiple entities in our brain, how might this control be arbitrated? For me, these were familiar issues of computer architecture. Contention resolution of parallel computer operation is one of the more challenging aspects of computer science. I’ve had to deal with it on several occasions over the decades with varying degrees of success. It’s not an easy problem to solve. At least not for a left-brained programmer, which is one reason the concept has been so poorly implemented, even in current multi-core silicon.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In an ironic comparison with the brain, (and for completely different reasons), most computer cores are quiet most of the time waiting for other cores to complete a process, making overall operation inherently serial and dependent upon whatever result they might be waiting for. Then the transitions of control are relatively clumsy and crude. I won’t bore you with more clunky detail. In contrast, the biological brain manages arbitration of these many facets with a casual elegance that would lock a computer in a tight loop, or the logic trap of a “deadly embrace”, a challenge in the control of data editing. But let’s not slip back into the world of tech quite yet.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">While reading the “Divided Brain” I recognized something else. These concepts of control forced me to step back from my detailed work with the neuron for a broader view of the brain. In one of Dr. McGilchrist’s lectures, he notes that while the lateralization of human behavior as a field of study has been largely ignored, the study of lateralization in other animals continued, and helped greatly in his research and writing. Why would we ignore lateralization in ourselves, but not less complex lifeforms? Is it the same reason we believe we are fundamentally different from other animals? Such thinking is the height of hubris which is perhaps our left-brain blinding our right-mind in some way. It's far more likely that we are only different from other animals by degrees, even when these degrees yield apparently dramatic differences. It’s only the disproportionate effects of emergent results in these degrees that separate us from Bonobos and Chimps. That and another superpower of the left-brain - denial.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Our left-brain does not easily accept the idea of other creatures in our skull. But the reality of at the very least a divided brain is obvious. I too found the concept challenging when I first read, “Dragons of Eden”. The thought of a lizard in my brain was distasteful at best, but I did seriously consider the possibility. Perhaps that’s why I found the ideas in “The Master and His Emissary” subjectively less shocking. (Carl Sagan had also addressed the issue of left and right brain differences, but more to insightfully contrast serial and parallel operation of the brain which I’ll address in due course).</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In any case, it’s not easy to think about sharing your skull with multiple entities. Still, the evidence is overwhelming in our language and our culture. Fortunately, this multifaceted approach allows us to more easily deconstruct the brain, and more effectively characterize its parts and processes. A multifaceted architecture solves so many problems in forming a useful model of the brain.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">So what does the Triune brain (and the other division models) have to do with a simply Divided Brain? For me, it was deja vu all over again. The more I compared the vertically divided brain with the core-layered Triune model, the more similarities I found with a third venue - my work with neurons trying to acquire knowledge in a nano context. In all three cases, and as noted above, arbitrating control was the key. After months of study, I discovered that each of these three models might use a similar method of arbitrating control. If I could solve this challenge for two, three, or six creatures, I could solve it for a thousand. And I have.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Before we leave the topic of complexity, let’s contrast it with the concept of complicated. Complicated implies something opaque and nebulous. It’s how our right-mind dismisses the issue. In contrast, if we are effective at deconstructing complexity, things get simpler, a quality especially appreciated by our left-brain. How do we even keep track of what goes where? And does it really matter?</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">But Not Exclusivity So</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As Dr. McGilchrist described the nature of the divided brain on the macro level, I began to see parallels with what I was finding at the nano-level of the neuron. Could evolution be using the same tricks over and over? The dynamic tension created by the divided brain was in many ways similar to what happens within a neuron as some inputs try to activate, and others work to inhibit firing. I began to call this similarity “decursive” in contrast with recursive, to be described shortly. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The thesis of the “Divided Brain” is that our left-brain has become more emergent during the last few thousand years as its dominance waxed largely because of the success of technology, and our left-brain’s lopsided advantage of expressing this success using language. Even if you don’t follow Dr. McGilchrist to his thematic conclusion, he clearly demonstrates the differences between the two sides of our brain and how arbitrating control presents a challenge of how one side might do one thing, “but not exclusively so”. And the other, the opposite, but with the same exception - “not exclusively so”.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">When I first read this bidirectional mitigating clause, I thought of it as a lack of conviction. I soon changed my mind. Dr. McGilchrist was describing something both subtle and significant about the two halves of the brain - they both compete and cooperate dynamically as control shifts from one side to the other in a macro context. But strangely, they do not collaborate (sharing labor), and their cooperation is often unintended, even unknowingly performed as he describes in his patients who have had their callus callosum severed. This strange dynamic also occurs within the layers of the Triune (or more layered) brain in a micro context. And finally at the level of individual neurons in a nano context. That alternative minority case in each context can be quite important for survival.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">For instance, language resides in the left-brain, but not exclusively so. And facial recognition occurs on the right, but not exclusively so. Both sides do both. But only to a degree, thus the exceptions might be best described as right-brained and left-minded.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Here’s another way to describe this subtlety. The left-brain likes to quantify and define things. It likes to find THE answer and deny the rest. It uses these definitions to construct dependencies which often become logically rigid in a serial fashion. In contrast, the right-mind keeps its options open. One solution need not preclude another. It’s constantly comparing and reviewing scenarios in parallel, dismissing the complex as merely complicated, and certainty only by degrees. You might say our left-brain likes to define things with "facts," or at the very least assertions, and our right-mind would be happy to end any generalization with, "or something like that." At least it would if it had a voice.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">One way to think about this dichotomy (and lack of conviction) is that there is likely a majority (or more common) response from the side that is most commonly associated with any given challenge. But there’s also a minority (or backup) solution standing by if the majority response fails, or for some other reason is not cued. That cueing is the key to transferring control. I’ll be describing this in more detail later on, but suffice it to say, something similar happens within the peripheral nervous system, the layers of each side of the brain, neuronal nets within these layers, and even within the neuron itself (which is where I've spent most of my time). Let’s return to Ricky at my high school to understand the value of a hot standby, which is also the value of an alternative and minority method.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Ricky’s Triumph</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">A few years after returning to northern California our complete high school Junior class was in the cafeteria taking some annual test. As usual, I was seated at the geeks’ table with my cousin Dave Cline and a few others. When the time was up, we were instructed to put down our pencils and hand in our papers. We were then told we had to remain seated for the next 45 minutes. That’s when school officially got out. The teachers didn’t want us wandering around campus, which I find reasonable now, but didn’t at the time. While sitting at this table, I remember discussing continental drift and other weighty topics while we lamented this boring challenge to our personal freedom. (Interestingly, the idea that continents drifted like pond scum was a concept that had virtually no hard data to back it up when introduced about fifty years earlier. But by the late 1960s, it was a hot topic with lots of valid evidence. I expect something similar to happen with the idea that neurons create knowledge. Or not.)</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Anyway, we were deep in debate when all of a sudden, I saw Ricky Morrison stand up across the room. He started for the main door, but three teachers literally ran to intercept him. By this time Ricky was a big guy, more than 200 pounds, and fairly lean. But the teachers were experienced with his brash physical behavior. They were ready to block the door. All of a sudden Ricky turned on his heels and went in the opposite direction. I’d seen this move before. Now he was in the lead. Ricky pushed through the emergency exit at the other end of the room and was gone. The alarm sounded. The door banged closed. The room became quiet. The teachers stopped and stared. The silence was broken by nervous laughter from some of the students.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">One of the guys at our table sneered with derision, and stated to no one in particular, “Retard!”</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">My cousin Dave countered with this observation which I’ll always remember, “That ‘retard’ is enjoying his freedom while the rest of us sit here in envy. So who is actually smarter?” And who is retarded?"</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Dave was right. Here was a room full of students who failed to answer that final question on that day’s intelligence test, the one about personal sovereignty. Ricky was the only one who got it right. At that moment I realized intelligence depended on context and perspective. Sure, like Alexander the Great before him, Ricky broke the rules. But he also solved a problem. Ricky didn’t think outside the box, he lived outside the box. At least the socially acceptable box. This gave him an advantage. And he exploited it. That day, Ricky was no retard.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Even then I saw parallels with my computer work. I’d been studying Boolean algebra and had just finished designing my first ALU (Arithmetic Logic Unit). I had demonstrated it for the science review and was selected to present it to the local Rotary Club. Yes, I was that geeky, even in 1967, well before geeky was cool.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I’d also been reading Freud and B.F. Skinner, but Desmond Morris’s “Naked Ape” was a favorite. I found human behavior fascinating, and like other computer geeks, wondered about the parallels between not only baboons and humans, but also between humans and machines. Even at that time, the possibility that a computer might become smarter than a human was being suggested. But the question at hand, the question presented by Dave that afternoon was, who was the smartest guy in the room?</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Ricky’s feint is often seen in football - head-fake an obvious objective, then break the other way. But Ricky wasn’t on the football team. He’d evolved this particular solution somewhere else. I’d seen the prototype years before. You might say Ricky’s weird habit in how he moved had tricked the teachers. I don’t believe he even thought about his actions that day. At least not like you or I might. He just wasn’t that concerned. He had the same instruction as the rest of us (remain in your seat), but he used a different personal script, something more primal, something more innocent. And in this case, something more effective.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I realized at that point, Ricky had the same desire to leave as the rest of us, but his behavior did not take into account the social contract. He simply didn’t honor those constraints, that inhibition. This allowed him to overcome this one minor challenge in his life by using a different script.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Perhaps we all have such alternative scripts, but how do they get triggered? When? And why? I’ve since spent most of my life in computer design and business management, but humans continue to fascinate me most. Like many technologists at the time, I wondered about the behavioral “program” we had in our brain, and why Ricky’s was so different from mine. What did he know that we didn’t? Or was it his lack of knowledge? I’ve since discovered that it actually depends upon the nature of knowledge. And the individual.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">One thing was certain, at that moment, for this particular challenge, Ricky was the smartest guy in the room. Or was he? Should I have used the word "certain" in the above sentence? Is any evaluation of the circumstances that certain? And should I remove “that” from this prior sentence when presenting a superlative? Or are all superlatives naive? Or even the word "all" in this last sentence. See how</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> language and</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> logic have a major blind spot? One can get lost for hours in such a paradox which can quickly be resolved by our right-mind if we let it. </span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">Can certainty be expressed by degrees? </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Of course not, but before we get distracted, don’t we all have a bit of Ricky hiding somewhere in our skull? When the standard approach fails, don’t we revert to something more crude and powerful? When push comes to shove, doesn’t it make sense to bolt for the exits and push through the door? Ricky was just a bit more claustrophobic than the rest of us on this particular afternoon. And less concerned with the consequences.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Here’s a question to ponder in the back (or quite probably, front) of your mind while I set up the concept of decursion:</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">For this situation, was Ricky really the smartest guy in the room?</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><h2 dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 10pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Contie</span></h2><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I need to present a bit of housekeeping before I move on. I could have titled this section “contexts”, but it’s hard to hear the subtle plural of context so I’ve stolen a method used for singular Latin words that end in “us”, I believe a trick applied by the Romans for a similar reason. The reason we need a plural for context is that the complexity of the brain demands it, otherwise we’ll get hopelessly lost. Since I’m about to play fast and loose with the definition of decursion, I might as well coin another term - contie. There. Doesn’t that sound better? At least there's a clear contrast, and isn't that bit of knowledge useful at times?</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Seriously, the brain scales many orders of magnitude in complexity. If we’re describing one context, how do we differentiate it from another? For instance, Dr. McGilchrist can be said to have described the brain in a macro-context - that of an individual human, and that’s fine as far as it goes. But it doesn’t go far enough. If I arbitrarily deconstruct the brain into a thousand creatures and the process is useful, is it really arbitrary? Utility is the test. It will be applied over and over as I proceed. For instance, we could describe these creatures as living in a milli-context, and their tricks in a micro-context, with individual neurons described in a nano-context. That’s a lot of contie.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Going the other direction for a moment, applying decursion outside the skull could be described in a kilo-context for tribes, a mega-context for different cultures or nations, and maybe even a giga-context for the internet, plus or minus an order of magnitude. Or two. There’s no need to get too specific at this point. We don’t yet know how much room we’re going to need in order to model this Gordian Knot we call a brain. For the sake of this presentation, I’ll define the following contie each separated by three orders of magnitude, with some exceptions:</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white;"><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Macro-context</span><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> - an individual human, or a few people, and perhaps the two sides of the brain within the skull.</span></span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white;"><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Milli-context</span><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> - the realm in which our creatures live - the many layers of the brain.</span></span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white;"><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Micro-context</span><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> - where these creature’s tricks are implemented, mostly the connectome, about a thousand neurons per trick, plus or minus.</span></span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white;"><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Nano-context</span><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> - within the neuron and between them, but not strictly so.</span></span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This gives us nine orders of magnitude to describe the brain and scale the concept of decursion within the skull, which I'll address next.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">But before we leave the topic, I'll note that context and contie are more than just words. A context can also be understood as a bit of knowledge expressed by all the neurons that feed the neuron in question. Yep, context too is knowledge too. I'll save the details for later. </span></p><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; white-space: pre-wrap;">Decursion</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">“To understand recursion, one must first understand recursion.” - Steven Hawking</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This paradox is literally a joke that makes fun of our left-brain. Our right-mind knows better because we obviously are able to understand recursion. Thus the humor. What exactly happens if you don’t have to define something in terms of itself? Or define it at all? Our left-brain goes around trying to define everything it encounters so it can be associated with a word and be manipulated, but that’s a naive behavior in many cases. Some things defy definition, yet obviously exist. Such as faith, conviction, or love.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Fortunately, these things can be largely understood without defining them using words</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> and are better managed with our right-mind.</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> Steven Hawking demonstrates the limits of logic with his assessment of the recursion paradox. And he nicely captures the essence of the issue, which is all about capturing the essence of an issue. Before we end up in a hopeless tangle, let me unwind this pretzel logic by inverting recursion’s definition into something more manageable, and opposite, in a somewhat derivative and decurrent fashion.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If you’re not familiar with recursion, it is the act of defining something in terms of itself. Recursion is repeatedly applying this same mathematical or computer result over and over until a base case is reached. It’s a little like (but not the same thing as), performing the long division algorithm over and over until the remainder is no longer significant. In contrast to long division, with recursion, the process is then unwound to provide a final answer. Recursion is one approach to solving various computer problems. Coastline fractals simulated by the Mandelbrot set are a visual example. Or seeing how a fern frond appears to be made up of smaller fern fronds, decurrently. Another example is the Romanesco cauliflower. Biologically similar structures can be successively deconstructed at each level, recursively. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">McGilchrist describes our left-brain as being somewhat recursive in its approach to defining the world. I began to wonder what our right-mind’s alternative approach might be, which led me immediately to appropriate the word decursion and apply it in a contrastingly new context - the brain itself.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white;"><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As the inverse of recursion, decursion</span><span style="color: #222222; font-family: Arial; font-size: 14pt; font-style: italic; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> constructs </span><span style="color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">from the smallest fern stem element in steps upward to the whole frond. Both recursion and decursion contain the essence of the object in each context. It would actually be more consistent with other "re" and "de" prefixes to swap these two but that would be too confusing at this point. It’s mostly a matter of whether you start - at the bottom or the top. Evolution and biology more likely started at the bottom. Decursion honors that approach. And it has more utility for our current challenge than left-brained recursion, as I’ll shortly demonstrate. We also don't have to worry about the "turtles all the way down" issue when the sky's the limit.</span></span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Decursion is the opposite of recursion in several important ways. Recursion deals with the issue objectively and at arm's length, much like the concept of stimulus-response. Decursion honors the artist's more subjective approach. Decursion is the right-mind's more intimate and expansive alternative to our left-brain’s reductionist approach. Instead of turtles all the way down, it's knowledge creation methods all the way up.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Decursion replicates a similar method, but not in a mathematically or logically definitive way. Decursion is not simply the same as derivation either. Instead, it mimics that base case with increasing adaptation and sophistication. I believe that once evolution discovers a new trick, it doesn’t like to let go. Instead, it applies it over and over in a decursive fashion. </span></p><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">About a billion years ago evolution discovered a new way to evolve by inducing movement in response to knowledge about the world. Animals came into existence at the same moment as knowledge. Going forward, many lifeforms did not have to die in order for evolution to proceed. It was the beginning of a new kingdom and a new era, decursively.</span></p><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As evolution applies decursion it captures the essence of the thing, just not in a perfectly defined form or algorithm. More significantly, decursion provides a template for understanding the brain. And it doesn’t stop there.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Here is a grim, but useful way to think about decursion - war. I've mentioned that neurons seem to both compete and cooperate to create knowledge. Putting aside my assertion for a moment, think of the political and practical aspects of war. It's largely implemented as acts of</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> competition</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> and</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> cooperation</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">. Let's take the idea a bit further and explore the parallels with the modern world of sports where</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> both</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> teamwork and individual champions are important factors in success. Sports become a metaphor for war because it is decursive of war (and likely, also the inverse - another chicken and egg?). War is decursive of sports. Many battles between primal tribes were much like sporting events, even when the players were injured, or even killed.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This decursive template not only describes the neuron, micro-brain structures, and the divided brain itself, this evolutionarily decursive expression</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> also</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> escapes the skull and finds form in our language, art, media, dance, government, and even finance. Verbal language is decursive of neurons creating knowledge, written language is decursive of verbal language. Memes are a right-minded version of both. For me, decursion provided the map of development history which I desperately needed, a sort of Rosetta Stone translating between contie. The concept also finally yielded some useful metaphors of the brain. Here’s one example:</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Instead of thinking of the brain as a computer, think of it as a Wall Street stock exchange where each of millions of remote investors is a neuron with their own unique ways of predicting the market in a nano context. They each "know" their version of how and when to buy low and sell high. Thousands of market managers in New York channel these decisions to a few brokers on the exchange floor where in a micro context these methods and systems compete and cooperate to find the highest and best use of capital as they dynamically price any given stock issue. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">At least that’s how it used to happen. Nowadays brokers are automated, but in the past, they too were humans representing neurons. When you stand back from the process, these</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> competing</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> and</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> cooperating</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> investors with their own financial methods can be thought of as creating useful knowledge, aka the investor's alpha. But until you understand how each method works, they are mystical tricks of the financial trade. And literally tricks of evolution that have directed capital needed for survival and the growth of civilization. No one investor has all the answers, or all the knowledge, but brought together as a system, it works surprisingly well. Most of the time.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">OK, how about another hierarchically convergent and decursive example of brain architecture. Think of our bicameral congress as a brain where each member represents a micro-context state, county, or precinct of nano-context voters competing and cooperating to promote the best form of government in an ultimately macro context defined as a bill that becomes the law of the land. Note how this example is steeped in dichotomy. I'll be describing how the brain and even neurons do something similar, for similar reasons.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">And if you want to understand the hierarchy of the brain, think Army. Some decisions are made by privates, others by captains, but the most important ones go all the way up to the general, or even president. Neural layers work in a similar way, on both sides of the brain. Or both sides of the battle.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Here's another example of decursion. It's a metaphor of simulation that artists may appreciate - the theater. Performance art likely started around the campfire. It still goes on today in many forms. From mimicry to scary stories the audience is asked to suspend disbelief as they listen to a narrative and watch the expressions of the storyteller. (For the more technical, this could be described as a form of sparse coding. I mention it to bring you scientists along a path. Artists can ignore the sparse aspect for now.)</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In time, such campfire performances became more formal oral histories yielding verbal language in a talent now largely lost to history. Fortunately, some of these stories were captured in written form tens or hundreds of thousands of years later. For how long was language limited to oral presentation? Perhaps a million years? And it's only taken a few thousand years more until actual theatrical scripts were written, capturing not only the words spoken, but also stage direction and movement of Shakespear's actors. This can be thought of as rebalancing McGilchrist's left-brain with a more cultural right-mind, which brings us finally to the theater.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Once a more formal stage was built and costumes made, the masks of ancient theater were left in the dressing room and suspending disbelief became easier. Modern comedy and drama are the results. Movies only refined and replicated the experience. Decursively. From campfire mimicry to modern virtual reality and electronic gaming, each of these art forms delivers an increasingly decursive version of the one that came before. How much they evoke emotion from their audience is a test of their quality. From pain to pleasure, from laughter to tears, what we see in the theater or the VR headset is a decursive re-cuing for what our ancestors experienced in life.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Sure, these performances sometimes have rough edges. And all of the above metaphors are rife with operational failure, but then so is the brain. As is the actual stock market, and army. Need I say Congress is not immune? Fortunately, the brain makes up for it in each case with parallel resilience. And in various ways, so does the theater, the army, the stock market, and congress. Failure,</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> competition</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">, and</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> cooperation</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> are how all three examples hone knowledge, decursively.</span></p><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><b>Neurons create knowledge.</b></span></p><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><b style="color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Cues are decursive of neurons creating knowledge.</b></p><div style="text-align: left;"><b><span></span></b></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><b style="color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Words are decursive of neurons creating knowledge.</b></p><div style="text-align: left;"><span style="background-color: white;"><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt;"><b style="color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Memes are decursive of neurons creating knowledge.</b></p><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt;"><b style="color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Maps are decursive of the brain's sparse signaling.</b></p><div><span></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt;"></p><div style="-webkit-text-stroke-width: 0px; color: black; font-family: "Times New Roman"; font-size: medium; font-style: normal; font-variant-caps: normal; font-variant-ligatures: normal; font-weight: 400; letter-spacing: normal; orphans: 2; text-align: left; text-decoration-color: initial; text-decoration-style: initial; text-decoration-thickness: initial; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px;"><span></span></div><p></p><p dir="ltr" style="-webkit-text-stroke-width: 0px; color: black; font-family: "Times New Roman"; font-size: medium; font-style: normal; font-variant-caps: normal; font-variant-ligatures: normal; letter-spacing: normal; line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; orphans: 2; text-align: left; text-decoration-color: initial; text-decoration-style: initial; text-decoration-thickness: initial; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px;"><span style="color: #222222; font-family: Arial;"><span style="font-size: 18.6667px; white-space: pre-wrap;">And there are so many more examples - virtually every form of knowledge. Even t</span></span><span style="color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">his blog post is decursive of neurons creating knowledge. Think in terms of experience driving expression, useful or not. Going forward, I'll</span><span style="color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> be using the concept of decursion to help model the brain in the macro context, and also using these macro examples to inform the nano nature of the neuron. Keep an open mind.</span></p></span></div><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><h2 dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 10pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The Tao of Zen and Zen of Tao</span></h2><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><a href="https://aeon.co/videos/what-zen-buddhist-riddles-reveal-about-knowledge-and-the-unknowable" style="text-decoration-line: none;"><span style="background-color: white; color: #1155cc; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; text-decoration-line: underline; text-decoration-skip-ink: none; vertical-align: baseline; white-space: pre-wrap;">"What Zen Buddhist riddles reveal about knowledge and the unknowable"</span></a></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Buddhism, Tao, and Zen are religions roughly associated with India, China, and Japan. Putting the actual practices aside, I find some of the ideas most interesting, especially with regard to the brain. My generalizations about Tao and Zen will not satisfy masters of either, but the underlying concepts provide examples of how an important aspect of neuron and brain architecture has escaped the skull and found form in eastern culture, summarized as the wisdom of Buddhism.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">For our purposes, I’ll present Tao as literally meaning “the path” or “the way”. Enlightenment may come from walking this path, an inherently serial process. A path can also be described as a technique, an algorithm, or a process. Tao is more left-brained, but not exclusively so. The pronoun “the” implies the superlative, the singular. This one and only path to enlightenment, (even if a different path for each person), is singular, and superlative and may have decursively evolved into the more monotheistic religions of the west. Our left-brain is always looking for “the one” as presented in the movie, “The Matrix”, an obvious reference to the messiah, a concept common across western cultures.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In contrast, the term Zen usually stands alone, (or sometimes with a following association, but no pronoun needed). In some ways, Zen is both the essence and exception of Tao, and vice versa, similar to the contrasts of the divided brain. Zen is often described by what it isn’t, as opposed to what it is. In contrast to Tao, Zen is knowing the nature of a thing without walking a path, an aspect we might attribute to the right-mind. Also, one person’s enlightenment need not be the same as another’s. And one solution or enlightenment need not preclude another. Thus, pronounless Zen. But back to Tao for now.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In the micro context, neurons connect from one to another forming pathways, actually, many pathways (in contrast with circuits as I'll address shortly). We have tens of millions of neural sensors. We have only a few hundred muscles, but these muscles can be applied in complex sequences to form about a million scripts of movement, plus or minus, making the brain inherently convergent from sensors to muscles. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">To clarify, these largely parallel paths begin with tens of millions of neurons sensing the world and converge down to moving a few hundred muscles (or collection of muscles in a serial fashion). These muscle movements will hopefully affect the world in some way beneficial to the subject at hand. This affected world can then once again be sensed, completing a never-ending operational loop specifically including this individual within the world. Tao is serial in nature, and so exists over time in serial feedback loops. But not exclusively so.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The knowledge that Zen acquires, lives in the moment, and so exists in parallel with other similar knowledge at any given moment. With Zen, we observe all aspects of a thing struggling not to define it before we are enlightened by its nature. Or not. Then somewhere a neuron fires and we understand its nature. It all happens at once, with no steps to be taken. Time is not a factor. Zen does not exist in a temporal frame as Tao must in order to be sequential. This Tao and Zen dichotomy exists all through the brain in our nano, micro, millia, and macro contie, decursively.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In the nano context, a collection of inputs need to be present at the same time in order for that neuron to fire, (or at least within a primed window of time to push this metaphor from an instant to a moment). Zen occurs in that moment. Not the moment before, not the moment after.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In a micro context, thousands of inputs all happening at once can be thought of as the basis for what we call associative memory (which is not actually stateful memory at all, as I’ll present later). Before we lose the path of our Tao and digress into a mess of logic, let’s combine this Zen moment with our Tao path to yield a convergent hierarchy, the most common “network” in the brain. I use the term network here loosely in that the brain’s hierarchies are obviously not orthogonal, nor even very consistent. But they must ultimately and inherently be mostly convergent, as is the result.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In a macro context, these two competing approaches yield hierarchical</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> competition</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> and</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> cooperation</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> as the two sides of the brain operate in parallel, and also serially, not unlike McGilchrist describes them. The left brain tends to engage the world in a serial fashion, but not exclusively so. It helps us to manage time and provides a temporal framework for language, and our view of reality. In contrast, the right-mind tends to engage the world in a parallel fashion, but not exclusively so. At that moment.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">What happens at the level of the neuron also happens at the micro and milla network context as well as the macro-level of the left and right brain. Oh, and it’s the very same decursively replicated architecture we see in the stock exchange and congress as presented above. Tao and the left-brain are more rational, but not exclusively so. The left-brain tends to present a thesis using logic. The right-mind is more insightful as it challenges with the anthesis, but not exclusively so. Together they form a synthesis yielding wisdom. Hopefully.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If my abused definitions of these concepts replicated decursively in various contie seems a bit flighty, it’s meant to be. We’re looking at it from the top down. The objective at this point is to keep things general until we form a more useful framework for the brain. Decursion is one of the tools I’ve used to get there. If it doesn’t make sense right now, relax. When we build from the bottom up, the model will become a bit more obvious. For now, it’s time to revisit the smartest guy in the room. We’re not yet done with Ricky. Or Dave.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><h2 dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 10pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The Smarter Guy in the Room</span></h2><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Thinking back at this point in my life, Ricky Morison not only inspired my interest in human behavior, in later analysis his actions taught me something very important about knowledge. In a more subjective model of the brain - knowledge approaches truth like an asymptote or Zeno's paradox - it never arrives. Definitions are merely ways to grasp things. Superlatives, like truth, are aspirations. All are illusions, starting out as knowledge. Each can be useful in its own context.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As the door slammed shut behind him, Ricky was now free to do whatever Ricky does. The rest of us were left to realize what Ricky could not have known without imagination. In this case, Dave’s imagination. Dave had implied that Ricky was the smartest guy in the room, but wasn’t Dave the smartest guy in the room for having the imagination to recognize Ricky’s brilliance? </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">But wait! If Dave was the smartest guy in the room, then his recognition of Ricky’s superlative was invalid, obviously leaving Ricky and Dave in a paradox for the title.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">See how easy it is to end up in a left-brain logic trap? They happen more often than you might realize. We normally dismiss them using denial. In order to be trapped, you have to think about them logically, and then limit yourself to only those rules. And that’s the error we might make - logic. Fortunately, we typically ignore logic traps and deal with them using our right-mind. Otherwise, we would all be constantly getting stuck in some catatonic state. Some "deadly embrace".</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In any case, Ricky accomplished that which the rest of us only imagined, after the fact. And I believe he did it without using imagination at all. Ricky was obviously cheating, but was his cheating not intuitive? His solution certainly was simple, but obviously violated the rules. Which is the point. In a more Zen fashion, Ricky walked his Tao path right out of the room, and left conformance to the wind. The rest of us only observed. So does enlightenment flow from walking the path? Or simply understanding its nature? Does nothing matter until something moves? And does it matter how we gain the knowledge from this little exercise as long as we come to know it?</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The point is, the standard for knowledge is constantly changing. As soon as it’s defined, it needs redefinition. Ask any day trader. Ask any Senator. Knowledge is as fluid and flexible as information is rigid and defined. Before debate, you may know one thing, after debate, the opposite. Though rare, it does happen. Minds do change. And for the little issues, constantly.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">So, which makes someone smarter, the Tao of walking the path, or the Zen of knowing what it means? I would suggest that, like Zen, it is all of the above. And none. It depends upon your perspective. And to some degree, luck. The world is not determinant.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">According to Occam’s Razor, the simplest solution is likely to be the correct one. Did Ricky have a breakthrough as he set off that exit alarm? Is this what the knowledge of enlightenment looks like? There’s no correct answer of course. There is no smartest guy in the room. But what might happen if we treated the brain as a collection of such challenges? Such koans? Such tricks? Might we find that brain model we seek?</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><h2 dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 10pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Enlightened by Art?</span></h2><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">“If you can’t replicate the work and get the same outcome, then it’s not science.</span></p><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If you can replicate the work and get the same outcome, it’s not art.” - Seth Godin</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">To clarify the above quote, I’ll paraphrase another of my favorites originally by Theodore Roethke: </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">“Art is that, which everything else isn’t.”</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Continuing from the last sections, this quote may seem a bit Zen. And that’s the point. Art is not a “thing” to be defined or managed by our left-brain. Art is often enigmatic. So is Zen. Art allows us to discover novel “things” which are initially undefined, but not exclusively so. As these new things move from our right-mind to left-brain they become “grasped” and better defined over time. Once characterized, these tricks become methods. These methods are then applied in the steps of an algorithm to be repeated in the more serial fashion of a machine. Once their more consistent nature is characterized, they cease to be art and become part of science. To gain traction for this challenge we will need to approach the problem as neuron-art in contrast with neuroscience.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The right-mind deals with novelty - things not yet defined. Artistic things are discovered in a moment of intuitive enlightenment. Once our right-mind shares these things by altering the focus of our attention, the left-brain defines them and uses them as components, or deconstructs them into their subcomponents, sometimes recursively. But not exclusively so. More later on how this transition (and many others) actually occurs as knowledge moves around the brain.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">After reading the Divided Brain and realizing that once again our left-brain had failed to objectively model the brain, I began to wonder what a more subjective, a more biological model of the brain might entail? McGilchrist inspired me to explore this possibility. How does our right-mind see the brain? What is the Zen of the neuron, as opposed to the Tao of a neural pathway? This last question became my personal koan. And the notion allowed me to tease out the first principle of the neuron - that neurons create knowledge. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Zen is about understanding the nature of things without defining them. Zen lives in the realm of the right-mind. Zen is a muse. What does Zen have to do with the brain? And Art? For me, it was the simplest approach to forming a useful model of the brain. To treat the brain as a collection of evolutionary tricks to be characterized and defined.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As for defining art, the above Theodore Roethke quotes simply mean art begins where science leaves off. As noted above, the left-brain deals with science, but not exclusively so. The right-mind deals with the things in life which have not yet been defined, but not exclusively so. These proto-things can also be generalized as art. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Oliver Sacks reached the same conclusion about understanding the brain, and mind:</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">“What is this mystery which passes any method or procedure, and is essentially different from algorithm or strategy? It is art.” - from “Awakenings”</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Over the decades I have made many attempts to model the brain, but none of them felt right. None were viscerally satisfying. None of them left me at peace with the problem. Until now.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">That has been my test. Was this simply my right-mind objecting to my more technical left-brain conclusions? Does it matter if we are able to build a model of the brain that is useful? What if we treat the brain as a Zen koan? What if we take a more subjective and indeterminant approach? What if we play with ideas instead of working with them?</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I’m going to describe this gnostic model, hopefully without defining it, at least until we get down to the neuron. And even after that, I shall endeavor to keep things flexible as we come to understand evolution’s tricks and ultimately describe them as methods.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I believe many others, perhaps millions, have used this or similar approaches throughout history. Many have likely reached similar conclusions over time, but have described the experience as spiritual or even as enlightenment. Or worse, not being able to describe what they have discovered at all because of our left-brain’s reticence to express the right mind's discoveries in language. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">How many times have you been at a loss for words? This might have been more likely because what you wished to express was coming from your right-mind. Or you were not able to describe it logically or in a way that would satisfy your left-brain. Perhaps you struggled with some art form. But just because you couldn’t find the words doesn’t mean your conclusion was invalid or useless. It just wasn’t directly accessible by your left-brain. </span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I will take you along a path that is similar to the one I have walked both logically and intuitively, but not definitively. Still, science will not be ignored. Virtually everything I present will ultimately be testable with repeatable results. Or it's not useful science. It's mostly how I get there that will be intuitive. Our left-brain may not admit that we already have a simple model of the brain in our right-mind, but it’s there. The left-brain may not describe it in words.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><p dir="ltr" style="line-height: 1.8; margin-bottom: 0pt; margin-top: 0pt; text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">But I will.</span></p><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><div style="text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Continued:</span></div><div style="text-align: left;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></div><div style="text-align: left;"><h3 class="post-title entry-title" style="background-color: whitesmoke; font-family: "Trebuchet MS", verdana, sans-serif; font-size: 19.5px; text-indent: 10px;"><a href="https://suddendisruption.blogspot.com/2020/12/unlearning-brain-metaphors.html" style="border-bottom: 1px dashed red; color: black; text-decoration-line: none;">The Gnostic Neuron - Part 3 - Unlearning Brain Metaphors</a></h3></div><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><div style="text-align: left;"><span style="background-color: white;"><br /></span></div><div style="text-align: left;"><br /></div></span></div>Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.com0tag:blogger.com,1999:blog-23742979.post-129135281596867352023-02-02T05:51:00.012-08:002023-12-31T16:13:36.329-08:00The Gnostic Neuron - Part 3 - Unlearning Brain Metaphors<p><span style="font-size: large;"><b>Unlearning Brain Metaphors</b></span></p><p>First posted 02-14-22</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEgkq7BOJBXuq0cIOVF8XlRlOx8mXqzigZBd4YwdsT93jTFEHzSGnG6vNDvXDrAHF379JasNX4d8oU8kzCBLqUv9wPJ-FrE62KY3x6OiaDoL3GDoSuSMfTlVwXU-tBbamrll59AHzObKdvG_0AMXHkbKW-GEoXx0EmJHvANeCxmkBMJcUqjX6oY=s640" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" data-original-height="360" data-original-width="640" height="360" src="https://blogger.googleusercontent.com/img/a/AVvXsEgkq7BOJBXuq0cIOVF8XlRlOx8mXqzigZBd4YwdsT93jTFEHzSGnG6vNDvXDrAHF379JasNX4d8oU8kzCBLqUv9wPJ-FrE62KY3x6OiaDoL3GDoSuSMfTlVwXU-tBbamrll59AHzObKdvG_0AMXHkbKW-GEoXx0EmJHvANeCxmkBMJcUqjX6oY=w640-h360" width="640" /></a></div><p><br /></p><p><span style="font-family: "Trebuchet MS"; font-size: 24pt; white-space: pre-wrap;">Unlearning Brain Metaphors</span></p><span id="docs-internal-guid-283aa883-7fff-0056-4b4b-5715c8948fd5"><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><br /></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Setting aside for now why we don't have a consensus model of the brain, I'm going to present one possible approach to resolving this deficit, without having to know for sure why it's missing. Let's start with the classic first lessons from Zen:</span></p><br /><h2 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 10pt;"><span style="font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">‘Zen Koan: “A Cup of Tea”</span></h2><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Nan-in, a Japanese master during the Meiji era (1868–1912), received a university professor who came to inquire about Zen.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Nan-in served tea. He poured his visitor’s cup full, and then kept on pouring.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The professor watched the overflow until he no longer could restrain himself. “It is overfull. No more will go in!”</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">“Like this cup,” Nan-in said, “you are full of your own opinions and speculations. How can I show you Zen unless you first empty your cup?” ‘</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">So it is with the brain. Our cup is certainly full of somewhat disorganized data, mis-metaphors, and preconceptions that simply don’t fit, which makes them worse than not useful. They actually distract us from the true nature of the brain. And not in a good way. We need to set them aside. Or somehow distract ourselves from these distractions. It’s not easy. Try to not think about a flying pig while we discuss how such a pig might be made to fly.</span><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">How will we empty our cup? For me this began decades ago when I began to note the differences between neural biology and computer architecture. I’m in no way an expert with biology but am intimately familiar with how a computer works. For me, what I discovered about the biology of the brain took a long time to accept. Your mileage may vary. Even so, the contrasts are striking, and I’ll try to make them even more vivid. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I started this process by literally setting up lists of what brains and computers have in common, and what they have in contrast. I suggest you also start such a list for yourself. It will help you find conviction when challenging some very engrained ideas - such as the brain being electrical in its nature. Ultimately, vivid descriptions won’t be enough. Even once you realize how the brain is not like a computer, the knowledge does not tell you much about what the brain IS like. Our brain does not like an information vacuum. This won’t be easy.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The brain will try to hold on to distracting associations until you can distract it with new and better ones until we discover reasonable alternative explanations and get them firmly in place. I had to force myself to ignore these default metaphors, all of them, and treat the brain as a complete mystery, which was the conclusion I ultimately came to only a few years ago. It’s not an easy process and is best described as unlearning, as the above Zen koan above nicely presents.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Once I began to let go of these preconceptions, things actually started getting easier, less frustrating. And more fun. I’ll provide many examples, but ultimately, you’ll have to reach your own conclusion on how this aspect of understanding, (and misunderstanding), goes to the core of our missing model. If you’re a technologist, this section will be more challenging. If not, you have an advantage. You’ll have less to unlearn. Now let’s address this more mystical approach before we deal with the electrical issues.</span></p><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 24pt;"><span style="color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The Brain As Mystery </span></h3><div><br /></div><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Treating the brain as a playful mystery has many advantages. It allows for a wonderful flexibility as we work our way back to more useful generalizations. By doing this, my casual model of the brain shortly morphed into a collection of evolutionary tricks not yet understood, at least not by me. It’s a humbling experience after decades of serious study, and not an easy frame of mind to maintain. But instead of being something I had to work at, I let the brain become a toy, something for me to play with conceptually. My Assertion Salad in the final post was just one way of generalizing. Nothing was sacred in my thinking, and this remains the case. It’s also how science in general should be treated, or else science may blind you to the actual nature of things.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Mystery, of course, is only one way of looking at the problem. But so far for me, it’s yielding a much more satisfying model of the brain, even if it’s only a simple, cursory, and casual one. So suspend disbelief for a time. Forget everything you know about the brain. This approach is actually one of the more powerful aspects of our right-mind. Keep your options open. Now, what is the nature of these “tricks”?</span></p><div><span><br /></span></div><div><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Ultimately applied in physical form, evolutionary tricks have been created by various proto-creatures down through our phylogenetic history. I refer to these critters as proto-creatures because they are not the same animals that currently walk the earth. Every modern species we observe today has been evolving for exactly the same amount of time. </span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">These creatures are the leaves of our evolutionary tree. In contrast, proto-creatures emerge at the branch points of our evolutionary tree. They are the prototypes of what we see in their current form, but each in turn has a phylogenetic history that can be best described as a long line of proto-creatures, one at each branch. And each has found a way to survive by evolving various tricks. This of course includes primates along our particular path through evolution out to our current leaf - humans.</span></div><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This evolutionary path has many branches and alternatives, not all of which are in human evolutionary history. For instance, there are at least two versions of eyeballs. In some ways humans got the inferior ones. Also, bioluminescence has independently evolved at least 40 different times so shows up all over the evolutionary bush. These are just two of evolution’s tricks. There are many, many more, and one need not preclude another. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><div><span><div><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><b>"The purpose of thinking is to let the ideas die instead of us dying."</b> - Alfred North Whitehead</span></div><div><br /></div></span></div><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Another idea I’d like to present is the high cost of Darwinian evolution. It's expensive. A whole lot of critters have to die in order for a very few to change. For this reason, I believe that evolution has evolved a more cost-effective way to evolve, and the brain is at the focus of this effort. Though truly wondrous, the brain is just another one of evolution’s tricks, or better understood as a collection of them, each evolved by a different creature in our evolutionary past.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">It’s important to note, these are not a magician’s tricks. A professional magician’s performance is all about human deception. Fortunately, evolution has no such agenda (yes, I’m subjectively anthropomorphizing evolution itself as it helped me understand its nature). Evolution’s agenda is replication and survival. Deception in this context is not typically important. These tricks are merely clever ways of doing things. Evolution has mostly left these tricks out in plain sight for us to discover and observe. Once we understand them, they become methods for our left-brain to engage, but for now, let’s leave most of these tricks as playthings for our right-mind. This will involve many things not well defined, as well as the relationships between these things, especially other humans, as they are the things in our lives that we come to know most intimately. I’m of course talking about personal feelings - love, hate, anger, and joy as the shortlist. These too will become our playthings.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Before I leave this proposal to treat the brain as a complete mystery, I need to contrast the concept with the engineering approach to dealing with the unknown. Engineers like to treat any subject in question as a “black-box” whose inputs are to be activated while observing its outputs. The concept is useful for teasing out the more consistent aspects of any object at hand, but less so for more dynamic challenges, or when we limit ourselves to being mere observers.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In the neuroscience community, this black-box approach is known as “stimulus-response,” the distillation of Pavlov's conditioning. It’s one way of exploring the brain from the outside - a more objective engineering perspective. The approach has actually been popular since Galvani first applied electricity to that frog’s leg, but since has mostly led to frustration. An alternative might honor the differences between our left-brain and right-mind by suggesting a less consistent but more useful model. Our left-brain sees the world and even neurons as a collection of things (or tools) to be manipulated. Our right-mind knows better. We do not control neurons. Neurons control us.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Try replacing the cause and effect of "stimulus-response" with "sense-decide-signal." Who decides when to signal? The neuron of course. This reflects the knowledge it's created and the sovereignty of that neuron, but I'm spoiling the plot. Try using the more macro and theatrical concept of, “cues and scripts,” where the actor makes the decision. This fresh perspective might be described as subjective versus objective since the subject neuron is in control, and not the investigator. I will explain in more detail shortly.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As noted, we can approach these mysteries as evolutionary tricks and then treat them as Zen koans standing between us and enlightenment. Our unlearning objective is to leave behind our preconceptions of the brain, especially those having to do with technical metaphors. I have spent my life working with electricity, electronics, logic, and computer architecture. I’ve only spent a few years without them when thinking about the brain. This is still not easy for me, but I will do my best to not distract you from the true nature of the neuron as I distract you from these unfortunate metaphors and the reality that electricity is actually anathema to the neuron. So let’s empty our cup. And also have a care as to which koans we embrace. We don’t want to get lost in the details before we form a useful framework. </span></p><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 24pt;"><span style="color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">False Metaphors and Distracting Words </span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Most neuroscientists think of neurons as logic devices or memory elements, even when their background is in biology or medicine. They apply these more technical metaphors without understanding the differences in depth. Brains are not state machines, nor do they conform to information theory in many respects.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">For most of my life, I too shared this view. But over time I've come to learn that neurons have far more in contrast than in common with such metaphors. </span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">If you're like me, you may have a feeling that there's just something about this tech approach that doesn't seem right. We need a new way of thinking about the problem.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If you’re a student of neuroscience, this “artistic” approach will stand in stark contrast to almost everything you know about the brain, especially the words and metaphors we use to describe the brain such as spikes, conduction, hard-wired, and anything to do with electricity. For this reason, I’ll find alternatives for the more blinding technical concepts as I proceed. For now, I want to point out the most distracting terms without losing the advantages of distraction for lateral thinking.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">One exception to misapplying technical terms is the word, "circuit". With electricity, electrons tend to flow from literally the ground or negative terminal of a battery or other power source. Using metal pathways, these electrons may pass through all kinds of convoluted logic before ultimately returning to the positive terminal of the battery completing a circle of sorts. These metal pathways are best described as circuits, not unlike a Circuit Judge who would travel from town to town managing cases until he returned to the city Courthouse in days of yore.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Confusingly, the brain does something similar but not in an electrical sense. Instead, we have neurons with long axons which (almost) connect to other neurons in an analogical fashion from one to another in complex ways. A simple version would be neurons sensing the world, which then might trigger muscle movement creating simple behaviors, which in turn might affect the world in some way, which neurons once again</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> sense</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> to start the process over. This ultimate circle of activity between the brain and the world forms a circuit of sorts, but not an electrical one. Instead, you might call it an analogical circuit of knowledge as I'll present. So circuits in the brain are a useful concept and description, unlike most other computer metaphors.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Fortunately, there are also other non-technical descriptions of the brain that apply surprisingly well in that they are more intuition-based, being inspired by the right-mind. For instance, “tension” better describes how we feel as our neurons prime for movement, in contrast with “action potential”. Yes, “potential” is a more accurate term for quantifying electrical charge, but we need to let go of our electrical metaphors. And the “action” part is archaic in that most neuronal firing does not lead to actual muscle movement, potential or not. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Lately, even the term “firing” is being replaced with “spikes”. This is likely because that’s how ionic charge appears on an oscilloscope. Since the pervasive use of electrical metaphors is part of our problem, I’ll mostly stick with the older metaphor of “fire” somewhat because of how a gun is triggered (and more recently, people). </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Seeking more useful metaphors, humans have been managing fires for more than 400,000 years. The propagation of neural signals across the brain have far more in common with wildfires (and backfires) than they do with spiking electricity. Fire yields direct experience so is the more primal and intuitive term. I will apply it generously. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I’m also going to retrieve the word “bit” from the tech world. With few exceptions, when I use this description of quantity I will not be referring to a binary digit, but the older reference to a small amount of something - in most cases, a small “bit” of knowledge, which as you’ll see, is very different from a binary digit. These two contrasting bit definitions still have much in common in that they both represent an elemental quantity. Contrasting the digital with the analog versions of "bit" will become very important as we proceed.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">And since we’re on the topic, you’ll see that I use the word “cue” to describe the cascade of neuronal firing, at least until we get to the business end of any neural pathway which I describe as scripts of muscle movement. And that more modern use of the word “triggering” will certainly be helpful depending upon who is pulling the trigger. I’ll note differences in other word use as we proceed. Now let’s address the most egregious distraction in trying to model the brain:</span></p><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 24pt;"><span style="color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The Neuron is Ionic, Not Electronic, Nor Even Electrical</span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">And the brain in general is neither; it biologically relies</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> on chemical signaling.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Electricity and electronics is all about the movement of electrons in conductors and semiconductors. The neuron has neither, relies on ion migration to work its magic, and chemistry for actual signaling.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If your background is in technology or even medicine, you may find the above assertion challenging. But the more you learn about electricity and electronics compared with the ionic nature of biology, the more startling the contrasts become. Most everyone has some confusion as to the differences between ionic and electronic. Thinking about the brain in electrical terms distracts from the neuron’s actual, and more simple ionic nature. Hans Berger’s development of the EEG (electroencephalogram) would have been of far more value if he had known about ion migration in the axon’s membrane instead of interpreting the cause of brain waves as electrical in nature.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">And it’s not just the tech world that harbors this confusion. It’s our broader culture in general. Hollywood presents electricity as the magic spark that restarts the heart, much like the Frankenstein story. And it works. Sort of. Sometimes. Well actually, not very often. Restarting the heart with electricity is nowhere nearly as effective as depicted on television where applying electricity to the heart usually does the trick. In reality, not so much. And there’s an important reason for these failures to re-spark life. Electrical stimulation is actually an abomination to most biology, (some aspects of knitting bones possibly being an exception). But ionic sensing can be of great utility in various ways.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Inside the brain, electricity is anathema to the neuron. It’s only in the neuron’s recovery from this type of assault that it sometimes restores biologically ionic rhythms in the heart. Note that even “electric” eels use “electricity” as an offensive weapon. Ironically, even this weapon is mostly ionic in its generation, but my objective here is not to actually argue the issue scientifically. I just want to raise a doubt and provide a fresh perspective - the un-electric nature of the actually ionic neuron and chemical brain. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Let’s back up a bit. You may have heard that in 1780 a guy named Luigi Galvani touched a scalpel to a frog’s leg just as a spark of static charge made the frog’s leg move. This became known at the time as “animal electricity”, and the metaphor still haunts us today. A fellow Italian named Alessandro Volta set out to replicate the work but attributed the electricity to the metal of the scalpel and not biology. In the process of proving his point, Volta invented the electrical battery giving him an advantage in the debate. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Ironically, Galvani and Volta were both right and both wrong in various respects. Batteries are mostly ionic. It’s the wires and silicon outside of the battery that deal with electricity and electronics. And also ironically, Galvani was correct to think that the “electrical” response of the frog’s leg was quite different from what was happening in Volta’s metal wires, or even his own metal scalpel. Again, the contrast is the same as comparing what happens inside of a battery (ionic) with what happens outside of a battery in wires and silicon (electronic). This contrast is critical to understanding the actual nature of the neuron. I suggest diving deep into the topic if you have any doubts. The rest of us will play with ions a bit.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Ask yourself this, is the nature of the brain really electrical in the same way that the telegraph or computer is electrical? If you understand both even modestly, that answer has to be no, not at all. The brain does not rely on electromagnetic propagation. Instead, it partly uses ion migration for signaling which is dramatically slower. What is often referred to as "electrical" charge is actually ionic charge and its detection is merely a side effect of what's really important to the neuron - cascading ion migration. </span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">The signaling may seem similar, but the medium of the signaling is dramatically different. It just so happens that ions within neurons are somewhat similar to those within a battery, but axon firing has almost nothing in common with the signals carried by electrons in metal wires. The point is, electrons are not the critical element involved in the operation of the neuron or the brain in general. But ions are. And that is figuratively, literally, and physically a very big difference. Electronics is all about the physics of the electron and signals using electromotive force. Ionic signaling in the neuron is far more chemical in nature.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">You may have heard that if an electron were a pea, most ions would be the size of a dump truck. And these dump trucks move through channels in the axon’s membrane in a cascading fashion, like dominos falling in turn. This action delivers a signal quickly in human terms, but these cascading ionic signals never approach anywhere near the speed of a computer. If none of this makes any sense right now - great! Ignore the difference for now. You’re halfway to unlearning the electrical nature of the neuron. Here’s an easier way to think about ionic neural conduction - don’t. Think chemistry instead.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Outside the neuron, the brain is best described as biological, with most neural communication delivered by tiny puffs of chemistry at the synapse where neurons almost touch one another. Only within neurons, and only because ion migration within the axon membrane polarizes (and depolarizes) these internal liquids, do we need to discuss charge at all. If it seems like I’m splitting hairs on this topic, I'm not. The differences between ions and electrons is dramatic and very important. Ionic charge from this “dump truck” (plus polarity) is actually the opposite of our electron “pea” (minus polarity). This is the same plus and minus you’ll find on the ends of a battery. As you may know, it’s important not to confuse the two ends of a battery, and it’s even more important not to confuse the two types of charge within the neuron.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The practical difference between ionic and electrical charge is the same difference between cascading ionic polarization moving at about 200 MPH, and electromagnetic propagation which occurs at the speed of light - or at about a 1,000,000,000 MPH (I rounded up to a billion miles per hour to make things more dramatic, but the value is fairly close). The point is, this is a huge difference in signaling speed. And again, if none of this makes any sense - even better! Ignore these electrical details with impunity! You’re well on your way to becoming innocently unlearned.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">OK, only one more comparison of what happens inside of a battery compared to what happens outside of the battery. Inside, ions migrate and build up a charge. Outside the battery, electrons move freely along a metal conductor. There are no such metal conductors inside the brain, there are no such metal conductors inside of a neuron. Because of its ionic nature, the brain acts like one big chunk of metal in electrical terms. An electrical signal will cross the entire brain in a nanosecond without discrimination as to type of tissue or fluid. Nothing in the brain insulates or isolates such possible electromagnetic propagation. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In contrast, myelin somewhat limits ion migration out of a leaky axon. Water metaphors applied to leaky hoses are far more useful in the neuron than wires. Myelin is not an electrical insulator. There are no insulators separating these fibers in electrical terms. By the way, none of these details are new to science, but it may seem that way depending on how much you rely on electrical metaphors in your thinking.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Unfortunately, this ionic nature of the neuron allows electricity to flash across a brain as quickly as a bolt of lightning disrupting the delicate ionic balance for each neuron in its path. For decades electro-shock therapy was applied to “reset” the brain. The result changed brain operation in random ways for a time as the brain recovered its ionic homeostasis. Neural damage remains an open question for this type of therapy. This is consistent with clinical recovery for such patients. But again, arguing the point is beyond the scope of what I’m presenting, so I won’t. Instead, I’ll just challenge your assumptions as we proceed.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">It’s true that what appears to be electro-”motive” force (EMF) radiates from neurons (and their axons). But it’s actually ionic charge that’s being detected. It’s the opposite in polarity, and it doesn’t “move” much at all. It’s this ionic charge that we measure with an EKG (or EEG as noted above). But this electro-”motive” force has nothing to do with most neuron to neuron signaling which happens chemically, not ionically in virtually all cases. Indeed, if you characterize the neurons as a black box, you can ignore ions completely. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Even more confusingly, if electricity is applied to the neuron in any way, it actually invalidates the operation of that neuron, at least for a time. In some cases it may even be harmful as noted above, but there’s no need to go into detail here. If you're not familiar with the difference between electronic and ionic, don't worry, many neuro-technologists aren't either or the metaphor would not be getting such wide acceptance. So relax. </span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">We’re just having fun for now. Here’s how I came to know the neuron.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">And also, how I came to unknow it.</span></p><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 24pt;"><span style="color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Snakes and Neurons </span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">During the summer between my sophomore and junior years of high school, I returned to Tucson and worked with my grandfather at the Skyline Country Club. He was the greenskeeper at their golf course during and after construction. We’d go to work at 5 P.M. to avoid watering in the worst of the summer heat. I’d spend my nights driving around the desert carrying sprinkler heads and turning water valves on and off in the dark. This was long before the electrical automation of watering systems. And we didn’t bother with headlamps. Such lights were far heavier and more awkward than they are today.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">At the time, Skyline was a new course and only a few of the paths were paved. Golf carts weren’t fast enough to get the work done and most of the roads we used were raw desert sand and rock, thus the need for something rugged. It was only a couple of decades after World War II and surplus Jeeps were a cheap solution. This job was also where I learned to drive as I didn’t have my license yet.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I especially appreciated that Jeep. It kept me up and away from the rattlesnakes. The headlights helped. At least most of the time. One night while carrying an armload of sprinkler heads I actually stepped over a rattlesnake in the dark. I was walking with my grandfather at the time. His eyes were better than mine and he pointed out that I was safe. It couldn’t strike until it had coiled. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As I said, this was a new course. All the fresh water had brought lots of animal life out of the Santa Catalina mountains. And rattlesnakes followed the other critters. Stepping on a rattlesnake was only one of the dangers. My grandfather had a side business of capturing the snakes and selling them to the University where they did who knows what. Several times I had to hold a gunny sack in the dark while my grandfather dropped a snake in. Once he even caught a coral snake, but it was small and somehow got out of the sack in the cab of my grandfather’s pickup. We never did find it again. For a while, I rode with my feet up on the seat. My grandfather didn’t seem to mind the snakes.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">After sunset, we’d stop on top of the hill above the maintenance shop and have “lunch”. This was also the far end of the parking lot for the main clubhouse. While we ate, my grandfather would tell me stories about the coal mines of Kentucky. He was my age when he first worked in the mines and noted that they were dangerous enough, but union sabotage and company bulls with clubs and guns during the labor disputes were a far greater risk. It’s one reason he started a tie mill. This allowed him to get away from the picket lines. He described some of the violence. He said snakes and cave-ins were a minor threat in comparison to what humans could do to each other. He noted that because of such conflict and war, humans were the most dangerous animal on earth. I’ve since learned that mosquitoes are worse. But humans are a close second. Rattlesnakes don’t even make the shortlist.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">On one of those nights at the golf course, I finished my change-ups early so was taking a break in the far corner of the clubhouse parking lot. The sun had long ago set and there was an excellent view of the city from this very dark location. A large thunderstorm was moving up from Mexico. Lightning flashed around the edges to set the mood for an unlikely encounter. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Just then a car pulled up a couple of spaces over. The headlights had not illuminated my position. A guy got out and opened his trunk. I think that’s when I startled him. A high school kid in a surplus army jeep was not what he expected to find in the dark. I put him at ease by explaining I was waiting for my sprinklers to finish. Perhaps to regain his composure he began a conversation that started with the lights of the city beneath the storm but quickly shifted to something far more enlightening. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">It turns out this guy was a grad student at the university. He was studying neurons. Since logic was my current fascination, I asked him how neurons might perform a logical evaluation. I already knew that neurons moved signals asymmetrically across the neuron but didn’t know how logic was integrated into the process. </span></p><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 24pt;"><span style="color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Asymmetric Communication </span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Asymmetry in this aspect simply means that signals generally come in one side of any neuron and go out the other, from input to output, not unlike a logic element sending signals over wires to another logic element. The point is, the signal does not seem to internally back-propagate, at least not directly. Setting a backfire in a forest has a similar effect. If you have a bit of wind, the fire will only burn in one direction. So it is with the firing of a neuron. The fired signal tends to go in one direction, from input to output and onward to the next inputs (but as a chemical signal, as noted,</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> not</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> electrical).</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I had already designed an ALU (arithmetic logic unit) and understood such logic intimately, so I asked how neurons might be connected to perform these functions normally associated with a computer. I think my understanding of logic and its parallels in the brain was his second surprise of the night. I’d been studying electronics since fourth grade. I asked how neural logic might be a factor in controlling behavior. He’d obviously also thought about the topic extensively and described the neuron to me in the following way. It’s how I came to both understand (and misunderstand) what neurons did:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If you’re not familiar with the neuron, you have an advantage, and far less to unlearn, so I’ll keep this simple. Our bodies are made up of billions of cells in a bag of salt water. These cells may be muscle cells, fat cells, and of course, neurons, among other types. One of the first and most impressive tricks of evolution is the cell membrane. It allows the cell to control what to let in, and what to keep out in many respects. Each cell is contained within its cell membrane. Muscle cells allow the body to move in various ways, expressing behavior. Fat cells store energy for later use. There are billions of other cells in the body that perform many other evolutionary tricks. Neurons are a big part of that count. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Like all cells, neurons have a cell membrane, a nucleus, and ways to provide energy for the cell, but they also do something no other type of cell does (at least not at the speed that neurons do it). This special evolutionary trick is that neurons signal other neurons using very small bits of neurochemistry, which ultimately tells muscle cells when to move. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The grad student that night described these signals as traveling asymmetrically, from input to output. I asked how neurons might electrically encode information. (I didn’t learn about state machines until much later.) As it turned out, he didn’t know the answer. To this day, no one else has been able to model how neurons might encode logic “states” representing information. That’s because they don’t. The closest thing a neuron has to a “state” is a dynamically evolving sensitivity to specific conditions in the world, and since this sensitivity is different each time the neuron fires, it's not technically a state. Or memory in the conventional sense.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The brain does shift moods as hormones wax and wane, and neurons too dynamically change sensitivities in a similar fashion, but describing either as states would be inaccurate. Yet both are a form of chemical signaling in the nano and macro context. Chemical diffusion typically has an aspect of temporal inertia, but the result is not very state-like. Instead of “encoding”, neurons have a way of evolving this sensitivity using something I've come to call analogic which I’ll describe shortly.</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> Consistent "states" encoded either electrically or chemically have yet to be found in the brain, but neurons clearly signal one another. This begged an obvious question at the time, and still does - what does this signal mean? Information theory does not deal with the meaning of signals specifically but does require that both sender and receiver agree on what a signal means to be useful. Neurons do not.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">Understanding what these neural signals mean, and how they can create simulations of reality is our objective. H</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">ow neurons know when to tell the muscles to move (and how much) is critical to such simulations, but for now, let’s just describe neurons as asymmetrical chemical communicators of knowledge. This simply means that neurons create and deliver signals along their most significant fiber called the axon, but generally in one direction - asymmetrically. These axons may be much shorter than a millimeter or up to several feet long. They tend to branch out and connect to other neurons at the far end (often far away from the source neuron’s cell body, but not exclusively so).</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Neurons also have input fibers. These are called dendrites and are much shorter and tend to form close to the neuron cell’s body or soma. Dendrites can be thought of as relatively short whiskers on the input side of the neuron. This is where the real magic happens. The axon typically protrudes out the other side of the neuron at a bulge called the hillock and extends for some distance as noted. The hillock is also known as the “trigger zone” and will be quite important when we get around to understanding how neurons create knowledge. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Dendrites support even smaller fibers called spines which host connections called synapses where other neuron’s axons almost connect at the input side of the synapse, but not quite. There is a very small gap in the synapse between the axon’s part of the cell membrane and the cell membrane of the next neuron’s spine and dendrite. This gap plays a critical role in the communication of any two neurons. It’s where the firing neuron delivers a very small amount of chemistry to the next neuron. This chemical is described as a neurotransmitter and where it’s delivered to the next neuron is called a neuroreceptor. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Transmitters and receivers harken back to radio signals which also travel at the speed of light using EMF. Though they are distracting terms, I don’t have better alternatives so far. These neuronal transmitters and receivers are actually chemical ports. Try not to think of them in terms of radio communication. They are not. Unfortunately, these words will have to do for now.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Getting back to signal meaning, how does agreement happen between neurotransmitters and neuroreceptors? Strangely enough, I now believe that it doesn’t. At least not in any logical way. Instead, this signal represents an enigmatic bit of knowledge (which we’ll explore shortly). This converging and cascading chemical knowledge forms a pathway from sensor to muscle, but this signal is not deterministic as required by information theory. Perhaps we need to define a new knowledge theory to contrast with information theory. I’ll put it on my To-Do list.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Until then, the "state" aspect of memory does not apply to the brain, at least not in the conventional sense. Neurons do not deliver states as described by information theory. Instead, they evolve a sensitivity for a particular condition not unlike an immune response detecting a virus which it has experienced before, but we’re getting a bit too detailed. For now, it’s best to think of neurons as creators of a magic signal that axons deliver at a distance in the form of nano-chemistry, asymmetrically. That’s it. That’s all we need to know about neurons. At least for now. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">For you technologists, let me cushion the blow a bit. Think of intraneural communication as a series of dominos lined up within the axon. Ionic tension may push over the first domino and start a cascading collapse that ends with the last domino dumping a bit of chemistry into the next synapse. Or if you don’t like dominos, replace them with butterfly wings caught in a cascading line of electrostatic discharge. I will elaborate later. Beyond that, we risk slipping back into our electrical metaphors. After all, everyone knows that sometimes dominos don't do what you expect. And with butterflies, anything can happen.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The next topic to unlearn is that neurons are not average.</span></p><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 24pt;"><span style="color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Disproportionality </span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">You may have heard that we only use ten percent of our brain. This myth is now largely demoted, but actually retains some utility on the nano level. The misunderstanding was caused by average oxygen consumption rates measured in the brain in the macro context. Extreme disproportionality is the reason for these oxygen observations and their misinterpretation. These early average estimates were both under and over stated dramatically. More modern measurements show only about one percent of neurons are in the process of firing at any moment, but the rate can vary significantly depending upon individual activity and immediate experience. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Most of the time, most neurons are not firing. Many individual neurons are quiet for minutes, weeks, or even years. But some neurons fire a lot, in some cases, almost all the time. This disproportionate firing rate actually reflects what happens in the outside world. Sensory neurons that detect detailed changes in the world fire most often, with interneurons firing less often along with increasing abstraction of the neural pathway at each succeeding neural step. This is how signals create simulation as they move up any neural pathway creating knowledge that ranges from concrete to abstract. Motor neurons of course fire more often when there’s lots of movement involved, but between sensor and motor, most of the middle parts of the pathways are far less active than even movement requires. But I’m spoiling the surprise. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In the brain, Pareto’s principle applies. Even more so. Think of it as hyper-Pareto. Who’s law covers 999 to 1? 9999 to 1? Disproportionality occurs by orders of magnitude in various ways throughout the brain.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The point is, statistics and average measurements are of little value when modeling the brain. Most neurons are inactive most of the time. Brain architecture, the number of connections, and active firing rates are all extremely disproportionate. Applying averages in the brain, much like generalizing about human behavior is a fool’s errand. It sort of works, but you can’t count on it. And for similar reasons. We need to back away from means, medians, and statistics in general as we explore the brain. We need to leave these powerful tools of science behind for a while.</span></p><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 24pt;"><span style="color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Brain Waves and Imaging are Gross </span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Though medical imaging has seen amazing progress and utility over the last century, and especially the last few decades, when it comes to brain modeling, most imaging is grossly over-interpreted and misunderstood. Much like brain waves were a generation before, brain imaging now does more harm than good. Here’s why:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">When I first gained access to minicomputers in the early 1970s I discovered that I could hold a small AM radio up to the sides of the CPU, (Central Processing Unit), and actually listen to programs being executed. You can try it yourself if you have an old AM radio and a cell phone (which of course, is a computer). Hear the static? It’s not nearly as random as you might at first conclude. There are definite patterns in that noise. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Of course, even with old and relatively slow minicomputers, these sounds were not caused by individual computer instructions, but the flow of the program could definitely be heard in a gross sense. Today it sounds almost random as the frequencies are so much higher causing so many more electrical transitions per second. But if you listen with a transistor radio held up to the side of an old Digital PDP-8, you’ll hear more order and more rhythms from the speaker than you will from a cell phone. I discovered these noises in a simpler time. I even wrote programs to yield a type of crude percussion music without the important aspect of actual musical notes. This was before Moog was popular.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Something similar is happening with brain imaging today. We observe the highest order rhythms of brain activity, or even more crudely, areas of increased oxygen demand. Correlating these images or movies with behavior is like trying to predict a single FedEx delivery by watching rush hour traffic from 30,000 feet. Try to follow a single vehicle while looking out the window of an airplane from cruise altitude next time you fly. It’s easy to keep track of a large truck on the freeway, but when they get into the city, don’t blink. Actually, CPU noise or watching cities from an airplane are very generous metaphors compared to fMRI and other imaging methods which are much lower resolution. They are more like trying to track that FedX truck from the moon. Are looking at clouds from space meaningful when trying to understand freeway traffic? Not much. It's like trying to predict behavior using phrenology during the 19th century.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Imaging is especially useless when mapping chemical communication between neurons at the nano level. Very high resolution Imaging is a bit more useful at the micro context. With more sensitive equipment, one can even sense ionic signals enabling a monkey to move a robotic arm. This is ultimately useful, but only in a very gross sense. Finer control, and more importantly, neuron sovereignty is sacrificed on this altar of electricity. Better brain interfacing will flow from a better understanding of the neuron.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In summary, such macro views of the brain are just mush and blur, not devoid of meaning, but almost certainly and dramatically over-interpreted. Scope and context are critical to exploration, but the devil is in the details. Brain waves and imaging are of little value on the macro scale. I won’t be bothering with either beyond this warning. We need to avoid this distraction no matter how pretty or how entertaining the images are. For now, simply ignore brain imaging and “electrical” brain waves. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Now for our most distracting metaphor.</span></p><br /><h2 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 10pt;"><span style="font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The Brain is Not a Computer</span></h2><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">“For more than a century, the single nerve cell has served as the structural and functional unit of brain activity. Pioneers of cognitive science enlisted the neuron doctrine as the foundation of the brain’s putative computational capacities. Each neuron was conceived as an on-off switch presumed capable of acting as a logic gate, enabling information to be ‘digitized’ (turned into ones or zeros) and thereby ‘encoded’. Single neurons were assumed to perform complex encoding tasks, including for places, faces and locations in space; a Nobel Prize was awarded on this basis.” - Pamela Lyon - Flinders University, Adelaide</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I include Pamela’s assessment to note how ingrained the computer is in thinking about neuroscience. By far, the most common metaphor applied to the brain is the computer. This is largely because of the emergent impact of the computer upon our culture and lifestyle, and also for some obviously similar aspects of their operation which are mostly a decursive macro illusion. The computer has thus become the very thing blinding us from the nature of the neuron and its collective expression, the brain. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Sure, the computer does a great job of accessing and manipulating information, but it rarely creates knowledge. And if it did, who would know? Most importantly, the operation of the brain has almost nothing in common with the operation of a computer. The challenge has been well addressed by Pamela Lyon noted above, Gerald Edelman, and many others for decades so I won’t go into much detail, but a few main points are important to challenge before we proceed with a gnostic model of the neuron. This is not easy for me. I love computers. They are just so unlike brains that it would be negligent to ignore the issues.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Here are the most important aspects that set the brain apart from a computer:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">A computer is fast, digital, consistent, synchronous, serial, fixed, objective, and most importantly, logical. In contrast, the brain could be described as largely opposite in each of these important aspects, but not exclusively so. Here are the main differences in chart form to help visualize the comparison:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">A Computer</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> The </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Brain</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Fast</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Relatively slow, but oh so elegantly time-efficient</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Digital</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Biologically analog yielding stateless digital signals</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Consistent</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Mostly malleable with evolving consistency</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Synchronous</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Actually asynchronous but exploiting synchronicity</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Serial</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Profoundly parallel only converging to serial</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Fixed</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Predominately plastic, in critical phases, by degrees</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Objective</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Surprisingly subjective, aspiring to the objective</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Logical</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Ultimately bioanalogical, but not exclusively so</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Please note in the table above that I define computers in terms of only eight words, one for each aspect. These eight are not the only differences between computers and brains, far from it. There may be a thousand things that matter, but I want to keep this simple and obvious for now. The above list is easy to defend.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This table is an example of something technologists might recognize as sparse coding which simply means finding the few things that matter most, then using as few binary bits as possible to encode these most significant things. Sparse coding works because of what’s not included, and not encoded. Maps are a good example. Only the important stuff gets included. The brain does this better and in a more elegant fashion than computers by sparsely DEcoding reality as dynamic maps. I’ll get to it shortly. Hopefully, we won't throw out the baby with the bathwater, another way to describe sparse coding</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> compared to computer systems which tend to capture more data to sort out later</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Also, In the brain column, I qualify each opposite aspect in contradiction, emphasizing that many exceptions are needed to describe the more multifaceted and competitive approach the brain uses to cooperate in parallel, yielding a somewhat serial result. I do this to avoid defining (or fixing) the brain’s version of each aspect. In truth, there are exceptions in both columns, but more so for the brain. Finally, I’ve added, McGilchrist’s, “but not exclusively so”, for the final bioanalogical aspect as it is the most confusing exception. At least it was for me. The result ranges from the dichotomy of definition to the threshold of an enigma. I do this so you will question my descriptions in more detail.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Even though it often leads to paradox, technologists tend to cling to a consistent, determinant, and most importantly, a defined model of the subject at hand. Or its opposite, when contrasting in a binary fashion. As Bob Dylan might say, “When something’s not right - it’s wrong!” As we proceed we’ll leave Bob behind and try to keep our thinking biologically flexible. Now I’m going to color outside the lines even more as I address some of these eight aspects and a few others not listed as I did with sparse coding above.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Are brains like computers?</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Not very much.</span></p><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 24pt;"><span style="color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The Elegance of Inaction </span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I previously compared a bird to a Boeing to suggest alternative ways of simulating the world, but the metaphor breaks down when the transit time of operation is compared for each solution. Signals in the brain travel at the speed of a very fast automobile. But electronic computer signals literally travel at the speed of light. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As noted by Jeff Hawkins in, “On Intelligence”, this limited biological speed only allows for about a hundred neurons in any neural pathway from sensor to muscle, at least if cause and effect are to be preserved. When we consider the possible allocations between transient response and propagation delays against what neurons accomplish, this number of jumps could be fewer than 100, but not by much. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The bottom line is, the brain seems to be as elegant in what it accomplishes with this speed and the limited number of “jumps” as it is with its astounding power efficiency. It seems that the brain’s Zen nature is more about what it doesn’t do, in comparison to what a computer has to do, to accomplish a similar result. There seems to be sparse signaling not just in content, but also in time and energy. This is probably the most telling contrast between brains and computers. The brain accomplishes far more, using far less, in all three aspects - speed, power, and encoding.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As for cause and effect, it depends upon who’s in control. Plus, the number of “jumps” in any neural pathway is rarely average. Finally, there are ways around the temporal paradox as described by more recent timing experiments in the corti. Where does behavior originate? It must be within the neuron. And when exactly? Whenever it decides. The details are beyond the scope of this contrasting exercise, but rest assured, the temporal paradox will not be ignored. It has a solution. For now, think in terms of an extraordinarily elegant tortoise, and forget about the computerized hare. </span></p><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 24pt;"><span style="color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Phineas Gage </span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">My grandfather knew I was studying logic that summer on the golf course. And he knew I’d bought a book about the brain after my discussion with the grad student. My guess is that’s why he told me a story he’d heard about a guy who had an accident on one of the rail lines. He said he’d heard the story as a kid:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In the process of dynamiting a cut for a new track, a spark accidentally set off a black powder charge which drove a pike completely through this guy’s brain. According to my grandfather, the guy survived and went on to live a fairly normal life. I had my doubts at the time, and my grandfather admitted it happened way before he was born, but he believed that the story was true. I wasn’t so sure at the time but found the tale interesting nonetheless.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">It was years later that I connected his story that night to Phineas Gage, one of the most famous brain injury cases of all time. My grandfather had not known Gage’s name, and I did not make the connection until decades later. Names are a bit of knowledge that allows us to connect things. The name Phineas Gage provides a handle to cue this vivid story. Since my grandfather worked in coal mines, this story would have been an important lesson even decades after the actual event: when preparing an explosive charge, it’s important to always work from the side of a borehole, not directly above it. Also, for me it made the point - the brain is resilient and has an amazing ability to recover from</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> injury, even</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> major injury. A computer would never survive such damage. Now for more unlearning.</span></p><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 24pt;"><span style="color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">“It’s Only Analogical, Captain” - said Spock. Never. </span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The prefix “ana” is Greek for, “up”, “again”, or “apart“, and is widely applied in the field of biology. These three letters also prefix both “analog” and “analogy”. For me, these last two especially help to contrast technical or philosophical logic with what happens within the neuron and the brain. Logic is definitive. Analogic is logic by degrees, but not necessarily in a proportional fashion. "L</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">ogic by degrees" is undefined in boolean algebra, much like "divide by zero" in arithmetic. Thus the need for ANAlogic.</span><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> It depends upon the chemistry of that context, or the "mood" of any given neuron. For instance, think of a logical decision in a </span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">relationship </span><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">that can be altered by three drinks of alcohol. Those drinks alter behavior in a macro sense. Other</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> mood shifts may occur in the micro and nano context as well. Analogic only approaches logic by doing something enigmatically similar - sometimes, or something like that. Bear with me as I compare analogic to logic.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The original Star Trek series premiered when I was in high school. As a fan of SciFi, this new TV show caught my attention. It was love at first "to boldly go". Watching Spock and his Vulcan nature, I became fascinated by the differences between the logic of Vulcan culture in contrast with human behavior with all of its messy exceptions. The show literally inspired me to study boolean algebra. It’s also when I began reading Plato and others who addressed the topic. Finding meaning from endless logical arguments was so much fun I even considered studying law for a time, but the fever passed - there were too many exceptions. Both human behavior and law were too unpredictable and illogical for me.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">But challenges with alien creatures seem so simple if you just take them logically as Spock did. I later wondered if Paul Simon’s only number-one solo hit, “Fifty Ways to Leave Your Lover” was inspired by Spock’s logical culture. If you could present reasons in a logical manner, you might even keep your heart from breaking. Well maybe.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">At least Spock made it look easy. Even when Captain Kirk screwed things up yet somehow succeeded, it was easier to revert to logical analysis. If I ignored the human element, I could live in a perfectly determinant world. Logic was the key. Logic allowed me to wall off the messiness of the real world. Digital electronics, computers, and programming were all ways to live in this perfectly logical world. I fell in love with logic. The honeymoon lasted for decades.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Later when Star Trek, the Next Generation elaborated on something called the "prime directive", I took notice. This is the idea that one should</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> try to</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> never affect the object under study, only observe it. This seemed to challenge the "stimulus-response" model so popular in brain science for the last few hundred years. By honoring the prime directive, I remain critical of the "stimulus-response" approach as it does not honor the genesis of the decision and can grossly distort the result, making any observations less useful for logical analysis. It was also the reason for my early concerns about any data captured by applying electricity to the brain.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Logic is the most clearly defined and determinant branch of mathematics. Logic makes integers with their “divide by zero", and other issues seem downright fluffy. Even merely ganging logic to encode the analog world opens up the challenges of range and resolution. And things get worse with real numbers. But logic by itself is almost pure and complete, the most challenging exception being how it’s applied to biology. And behavior. We won’t need to unlearn logic, but we will need to understand how technical logic likely evolved from the biological version. I refer to this as analogic because of the obvious similarities when it comes to creating knowledge. Logic works with clear definitions, like all mathematics. Analogic is more flexible, especially in the early phases in the nano context. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As a 14-year-old, I found the beauty and consistency of logic compelling. I’ve since spent most of my adult life with this tool close at hand and have used it widely. But relationships change over time, and so has this one. Don’t get me wrong. I still have a love of logic, but I also have a new lover - knowledge. This new relationship is far more accommodating because of its analogical nature. And because one love need not preclude another.</span></p><br /><h3 dir="ltr" style="line-height: 1.656; margin-bottom: 6pt; margin-top: 24pt;"><span style="color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Analog Versus Digital </span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">During much of my grammar and high school, my cousin Dave Cline and I shared more than just classes together. We also shared a lab. Well, that’s what we called it. It was actually his sister’s playhouse which she had long ago abandoned. This “lab” was a free-standing building of about 10 feet by 12 feet located behind his house. We built a bench along the back wall. Dave took the right half, I the left. Over the years we built bikes, rockets, radios, and other circuits in our “lab”. By the time we were in high school, it was mostly used for electronics. I was into the newest digital systems. Dave preferred analog. Many relate to the analog/digital dichotomy in reference to analog music which has seen a recent resurgence. But the difference goes much deeper, even to the core of physics and philosophy. Our interest in the analog / digital dichotomy was all about electronics. It was a friendly competition of dueling designs. Cooperation might come later.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">At one point we had both purchased oscilloscope kits which we assembled. For those not familiar, an oscilloscope is a kind of TV for electronic waveforms. Because of our limited resources, these were inexpensive and very simple single-trace units. To make them more useful we decided to add a dual-trace input circuit so we could compare two waveforms on the screen at once. My design digitally switched from one input source to another quickly enough to time-share the oscilloscope beam. Dave took the classic analog approach of mixing a square wave with the two input sources to be observed.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The design world today is almost completely digital, but in 1968, analog was the standard approach. Radio was analog. Television was analog. Even a few simple computers were analog. But the cool new computers were all digital. Philosophically, digital and analog are about as different as possible and still be called electronics. Both approaches were common at the time. As an exercise in design, we were reinventing the wheel with these new dual-trace circuits.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The world in which we live is mostly analog. Most performed music is analog. Temperature is analog. Dance (movement) is analog. But the representation of each can be digitized in various ways. In nature, virtually everything is analog. You can think of analog as smooth waveforms, infinitely variable. Old fashioned volume controls with real knobs are a great example. Most of our interaction with the natural world is analog. From the chirp of a bird to the warmth of a kiss, we experience an analog world.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In contrast, the digital world is driven by logic and math. Anything in nature such as sound or music can be quantified by defining values of a certain resolution and range, (the two parts of a floating-point number). Once digitized, these sounds from nature can be treated as numbers to be encoded, copied, and manipulated by computer programs. This digital world has another special quality - it’s determinant, meaning that it sounds exactly the same each time you play the song. Or at least it should, (reality has exceptions). Manipulating these values using math yields a consistent result, more consistent than even nature itself, well if it weren't part of nature. Let's just say digital is less common in nature. Digital also allows for interchangeable components - the key to mass production. In contrast, analog often has to be tuned for each application, and over time, degrades. Digital is always the same. Or it doesn't work at all, much like comparing Pheneas Gage’s resilient brain to a computer where a single failure can brick the device.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Ironically, if you look closely enough at nature, some parts of the world become almost digital. Atoms and molecules are actually discrete. We only perceive them as analog because of their extraordinarily high resolution. Well, mostly. Smell, taste, and some aspects of light have certain digital qualities because of their molecular and quantum nature. Our neural sensors can detect a single molecule of odor, and a single photon of light. These vivid exceptions nicely demonstrate how exquisite our organic neural sensors can be.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">But in a macro context, we live in an analog world. So why would we bother with digital? Digitizing the world has some dramatic advantages. It makes our analog world easier to capture, store, and manipulate. That’s why our interaction with the world today has been almost completely digitized by technology.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Designing these oscilloscope enhancements at the time was a challenge. Integrated circuits and TTL were exotic and very expensive, certainly beyond our budgets. Many of our parts came from old transistor radios. My memory system on another project was literally made from relays taken from a pinball machine. For this project, we struggled to find transistors with matching characteristics. For these reasons, our designs had to be simple, even elegantly so.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In another application I actually pushed the limits of good power design by using a single transistor as an AND gate. After all, two of its three wires require the same polarity, and that final wire the opposite polarity in order for it to activate. This allowed me to use an analog transistor in a pseudo-digital fashion. At the time I jokingly thought of it as “analogical”. Decades later the term would take on new meaning.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">It took a few weeks, but we both got our solutions to work reasonably well. Indeed, they had similar performance characteristics. To this point we had been quite secretive as to our implementation, even hiding our schematics. Now it was time to critique each other's work.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">With a digital perspective, I of course started with a logic design then figured out how to cost-effectively implement it using linear transistors, which was all I had. Dave’s design treated the inputs as if he were mixing music channels but at frequencies high enough to form fairly nice square waves, again implemented using similar linear transistors. That was not surprising. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">What really got my attention was that when these two different designs were reduced to electronic schematics, the circuits were virtually identical. I was very much challenged by this outcome and compared the designs in various ways only to conclude that no matter how you approach this particular problem, the optimum result was similar. It reminded me of the quantum nature of light being both a particle and a wave. The concept was to become important decades later when sorting out the nature of the neuron compared to the computer.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Here's a video that nicely describes some of the issues between analog and digital that challenged me for years. The presentation </span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">remains clearly in the tech world so in terms of unlearning, take it with a grain of salt:</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><b><a href="https://www.youtube.com/watch?v=GVsUOuSjvcg&t=1128s" target="_blank">Future Computers Will Be Radically Different</a></b></span></p><div><br /></div><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 24pt;"><span style="color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Mechanical Perfection </span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As a child, we wish time would go faster, allowing us to do the things the bigger kids got to do, granting more freedom with each passing year. Once we reach the age of majority, we wish time would slow down so we could take advantage of these options. And it’s that way for the rest of our life. At least so far.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">One of the things I liked about school in Tucson was that they began vocational training in 7th grade. The girls went to one classroom to study home economics, and the boys in another to learn mechanical drawing. I’d been looking forward to this topic for more than a year. Drawing was the key to expressing engineering design and schematics, already a personal interest. Schematics are a type of map showing how various elements of electronics are connected to one another. I was fascinated by them. I had even saved money and bought a collection of precision drawing instruments. My intention was to create perfect schematics and get an “A” in this class.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">On the first day I started by placing my paper perfectly on the drawing board. I don’t know how many pieces of masking tape I wasted trying to get the position against the t-square perfect, then putting tape on all four corners without moving the paper or ending up with tension between these four points of support on the slanted board. Seriously, much of that first hour was spent learning this skill. And that was just putting the paper in place.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Next I began to plot the assignment, but not with a pencil. It was too early for that. Instead, I used the very fine point of my high-tech divider to make a tiny hole where the first line would begin and another one where it would end. These holes were so small you could only find them if you knew exactly where to look. Then I would go on to plot the next line with another tiny hole. By the time I closed the loop there was a very small gap. My tiny holes were in the wrong place.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The problem of course was accumulative error. This was something I would learn a great deal about years later when I managed a survey crew. On this day, no matter how careful, it kept happening. Sometimes error averages out. Most of the time it doesn’t. In any case, I tore the paper off and started over. I didn’t want to get confused about which tiny hole was which attempt.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The same thing happened on my second try. The third try I actually got some lines drawn, but other lines had a similar problem. I didn’t want to erase them because I can never make it look fresh again, so I started over one more time. This was only a one hour class. Four days later everyone else was on their third assignment. I still hadn’t turned in my first. It still wasn’t perfect, but it was getting close.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As I tore off yet another attempt, the teacher took notice. He came over, set a piece of paper on my board and quickly taped it down. Then he grabbed my hand and drew a line with a triangle, not even using the T-square. Next he rotated the triangle, grabbed my hand to draw another line. In about five minutes the drawing was complete. He pulled it off the board, put a big “C” on the top and threw it on his desk. My first drawing was done. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I did OK with the other assignments and ended up with a “B” in the class, but I remained forever disillusioned about creating that perfect drawing. This was of course an example of perfect being the enemy of good. Or a demonstration of my teacher’s right-mind casually stepping over the towering paradox created by my left-brain seeking perfection. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">For me, physical expression was a challenge. I tended to live in my head, a theoretical place. From the drafting class, I learned that perfection in the real world is an illusion, and ultimately a fool's errand, by degrees. But by degrees is never perfect. Another paradox. It’s one reason I was drawn to the digital world. Somewhat later, when doing logic design or coding, I could make things perfect. Or seemingly so.</span></p><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 24pt;"><span style="color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Cargo Cult </span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">You may have heard that there were isolated Pacific islands invaded by American soldiers during World War II. The native people watched these Americans create long, flat, and hard runways from coral and steel. Then large airplanes landed on these runways and disgorged all sorts of rectangular boxes. These plywood boxes contained all kinds of weapons, food, equipment, and tools - the stuff that makes an army function. Of course the natives got some of this opulent stuff in trade for helping the soldiers in various ways. And lots of empty boxes were left over.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">A couple of years later the soldiers loaded up most of their stuff and flew off never to be seen again. When other westerners visited these islands years later they found that the natives had used some of these empty wooden boxes to fashion crude “airplanes” which didn’t actually fly but were an attempt to encourage the real airplanes with all their cargo to return to the islands. These natives became known as a cargo cult, and similar behavior has shown up in various native people around the world at different times and in various ways. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The behavior is of course known as mimicry and is one of evolution’s most powerful tricks, which is why it’s so often applied. The more modern and technical description is known as a simulation. They are common in our hyper-modern digital world. Minecraft, Rec Room, or Roblox are more current and more vivid digital examples. Each allows the user to create perfect virtual worlds where they can control the outcomes in various but actually imperfect ways. One need only note the blocky features of these creations and the lack of intimate connection that we find in live theater. It's a </span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">cargo cult result, but quite attractive to our left-brain as Dr. McGilchrist suggests in the, "Divided Brain".</span></p><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 24pt;"><span style="color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Bizarro Logic</span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">When I was the age to enjoy Robox or Minecraft, the toy store had Lincoln Logs, Tinker Toys, and Erector sets. For my sons, it was Legos and Warcraft. I too enjoyed Populus and Polytopia once computers became common. These too are simulations just as are dolls representing people, or even a simple wooden stick that can become a rifle when you have the right frame of mind, and most importantly, the ability to suspend disbelief. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As a kid, one of my favorite simulations was comic books. They inspired complete worlds, many quite different from our own. One of the Superman subplots was something called <a href="https://superman.fandom.com/wiki/Bizarro">Bizarro World</a> where everything was expressed in crude form and everyone did the opposite of what was normally done back on Earth. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Bizarro people were ugly, frustrating, and even mean - typically sinister, and just the thing a young boy likes to explore. This place was kind of a contra-earth where everything was clunky, inverted, inside out, and backward. The actual planet was even a cube, well, once the organic Superman got done with it. Don’t ask about the geo-dynamics. Nothing worked like it did on Earth. This place was frustrating to think about, which is what made it fun. It was one of my favorite Superman venues.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Irony flows from trying to be perfectly contrary in multiple aspects at the same time. As you might imagine, the writer's many attempts quickly lead to paradoxes. And so it is with logic and technology in our analog world. It may sound strange, but for me, computer technology and “perfection” have become a clunky and rigid Bizarro World version of the brain based on logic, as opposed to biologically authentic organic intelligence and intuition. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Analog electronics become Bizarro digital with their consistent square waves. Signals exist at the moment, and states are more persistent. These signals are forced into clocked synchronization in contrast with our biologically asynchronous reality. Simulation is managed using states instead of the signal-based simulation which biology has evolved. For me, Bizarro is the world of technology, not unlike a cargo cult in comic form. Ironically, at times technology works much better, thus airplanes, computers, and speedboats.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">So what do cargo cults have to do with the Bizarro World and computer logic? They are both forms of mimicry somewhat imperfectly implemented to create the illusion of perfection. But brains are not logical. They are biological. Logic is a crude subset of biologics, as information is a crude and rigid subset of knowledge. Words are only a rough approximation of organic knowledge. Analogical is an easier way of bridging the differences.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Since logical systems will ultimately be useful in understanding and validating all the tricks evolution has created, let’s explore logic as the people of Bizarro World might. I’ll now describe the logic of a very simple example of homeostasis, a trick of evolution that finds form in biological “systems”.</span></p><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 24pt;"><span style="color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Robots are Bizzaro Humans</span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">There are two main types of flying protocols, at least for humans in airplanes - VFR (Visual Flight Rules) and IFR (Instrument Flight Rules). In the first case, the pilot in command is responsible for seeing and avoiding other aircraft. In the second case, air traffic controllers are responsible for keeping all aircraft separated. At times they will try to offload this responsibility by noting the relative location, speed, and direction of other traffic. If this traffic is acknowledged by the pilot, they can go on to other work. The pilot then comes under VFR rules for at least that specific encounter.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">When I was training for my instrument flight rating, like every IFR student I had a plastic hood over my eyes so that I couldn’t look outside the cockpit, forcing me to rely only on instruments. My flight instructor worked the radio as needed. I remember one early lesson departing the Reno control area when departure control called traffic, “United heavy, eleven o’clock, 7 miles.” </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Out of “reflex”, I tried to look up, but my instructor swatted my hood and admonished me to track the instruments. He radioed back, “looking”. After a few seconds, he said, “ah, most of them miss us</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> anyway”. I of course found this humorous because positive air control and safety require seeing and missing EVERY one. Perfectly. But as noted by my instructor, that doesn’t always happen, and you have to call back, “negative traffic” so that Control can vector you to a safe path.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">There’s actually a very important trick to spotting air traffic. If you see the traffic moving within your visual frame of reference, you’re not going to collide. It’s the ones that don’t move within your field of view that you have to worry about. The process is dangerously counter-intuitive. Or is it counter-logical? Is that another airplane on a collision course? Or just a bug on the windshield?</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Here’s another way of “looking” (pun intended) at this perception problem which I’ve encountered several times in various books. I’m not sure who first used the example so you’ll get a mixed-up version. The example is described like this:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In baseball, how does an outfielder catch a fly ball? If you’re an engineering student and you had to build a robot to accomplish the task, you might have the robot look at the fly ball long enough to determine its direction and velocity as a vector, then calculate the parabolic curve of the ball in flight considering Earth's gravity and have your robot proceed to that location. This might seem like a reasonable solution but it’s not what a human does. The robot approach is actually the more Bizarro method.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">A human outfielder will look at the ball, noting changes in displacement within his visual field. He will then begin moving in a way that decreases that displacement dynamically, which of course means that the two objects (human and baseball) move into a collision course. The outfielder then simply raises his glove to stop the ball from hitting him. This is the more biological method and baseball players do it without thinking. We might call this method, “subconsciously normalizing ball displacement in a visual field.” It’s just one of a million tricks baseball players come to learn through experience, and not in a classroom. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Early robot engineers spent months duplicating this ability in a far more crude, clumsy, and Bizarro fashion. AI (Artificial Intelligence) now at least seems to be refining this more technical approach into something more organic.</span></p><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 24pt;"><span style="color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Digital Consistency?</span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">“A foolish consistency is the hobgoblin of little minds, adored by little statesmen and philosophers and divines.” - Walt Whitman from “Self Reliance”</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If Walt had written this 150 years later he might have included technologists on his list of the adoring. Consistency is certainly the key to science, if not its most basic requirement. And most digital technology is simply broken without consistency. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This is not the case for neurons. In spite of those beautifully digital ionic waveforms which are quite consistent in both amplitude and pulse width, neurons seem to have a mind of their own, at least to some degree. But that small degree often makes the difference between a neuron firing and not firing. Even though chemical release at the synapse is an all-or-nothing affair, the magic lies between the "all" and the "nothing"; it's managed upstream of the hillock. Is this actually “a mind of its own?'' Not really. The term is generally reserved for a decursively higher-order form of this decision-making ability in the macro brain, but can you imagine each neuron deciding for itself? Keep that thought… in... mind.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Speaking of mind, do you ever hear the word “mind” used to describe what happens within a computer? Not so far. One of the key differences between a neuron and a logic gate is that the neuron knows both if and when to fire. And the neuron does so asynchronously. The “when” can vary dramatically from event to event for all kinds of reasons. The "if," even more so. We’ll explore some of them. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Though the ionic signals arriving at the next synapse are quite consistent, the resulting chemistry at and after the next synapse is anything but. This is where the analogical magic begins. Neurons seem to produce some kind of normalization of this apparently digital input signal into an analog form for this next neuron.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">So what exactly happens between the synapse, (along with all the other synapses informing this particular neuron), and the hillock where a new signal may or may not be triggered? There is some kind of analogical integration going on, but what is its nature? Let’s explore a bit.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This hillock divides the neuron into two realms - the neuron body and its axon. From the synapse to the hillock, the neuron is not only analog but dramatically so in many different ways. Between the hillock and the next synapse, the ionic output signal is quite consistent. It could almost be described as digital. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The first realm is a bag of magical chemistry that may or may not trigger a new signal to deliver at a distance using the neuron’s second realm, the axon, and its delivering synapses. This second realm is a little easier to understand as it uses a subset of what’s happening in the first realm. We’ll get to it later. And the first? Well, that’s where knowledge is created, but for now, we’re only exploring how digital or how analog a neuron is. At this point, it's important to understand how the brain is unlike the computer. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span style="font-size: 18.6667px;">In Bizarro fashion, c</span>omputers are digital everywhere. Brain states are described as being turned on and off like a toggle switch. Computer logic and memory is perfectly consistent. But in the brain, neurons are only consistent enough to be called digital in the axon. Everywhere else, the neuron is mystically analog and frustratingly inconsistent. "Switching" in the brain is actually one neural layer gaining control over another as they compete and cooperate, or more generally shift their macro-chemistry using hormones as temporally extended signals.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">For those who are more technical, those repeated ionic spikes from a given neuron may appear to be a pulse-coded modulation of some sort, and that may actually be the case, but not in the normal digital sense. These pulse trains are more likely an artifact of analog priming than actual coded values of some sort. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I too was captivated by the beauty of these waveforms as a teenager. But the closer I looked, the more problems I found with a digital interpretation. Still, I ignored these problems for decades. Slowly I began thinking of the neuron as having analog inputs producing a somewhat digital output. But that didn’t help much. It was all quite frustrating. Ultimately, even this model broke down leaving consistency only by degrees, and signals in contrast to states. Or something like that.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">It took me years to discover that this conflict was completely resolved with a change in perspective from objective to subjective when dealing with the neuron itself. This was of course a form of subjective anthropomorphization. With this new tool, the relationship between a neuron’s inputs and output began to make a lot more sense. Once we yield our objective perspective, neurons become even more consistent, but never actually digital. Let it go, Grasshopper.</span></p><br /><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 24pt;"><span style="color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Asynchronicity</span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">When I started college, microprocessors were yet to be invented. The Altair, Apple, and of course IBM PC computers were still years away. The relatively few computers that existed were either mainframes or minicomputers. Our modest College of the Redwoods didn’t have a computer anywhere on campus, just an 80-column card punch and sorter. I first learned the computer language BASIC on a Teletype connected through a 300 baud modem to an HP minicomputer at Berkeley, California. This Teletype was isolated in a storeroom in the physics lab because it made so much noise.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">When I signed up to learn FORTRAN, we had to literally create an 80-column card deck which was then driven to another campus. It took one or two days to get the result. One comma in the wrong place and you had to wait 48 hours to discover your error. You might say the learning experience was very loosely coupled in time, and virtually useless. Though it may have been loud, the response from that Teletype was almost immediate. I dropped the FORTRAN class. BASIC was similar, or at least close enough for the work that I was doing. I much later picked up FORTRAN as needed for specific projects after our campus got its own minicomputer. As one of the few computer geeks on campus, I helped install and manage this new HP minicomputer when it arrived. Direct connect screens dramatically improved the learning experience for both. And cut the noise level as well.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In the early 1970s, almost everything about computers was learned from stapled Xeroxed pages of schematics, flow charts, or source code. Then Altair made the cover of Popular Science. Soon books were published about the architectures of microprocessors. I’d already designed an ALU in high school but this at a whole new scale. I remember discovering the first issue of Byte at a stereo store. Things happened quickly after that, but digital electronics was still a very new field of study. I helped a close friend who was part of the faculty at College of the Redwoods define his first curriculum for teaching TTL (Transistor-Transistor Logic). We’d spend hours late at night in his empty classroom debating the best way to design and present the ideas behind electronic logic. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I had this theory that computers could be much faster if the logic were simply asynchronous and didn’t have to wait for the standard clock signals. Some of these ideas were used years later when I designed the Sage computer which I documented in another blog post. Other aspects were applied to understanding the neuron. You might say that waiting for a clock signal in TTL is a form of artificial or forced synchronicity. It happens on a grand scale in virtually all computers. But it has a cost in time efficiency.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Neuron communication is asynchronous, at least in most cases. Again, this assertion will challenge many of the more technical. Brain waves clearly show a great deal of what appears to be synchronicity in the brain. But this is largely the effect of brain operation, not its cause. The point is, brain “processing” is not driven by some synchronizing clock signal. It’s normally not even a synchronized process. The illusion of synchronicity is an artifact of parallel competing neurons cued by the same experience from the world. Brain waves just seem synchronous.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">For instance, a given neuron may cue 37 different neural scripts, but only one (or none) may actually invoke physical movement as the 36 others are inhibited by various other cues in the “group”. It may seem that an ensemble of neurons are responsible for some stimulus by coming together in an apparent synchronized fashion, but the very opposite is actually the case. Cause and effect seem inverted compared to a computer. Ultimately, a single neural script might induce the movement of a given muscle resource even though it may seem like many more were involved.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Ironically, it’s also the asynchronous synchronicity of inputs from the world that is at the heart of one of the neuron’s first and most effective tricks for creating knowledge. Does that sound like a paradox? Good. Now relax your mind. The concept won’t even be needed until I describe how neurons come to know a thing - asynchronously.</span></p><br /><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 24pt;"><span style="color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Parallel and Serial</span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">At their heart, computers are inherently serial and single-tasking. </span><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Notwithstanding superclusters of identical GPUs and some modest success with “multi-core” and “neural net” functionality, most computers still operate in a highly serial fashion as do each element in these clusters. When controlled by a single clock signal, computers mostly do one thing after another. But they do it so quickly, they seem to be multiprocessing. This creates the illusion of doing many things at once. In general, computers simply don’t work well in parallel largely because they are constrained by tight synchronization and identical processing cells, a kind of left-brained mass production of logic.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In contrast and as noted earlier, the operation of the brain is profoundly parallel and multifaceted, becoming more serial as behavior is delivered. The contrast is also vividly and visually apparent from the right to left side of our brains. Left-brained language is more serial. Right-minded visualization is more parallel. The right also has more parallel connections showing up as white matter. The left more sequential neuron connections showing up as gray. Even so, both sides are more parallel near the sensors and more serial near the muscles along each neural pathway.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Brains literally do many things at once, and these many “facets” seem to accomplish this remarkable parallel operation without getting in each other’s way. At least most of the time. One of the reasons that computers as described above struggle with parallel multiprocessing is managing contention resolution - simply which processor (or logical function) is in control at any given moment. This problem is exacerbated by forcing synchronicity from a single clock.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In contrast, neurons resolve this contention issue in a more asynchronous, finely-grained approach, literally at the neuron level. Each neuron resolves contention each time it fires. How this works has been one of my biggest personal challenges for decades. How this happens between the left and right brain was what initially inspired me to test the model deeper and led me to discover how it was managed in other parts of the brain, and ultimately to the neuron itself where I found something very interesting about control and consent, which I’ll share shortly.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The important part for now is, this profoundly parallel architecture is the key to the brain’s resilience and graceful degradation, meaning when one part fails, most of the rest of the brain keeps functioning in fairly normal fashion. Analysis of patient stroke data is a vivid demonstration of this resilience which is mostly lacking in computer architecture.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Paradoxically, brain architecture is not just parallel, it’s both parallel and serial at the same time. Steps along a neural pathway or steps in a dance are both obviously serial, but knowledge converges in a parallel fashion. Simulations in the brain start out parallel near the sensors and become serial at the muscles. This is the Zen opposite of a computer which is inherently serial and struggles to accomplish much of anything in a parallel fashion. Is this the Zen of Tao, and the Tao of Zen?</span></p><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 24pt;"><span style="color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Don’t Hardwire the Zen Nature of Memory </span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As a consequence of the difference between electronic and ionic, we also need to ignore the copper wire metaphor and think in more biological terms. Instead of wires, neurons typically rely on leaky hoses called axons moving ions around in a cauldron of chemistry. These fibers deliver signals from one place to another but are also influenced by this ionic stew of chemistry. Communication in the brain happens in multiple mediums, and in multiple ways, some broadly chemical, others relying on necessary cellular isolation. Only the neuron’s axon appears similar to wires with insulation, but that’s just an illusion. Axons with their “insulation” are not similar to metal wires in any way. And the connections as noted are often changing, not hard-wired at all.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Nothing in the brain is hard-wired, not even what we’re born with. Our first couple of years are dedicated to pruning what we don’t need based on our initial experiences, or the lack of them. What’s left forms a very sparse framework representing experiences during our first few years of life. From there, new and more subtle connections are made over the rest of our lives in various critical phases of learning. This “softwired” metaphor can be a little difficult to understand at first but is a very useful concept.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Our left-brain prefers to work with things that are fixed or at least change in a predictable fashion. “Hard wired” implies a predictable result. Consistent “states” are one way to describe such things. But in the brain nothing stays the same. Each time a neuron fires, it may adjust how much ionic tension is required from any given input signal to induce the neuron to fire the next time. There is no fixed logical relationship between the input and output of any given neuron. But there are analogical ones. Any apparent “states” in the brain are a high-level artifact (or illusion), as is human memory itself. Everything in the brain is plastic by degrees and in critical phases. It’s just a matter of when and how it changes. Like the world, nothing is fixed. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">There are no hard wires. Unlearn them. The "circuits" in the brain are not electrical signals nor even power circuits that got their name by completing a circle from the battery back to the battery. There are no such power or signal circuits in the brain. Neural pathways instead start at neural sensors and converge down to scripts of muscle movement. The only circle they complete is a loop with the world, various internal feedback systems, or possibly in the micro context of imagination where physical looping of neural pathways are more likely to occur. These pathways are better described as dynamic neural pathways which start out being quite flexible and only become fixed by degrees and in phases over time. Or not. But forget electrical or electron "circuits". They don't exist in the skull.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Memory as suggested by information theory will be of little use in any model of the brain. The concept is a logic trap. The neuron does not store “states”. And “muscle memory” is not memory at all. Instead, the brain has an alternative way of simulating the world by using signals dynamically, yielding a reconstruction of the past. Is it memory? Not in a technical sense, but it can produce similar results in some cases. For now, it’s best to let go of the concept of memory altogether. I’ll address the topic in more detail later on.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">“Cause and effect” nicely describes what happens between a motor neuron and the muscle it controls, but less so as you evaluate the connections back along the neural path towards the sensor. While “determinant” may apply to this last connection before movement is invoked, It’s less true of each step that precedes it. And in a mathematical sense, not much at all.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">“Cause and effect” has far less of a correlation between most individual neurons than mathematical logic. And for the brain in a macro sense, very little. This is a very hard thing to unlearn, but critical to understanding the nature of the neuron. For now, relax your sense of a hardwired or consistent connection between most neurons. Consistency occurs by degrees. Think instead in terms of dynamic associative probability at each junction made up of multiple synapses. The quantification of the meaning of any neural input is controlled by the receiving part of the synapse, not the transmitting side. Or something like that.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Instead of hardwired, the brain is sort of soft-wired where the contrast between hardware and software is a useful comparison. In a computer, a logical function may be expressed in hardware or software, but hybrids are difficult. In the brain, analogical functions range between these limits as synapses are formed, upregulated, downregulated, and or decrease through atrophy over a lifetime. Think biology, not copper wires.</span></p><div><span><br /></span></div><span id="docs-internal-guid-d262c1c5-7fff-7ac3-38aa-6c37cbbc4ac8"><h2 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 10pt;"><span style="font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Fixed by Degrees</span></h2><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">“The moving finger writes; and, having writ,</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Moves on: nor all thy piety nor wit</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Shall lure it back to cancel half a line,</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Nor all thy tears wash out a word of it."</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> <span> - </span>From The Rubaiyat of Omar Khayyam</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As for writing, the craft has changed dramatically since Omar’s time in the 13th century. Ink and paper were expensive and valuable tools in the 13th century. It was important for writers to carefully choose their words before fixing them on paper. Or by actions in their lives.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span style="font-size: 18.6667px;">Things are different today. We're</span> no longer limited to quill and ink. We edit with impunity, changing content willy-nilly in electronic form as I’m doing with this blog post right now. Even on paper, (which we discard by the millions of tons each year), we sometimes reprint versions of our work every few minutes until it looks just right in physical form.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">For decades I’ve preferred electronic dashboards over paper because of the advantages of their more dynamic nature. I used to admonish subordinates for bringing me reports in printed form, not just because it killed a tree, but because “paper freezes disembodied information”, decreasing its flexibility even as it logs a more permanent history. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">When I designed and coded my text editor, Sudden View, I purposely left out the print function just to keep the content more flexible. It frustrated some of my customers, but I never relented. The point is, information is permanent only by degrees depending upon the medium in which it’s stored, ranging from being wispered in your ear or written on paper with a quill and ink, to carved in stone at the base of a building. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The same can be said for knowledge, even before it finds form in physical expression. Knowledge has to exist in the mind before it can take physical form as information, whether in spoken, electronic, or written form.</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> The permanence of knowledge is by degrees, even in the mind.</span></p><br /></span><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 24pt;"><span style="color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The Surprisingly Subjective Neuron</span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In the gas crisis of 1974, President Nixon asked for and got a national speed limit of 55 MPH. After two decades, and in an attempt to have it repealed, one of the legislators from Texas noted that “there are parts of west Texas that if you set out at 55 miles per hour, you’ll never arrive.” </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Obviously, his assertion is false in a mathematical sense, but his humor helped get the law repealed, so ended up being quite meaningful, at least on the Senate floor. A vehicle going at a certain rate on a west Texas road can be calculated to arrive at a specific time. But if you had to actually drive the course, at each moment along the highway, it might seem to be taking forever. The left-brain deals more effectively with the abstraction of time and its calculation. The right-mind lives in the moment and is frustrated by the lack of arrival as noted by the childish refrain, “are we there yet? Laughter in the above legislative case flows from a type of race condition between the two sides of the brain. Or it doesn’t, depending upon the individual. Some people have no sense of humor.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">So which side is correct? Which is true? It largely depends upon if you seek an objective answer or a subjective one. Since the law WAS repealed, subjectivity won the day. Something similar happens not only with the macro brain but also with their controlling neurons. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><span style="color: #222222; font-family: "Trebuchet MS"; font-size: 18.6667px; font-weight: 700; white-space: pre-wrap;">Cues and Scripts versus Stimulus-response</span><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><br /></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">"Cues and scripts" are the subjective alternative to the more objective, "stimulus-response" model that has been popular for more than a hundred years. Unfortunately, stimulus-response relies on a determinant model of the world where every effect has a cause. But in the context of the neuron, it's often not true; or at least the cause can not be easily determined, bringing the effect into question. Think of the contrast in the macro context. Why do people do absurd things such as murdering their own children? Such decisions start with a neuron, and what they come to know, subjectively. Can such behavior ever be rationalized? Yes, depending upon what the subject comes to know about the event and how critical it is in their life.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">“Objective” is an emergent abstraction of our more recent Bizarro culture. It requires at least two people to agree upon the thing held apart from both typically described as information. Computers are a higher-order form of such information management and so are inherently objective in nature, being set apart from any single individual.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In contrast, the brain and each neuron in it are by nature, inherently subjective. Neurons only come to know what is delivered to them in the form of chemistry at each synapse. The important difference between objective and subjective when comparing computers with the brain is a matter of who is in control of what and when. Let’s explore an example using logic gates.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Electronic logic requires perfect consistency in its evaluation of input signals, and it always yields a determinant result, and that result will stay the same no matter how many times you apply the inputs. (Well, at least unless one or more of the inputs is a random number, but this is an edge case that we can explore later.) It could be argued that the inputs control the outputs in most cases. If such inputs come from the world then a consistent stimulus should produce a consistent response. This is how most neuroscientists come to understand both logic, and also try to evaluate the brain. But they typically fail. The reason is that the sovereignty of control does not flow from the world but is literally created within the neuron. The easiest way to understand this control is that neurons are subjective as opposed to objective. And so is the brain. Hopefully, the next section will help clarify this a bit. Just be ready to understand the neuron subjectively.</span></p><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 24pt;"><span style="color: #222222; font-family: "Trebuchet MS"; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Finally, Analogiccal Signals versus Logical States</span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Logic is involved in approximately half of high-level thinking in a macro context. The other half is intuition. Without defining logic in detail, I’m going to assert that logic is a mathematical tool for not only reasoning, but also the basis for validating all control and computer systems. Logic is also a tool created by the world of the Bizzaro Cargo Cult of technology. Logic is all or nothing, 0 or 1, true for false. There is no middle ground with logic. But we live much of our lives in that middle ground between true and false. Logic only deals with literally the limits or edge cases in a very different sense of the term.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Logic gates are electronic devices connected by copper pathways used to control determinant systems but are of little use in the more flexible and dynamically analog world of biology. Fortunately, there is a bridge between these two worlds best described as analogical.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The objective here is to contrast logic with what happens within the neuron and the brain in general - analogic. For now, I’ll focus on neurons in a nano context. Later we can decursively apply most of these ideas to the brain in a macro context. As you might guess, this will not be an exercise in reasoning. It will be an intuitive quest to understand neurons as I have suggested above.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Like the computer which is its crowning achievement, logic is digital, consistent, fixed, objective, and determinant. As I’ve noted, the biological neuron has these characteristics only by degrees, or in many cases can be described in a way that is the very opposite. I have presented many examples above. Since logical signals beyond noise are the key to information theory, I’m going to summarize how I reached that almost opposite conclusion. Sometimes noise has utility. Or something like that.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Having a technical perspective, I initially assumed that the neuron was logical. At least some of the observations can be described that way, but the exceptions start early and become the rule, ultimately overwhelming the thesis that neurons have what’s needed to be logical. I’ll now focus on what caused many of those exceptions, the exceptions that literally changed my mind.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Neurons sense the world and create signals that somehow sparsely encode the important parts of what they discover. These “digital” signals are passed on to other neurons across synapses in the form of chemistry. Approximately half of these connections tend to activate these follow-on neurons, and the other half inhibit such activation. That is an important clue. Muscles are arranged in a similar fashion, and their control is managed in a similar fashion - opponent processing as noted by Sir Charles Sherrington.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">There are a few hundred skeletal muscles in the human body. Most are arranged in pairs allowing movement in both directions. These muscles both compete and cooperate to achieve gross displacement and fine motor control. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Decursively, the neurons that control these muscles also compete and cooperate by applying activation and inhibition in the micro context. Even within the neuron in the nano context, some synapses tend to activate, and others tend to inhibit. Again, the architecture decursively allows for</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> competition</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> and</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> cooperation</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">, and for a similar reason.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">What’s extraordinary, at least from a technical perspective, is that a given neuron will have synapses that both activate and inhibit the very same following neuron! If you’re a technical person, think about this assertion for a moment. Why would one neuron try to both activate AND inhibit the next neuron? These synapses would cancel each other out, at least in a digital sense. When that first neuron fires, the result is null. Nothing happens in the second neuron. Activation cancels inhibition. At least if these signals are truly digital. Logically, it simply makes no sense. Analogically, it might.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If these two neurons had more than these two opposing synapses, let’s say 3, 17, or 54, the connection is no longer digital - it becomes analog having a value reflecting the RATIO between activation and inhibiting synapses. Poof! The resulting signal transforms from digital into analog. The connection between these two neurons can now up-regulate or down-regulate the significance of any given digital signal by changing the ratio of activation and inhibition allowing it to both</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> compete</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> and</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> cooperate</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> in this nano context. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">When you think of the resulting ionic tension created by this form of connection, the dendrites become a digital to analog converter (D/A), and the hillock becomes an analog to digital converter (A/D), all within the body of a single neuron. And when you introduce a second source neuron at a different dendritic spine, a type of analog logic becomes possible, but only by degrees. And that’s just one of evolution’s decursive tricks. Here’s another that’s a bit easier to understand.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">What happens when you get angry? That’s right. Your mood changes. This change is largely mediated by various hormones, I won’t bother getting into the detailed chemistry. The point is, your mood can be thought of as putting you into a different mode of operation in a macro sense. And also in a micro sense. And finally, in a nano sense. That’s right. Shifts in ambient macro chemistry affect micro and nano-connection allowing the analogical equation to change - some form of fight or flight is the likely result. This too is an evolutionary trick, something that helps keep us alive - adaptive chemistry yielding a form of analogic.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I'll now tease you with this one paragraph on the topic of the "quality" of analogic. Its "ANDness," "ORness," or how "naughty" ( tending to invert a value) it might be at any given instant. The very idea invalidates the all-or-nothing nature of a signal, and indeed, the very nature of logic as we've come to know it. At the same time, it allows for applying logic by degrees. I'll stop here. That's a useful hint for now. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I believe you can probably see how quickly things can become complex when trying to understand what’s happening in the neurons according to Boolean algebra. And this is only describing two of nature’s tricks. There are many, many more. And one need not preclude another.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">OK, I'll give you one macro example of what I mean by analogical. Our right mind is more likely to reason by analogy, simply by making comparisons from similar events in our history known as metaphor. This works reasonably well but when you cooperatively also reason by logic a la the Socratic method, the blend can produce remarkable results and be rightfully described as analogical wisdom. It's when things get out of balance that results can become either really bad, or really good reflecting McGilchrist's thesis of our left-brain. But there's also the possibility of insight or epiphany. I realize such thinking can quickly lead to paradox, but also in a few cases, breakthroughs such as this one:</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The brain is a stack of divided and layered constantly changing asynchronous neural nets cueing scripts of chemical signals reflecting various analog relationships from the world. This is in contrast with computers and their algorithms altering states using logical functions in a determinant and synchronous manner. Think of neural nets as dynamic analog programmable logic arrays (a new type of silicon) stepping through scripts of muscle movements. The brain applies both functional and procedural languages in an analog fashion. The neuron, and the brain in macro, are analogical.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><br /></p><div><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Before we end up down a rabbit hole, forget about what I’ve just written in this complete post if you like, but consider the possibility that the neuron is not electrical in nature, and its ionic aspects are mostly internal to the neuron; that the brain is not built of logic gates, though it may have analogical aspects; and finally, that the mind is not a computer. It’s something far more powerful and elegant. Think of neurons as magical devices that create a bit of knowledge and then deliver a chemical signal at a distance to any other neuron that might be able to use this knowledge to help survive and replicate. Pretty simple, right?</span></div></span><div><span><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><span id="docs-internal-guid-283aa883-7fff-0056-4b4b-5715c8948fd5"><span style="color: #222222; font-family: "Trebuchet MS"; font-size: 18.6667px; font-weight: 700; white-space: pre-wrap;">Top-down or Bottom-up?</span></span><div><span><span style="color: #222222; font-family: Trebuchet MS;"><span style="font-size: 18.6667px; white-space: pre-wrap;"><b><br /></b></span></span></span></div><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Before I end this post, I want to clearly state an objection to the typical approach to modeling the brain - top-down versus bottom-up. So many of the books I've read about the brain tend to start with the cortex, either mapping, imaging, or modeling. And yet we don't have a useful and effective model of the brain. I believe this approach is a big part of the problem, and I believe this has occurred for largely technical reasons. It's how we do black-box analysis, but in this case, the approach is highly distracting, and not in a good way.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Evolution didn't start with a fully formed human brain, nor even the cortex which is likely only about 100 million years old. This is quite recent considering all of evolution. The corti are literally in the way of deeper exploration. I will here suggest that we simply cut off the top of the skull and take both sides of the corti and set them aside for a while. This will expose everything below allowing us to more easily imagine starting from the bottom of the brain and working our way up. We'll get back to the cortex (both of them) in due course.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><br /></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Now, let’s explore the really fun part, the philosophical detail of a gnostic neuron.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><div><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Continued:</span></div><div><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></div><div><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><h3 class="post-title entry-title" style="background-color: whitesmoke; color: black; font-family: "Trebuchet MS", verdana, sans-serif; font-size: 19.5px; text-indent: 10px; white-space: normal;"><a href="https://suddendisruption.blogspot.com/2022/03/entertaining-assertion-as-should-be.html" style="border-bottom: 1px dashed red; color: black; text-decoration-line: none;">The Gnostic Neuron - Part 4 - Neurons Create Knowledge</a></h3><div><br /></div></span></div><div><br /></div></span></div>Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.com0tag:blogger.com,1999:blog-23742979.post-42866555423347947572023-02-01T07:31:00.032-08:002024-02-18T05:53:08.304-08:00The Gnostic Neuron - Part 4 - Neurons Create Knowledge<p> <originally posted on 03-15-22></p><span id="docs-internal-guid-72ebe2be-7fff-df03-2f74-c50c69129378"><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 10pt;"><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjedx4OEcAISXWVsQd_TDmSELYdt40Hjs0r6VPOFs07uRaXK8JU2ow-78dXq9MOqLJohZIcdcLaS8ZwupxQwncMHzWcMbvIIMTmPt89Ije7Z4JEpROV7EWEpzhbu2sS_0LS9tTQTa3enEo_NeXt-CEhFHw_aW8ct8w1CXUbWMx3Uxl5dKVGzYU/s960/175795389_623055789093294_8323484834082227588_n%20(1).jpg" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" data-original-height="960" data-original-width="720" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjedx4OEcAISXWVsQd_TDmSELYdt40Hjs0r6VPOFs07uRaXK8JU2ow-78dXq9MOqLJohZIcdcLaS8ZwupxQwncMHzWcMbvIIMTmPt89Ije7Z4JEpROV7EWEpzhbu2sS_0LS9tTQTa3enEo_NeXt-CEhFHw_aW8ct8w1CXUbWMx3Uxl5dKVGzYU/w480-h640/175795389_623055789093294_8323484834082227588_n%20(1).jpg" width="480" /></a></div><br /><span style="font-family: "Trebuchet MS"; font-size: 32pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></h1><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 10pt;"><span style="font-family: "Trebuchet MS"; font-size: 32pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></h1><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 10pt;"><span style="font-family: "Trebuchet MS"; font-size: 32pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></h1><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 10pt;"><span style="font-family: "Trebuchet MS"; font-size: 32pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></h1><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 10pt;"><span style="font-family: "Trebuchet MS"; font-size: 32pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></h1><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 10pt;"><span style="font-family: "Trebuchet MS"; font-size: 32pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></h1><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 10pt;"><span style="font-family: "Trebuchet MS"; font-size: 32pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></h1><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 10pt;"><span style="font-family: "Trebuchet MS"; font-size: 32pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></h1><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 10pt;"><span style="font-family: "Trebuchet MS"; font-size: 32pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></h1><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 10pt;"><span style="font-family: "Trebuchet MS"; font-size: 32pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Entertaining an Assertion</span></h1><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As should be obvious by now, the prime assertion that lies at the heart of this gnostic model requires only three words to describe in its simplest form:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">“Neurons Create Knowledge”</span><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I’ve repeated it several times at this point. Now we’re going to explore the idea behind the assertion, but not in the way you might expect. Instead of presenting a technical, formal, or logical argument to support this assertion, I’m going to ask you to suspend disbelief for a time and even relax your intuition as I take you along a path I stumbled along trying to understand the brain’s analogical nature. If you haven’t yet bothered to consider the consequences of the above assertion, I will now present some of them in a more literal fashion so we can play with them as ideas before accepting them even as hypotheses. Later these ideas may find more permanent form as thesis, becoming less hypo.</span></p><br /><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 18pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Permanence by Degree</span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">“The moving finger writes; and, having writ,</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Moves on: nor all thy piety nor wit</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Shall lure it back to cancel half a line,</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Nor all thy tears wash out a word of it.” </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">- f</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">rom The Rubaiyat of Omar Khayyam</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I present this quote again for a largely different reason. I'll now present how the right-mind comes to know these words. When I first read the above verse, I didn’t take it literally. As opposed to</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> actual</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> writing, I saw this quatrain as a metaphor for the things we do in our lives that can’t be undone. For me, and I think for most, this quote represents the choices we make, and how these choices affect our future, sometimes in more permanent ways. How are these choices made? Ultimately by the firing of a neuron somewhere. That is pretty well established. What drives this firing? Knowledge, even if neurons don't create it, and even if it has some more mystical source. We express ourselves in writing because of what we've come to know about some topic.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">An assertion documents an idea in words; and ideas are informed by knowledge.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Before a word is ever written (or even spoken), it exists only as knowledge in someone’s mind, as merely an idea. But once an idea is communicated and understood by someone else, it can be difficult to unknow in an objective sense. Some ideas are more permanent than paper, or even stone. They may even be captured in biology. And biology, with its organic will, can have greater impact than stone.</span></p><br /><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"><b>“Nothing is more powerful than an idea whose time has come.”</b> - Victor Hugo</span><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">The rubaiyat above can also be understood in a deeper sense as to the degree of permanence our choices make in the world. Some ideas are fleeting thoughts. Others are a challenge to unthink, such as the poorly applied electrical metaphors of the brain. Or any idea about knowledge whose time has finally come.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">If it hasn’t happened yet, what you’re about to read may profoundly change your life. It certainly has changed mine. Entertaining this model of the neuron and the brain has literally triggered a personal experience of transformation for me. It was something much more profound than I ever imagined it might become. It has been very difficult to describe in words, but its luminance has surrounded me, and still does. The experience shortly took on the nature of an alternate reality and continues to gain conviction. It has now morphed into my normal way of understanding the world. I enjoy each moment of this emerging epiphany.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I now see people as a collection of what they know, essentially the sum of their relationships in the world. This is not such a strange perspective in the macro sense, but when seen from the nano perspective of the neuron, it takes on a whole new meaning. Of course, I can not </span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">truly know</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> another's relationships. But many I can guess by observing how others respond to experiences</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> similar to what I have had</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">. It comes down to what they know versus what I know about any given topic, at any given moment. This of course takes many forms because of personal and subjective history, but it's all interesting. And that’s just a taste of what I’m living right now. I see similar but even more primal knowledge when observing other animal life and comparing them to my own more primal behaviors. Fight or flight takes on new meaning. Neognosticism works on many levels.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">What I suggest in the above paragraph might be described as simply empathy. And indeed, most of what I’m living can be presented in other fairly normal terms, but not the whole of it, as opposed to all of it. The experience is both holistic and reductive as both sides of my brain are constantly analyzing these experiences in real time. I’ve gotten to the point that my old Bizarro worldview no longer gets as much attention, and the new one is still finding its legs. I will describe more of this later.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">It’s time to offer a formal and final warning - what I’m about to describe can not be unseen; it can not be undone, or innocently ignored once it has been understood. It's very unlikely to once again become unknown. At least that’s been the case for me. I have no regrets. I awake each morning ready to proceed with all due haste. Your mileage may of course vary, but if the result is even close to what I’ve experienced in the last few years, it will certainly change your worldview. So if you like your world just as it is, you should simply stop reading at this point. I realize how delusional these last few paragraphs may seem as I write them, but I’m trying to be as candid, and explicit, as possible. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In any case, you’ve been warned. If you’re ready, let’s proceed.</span></p><br /><br /><span id="docs-internal-guid-fa0d9e39-7fff-3e92-de4a-d7d31af8c6a0"><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;"><span style="font-size: large;">Neurons 101</span></span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Before we get to the heart of the matter, this is a good time to quickly review some of the more probable and useful generalizations about neurons. These are things generally accepted by most in the world of neuroscience.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Neurons have a clear input and a clear output, both of which take the form of chemical signals. I</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">nputs occur at the synapses of the dendrites with their spines which also ionically integrate these inputs along with other internal ambient chemistry. </span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">At the other end of the neuron, chemical signals are expressed at the synapses near the termination of the axon.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Neurons apply a complex (but not that complicated) form of ionic integration to combine these chemical signals from other neurons to create a signal of their own. I describe these events as chemical</span><span style="font-family: Arial; font-size: 18.6667px; white-space-collapse: preserve;"> (in contrast with electrical)</span><span style="font-family: Arial; font-size: 14pt; white-space-collapse: preserve;"> signals, because from a neuron's perspective that is exactly what they are. The effects of any ionic charges are limited to the inside of the neuron. Firing does not create or change states in any neurons, but they do adjust sensitivity often in an</span><span style="font-family: Arial; font-size: 18.6667px; white-space-collapse: preserve;"> ionically</span><span style="font-family: Arial; font-size: 14pt; white-space-collapse: preserve;"> analog fashion, again, in contrast with determinant logical shifts.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">So far no one has demonstrated how or where any “state” might be stored in a neuron, at least not in any consistent or conventional sense. Even a neuron’s sensitivity to any input can be said to vary from moment to moment so of course when and why it fires is best described as a dynamic as opposed to a fixed. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Whatever contribution a neuron’s signal produces is expressed as an ionically mediated chemical signal at a synapse near the end of its axon. An obvious question might be, “What is the nature of these signals?” We’ll address that directly.</span></p><br /><br /></span><h2 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 10pt;"><span style="font-family: Arial; font-size: 24pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The Definition of Knowledge</span></h2><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">My personal satori actually flowed from the analogical nature of how neurons chemically communicate with each other, and somewhat later, how this process relates to knowledge in a macro sense. But that second step still took me by surprise. The philosophical implications of the prime assertion of the neuron has far more significance than neuroscience might suggest. It essentially is the foundation for the empiricism presented by John Locke but also supports rationalism through imagination. That will need to be explored when we get to consciousness. For now I don't want to get too distracted by the deeper philosophical consequences. Still, in many ways, the cultural follow-on aspects totally eclipsed the original more technical neuron work, so I’m going to present this philosophical version first. A review of the definition of knowledge is needed. Or should we define the WORD “knowledge” first? There is a significant difference between experience and the way we label it. Experience creates direct neuronal knowledge. Expressing a word verbally or in written form creates indirect knowledge to cue others on various topics.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Not being an etymologist, I didn’t draw the distinction between a thing and the WORD for that thing, so I glossed over this difference and dove into the epistemology, specifically the definition of the WORD “knowledge”. If you’re not familiar with these two quite different “e-mologies”, now is the time to get a handle on each. We’ll be dealing with both quite a bit. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Ety-mology is the study of the history of words, and how their meanings evolve over time. Episte-mology is the study of knowledge and its underlying origin and etiology (yet another "e-ology" defined as the cause of something, especially in medicine or meaning.) Words are such decursively fun playthings, and the concepts behind these three will become even more playful as we proceed, but</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> let’s deal with the word for knowledge first as written words are our current form of communication.</span></p><div><span><br /></span></div><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 18pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Justified True Belief?</span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">There’s of course much more on the topic, but serious analysis of knowledge was nicely documented by Plato in his description of a </span><a href="https://plato.stanford.edu/entries/plato-theaetetus/" style="text-decoration-line: none;"><span style="color: #1155cc; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; text-decoration-line: underline; text-decoration-skip-ink: none; vertical-align: baseline; white-space: pre-wrap;">Socratic debate from his Theaetetus of 369 BC</span></a><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">. I’ll spare you the tedious arguments and instead present Wikipedia’s summary of Plato’s definition in three words - knowledge is “justified true belief”.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">These three words carry a heavy burden and balance three important underlying concepts to yield this definition. “Justified” describes evidence used to support truth. “True” sets knowledge apart objectively, and removes all doubt. “Belief” reintroduces doubt, but then quickly dismisses it with subjective conviction depending upon whether the evidence is justified, making things a bit circularly dependent. The three words together result in a kind of dynamic tension, but definitions in general don't allow for any possible dynamic nature of the experience of knowledge. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Justified by what? And how? Truth at least seems testable in an objective sense. And if true, is knowledge always true without qualification? If so, why do we need the other two words in the definition at all? Finally, “belief” begs the question of subjective conviction. Is knowledge dependent upon who believes? Or is a believer required at all? Is knowledge subjective or objective? Did you hear that tree fall in the woods? Me neither, but I know that it fell. Or do I? This definition of knowledge is all pretty messy, but apparently, the best we have.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">In any case, “justified true belief” has pretty much held for most of the last two thousand years - right up until 1963 when Edmund Gettier established that the “true” part is not always a required component - at least not logically. Leaving “truth” behind, we’re left with “justified belief”, an even weaker, less satisfying, increasingly subjective, and yet apparently more accurate remainder. Does this consensus definition now fall over like a two-legged stool? Or was Plato's three-word definition overdue for an update?</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">And it's not just the definition of knowledge that's been in question for thousands of years, there's also the quality of knowledge, however you define it. What each of us knows varies widely. Most of us end up wrong to some degree most of the time. So much for "facts". We need only refer to what each of us may know about some price in the stock market to understand the validity of this assertion and efficient market theory. And there are so many other examples. I'll let you pick your favorite.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I’m obviously not the first to question the nature of knowledge only to discover this aporia. After all, we wouldn’t have the word “epistemology” if the field weren’t worthy of further study. Being its object of love, knowledge goes to the core of philosophy. I’m in no way qualified to debate philosophy, but I’ll shortly offer a right-minded alternate definition for the experience of knowledge by using words to describe the experience and let you decide. For now, we need to let “truth” go hide somewhere as Gettier logically established, or at the very least accept that truth is a relatively small subset of knowledge. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Letting “truth” go somewhere else for a while actually helped me a great deal. It may sound strange, but truth did not set me free. Its absence did, in a Zen sort of way. This simple exclusion led me to a new way of viewing knowledge inspired by a very strange aspect of neural reality supplied by knowing Jennifer Anniston.</span></p><div><span><br /></span></div><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 18pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Jennifer and Friends</span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">So, we’ve finally made our way back to the Gnostic Neuron and what it means to know Jennifer Anniston. Or anyone else.</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> Or any place.</span><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> Or anything. How do neurons come to know a person, place, or</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> thing? </span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">The idea seemed fantastic. At least it did for me. And for several years I had no concept of how a neuron might achieve this remarkable result. But the possibility that neurons could create knowledge resolved so many issues in the nano, micro and macro context that I just couldn’t leave the concept alone. Set aside for now how evolution might have accomplished this amazing trick, and just envision neurons knowing SOMEthing in a Zen sort of way.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">What exactly did it mean for that specific neuron to come to know Jennifer Aniston? To be accurate, this neuron also fired for other actors from the TV show “Friends”, but it has been argued that they only fired because of their association with Jennifer. Forgetting her “Friends” for the moment, the response of this neuron remains an impressive demonstration of a type of knowledge, in this case, the ability to discriminate between thousands of people that this particular subject encountered in various ways during her lifetime. THAT is an amazing trick and it seems to border on being impossible. Or even alien. Yet evolution accomplished it. Discrimination between this person or that one is at the heart of the process as we’ll later explore. Before we try to figure out how, let’s focus on the WHAT of neuronal knowledge.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I spent years trying to make sense of this type of knowledge, even in a more flexible form. I ultimately tried to characterize the subject’s recognition as not knowledge at all, but something very limited and specific to perhaps a very few neurons. (In hindsight, I should have gone the other direction - toward the more general, for I now realize that knowledge is the superset.)</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Actually, this same reductive approach has been taken by others trying to explain this neuron-knowledge phenomenon. Many believe gnostic (or concept) neurons are a relatively rare exception in the brain, the enigmatic leader in a group of neurons. Unfortunately, limiting the scope of knowledgeable neurons more easily allows the mystery to be ignored or dismissed as a bizarre exception, (at least by our left-brain with its superpower, denial.) </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">But I couldn’t let it go. After reviewing Rodrigo Quiroga’s videos and reading his book, I decided that gnostic neurons seem to be far more common than we might have at first guessed. So my thinking finally went in the opposite direction. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">What if ALL neurons were inherently gnostic in their nature? Along with questions raised by, “The Divided Brain”, this idea led me back to interneuron chemical communication and my ultimate epiphany. To describe how I got there, I’ll use one of my favorite tricks of critical thinking: instead of seeking the answer to any question, turn the question into the premise, taking the form of an assertion. Then evaluate the results in the context of current brain data.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Instead of trying to understand how a neuron could come to know something, I simply accepted that it did, then explored the consequences. And the consequences for me were astounding. This little exercise in critical thinking not only begged for a redefinition of the word knowledge, it changed how I thought about the definition of all words. Let’s deal with these new constraints of knowledge first, or better described as the lack thereof. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Knowledge is not something represented only by words or even actions. Indeed, (thanks to Dr. McGilchrist), most words and actions are re-presentations, or the result of knowledge, and not actual knowledge itself, thus the distinction between the definition for any experience and the word used to define it. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Neuronal knowledge is more primal, more basic, more atomic in this elemental sense. We come to know things we can’t even describe which is consistent with most knowledge being sub-cognitive. This is why we have the word “hunch” or the more modern expression, “vibe”, both of which reside on the boundary between conscious and unconscious. Only a very small subset of knowledge rises to the level of consciousness to be re-presented by words like these.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I now believe that most knowledge is created by neurons, and the expression of knowledge is not limited to the skull, nor the body, nor any given species. Before I explain a few of the myriad ways knowledge can be created (which in some cases can be quite complex), it will help to continue this simple game about defining words. Defining a word is a decursive version of what a neuron actually does when it fires. Words are literally the result of what happens when a neuron delivers that little packet of chemistry to the next synapse.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">How’s that for a radical thought? If you want to understand how neurons express knowledge, simply explore the Zen nature of words. For now, I’m going to follow this shortcut as to what it means for a neuron to know something, and simply suggest that it does. Suspend disbelief for a time, as is done in the theater, or in literature. Instead of thinking of a neuron’s knowledge as a message from aliens, think of it as the very core of a neuron's nature, arising between its dendrite and hillock, no matter how implausible it may seem at first. Let’s take a clue from one of my favorite Sherlock Holmes quotes:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">“Once you eliminate the impossible, whatever remains, no matter how improbable, must be the truth.”</span><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Following Sherlock’s advice, I accepted the observation to see where it might lead. That was the moment. That’s when everything changed for me. I now believe that the easiest way to proceed is not by debate nor logic, but simply to entertain this prime assertion and follow its path to enlightenment:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Neurons create knowledge, but not exclusively so.</span><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If you haven’t already done so, at this point I suggest you take a break from reading and go for a walk to think about this assertion without prejudice…</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">And even more walking… </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Until…</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">…</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Welcome back. Let’s proceed.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Even if you only accept that some neurons know something, the next question is what exactly do they know? And what is the nature of this knowledge? Or if my prime assertion is correct, how does it inform models of neuron operation and ultimately, the amazing intuition, resilience, and effectiveness of the brain? It requires a bit of imagination to envision the emergent nature of many (non-digital) bits of knowledge coming together to create that one neuron’s extraordinary abstraction we call Jennifer Anniston.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">So, how might a neuron create this knowledge? Were these special neurons? Or all of them? And how did this knowledge come together to yield behavior? That’s where I discovered an even bigger surprise. As I began to apply this gnostic approach at the macro level of the divided brain, human behavior started making a lot more sense. It was like people themselves were just higher-level neurons. The way they deal with consent, conflict, and control is similar to what I found at the nano-level of the neuron. That’s when the idea of decursion emerged. As I take you from the nano context of the neuron to the macro context of the brain, I can actually describe neurons in terms of human behavior, as long as that behavior is driven by a new broader definition of knowledge. I will provide plenty of examples, but first, here are a couple of important questions to address. They helped me in my quest:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Why does a neuron fire?</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">And when exactly?</span></p><div><span><br /></span></div><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 18pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Significance</span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Single-word answers are elegant, but are they accurate? Setting knowledge aside for a moment, have you ever seriously thought about why a neuron might fire? “Knowledge” is not the most primal answer. It’s more of the result. I treated this question as a koan and kept it in the back of my mind for years. The question is simple, but the answer, not so much. It was only after I’d spent a long time unlearning even the analogical nature of neural communication and finally dismissed the whole approach as not having a scientific answer. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Though frustrated, I worked on other issues, leaving the problem in the back of my mind for years. Then one day I thought of the neuron not as a component of a neural pathway, but as a cell standing alone. I became that cell. That’s when it came to me. Sometimes it helps to anthropomorphize in a more intimate fashion, or just broaden your focus. It’s why I suggested above that you go for a walk. Leaving the objective behind for the moment, reach for a more subjective answer to the question. If you were a neuron, why would YOU fire? And when?</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">For me, a neuron fires because it has found something significant in its world, something that may be of utility. This of course immediately begs the question, “what is significant?” This has an easier answer - with help from Darwin, significance is “knowing” something that helps a creature or even a concept survive and replicate (not unlike a meme per Richard Dalkins). </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">This brings me back to the elegance of one-word answers. This is where Iain McGilchrist’s “Divided Brain” helped out. He described our left-brain as the tool our mind uses to grasp something, to nail it down. This is decursively similar to the nature of sparse coding which is how we populate a map - only the most important stuff is included. What if neurons only fire when they detect important stuff - the very essence of significance?</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Our left-brain prefers solo and simple answers. It’s always looking for “the” thing as opposed to “a” thing. Our right-mind likes to keep its options open. After a fashion, the two sides of our brain establish dichotomies for many aspects of the world. Our left-brain tries to drive any solution toward the endpoints; our right-brain mines the middle ground, tending away from the left-brain’s focus, creating a useful dynamic tension between the two sides. Together, dichotomy is a powerful tool of investigation, and neural knowledge generation. Wisdom lies somewhere in the middle.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">When I was in high school, I would drive over to The College of the Redwoods to read a magazine simply called, “Electronics”. You could only subscribe if you were a documented engineer, which at 17 years old, I wasn’t. So I had to sit in the library to read this particular magazine. Not only did it have excellent content, but the ads were also from companies constantly bragging about their component specs, some of which were quite detailed. And very interesting, at least to me.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I remember at the time laughing at myself for studying the ads as much as I did the articles. In most publications, there is an editorial line between news and noise. This is also known as the line between content and spam using the current vernacular. It may be obvious, but this line gets adjusted as we learn more about the topic. Something similar happens with neurons. I ended up reading this magazine for decades. The more I learned about electronics, the more I ignored the ads. But the threshold of what I paid attention to was dynamic, and remains so. This is why advertising works. Well, when it does work, when you let it past your bullshit filter. The point is, every now and then ads can be content. Mostly the world is not black or white, true or false, right or wrong. We mine the areas between dichotomies for meaning all the time. This magazine allowed me to better appreciate the “Goldilocks” zone and challenge “absolutes”. I still do. But there are exceptions to the exceptions.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I will here assert that significance, unlike most things in life does not occur by degrees. It either is, or it isn’t, much like a firing neuron. It’s how we think of logic values - true or false, right or wrong, good or bad. That last sentence took you from objective logic to subjective social judgment in only three steps, yet it seems reasonable. As Bob Dylan would say, “If something’s not right, it’s wrong.” </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">But is Bob actually correct? Your left-brain might agree, but hopefully, your right-mind would withhold judgment. Our left-brain prefers to deal with the endpoints of dichotomy, our right-mind, the middle ground. It’s decursively similar to what a neuron does when it fires.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Our left-brain likes to nail things down, define them using fixed words. It’s how we grasp things in the world, ideas included. It’s the same with knowledge, including its more limited higher form - information. I</span><span style="font-family: Arial; font-size: 18.6667px; white-space-collapse: preserve;">nformation is closed like a parabola.</span><span style="font-family: Arial; font-size: 14pt; white-space-collapse: preserve;"> In contrast, k</span><span style="font-family: Arial; font-size: 18.6667px; white-space-collapse: preserve;">nowledge is more like a hyperbola. O</span><span style="font-family: Arial; font-size: 14pt; white-space-collapse: preserve;">ur right-mind likes to keep its options open in order to mine that middle ground. So it is with that realm between the neuron’s dendrite and hillock in the nano context. That’s where priming is managed until firing nails down a particularly significant thing.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">So is knowledge simply finding something which might be significant for survival at that moment for that particular creature? Or that particular neuron? At that particular moment?</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Wait before you answer. There’s more. “Significance” depends not only on context but perspective and the temporal aspects of synchronicity. This is where we need the answer about "when". The flash of a flame might be quite significant unless it’s accompanied by the visual image of someone lighting a cigarette. Then we ignore it. What’s objectively insignificant to me at this moment might become VERY significant to someone else in another moment depending upon context and experience.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Knowing? Significance? Objective or subjective? In this moment, but not the next? Are we headed into another logic trap? Another paradox? Not if you’ve studied Zen. I had already worked out how an analogical neuron could yield a type of “AND” and “NOT” function, or at least a fluffy relationship for each. I had even characterized the analogical version of the “master OR” configuration, also known as a CASE statement in some computer languages. In any case, (pun intended), this work informed my understanding of significance and led me back to my inverted assertion. So to speak. Or something like that.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This understanding forced me to reevaluate my process of turning the question into the premise. It was a way of freeing me from the assumption that I knew what the Jennifer Anniston neuron knew. It no longer mattered to me what exactly that particular neuron knew at that moment. The nature of its knowledge was subjectively a matter of its (that neuron's) sovereignty, not mine. It was no longer a question of how a particular neuron came to know someone that I had defined as Jennifer Anniston. It became a question of what exactly did that neuron know at that moment. And yes, the words "Jennifer Anniston" are a useful REpresentation, but not the actual knowledge. Only the neuron could know what that was. So the question I should have originally been asking was, what does each neuron know when it fires? My "flipping the question into the premise trick" was merely a bridge to a new perspective. (This paragraph was added years later. I hope you find it useful.)</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I’ve long thought of the line between yin and yang as a middle ground without dimension, without range. When is it the same for content and noise? Significant and insignificant? What I mean by this is, sometimes even spam is content. That line is the essence of discrimination. Which happens to be what a neuron does. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Significance is why a neuron fires. Discrimination is how it happens. But it's not just THAT it fires, it also fires WHEN it's most useful. Knowledge can best be described as a significant relationship between things that help survival and replication. The things are material. The relationship between them is ethereal, so knowledge is ethereal, at least until it is RE-presented in some material medium.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Now that we understand that a neuron fires when it finds some significant relationship in its world, it’s time to consider another single-word answer - "knowledge" in more detail.</span></p><br /><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 18pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">From Where Does Knowledge Spring?</span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Before we address the above question, let your mind wander just a bit more…</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">What if creating knowledge is literally the nature of neurons? Or more primally, simply the nature of matter itself? Above I described the cell membrane as an important trick of evolution that allows for the ionic nature of the neuron and their magic ability to create knowledge. So do cell membranes create knowledge? They certainly have some osmotic aspects that are ultimately key to the process. But does this tenuous membrane qualify as a source of knowledge? I can honestly say I don’t know, but I’ve left this paragraph in to leave the question open, and more importantly by going a step beyond, to anchor one end of knowledge generation as an axiom, a first principle. So for now, cell membranes do not know what to let in and what to keep out. But they might. Now I’ll return to that same important question.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">What if creating knowledge is literally the nature of neurons? (Or even their cell membranes?) </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">What if knowledge is simply what neurons produce? How might this idea affect my casual brain model? It may seem like I’m just playing word games but think about it carefully. For me it took weeks, but the ultimate epiphany was quick and dramatic. This simple idea changed everything in how I thought about the brain. And the mind. Indeed, a useful brain model can be logically, (or more likely, analogically) derived from the simple assertion that neurons create knowledge. This is true even if you don’t know the actual nature of knowledge, or even how it is created. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The answer does not require decursive analysis, nor even logic, though both can be very helpful. The easy answer simply reflects the nature of a neuron, decursively (from possibly the nature of cell membranes). The point is, I’m more certain that most knowledge starts with the neuron, at least in the way I’m about to redefine knowledge. It’s neurons that ultimately create meaning in dance, language, memes, and a thousand other forms of expression. It all flows from knowledge, because…</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Neurons are the genesis of most knowledge.</span><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Once I had that assertion, the idea wouldn’t leave me alone. I looked for ways to invalidate it. Please let me know if you are able to accomplish that objective. I even thought about coining a new word for this type of primal organic knowledge as opposed to the more macro and abstract type we generally deal with day to day, but decided the word knowledge (and to know a thing) was actually the most accurate description I could find. Let me know if you come up with a better word than knowledge.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I played with the idea for months, but it just became more and more convincing. For me, it now seems impossible that neurons do NOT know something at the instant that they fire. So back to the questions that come to mind from this little exercise - what does each neuron know? And how exactly does it come by this knowledge?</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The answers to these questions allowed for a dramatic conceptual simplification. A Zen moment. For now, I won’t bother returning to Socratic debate which is steeped in left-brain logic and language. Instead, to share my experience of that moment, I’ll return to that process of elimination and flip the premise:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If not from a neuron, from where does knowledge spring? From that apple in Eden? Carl Sagan almost reached that conclusion, (or should I say genesis?), in “Dragons of Eden.” Or does knowledge spring from the forehead of Zeus? Or his prefrontal corti? That’s a useful hint. Even our right-minded myths point us in the more probable direction. If you haven’t already, take some time with this question. Treat it as a Zen koan. I did for years.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Creating knowledge is not limited to just that one Jennifer Aniston neuron, or the many other examples from split-brain surgery. Regardless of how you understand the poor historical definition of knowledge above, I will assert that ALL neurons know something at the moment they fire. Knowledge is the variable in multiple ways. It's not only the "limited" algebraic variable with its rigid structure, type, or values. It's something far more general, literally including everything anyone (or any animal) has ever known. This more generalized "variable" of knowledge becomes a matter of what exactly each neuron knows, and how this knowledge is acquired and applied.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As an answer and example of where knowledge comes from, let’s deconstruct the “World Book”, my favorite resource as a child. Where did all that knowledge come from? People, of course. Its knowledge came from writers and editors. And they got it from their teachers who got their knowledge indirectly from those who originally wrote it down, either from direct experience of investigation or from actual witnesses. And where did they get this knowledge? Let’s use the direct experience case since it’s more to the point. That’s right. They lived it, then expressed it in movement taking the form of speaking or writing, both controlled by neurons. Ultimately, most knowledge starts with the firing of a single neuron, which brings us back to searching for a more useful definition of knowledge.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Slipping back into the world of logic and definitions for a moment, there are two ways of defining a word (which can represent a bit of knowledge). Words can be formally defined in terms of other words. This is the essence of association, but not identity. If the meaning behind one word were EXACTLY the same as another, why would both things need separate words to represent them? Even when some are quite similar, each word represents something unique. As you can clearly see, word-based definitions have a problem. They lead to a circular paradox - words defined by words, defined by the same words. If you want proof of this assertion, try satisfying a young child’s series of questions all beginning with “why”. Your words will ultimately be futile, likely exhausting your patience before your vocabulary. From the mind of a child, wisdom.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Fortunately, there’s another way to define things. It’s simply applying direct experience, such as feeling a hot stove. I will here suggest that defining words in terms of other words is the more limiting alternative in a left-brained Bizarro and indirect fashion. Right-minded intuition is far richer in meaning than any collection of words, including these. These paragraphs are only a sparse approximation, especially when it comes to defining something as important as the word knowledge. And defining the experience of knowledge? Well, you have to live it.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">That’s why it’s important to not only think about but also to anthropomorphically and subjectively envision a neuron collecting many bits of sensation or other knowledge to create that one thing that IT knows. The best way to really know something is to experience it. Otherwise, we’re just taking someone else’s “word” for it, (including all the logical issues presented by Plato and others). </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">So, what’s it like to experience the creation of knowledge? It’s exactly as you might imagine - insightful. Sometimes. But most of the time we’re not even aware of the vast nature of this knowledge, let alone the experience. This is where introspection becomes a critical tool for understanding introspection’s limits. The creation of some knowledge can be brought in from the shadows, but most knowledge will always remain a mystery. Perhaps it’s time to reframe that definition of knowledge. Yes, I believe that time has ripened the word-fruit from our tree of… knowledge.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Now for a new definition of the experience of knowledge, ironically, using words.</span></p><div><span><br /></span></div><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 18pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Redefining Knowledge</span></h3><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><br /></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">"I know it when I see it." - Supreme Court Justice Potter Stewart</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Justice Stewart was referring to obscenity when he reached the above conclusion, and the movie in question did not qualify. At least according to him. Which was the point. We may not always be able to define a rule for some quality, but we still know it when we see it. So what is this thing we see? What is this thing we call knowledge? How do we define it? Or come to know it when we see it?</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Once we back into this working premise that neurons create something called knowledge, we need to define knowledge more clearly. This is especially important to those who are technically minded as they are more likely to apply their left-brain to the issue even if the answer lies in the realm of the right-mind. So, what is the nature of this new more generalized type of neuronal knowledge? Is there a way to define knowledge as a practical experience? I’ll here assert that there is, and that way is Zen simple. The definition again flows from</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> turning the question into the premise as noted above:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Knowledge is that which is created when a neuron fires,</span><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">…but not exclusively so.</span><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> </span></p> <br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If you’re not laughing, don’t bounce just yet. Before you decide I’ve gone full-looney, relax, and actually entertain the idea. You’ve come this far. You might as well suspend disbelief a bit longer. Yes, I realize this seems like a case of circular reasoning or a paradox, but it’s not. As Iain McGilchrist noted, it's more like "Drawing Hands", by M.C. Escher. </span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">The reason this definition is enigmatic is that there's more than one way to understand knowledge, and one way need not preclude another. Also, f</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">or a definition, it’s not very definitive, but the exceptions are relatively rare.</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> </span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">It’s mostly a primal definition, a firstish principle.</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> An epistemological axiom, with an exception. Or perhaps a few more. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Thinking about the issue leads us back to presenting a Zen koan - which of course, defies logic. But there it is. Play with it a bit. The simple idea that neurons create knowledge solves so many problems in modeling the brain that the answers we seek must at the very least lie somewhere in that direction. Neurons creating knowledge explains a great deal about language, art, science, and of course, philosophy. These are all expressions of human behavior. For me this idea was startling and I laughed about it for days before I began to take it seriously. Now I can’t see the world or neurons in any other way. But you are likely new to the idea, so let’s play with it a bit more. It takes time to adjust.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I could have asserted that neurons simply create knowledge, without defining knowledge, but that would beg more questions. This definition begs fewer and provides better answers. For now, it's more useful to have this definition than not. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">For instance, what does it mean to create a thing? If the idea of “creation” challenges you, there’s an easier way to think about the topic. Think of neurons as complex active filters converting experience into knowledge, capturing an essential bit of the nature of the world as such a process implies, but not always in a consistent or determinant fashion. At this point, I must be clear. Neurons don't simply convert experience into knowledge. Neurons literally CREATE knowledge. Experience is simply the raw material neurons use in this creative process. I’ll describe how later on.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Though easier to think about, subjective experience conversion is not quite as mystical as creation, but it still leaves room for the magic to happen. The magic that happens between the dendrite and the hillock of the neuron is a reflection of the magic we experience in the world. The World is the source of experience, and even more importantly, a big part of a loop with various neural pathways. Neurons are constantly working to perfect knowledge as a dancer might do with movement. Or something like that.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Once you achieve it, the implications of this perspective are profound. All knowledge takes on new meaning when redefined in this way. Let’s explore a few of these shifts in thinking. For instance, I’ve noted that neurons fire when they detect something significant in the world, at the very least significant relative to that neuron, and likely significant for the whole organism. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I’ll go one step deeper for those seeking a handhold in mathematics. When any neuron fires it divides that particular aspect of the world, (informed by which type of neural sensor initially fires), into two mutually exclusive sets - almost enough and more than required. The edge between these two domains is where the magic happens, where knowledge is created. It’s that “S” line in the symbol for yin and yang. It’s the very edge of indecision. Or decision. It’s that creative spark (metaphorically ionic, not electronic).</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Now overlay these two sets at angles with the sets created by another associated neuron sensing some different aspect of that same event in the same context, let’s say just enough heat and light for these two types of neurons. Where the yin and yang of these two edges cross defines a point of no dimension, yet with great significance for some creature just trying to stay alive and survive a possible actual fire in this particular example. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Now add in a third neuron sensing yet another aspect of that context, perhaps the complex smell of smoke. Now imagine that each of these three neurons with their respective double sets is supported by hundreds of other more primal neurons to support these three points of knowledge, yielding this abstraction we call fire. This should give you just a glimpse of how specific (and abstract) any given convergent neural net can become. Going deeper at this point will challenge most readers, and this hint is enough for those who wish to chase the model using geometry and set theory. Or more accurately, those lines and points between such sets.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Yes, I’ve left out a few hundred pages of very technical detail and debate, but most of that is about the “how” of creating knowledge as noted above. Once I realized that it was possible to understand the “what” of neuronal knowledge creation without knowing specifically “how”, I set out to describe knowledge generation in its simplest form possible - philosophy. Using these words.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">For you technologists, think of a computer language without an input/output library or any kind of I/O function. The operation of any program in that context must by definition only exist within its mathematically complete world. I actually defined such a tiny language, and a friend wrote a compiler for it so we could play with the ideas as we debated the issues about the essence of entropy. And extropy.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Since I’m trying to present this concept in its simplest form, I’d like to first present what it allows for in modeling the brain before we get into the “how” of the neuron. I’m just going to leave these ideas here for now and express my experience as I applied them using decursive examples. Hopefully, by then it will make as much sense to you, as it does to me.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If the idea that neurons create knowledge doesn’t make any sense at all, I suggest you simply stop reading at this point and imagine that it’s true. What are the consequences? Test them as you will. Your imagination will be different from mine, but all of that will likely be helpful. Think of ways to prove me wrong. I would love to entertain your debate. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Now with your version in mind, what problems does it solve with respect to understanding human behavior? Take all the time you want. Hours, weeks, or years. I’ve had that luxury. The rest of what I present is based on this simple assertion, but for me, pieces started to fall into place right away. And yes, I’ve been down the “how” rabbit hole enough times to realize there are many answers to that question and most are ultimately far simpler than the concept itself, and its consequences. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">At first, it was just a trickle, but over the next few weeks, it turned into a torrent of resolved paradoxes. I’m tempted to leave, “how neurons come to know”, as a student exercise. At least for now. Other questions are actually more important, and their answers more enlightening. This new definition of knowledge allows for a wonderful flexibility in modeling the brain as I'll present in due course.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I realize your analysis of this quirky redefinition is likely to be quite different from mine. My perspective is a technical one filtered by intuition, art, and Zen. Because of my technical background, I’ve had to struggle to accept my own conclusions in each of these issues. Some of them may ultimately turn out to be completely wrong, but this perspective has been so useful over the last few years that I remain compelled to document my conclusions, so I will proceed.</span></p><div><span><br /></span></div><br /><h2 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 10pt;"><span style="font-family: Arial; font-size: 24pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The Nature of Knowledge</span></h2><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">"Somewhere, something incredible is waiting to be known." - Carl Sagan.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Might that “something” be the nature of knowledge itself? </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Using this new definition of neuronal knowledge, how might we describe knowledge in general? Returning for a moment to the discussion of significance, what is it that any given neuron finds significant in the world? This question of course has a myriad of answers, but a generalization might be something that a given neuron has detected before that might help it survive, plus or minus. I note the variance because neurons seem to be constantly adjusting their sensitivity to specific prior conditions, causing the result to range from one end of a spectrum to another in multiple aspects. This is quite different from another source of biological knowledge which I should quickly address.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This newly defined neuronal knowledge should by now sound similar to an immune response where a mature T cell recognizes a specific prior infectious agent, but without needed adjustments. Gerald Edelman developed what he called the Theory of Neuronal Group Selection (TNGS) largely based on a similar idea. The theory hasn’t gotten much traction, though many of his ideas have been quite useful for me. Can an immune cell be said to know something about the infection it guards against? How might this type of “memory” be different from the memory we ascribe to the brain? I found it a useful hint. Much more on memory later.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">What if we think of the entire world as a source of challenges, some from bacteria, some from viruses, but mostly various environmental conditions we can detect using smell, light, touch, temperature, and all the other neuronal sensors the body possesses? When one of these sensor neurons fires because it recognizes a specific condition or profile of conditions outside of the body, might it then deliver to any subscribing neuron a chemical signal representing that unique profile or condition? Further, might this second interneuron sample this fairly standard chemical message in a way that it can best hone its sensitivity to what IT wants to know in contrast to how it’s being informed? This would allow the sensor neuron to know one thing about the world at that moment, and the next neuron in the path to know something a bit different depending upon what other signals arrive within a temporal window of synchronicity. This “what if '' is only one possibility, though a likely one. It also goes to the genesis of consent.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Before I get carried away with the details of chemical communication, let me try to generalize a bit about the impact of this new definition of knowledge. Let’s just summarize by saying neurons (sensor and otherwise) evolve a sensitivity to specific conditions in the world, and knowledge is a reflection of this sensitivity. I believe it’s fairly easy to see how this significant detection might reflect the relationship between that thing and the neuron and be quite useful for survival. THAT is the nature of knowledge - the relationships between things out there and ourselves as neurons, anthropomorphically speaking. Neurons create knowledge, and so do people from their relationships with other people and other things.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Here are some of the more obvious consequences of redefining knowledge. This may seem like a backward approach but I’m going to characterize knowledge as a collection of dichotomies, or more accurately, the continuums between dichotomies. This approach to describing knowledge is based on what I’ve learned at the nano level. Using decursion, we can tap into our cultural vocabulary which gets us closer than any tech or mathematical approach. </span></p><br /><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 18pt; white-space: pre-wrap;">What Exactly Does a Neuron Know? </span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">So if neurons really do create knowledge, what is the nature of this knowledge? I’m sure there are many other important aspects but these ten have moved to the top of my list over the last few years. Others come and go. These aspects of knowledge are not fixed or definitive in any way. Instead, these aspects are best represented as spectrums anchored by dichotomy. Knowledge ranges:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-indent: 36pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">From</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">To</span></p><div><span><br /></span></div><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Subjective</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Objective</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Ephemeral</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Persistent</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Novel</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline;"><span class="Apple-tab-span"> <span style="white-space: pre;"> </span><span><span style="white-space: pre;"> <span> </span><span> </span></span><span style="white-space: pre-wrap;">Expected</span></span></span></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Fallible</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> <span> </span><span> </span></span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Reliable</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Capricious</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Useful</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Abstract<span> </span><span> </span><span> </span><span> </span><span> </span><span> </span><span> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">Concrete</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Emergent<span> </span><span> </span><span> </span><span> </span><span> </span><span> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">Reducible</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Ethereal</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> <span> </span><span> </span></span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Real</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Signal State</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Relationship<span> </span><span> </span><span> </span><span> </span><span> Function</span></span></p><div><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></div><h4 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-weight: normal;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">These spectral associations obviously don’t replace Plato’s three words as a definition. We did that when a neuron fired. This description is not definitive, it's just a more general way of understanding knowledge. These are not merely words to anchor meaning. They are limits of spectrums.</span></span></h4><h4 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-weight: normal;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></span></h4><h4 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-weight: normal;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Think of knowledge as love. Does one word define it? The Greeks used 16 words in an attempt to corral a definition for love, but that just broke the definition into types, begging even more subtypes in a left-brained fashion. Love remains enigmatic. So is knowledge. Love</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> is literally a type of knowledge. </span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Also, these words are only those that have captured my attention so far. They are not definitive. They are just the opening of a door. Let’s step through.</span></span></h4><div><span><br /></span></div><h4 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></h4><h4 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Knowledge is Subjective, Striving to Become Objective</span></h4><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This first spectrum begins at a sensor neuron, the first in a series forming a neural pathway. Even if we knew why it triggered the next time it did, would we be able to quantify it? And if we could, this quantification might change in the next moment. Such knowledge is subjective to that neuron, at that moment, and what that knowledge means can change almost as quickly as it’s detected. Even though knowledge becomes more objective over time and as we move along any given neural path, it also becomes more dependent upon what each preceding neuron knows. Even operational objectivity is never truly achieved within a single skull. Knowledge is not “encoded” in any given neuron, but the neurons do become sensitized to a specific bit of knowledge as they decode the world. Knowledge from these neurons converges on a relative consensus with each step becoming almost objective near the muscle where the process continues in a macro context between individuals but never truly reaching its objective as a limit (circular reference intended). Knowledge is subjective as its quality is relative to that neuron at that moment, and changes from neuron to neuron, and moment to moment. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Much of this knowledge</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> competes</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> and</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> cooperates</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> with knowledge from those neurons around it. What this means is that what any neuron knows may be quite different from what an adjacent neuron similarly informed comes to know as each reaches its own conclusion.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As knowledge converges into more specific cues, it’s better described by the limit at the other end of its continuum. Subjective knowledge becomes more objective just before it drives a script of muscle movement, but not exclusively so. We’ll explore why shortly.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Decursively in the macro context, what individual people come to know is also on this spectrum. You and I may have similar backgrounds and read the same paragraph yet come to know very different things about what that paragraph says. We may even reach opposite conclusions as to its meaning. The result is similar with neurons. What they come to know is sort of an analog function better described as a mathematical relationship of how each of their triggers is primed, and/or fire. So it is with people. Even in a given neuron (or person) consistency ranges from nearly random to almost determinant, but not exclusively so. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I realize this sounds pretty fluffy, and it’s true only by degrees, but much neuronal knowledge is fairly consistent. Well, at least as consistent as human behavior, and for similar reasons. Neuronal knowledge also becomes more consistent as patterns evolve in life, and also with each step along a neural pathway yielding increased abstraction. But never reaches its limit.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Knowledge is a signal meant to cue or prime movement. It depends upon how that knowledge is literally applied in ratios of activation and inhibition by other neurons, and how it’s refined as it gets schematically closer to a muscle. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Or something like that.</span></p><br /><br /><h4 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Knowledge is Ephemeral, Striving to Be Persistent</span></h4><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In a fashion similar to the subjective-objective spectrum, primal neuronal knowledge is individually ephemeral, but there are tricks to make its sensitivity more persistent over time. Examples to follow.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Knowledge only exists for about a thousandth of a second and then is lost, meaning that that specific bit of knowledge can’t be detected again - ever. Similar knowledge can be detected by that same neuron in the next moment, but because of its dynamic nature, may sometimes be quite different. This makes it difficult to simulate the world, but not impossible. Evolution has come up with some very elegant tricks using chemistry which I’ll describe in due course.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This ephemeral nature of knowledge is also why so many psychological tests fail repeatability and therefore are eliminated from being part of science. Science requires objective consistency in the macro context. So do people in order to manage their lives in the world effectively.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Knowledge is a signal striving to become a state.</span></p><div><span><br /></span></div><div><span><br /></span></div><h4 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space-collapse: preserve;">Knowledge is Novel, Becoming Expected</span></h4><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space-collapse: preserve;">This aspect of knowledge used to be "Random to Predictable", but I thought "Novel to Expected" might be a better description, but I left the original version included below just in case. Such is the dynamic, or maybe fluid nature of knowledge.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space-collapse: preserve;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space-collapse: preserve;">At birth everything is novel and lots of neurons are firing, but the ones that deliver more utility from what they move over time tend to be the ones not pruned. Yet novelty holds great utility all through life as demonstrated by the operation of our right mind. Note how the nano finds decursive form in the macro of the divided brain. Perhaps I should switch this back to capture some irony? Or is such ambivalence just my mood this morning?</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space-collapse: preserve;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space-collapse: preserve;">Whichever version you prefer, as each neuron accumulates more and stronger connections, what that neuron comes to "know" tends to settle in, thus, so does its knowledge. Knowledge is fluffy becoming more useful as noted in the next (now second) section below.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space-collapse: preserve;"><br /></span></p><br /><h4 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">(Knowledge is Random, Striving to be Predictable</span></h4><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">OK, random is actually the anchoring limit, not exactly the nature of knowledge. But it sometimes starts out that way. Think of the primal limit of knowledge as random initial conditions for a system designed to be self-tuning. I doubt there’s anything truly random about the firing of a neuron, but it can certainly seem like that on occasion.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Since the knowledge created by any neuron is dependent upon all the neurons that inform it, consistency is quite useful, but not required. If a given neuron’s knowledge is not of much value to the next neuron, the quality of that connection will atrophy over time.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Fortunately, the sensitivity for any given knowledge strengthens with each firing of that neuron and each neuron in the path. And since predictability holds great utility for survival, repeatability is the objective, and the actual result in many cases. B</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">ut only by degrees</span><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">So, unfortunately, you can’t always count on it.)</span></p><br /><br /><h4 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Knowledge is Fallible, Striving to be Reliable</span></h4><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">"90 percent of everything is crap" - Theodore Sturgeon</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This third consequence of neuronal knowledge is perhaps the most challenging - that knowledge is not necessarily true in any logical or reliable sense. Indeed, “fallible” may be an understatement when accessing the quality of primal neuronal knowledge. Such knowledge may not even be close to the truth, though it tries to be. At its best, the quality of knowledge asymptotically approaches the truth but typically struggles to beat the flip of a coin, and in many cases for new experiences, knowledge may have less than a five percent chance of being accurate. Sometimes, even far less. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">When this idea is applied to individual humans decursively, the result is what we observe in politics - wide disagreement, with conviction. What you may know about any topic, (such as modeling the brain), is likely quite different from what I know. We usually sort it out by seeking the answers using cooperation and competition to achieve consensus, just as neurons do. One of the main differences with humans is, once we express it in some more permanent form it becomes information (see information’s contrast with knowledge below). Neuronal knowledge does not have such luxury. It must remain flexible and adaptable, but by degrees.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This consequence may be difficult to accept but not if you consider the impact of false positives compared to false negatives for each case. Often there are tradeoffs, even when the probabilities of each are in the single digits. The key to quality is the ultimate Darwinian consequence. This is only one-way evolution has learned to evolve - by applying knowledge disproportionately, accurate or not.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The point is, what you know may be wrong. Keep an open mind.</span></p><br /><br /><h4 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Knowledge is Capricious, Striving to Be Useful</span></h4><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Which brings us to utility. Knowledge need only be right one time out of a thousand, if that thousandth time is the critical application that allows for survival. Knowledge is the reason that one out of 20 businesses succeeds. Does that invalidate the 19 that fail? That 20th may make up for all the rest, but that’s enough to be useful. Business success is a decursive example of such knowledge generation and application.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Even long shots have some utility for that neuron at that moment, so they are justified even if they can’t be defended logically. That’s where belief comes in, or if you prefer, faith. Belief is an even higher-order aspirational illusion that is closely related to truth, or at least its approximation. Such diametrically opposed knowledge may be less useful, though surprisingly not actually false. At least from a neuron’s perspective. All religions can’t be true at once, yet faith provides value for each, subjectively. And one (or more) may ultimately be the truth.</span></p><br /><br /><h4 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Knowledge is Abstract, Striving to be Concrete</span></h4><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Knowledge starts out as a kind of sparse abstraction of the world. Similar to the way a pinscreen captures a reflection of our face, the mind is a reflection of the brain's effort to capture knowledge of the world.</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> After only a few jumps from neuron to neuron this knowledge quickly becomes even more abstract which then informs a concrete response to the world to once again be evaluated.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If you've never played with a pinscreen, it's a sculptor's tool. It's a useful plaything and great for capturing compound curves in a 3D space. A pinscreen is a collection of pins or nails set loosely in a</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> matrix of holes in a square piece of thick plywood. Each of these pins can move individually. When you press the pins from the block of wood up against your face it captures a three-dimensional image of who you are in inverse form, sparsely decoded. (The other end of the pins contains your positive image.) </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Our brain does something similar when capturing images and other sensory input. It converts reality into an inverse signal simulation which we experience as the mind. The ethereal mind becomes pure abstraction or a type of negative reflection of our world which we manage internally, struggling to find concrete form that can be usefully applied. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Just as a neuron converts its knowledge back into physical form as a packet of chemistry at the end of the axon, our muscles convert our ethereal mind back into reality as a physical expression to affect the world, decursively. Neurons sense the physical world and convert it to ethereal knowledge, then back to material chemistry. Sensing is concrete. The abstraction is pure ethereal knowledge. More on the pinscreen later.</span></p><br /><br /><h4 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Knowledge is Emergent, Striving to become Reducible</span></h4><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The emergent aspect of neuronal knowledge is a little more difficult to understand, but fortunately, there are many excellent examples. An emergent property is when many somewhat similar things come together to produce a result dramatically different than each of those individual things. Digital music is a useful metaphor. It’s created from thousands of electronic states delivered in a specific order to an electromagnet that modulates sound waves, ultimately yielding something transformational; in this case, a beautiful tune. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The brain does something similar in the visual corti. It’s described in layers as V1, V2, V3... etc. These layers converge millions of pixels of knowledge to yield the emergent result of an ephemeral image. It takes surprisingly few layers to manage the abstraction we call a face. Perhaps Jennifer Aniston’s face. Such a result can be described as emergent, similar to a song.</span></p><br /><br /><h4 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Knowledge is Ethereal, Only Modeling the Real</span></h4><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This final aspect is the continuum that sums up the difference between the brain and the mind, the dead and the living, the method, and the magic. The brain is material. The mind is ethereal. Atoms and molecules are material. Knowledge is how they are arranged reflecting the relationship between things. This arrangement is ethereal and dynamic.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">To clarify, when a neuron fires it creates ethereal knowledge, meaning the knowledge has no material substance but can effect material substance as the axon uses ionic charge to move this knowledge along the axon then express it as a puff of chemistry at the synapses. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In a similar respect, knowledge is the ethereal song even though its representation is expressed by vibrating atoms of air in the material world. The brain is physical, made up of matter; the mind is an abstract illusion of the real, a thing of the ether, meaning literally from the other world, or more simply, the other. Thus ethe-real.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This does not mean that the mind is a ghost, but it may be. I wouldn’t draw that conclusion, but others might. From my perspective, something else may be going on - that our left brain does not like to admit the ethereality of the right-mind. This is because of the isolation between the two sides of the brain which is needed for our left-brain to get its work done.</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> I think of the divide in our brain (and our mind) as a necessary isolation as opposed to a mystery.</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> Ethereality is why we don't yet have a good model of the brain in spite of way too much information. This technical distraction is what makes consciousness a hard problem.</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> Our left-brain tends to deal with the physical things in the world. Our right-mind is more comfortable with the ethereal.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">In contrast with the left, our right-mind needs to know everything about the whole brain (and mind) to keep both of them safe, but mostly in the moment in case left-brain strategies fail, providing for a pervasive redundancy as our left-brain navigates time. </span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">This divide allows our left-brain to remain grounded, and our right-minded mystic to soar, as may be needed for each to offer their individual illusions of the world. Now let's unify this experience in our skulls.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><br /></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><br /></p><h4 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Relationships, Born of Reality, Probing the Ethereal</span></h4><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><br /></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I want to be clear at this point. This project started out as a technical description of the neuron but has become a love letter to philosophy, exploring knowledge as the object of that love. I’ve had to leave the technical part in the closet for now in order to describe the ethereal aspects of knowledge creation. This model of the brain answers so many questions and validates so many aspects of other models from Descartes to Freud, to Skinner, but in various ways for each. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">It's not a matter of this gnostic model being right or wrong. That's ultimately for science to settle. At this point, the question is, does this neognostic model provide a more useful way of thinking about the brain and human behavior? For me, the answer is obvious.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This gnostic way of thinking about neurons unifies the brain with the mind. Instead of the pineal gland, the spiritual magic happens between the dendrite and the hillock in each of billions of neurons competing and cooperating to create knowledge of the world needed to manage this illusion of reality - better described as ethereality since we can never be absolutely sure of the quality of the knowledge we acquire from our bodies. We only approach the limits of truth.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-weight: normal; white-space: pre-wrap;">For instance, what is the relationship between water and dirt?</span></h3><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">One form of this mixture we commonly describe as mud. It is one aspect of the relationship between dirt and water. The dirt, water, and mud are all real, but the relationship formed when we mix dirt and water which we describe using the word mud is ethereal, even though the pressure waves in our ear when we hear the word mud is also real. It's the relationship between dirt and water when mixed that is knowledge, and a reflection of that relationship as knowledge is a reflection of the world. The word "mud" is just how we represent this knowledge.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Now extend this idea to the interactions of all the things you know of, including people. These interactions are known as relationships. The number of them is literally infinite. That is why neurons only create the knowledge about the most significant relationships. This is the same thing that happens when someone creates a map (which represents knowledge). The map can not contain ALL of the relationships about the ground in question. So the map maker only notes the ones which are most significant for the task at hand. For an army major, it's the rivers and hills. For a botanist, it's the flowers and the extent of the grasses. All of it is knowledge created by someone's neurons.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Again, language guides the way. The words theater and thesis flow from ethereal. The theater is a decursive version of our worldmap</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">culus, a type of virtual world or simulation created from the knowledge of billions of neurons in a semiotic fashion. In being so familiar with the theater, William Shakespear intuitively understood the nature of this ethereality - "All the world's a stage". A thesis for the ethereal is what I here present.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Another way to look at these issues is people and things are material, but the relationships between these things have real aspects which are captured as ethereal knowledge. Here you may need to take a break and go for a walk again. I certainly did. This conclusion literally took miles to reach. What I'm writing, I do not write casually, and I won't be flippant. I am quite serious about these next few paragraphs.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Even without biology, I believe relationships exist, but they aren't RE-presented as signals and certainly not as things of the mind. Relationships do not have mass. They do not occupy space. But they do live on the boundary between real and ethereal. Until there were neurons, relationships existed but it took neurons to convert these relationships into knowledge represented as signals. Before I go too far with this idea, I will not exclude the possibility that signals representing knowledge might not be able to be created artificially or even possibly in some more alien form, or perhaps in some other more primal yet organic form. I try to keep an open mind, and I encourage you to do the same thing.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Here it's important to contrast knowledge with truth. The relationship between things is best described as truth, but reality is only known by degrees, so truth is only an aspiration of the mind. What each of us (and each neuron) knows about truth is merely a good-faith estimate.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">For instance, the constant pi is the quantified ratio between the circumference of a circle and its diameter. Pi is not material. It does not have mass. Pi does not occupy space, but once objectively agreed upon, it can be REpresented in physical media as information about this particular ratio. It is a bit of knowledge that describes the relationship between a circle's circumference and its diameter. Knowledge about pi</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> </span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">likely and asymptotically closely approachs the truth. Math is one type of knowledge reflecting the relationships between things in the real world, but math itself is ethereal.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Relationships are the surfaces between things and have many aspects which can be effectively managed. But knowledge about these relationships allows us to predict how things in our world may interact, and so we can optimize those interactions. Knowledge is not the relationship, but it can inform others about this relationship in an ethereal fashion. Knowledge is the RE-presentation of relationships between things.</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> Such concepts will be very important when we get around to exploring consciousness. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><h4 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Knowledge is a Signal, Striving to Become a State</span></h4><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br style="color: black; font-family: "Times New Roman"; font-size: medium; white-space: normal;" /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Knowledge is a signal, born of the material, but having an ethereal nature and striving to become a state as information.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br style="color: black; font-family: "Times New Roman"; font-size: medium; white-space: normal;" /></span></p><h4 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space-collapse: preserve;">Knowledge is a Relationship, Seeking Functionality</span></h4><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space-collapse: preserve;"><br style="color: black; font-family: "Times New Roman"; font-size: medium; white-space-collapse: collapse;" /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space-collapse: preserve;">If you explore the differences between a relationship and a function this arch is a bit redundant, but I include it for the technical. It may help in your transition from thinking about information theory to the true nature of knowledge.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space-collapse: preserve;"></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space-collapse: preserve;"><br style="color: black; font-family: "Times New Roman"; font-size: medium; white-space-collapse: collapse;" /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><b>Dancing with the Consequences of Knowledge</b> </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><br /></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Before we leave this challenging topic, I need to carefully present the second and in some ways more critical half of this process best described as the consequence of this prime assertion. This ethereal signal created by a neuron does not come into existence alone. It has a counterpart best described by the word dance. Yes, I mean literally the thing that Michael Jackson was so good at doing in the macro context, the neuron relies on muscles to perform even in the micro context, sometimes quite indirectly. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Without dance, there is no way for the neuron to affect the world and so no way for the world to once again affect the neuron. Without this loop made up of, w</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">orld - sense - decide - signal - converge - cue - script - movement - </span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">w</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">orld - sense - decide - signal - converge - cue - script - movement - </span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">w</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">orld - sense - decide - signal - converge - cue - script - movement - world...</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> there is no way for the neuron to hone its knowledge of this relationship between things. </span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">Knowledge is merely a constantly changing approximation and REpresentation of this relationship between things in the form of a chemical signal. Dance is critical to the creation of knowledge. This movement is primed in the nano context. It is triggered in the micro context. It happens in the macro context. I will describe how evolution came up with this "trick" in due course, but here's one more observation about knowledge.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">Knowledge is the simplest thing I can imagine. It's more simple than matter.</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> It's more simple than movement, so of course, it</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">'s even more simple than a more complex dance. It's even more simple than the signal that represents it. It's just a sign. And it's applied to allow the operation of the most complex thing so far known in the universe - the brain.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">I could write a whole book about the consequence of knowing. So could you. </span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">These are the best words I’ve been able to find to describe these ideas.</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> So far. Perhaps yours will be better. I encourage you to write them.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><br /></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 18pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Knowledge at Human Scale</span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Do any of these ideas sound familiar? Once you begin to explore the consequences of gnostic neurons and brains, some of life's greatest challenges begin to make more sense, at least in an intuitive way. This simple model of a complex brain decursively helps to explain the more enigmatic aspects of war, human relationships, and even the divided brain itself.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If you’ve read The Master and His Emissary, Dr. McGilchrist presents the dichotomy of the right and left brain in a fashion similar to what I’m presenting, including that the left-brain has come to dominate our culture in the last few centuries, and at other times. I'm sure each has had its turn many times in the last billion years. I threw in the continuum part as wisdom comes not from the limits of each side, but from the realm between them. Even science itself can be described as playful experiments of the right-mind struggling to become “facts” in the left-brain.</span></p><div><span><br /></span></div><span id="docs-internal-guid-970e61f4-7fff-afc3-2cac-63a28e0f1acb"><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space-collapse: preserve;">Facts do not change. Knowledge does. What any given person knows about some event is often quite different from what another person might know about the same event, or even what that same person might know about that event at a different point in time.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space-collapse: preserve;">Just because knowledge may be true at some point in time, that does not make it a fact. Even after the fact (entering the set of things we call history), what you know for sure may not always remain true. There have been all kinds of exceptions causing reversals in fact. The obvious argument then becomes, “well, it wasn’t actually true to begin with, and so not a fact.” </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space-collapse: preserve;">This assertion is also true. It’s a fact. So how many tests must we run before we finally accept something as fact? And how long need we wait? The answers, which become “patently obvious to the most casual observer, ” are an infinite number and forever. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space-collapse: preserve;">As a matter of fact, facts do not exist. Nor does truth. Facts are merely more reliable, more stable forms of knowledge. For now. And truth? Merely an illusion.</span></p><div><span style="font-family: Arial; font-size: 14pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space-collapse: preserve;"><br /></span></div></span><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">And if you’ve studied human behavior, each of these continuums should also be familiar. I did not invent the words used to describe this model (well, maybe one - decursion). Most of these words and many of the ideas existed long before me. They were just less accessible with their knowledge living as they were in the realm of the right-mind. I just put them in this order by drawing them into my left-brain from my right-mind. I'll explain how later. For me, they present the range of what humans know and what we do with that knowledge. Examples could fill all the libraries, plus a soap opera. And they do. Yet there is much more to learn, so let’s get at it.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If you review the descriptions of knowledge above, most of it can be applied to humans in a macro context without much change. I could go off in a thousand directions at once. For instance, hierarchical organizations in our culture outside of our skull are much like they are organized within the brain. Decursively. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Knowledge finding form as information supplied by many soldiers in the front line might move up through the chain of command to give a general a useful insight of an attack in relatively few steps. It’s called raising an alarm, and takes many forms in war. Such knowledge allows the general to manage the battle more effectively. War is the ultimate example of competition; the collecting of battle knowledge at the risk of death, the ultimate example of cooperation. All of them are literally trying to stay alive in their own contie. I earlier presented similar examples in the stock market and congress. Refer back as needed. But let’s return to the neuron for now.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The quality of knowledge from any neuron depends upon the specific neuron in question, the event that drove its firing, and where it is along its neural pathway aspiring to control the body. This quality can range from barely significant at sensor neurons, all the way to quite accurate about life-changing reality at the most abstract neurons just before they generate behavior. How do we manage the consequences of such slippery knowledge? The best we can.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Most knowledge is not accessible consciously. Indeed, it’s a very small percentage of knowledge that raises to the level of awareness. How do we manage this vast and dynamic resource if we don’t even know the details of its slippery encoding? Fortunately, it manages itself, or mostly. That’s part of the “how.” And the brain in general takes care of the rest using alternative systems in a very dynamic fashion.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">For instance, there are millions of neural sensors all over your body that are actively involved in maintaining homeostasis for hundreds and even thousands of aspects of our biology. The actual number depends upon what resolution you choose to observe these biological feedback loops, but let’s take a look at one of the more obvious examples.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Your heart has by default a free-running clock of neuronal firing represented by your sinus heart rate. This neuron, or neurons, knows when to fire, rhythmically. It’s why your heart can beat even outside of your body. For a time. When properly reconnected, this rate is upregulated and downregulated to match demand from your environment based upon what it’s come to “know” about that environment. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Now, you might think what I’ve just described is obvious, but the actual details aren’t. Slight changes in temperature, available energy, and many different hormones all come together to constantly control your heart rate. You can understand this process without ever invoking the concept of knowledge, but each of these sensors (and many follow-on neurons) come to know the best way to manage your heart rate. You might not consider these subcognitive signals as knowledge, but when you think about it, why they fire is best described as useful knowledge, conscious or not. And WHEN they fire is critical to your survival.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This is a primal example of neuronal knowledge. A more abstract example might be agreeing to a marriage or other contract. A great deal of primal knowledge would normally inform such eventual firing of such a neuron. Along with veracity and consistency, it’s also useful to assess the quality and context of neuronal knowledge before allowing it to escape the skull and become the information that creates the sound of, “I do”. Have a care as to what you say or write. And when.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">You might think that words are just communication cues, but they are so much more. Words are the physical and sparse expression of neuronal knowledge, or at least a small percentage of all that knowledge. Which brings us to one of my favorite exercises - thinking of words as neurons firing, and the implicit scale of neuronal knowledge once you realize the consequences of flipping this brain mystery into an assertion.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The Oxford English Dictionary has upwards of three hundred thousand words, each of a different flavor from the next. A great writer will find that ideal word that captures the meaning he’s trying to express. Now think of each of these words that any given person might “know” as neurons. And how their knowledge might vary from yours. Think of how some specific words inform others. Such networks generate meaning for that word as it cycles between the world and back to the brain. The idea embodies how neurons express knowledge to the next neuron and ultimately, a series of muscles. At least in the abstract. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Sure, three hundred thousand words pale in comparison to the number of bits of knowledge even in one brain, but remember, each word is flavored by thousands of other neurons - words are the crowning and emergent result. It’s time for a more vivid comparison. Knowledge aspires to leave the skull and become information in the form of words or other symbols. Every single neuron firing represents a chemical signal. And these “symbols” are the key. Each symbol ultimately represents the knowledge of some neuron somewhere. </span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Not there yet?</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Try this. For a time, ignore computers and electricity completely; even forget about the underlying ionic aspects of the neuron. Ionic signals are encapsulated with the neuron’s cell membranes anyway. Instead, think of neurons as proto-words struggling to be born as information once their quality is honed. Let’s evaluate this comparison.</span></p><br /><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 18pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Knowledge Informs Information, (and vice versa)</span></h3><div><span><br /></span></div><div><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">"We are buried beneath the weight of information, which is being confused with knowledge; quantity is being confused with abundance and wealth and happiness. We are monkeys with money and guns." - Tom Waits </span></div><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">It's commonly accepted that data, and its more meaningful form, information is more atomic, more primal than knowledge. But that's just our left-brain's more limited and Bizarro view. What if the very opposite is true? Collecting data is the process of quantifying and fixing various aspects of nature in a very left-brain fashion. If our right-mind could speak, how might it compare the two? If a neuron really does know the essence of Jennifer Aniston, does it not suggest that neurons have some quality that can not be easily captured as data? Or information? </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">It's often said that wisdom flows from knowledge, but do we ever ascribe that same quality to information? Or even its data? Does our right-mind know something about wisdom that our left-brain has no way to process?</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Using the above experiential definition that neurons create knowledge, can we explore not only the nature of knowledge, but also the nature of information, and how the former possibly becomes the latter, as words, verbal or written.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">First, there is the contrast of characterization and quantification from a more left-brained technical or “programming” perspective. Knowledge doesn’t have a “class”. Knowledge doesn’t have a “type”. Knowledge doesn’t define a fixed “variable”. Knowledge doesn’t even have a set “value” so of course doesn’t have a “range” or “resolution”. With knowledge, there’s nothing to encode, nor any way to encode it, at least in a technical sense. As soon as you try, you invalidate the effort of the neuron at hand. Star Trek’s Prime Directive is especially important when it comes to interfacing with any neuron or the brain in general.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Most neuronal knowledge is not meaningful in any numerical</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> or</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> logical</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> fashion, but the subset of knowledge known as information generally qualifies in all ten of the above aspects. And more. Information can be used to describe many aspects of a thing, including knowledge itself as is happening right now. As you read each word in this paragraph, new knowledge is being cued in your mind, each idea to be accepted or cast aside. And that discrimination is the very thing that neurons do; the essence of how neurons create knowledge - discrimination.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Using this simple assertion, what any given neuron knows becomes the analogical variable, and it is a variable in multiple senses of the word. The concept is not useful in the algebraic sense, but programmers will still try to apply the important aspects of data - class, type, variable, value, range, and resolution. These are how things are taken apart by our Bizarro left-brain when we wish to calculate them in a clearly defined way. Doing so with neuronal knowledge risks invalidating its creation and quality, thus, the need for the Prime Directive.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">So what is this more right-minded version of neuronal knowledge like? What is the experience of knowledge in contrast to the word we use to grasp it? The cloesest analogy I've found is what I consider as a hybred in the form of ChatGPT, a kind of Bizarro version of knowledge approximation and interaction. The biggest difference is that knowledge varies dramatically as to its probability of being true. Knowledge also varies in its degree of similarity with other closely related knowledge. Just don’t try to quantify or define it too quickly. The moment you do, knowledge becomes frozen as data, perhaps prematurely. Living knowledge becomes dead information.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In contrast to information, most knowledge is not only dynamically changing but literally ephemeral - most knowledge has no physical representation. Knowledge only lasts for about a thousandth of a second at which time it needs to be re-sensed in order to exist again. It’s why we have the word re-member. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I’ll now further compare and contrast knowledge with dynamic RAM in a computer. Though it needs to be refreshed quite often, information stored in dynamic RAM still represents a state and needs only to be refreshed to maintain that state. Knowledge only exists when being sensed, and for about a thousandth of a second later. Knowledge is expressed as a chemical signal, but only now. Information is represented by a state in the form of a word, electronic voltage, or part of a digital simulation. When this subset of knowledge manages to find form as an agreed-upon value, it becomes information. Some information might even be true if such a thing exists, or at least approach truth asymptotically if it doesn't.</span></p><br /><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Knowledge</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> <span> </span><span> </span></span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Information</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> </span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Subjective</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Objective</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Embodied<span> </span><span> </span><span> </span><span> </span><span> </span><span> Disembodied</span></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Adaptable</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Fairly fixed</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Ephemeral</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">More enduring</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Signal</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> <span> </span><span> </span></span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Mostly Stateful</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">A flip of a coin</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> <span> </span><span> </span></span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Closer to the Truth</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Mostly subconscious</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Usually Conscious</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Emergent</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Reductive</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Biological</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> <span> </span><span> </span></span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Mechanical</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Living</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> <span> </span><span> </span></span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Dead</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Mostly internal</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> </span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span class="Apple-tab-span" style="white-space: pre;"> <span> </span><span> </span></span></span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Mostly external</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Open-ended<span> </span><span> </span><span> </span><span> </span><span> Defined</span></span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Once you understand how neurons create knowledge and how this knowledge helps inform information, you’ll realize that knowledge is millions of times more common than information. Simply compare the number of active neurons in all the skulls in all the world with the number of bits of accumulated information. At least so far, knowledge clearly wins.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Though information in the form of words is an abstracted and sparsely encoded Bizarro version of knowledge, they are still literally and often cued by a single neuron, and typically informed by millions more. It remains useful to think of mindful words as those final physical neurons of expression.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Take the word love as an example. Is it a four-letter word or simply a soft sound that may cue an amazingly rich memory far more complex than any word? At least the Greeks had sixteen flavors. It’s best to live the word love in order to know it. But the word “love” will have to do for now. Or at least until it happens to you.</span></p><br /><br /><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 18pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Quantifying Knowledge Nets</span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">There are lots of consequences to entertaining this assertion that neurons create knowledge. One we can address quickly which will give you a sense of scale is to quantify knowledge. How much of it exists? Or a better question in light of its new ephemeral definition might be, how much of it exists each second?</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Let's start with a single neuron. Neurons typically converge thousands of inputs and deliver a single output or conclusion. This could be described as a convergent network, or knowledge net since knowledge is the nature of that conclusion. But wait. That conclusion does not typically go on to a single neuron somewhere. It typically goes to thousands of other neurons that can possibly use that knowledge in some fashion. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">So a neuron could be described as a knowledge network with two different arbors. One for collecting and creating knowledge, and the other for publishing. These arbors are not equal. Their ratios of connections will vary dramatically, even extremely disproportionately depending upon where they reside in the knowledge network. Now scale out descursively and quantify.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The brain has upwards of 100 billion neurons. At any given millisecond about one percent of them are firing so about a trillion of them fire each second. And that's each neuron being sparse with the physical world. Obviously, not all of these firings induce movement. With perhaps a thousand pathways of a thousand neurons each directly involved in controlling a few hundred muscles, about a million neurons could be said to get something done each second. This is an example of the extreme disproportionality of brain architecture - only one in a million neurons actually move something each second, yet a trillion bits of knowledge are created in that second, in that single skull. Multiply by 8 billion more for just the humans alive at this moment (I use round numbers for such speculation to make the point that it's only an estimate), and you have 8 sextillion bits of knowledge newly created on Earth. And that happens each second.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Next, how much of that knowledge actually becomes useful information? The information created by humans (such as this sentence) is a dramatically smaller portion of knowledge. Humans create about 20 million text messages per day which are 231 texts per second out of 8 sextillion bits of knowledge giving you some idea of how common knowledge is, or conversely, how rare information is. Perhaps one in a quintillion to be generous in how each letter of each text message is cued.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">And those texts are mostly noise - small talk. Information that finds form in blogs like this or books, screenplays, videos, songs, and text in other forms is such a small part of the total that it might get lost in this noise except for its significance. Significance is why it emerges as the cream of our culture - the top 40 hits, best selling books, classic film. See any parallels in how neurons might come to know things yet? At least this quantification gives you some handle on how pervasive knowledge is. And the relative rarity of information.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><br /></p><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 18pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Sextillions Upon Sextillions</span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Now that we have words to help us think about the brain in a more useful fashion, we can take a look at this model from a different perspective. Think of EACH of those 300,000 plus words from the Oxford English Dictionary as a neuron in the brain. Ah, but wait. We can simplify this by an order of magnitude since the average person only knows about a tenth of that number. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Those thirty-odd thousand words each represent something you have experienced, at least in the form of language, or you wouldn’t have known that word. In any case, a specific neuron is cued by the approximate experience described by that word, actual or imagined. And this same neuron can cue an expression of that word, spoken or written. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">These words literally form a sparsely “decoded” map of your subjective knowledge of the world in terms of words, one of our highest and most abstract forms of knowledge. But what about the more primal forms of knowledge? Words are only a very small part of what we know at the nexus of neural pathways which I estimate is around a million specific things. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Thirty thousand out of about a million possible responses - we only have words for about 3% of the most important and abstract things that we know, at least consciously. Imagine the rich nature of that other 97% that lies just below the level of labeling in the mind. This is the realm of hunches and vibes and the even more subtle aspects of knowledge. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Returning to words, they only represent about 0.03% of all of the knowledge in the brain, with one bit of knowledge for each neuron representing a word. These quantification and ratios are of course only gross approximations, but we have to start somewhere. We can work on more accurate estimates later. The point is, our 30,000-word vocabulary is informed by the same knowledge that cues about a million useful scripts of behavior; and the knowledge of these million-odd tricks are informed by another 86 billion more primal bits of knowledge. That would mean that for each abstract survival trick in our repertoire there are about 86,000 supporting neurons but this average is meaningless as the informed combinations are shared widely. 86,000 neurons do not only support that one trick. That would be left-brain thinking. Our right-mind can imagine a myriad of combinations. The scale of it can take your breath away to the point that if you aren’t careful you might sufficate. Now let’s move in the opposite direction, decursively.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">These 30,000 personal words, like the 26 letters that are used to form them, are in turn a decursive alphabet of meaning once you string them together in sentences as I have done here. Each letter, each word, each sentence, and each paragraph enriches meaning as a script is delivered in a somewhat parallel and hierarchical fashion we call a serial narrative. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Over in the right-brain we have something similar happening but with visualizations. Together they make up our imagination fed by two competing and cooperating approaches. I’ll explain the Zen nature of that mix later. In any case, that’s how we come to know more than 30,000 things in an objective sense. How much of the other 49.985% is used in this way is hard to guess, so I won’t. But it’s likely a lot, perhaps still well short of a majority depending upon the topic at hand. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The other half remains in darkness, mostly doing the things that make the magic of human discovery and behavior happen. I’m sure some of this knowledge will be more clearly characterized as we move forward with this model. The point is, we don’t know what most of the brain knows in an objective sense. We only get hints about some of it now and then. 86 billion (plus or minus a few billion) are a lot of bits of knowledge even considering virtually all of it is too primal for words. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">And that’s just one brain out of eight odd billion living today, and the billions more of humans and other critters that have ever lived to help build this body of sublingual knowledge we pass on culturally in the form of dance and other forms of expression to be mimicked by others. Most of this knowledge never rises even to the level of expression, let alone information. And then when you realize that all the information in all data and in all the libraries in all the world for all of history is the smaller and more decursive end of this funnel, you can begin to appreciate how much knowledge is active in all the brains all around the world, and where your breath went. Sure, there’s a lot of duplication, but each is also unique in how it’s been brought together by each individual, such as me delivering this description right now. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Hopefully, this little exercise has enriched your understanding of how much knowledge goes into recognizing Jennifer Anniston. So far I’ve talked about the brain in phylogenetically formed layers of evolution from the brainstem up, out, and forward. If you don’t get too technical, that pretty much takes care of three dimensions. And this perspective is great if you only want to understand the brain’s history or how it shifts context on the fly, but it doesn’t say much about the subtleties of meaning and how it’s formed. Before we catch our breath completely, let’s push on.</span></p><br /><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 18pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Sparse Signaling Forms a Worldmapculus</span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Another more general way of looking at the brain is how it interfaces with the world. Here I'll preview of how knowledge generation might be used to model our world. Specific details will follow. Let's start at the sensor end. A homunculus is a mapping of the neural sensors from our body onto a modest section of the corti, on each side, actually two homunculie. This allows the brain to have direct contact with the world, but what about all the other areas of the brain's surface? The balance of the corti's area could be said to be modeling the world apart from our bodies using the balance of this resource. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Like a homunculus, a worldmapculus (to coin another term) could be said to manage the relationship between our bodies and the rest of the world which exists apart from our bodies in a similar fashion. Again, two worldmapculi would present two competing versions of this world. This would require inputs from our vision, hearing, and smell. Our skin represents about a million sensors. When you add in proprioceptors and taste you get a few million more, but by far vision with its tens of millions of sensors dominates sensing our reality, (and our imagination).</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Considering our modern ability to collect data at the nano level, the number of possible data points just within an arm's reach is an extraordinarily large number. Even the modest tens of millions of visual sensors are a tiny subset of what can be detected. That is why our sensors are described as sparsely coded. These visual (and other remote) sensor points are mapped in our corti largely by proximity, similar to a homunculus, so the parallel concept is useful.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Now let’s unbend our minds by a few orders of magnitude. The world is obviously far too large and complex to capture with any significant resolution such as a 4K computer image. Our model has to be steeped in disproportionality. We’ll need to use a map, but not a flat or linear map. We’ll need one like from the cover of that special New Yorker image:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><a href="https://en.wikipedia.org/wiki/View_of_the_World_from_9th_Avenue" style="text-decoration-line: none;"><span style="background-color: white; color: #1155cc; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; text-decoration-line: underline; text-decoration-skip-ink: none; vertical-align: baseline; white-space: pre-wrap;">View of the World from 9th Avenue</span></a><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> </span></p><br /><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Now for the pinscreen version. Ever take a pinscreen and hold it against your face to capture a 3D image of yourself as suggested above? That’s what we’re after, but it’s not just a visual image. Though low in resolution, because of sparse coding, it captures the essence of a person to the point of marginal recognition. Now think of those pins capturing not only visual depth but all the non-visual aspects of the world as well. Think of each nail as one bit of knowledge forming a meta image in depth with that depth as adjustable sensitivity for any given bit of knowledge. Now imagine such a disproportionate image forming a map of what’s important in your world, all ultimately in 3D.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Next, expand the idea to capture everything in our world that is significant at that moment - smell, sight, touch, taste, and hearing, to summarize in a very limited and Bizarro fashion. It's fairly easy to conclude those parts of the world that are significant at any moment in a very dynamic and amazingly small subset of all possible sensing. That's where attention comes in. It allows us to very selectively and dynamically change what is significant at any given moment in the process of creating useful knowledge. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Words to list all aspects of knowledge fail us at this point. Or at least these words fail me, but after all, words to are only a sparse map. I only have about 30,000. I think of this as a 2D image because the actual 3D part is later derived, but our sensor forms a topologically 2D surface representing an ultimately multidimensional construct. What does smell look like in this image? Exactly. Let’s hope it’s tasty.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">We now have this imaginary surface containing the endpoints for each biological sensor on the human body. These points on the surface of our brain’s corti are not beginning-points as a neural sensor is. Instead, they are where signals are delivered to form an abstract semiotic simulation of the world which is why I call it a worldmapculus. Our homunculi is a subset of that worldmapculi because your body is part of that world, perhaps the most important part of that world, at least for each of us subjectively. This is why our modest homunculi still gets a disproportionately large "view" of the world in each cortex. But let’s get back to our sensor surface. It’s where the fun begins. The worldmapculi will be used to describe signal-based simulation later.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">We now have a wall of tens of millions of neuro sensors. Each sensor is a dot on this wall. These dots are like pixels in a digital image, but not as regular, not as orthogonal, not as linear. That’s just how our cargo culting Bizarro technologists might want to lay it out, but biology has its own agenda. Our brain takes a different approach. Mapping proximity is important as is evidenced in our corti, but temporal mapping as in synchronicity may also be critical when we get a few neurons deep along any neural path. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Also, these few tens of millions of sensor neurons are only the first rank in a converging matrix of neural pathways. The output from each of these sensors (which are the very beginning of each neural pathway) will inform many of the next ranks of neurons in a divergent fashion, but not all. That would require each sensor to connect with every neuron in that next rank. (I use the word rank here instead of row; columns and rows only form a 2D surface, but the brain models a 3, or more, dimensional world, so "rank" is a more flexible term.)</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Here’s a brief overview of convergence versus divergence in neural nets. Near the sensors, the net diverges as needed but as you hop from neuron to neuron, convergence creating abstract knowledge comes to dominate. Once the path reaches the nexus of a million tricks, serial scripts looping with the world in a dynamic dance yields behavior. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Now back to our pinscreen.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The reason each sensor does not connect to each neuron in the next rank is because the geometric expansion required to make such connections would make a Gordian Knot seem like child’s play. The brain’s connectome is complex enough already. That’s why each sensor only connects to the neurons that find significance in what that neuron comes to know in the world. These connections are dramatically fewer than all possible connections.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Such sparse coding is the very nature of maps. Only the significant points are included.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Actually, we’re born with a far more complex connectome but most of it withers away within a couple of years during the process of discovering what is significant in the world. And what isn’t. This brings us to the conclusion of this section so I'll summarize.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Imagine yourself looking at this wall of neural sensors with its diverging and converging pathway behind the wall. Now turn around. You are looking out into the world. Watch what happens out there. Some things in that world catch your eye (or other sensors) more than other things. It tends to be things that move or change in some significant way. Our brain is fed by sensors that seek out the most significant changes in the world and analogically encode them as sensitivity adjustments in a sparse map of that world. In a crude fashion, yet still rich in content, our brain is a map of sparsely-decoded significance that is constantly changing. What connects to what and exactly how at each synapse is what encodes our experience of this world. So our brain is a reflection of this world, but only the most significant parts are captured and simulated - thus an extraordinarily efficient and elegant survival system.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Paper and now electronic maps are literally a decursive version of our neural connectome.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I’ve gotten a bit far afield for a simple brain model, but I think I’ll leave this in for now, and even enrich this topic later on. Or someone else may enrich it later? At least if this model has utility. It works for me.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><br /></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><br /></p><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 18pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Knowledge Dances With the World to Form Cues and Scripts</span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">“All the world’s a stage” - from, “As you like it,” by William Shakespeare</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Even more useful than thinking of neuronal knowledge as gestating words is to think of the firing of a neuron as a theatrical cue ultimately driving a script of muscle movement, which can also be thought of as a theatrical script interacting with that world. With the other actors and the props to play off of, life becomes improvisation forming a story or narrative.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Indeed, thinking of the world as a stage, and my body as an actor on that stage created a little model of everything - me as a homunculus</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> experiencing that world. This model can</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> yield substantial insight. At least it did for me. It even demanded the concept of a worldmapculus in which our homunculus can perform. But I'm distracted by the macro version. Let’s get back to basics.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Early primal neurons needed the actual world in order to hone knowledge from experience. As noted above, this happens in a loop with the world:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">World - sense - decide - signal - converge - cue - script - movement - world...</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Rinse and repeat, maybe thousands of times. Or a million. This is the dance that neurons do with the world. This operating model actually works quite well for very primal creatures trying to stay alive (examples to follow). But this method is not only very expensive to the creatures in question it is also very slow and tedious to evolve. A given neuron might have to wait a week for another external event to recur, or even much longer in some cases. This method is not very time</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> or energy</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> efficient. But at least it got the process started.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Fortunately, evolution found a way to speed up evolution using a chemical-based semiotic simulation forming a worldmapulus of knowledge as described above. We feel this chemistry as emotion where both the feelings and</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> emotions</span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> are ethereal knowledge. The pain and pleasure you experience do not exist in the material world. They only exists within your skull as is true for every bit of ethereal knowledge you “experience”. This is called your mind and it’s a simulation of a hypothetical world approximating the real one. The quality of this simulation is a reflection of your imagination and how you manage it.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Like most neuronal knowledge, our mind is mostly subcognitive. Only a very small percentage can be manipulated as consciousness. But again, we’re getting ahead of ourselves. I only leave these paragraphs in to let you know that this model explains far more than just primal response or behavior. But for now, let’s simplify.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Think of converging neural-nets of knowledge as theatrical cues, and what we do with those cues, as scripts of muscle movement we call behavior. Cooperating and competing cues and scripts explain a great deal about how we function in the world. Let’s explore the concept a bit. We can hone it into higher-level forms later.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">To summarize, before a word is ever written (or even spoken), it exists only as ethereal knowledge in someone’s mind. But long before that knowledge has the consistency to be useful as information, it dances with the world in a loop of development by applying a script of movement while sensing the result as a cue. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Actually, knowledge and the dance that helps create it both come into existence at the very same time. Knowledge in the form of a cue from the world, and dance in the form of a script of muscle movement meant to affect that world. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">You may reasonably argue that most neuronal firing does not induce movement as behavior and you’d be correct. Much of the time it only primes subscribing neurons. But if movement is not ultimately part of this loop with the world, then knowledge can not be refined effectively. Such knowledge would not be grounded in reality. Knowledge needs a place to play as we are doing with this model of the brain but it also needs to be anchored in reality to be useful for survival and replication. </span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">For now, think of our interaction with the world as a collection of cooperating and competing cues and scripts. Details to follow.</span></p><div><span><br /></span></div><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 18pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">What Does Any Given Neuron Know?</span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I’ll ask the question again because that’s what I did for years on end. It was one of the first questions that occurred to me after accepting the premise that each neuron knows something at the instant that it fires. This question could also be presented as, "how is knowledge encoded in the brain?" It took me quite some time to realize that I didn’t know the answer, and that I might never know it.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">But that neuron <b>does</b> "know" it.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Actually, what any given neuron knows is far less mystical than the above might suggest. What a neuron knows and what it means to the world is a type of analogcal function (better described as a mathematical relationship) of where it connects (which neurons) and how it connects (the number and types of synapses). The difference between a function and a relationship is one of agency. Functions are determinant. Relationships, less so. Agency goes to the heart of the neuron and is socially expressed as consent. This is why the neuron knows, yet we may not.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Neurons "encode" relationships to form knowledge. These relationships in the world are expressed as connections in the brain. Each is literally a mathematical relationship, (as opposed to a function), of which sensors or other neurons inform that neuron's dendrites by how they connect, as in, how many synapses and the ratio of activation versus inhibition connections on the input side. But it doesn't end there. The second half of any given formula would need to take into account the output of the neuron in question </span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;">to the next neurons in turn in a given neural pathway until it causes a muscle to be moved or other chemistry to be deployed or at the very least the degree to which other neurons are primed to fire. Then there are the issues of these factors for all the inputs that have honed this neuron's sensitivity.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> Before you start calculating, yes, there may be an opportunity here to create a formalism and even a formula to express the creation of knowledge but it should include all the neurons involved in all of the previous loops through the skull as well as those that might be affected in the future, a virtually infinite collection with meaning occupying the other side of the equal sign, so of course, I'm going to hold off presenting any attempt of such formalism (better described as a fool's errand), in favor of just playing with the idea. In any case, this is </span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">how neurons "encode" actual relationships into knowledge, at least in rough form. Fortunately, we don't need formalism for our right-mind to grasp this idea.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In any case, that’s the conceptual key to understanding the gnostic neuron, and any</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> such formalism should honor the sovereignty of that neuron.</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> It’s not just about how a neuron comes to fire when it detects what YOU may think is knowledge. It’s that the neuron fires when it detects what IT thinks is knowledge, and that can be a wholly different collection. Of cues. And scripts.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In other words, what any given neuron knows at the moment of firing is subjective for that neuron, and ephemeral, so it may be different in the next moment. Knowledge is not objective nor determinant. That only happens outside the skull. Approximately. Sometimes. Or something like that.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Let’s say that by the time you were five years old you came to know your grandmother well, but I’ve never met her. I don’t know what you know. You’ve come to know the sound of her car, the careful way she opens the screen door, and the type of cookies she likes to bake. You may not know the actual make of her old car (what five-year-old does?), why the screen door makes that special noise when she lets it close or what’s</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> in</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> her cookies.</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> But you do know what those cookies smell like even if you can’t describe the odor. And you do know the creak of the screen door when she arrives and that she doesn’t allow the screen door to slam when she comes in. You will be able to re-cognized her by these three elements of knowledge the next time she shows up. But I can’t. I don’t know those things. What you know about the world is different from what I know about the world. And what each of us knows about your grandmother.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">For you tech types, and from the perspective of any neural pathway, what a neuron knows is a relationship that can be described as a type of analog function of that neuron, the other neurons that cued its firing, and the scripts of still other neurons that it might cue. What any given neuron knows becomes a function of the knowledge accumulated as the signal moves along its neural pathway combined with other knowledge from other neural pathways. And the neuronal accumulation of knowledge can go from primal to sophisticated in surprisingly few jumps. If you remain frustrated because of the lack of formalism, I have something even better. I'll describe how this "analog function" actually works in due course. Stay tuned.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">For now, I’ll provide an algorithm as a tease. Say you have a town with almost a thousand people and you want to find a specific one. You don’t have their name or address but you know how tall they are. You line them up by height then move down the line until you match the height exactly. You have found your objective. This is a process and interaction with the world known as an access method that ultimately yields knowledge.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This evolving knowledge makes its way from sensor to muscle and then encounters the world where it is expressed as movement. That behavior may then affect the world in a way that can be sensed and refined during its next time through the loop. The world is wild. Knowledge is the tracking variable. Or something like that.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I realize this answer may not be very satisfying to everyone, especially the more technical among you. But the answer holds great utility. It has nicely explained so many paradoxes flowing from interneuron communication that I can’t even begin to count. And it has literally and decursively done the same for thousands of behavioral interactions I’ve analyzed in humans over the last two decades.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Here’s another way of thinking about this enigma. Imagine that you have a neuron in front of you that knows whether Schrodinger’s cat is alive or dead in that box. At the very instant that it fires, it knows that answer, but you don’t (and perhaps for the same reason as the cat, but I won’t take you down that rat-hole right now). Do you see how what that neuron knows is not only hidden but indeed unknowable? Well, at least unknowable if you do not know why that neuron’s firing may have been successfully cued. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Fortunately, most neurons do not have to know something as unknowable as the fate of Schrodinger’s cat which lives on the very dangerous edge of an equally probable binary outcome. In most cases, what a neuron knows is not at the asymptotic limit of what can be known. For most neurons, most of the time, what that neuron knows can be guessed quite successfully, but not absolutely. The answer technically ranges from almost determinant to almost random, but absolutely neither.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If the above paragraph challenges your sensibilities, relax. It’s only a theory. Most of the time neurons are very practical in what they know. I will provide many examples shortly but I had to include this section for those who need to apply technology to the above question. The rest of us now get to play with the answers.</span></p><br /><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 18pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Waves of Knowledge, Vibes, and Hunches…</span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I could now go into the nano details of creating knowledge using interneuron communication, but we’d risk losing sight of the macro consequences of neuron-created knowledge, which is actually more important than the details. The holistic understanding holds more value than the proof, which will come in time.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">To be clear, when I made this conceptual leap about seven years ago I spent a great deal of time at the nano level and then began to observe the decursive similarities between that work and what Iain McGilchrist had written at the macro level of the Divided Brain, as I’ve noted. This led me to explore the micro context middle ground where words are given meaning, even before they were words. This is the realm of vibes and hunches. Understanding how words might be formed from chemical vibes began to force a revision of my analogical analysis. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">These similar ideas in each contie lead me up and down the axis of evolutionary sophistication using my new magic vehicle - decursion. In the process I came to realize that at each level there was not a single solution to each problem - there were many, and one need not preclude another. This is where the multifaceted aspect of the brain became reinforced, as I explored how it might be managed. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This particular path of understanding also had the side benefit of explaining the nature of words in general and how grammar is actually a post-analysis and not the genesis of language. Our left-brain has it backward. Our right-mind delivers concepts for our left-brain to manage using words as a clerk might do. And again, words lead me back to human behavior and neuron interaction. The process continued over and over gaining conviction with each iteration.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This conceptual migration in context from nano, to micro, to macro, had other interesting side effects. I spent so much time with neuro connection, chemistry and emotion that it began to affect how I came to see the macro world. Soon, the two limits of the contie were competing for my attention at the same time, changing my thinking daily. In the beginning, it was my more innocent and normal way of dealing with the world, often without thinking. Then I began to take a moment to consider what would happen if my neurons really were informing my behavior using this crazy concept. The details were enlightening! </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">For instance, a friend might ask me what I wanted for dinner and I would try to reach beyond my usual habits to offer a new and creative suggestion. As a creature of habit, being creative was no easy task, and I would relax my left-brain to reach into my right-mind for a fresh answer, all the time imagining each inhibiting the other as neurons do in a similar situation. I would then quite often stare into space for minutes at a time seeking new vibes and hunches to drive culinary delight. You might think of this as mindfulness, but for me, it was far more. Soon I was applying the idea to those around me and actually spending more time with the gnostic model than my normal way of dealing with the world. This constant dual exercise has changed my life in many subtle but significant ways.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The best way I can describe this is as waves of knowledge washing over me which became one of my more useful metaphors for brain operation. I see waves of neuronal knowledge from millions of biological sensors forming cues that converge and cascade across my brain in various ways to reach conclusions expressed as muscle scripts. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As I’ve mentioned before, another way to visualize this is how the character “Mouse” in the original Matrix movie would look at green letters dropping down the screen and see beautiful women challenging those around them. For me, these mind games became so much fun that this neo-gnostic perspective almost didn't get documented in the form you are</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> now</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"> reading. It still dominates how I see and deal with the world. Your mileage may vary.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Speaking of which, you too now have all the hints needed to achieve this neo-gnostic perspective. I’ve considered this in some detail. Even if you don’t know how neurons create knowledge, just knowing that they might, should yield a completely new worldview largely informed by how we use what we know in relationships with others. OK, I’ll give you a few more hints to sum things up.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As I’ve suggested, we have millions of neural sensors each of which is monitoring the world. They don’t fire until they find significant change which they then use to create knowledge reflecting the relationships between things and ourselves. This is done using cooperation and competition in a yin-yang fashion. The resulting knowledge is then delivered to the next tier of interneurons in various ways with various weights as selected by the next neuron’s synapses with their ratios of activation and inhibition. Sovereignty, decision, and control ultimately lie in the ionic area between the dendrite and the hillock of each neuron. These neural signals from millions of sensors tend to converge with each hop as the knowledge they generate becomes more abstract. As noted, this happens much as the series of “V” layers in the brain’s optical system is conventionally described - from pixels to lines, to ovals, to mouths, to eyes, to faces. If you’re not familiar with this convergence of abstraction, think about how a journalist collects “facts” to build a story fully summarized when published. It’s similar. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">A journalist first collects facts, hopefully in an objective fashion. These facts (hyper-knowledge) begin to yield a theory which he may then begin to test by seeking more knowledge about the events in question. Once he has a meaningful thesis he expresses it as a written narrative and publishes it, just as I have done. Neurons do something similar but in a more primal way. Both examples create new knowledge in their own context.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">At the same time, copies of this knowledge diverge out to other layers that may be able to use this knowledge in some way, just as you might subscribe to a blog. A downstream neuron can then subscribe to an upstream neuron to acquire its knowledge. The result is a wave of knowledge starting at the sensors and narrowing down to a nexus of about a million neurons near the lower center of the brain. At least this architecture largely describes our lizard brain. I will provide details about the even more complex layers later on. For now, I need to continue this decursive progression to sum up this post.</span></p><br /><br /><h4 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">…Inform Words, Emoticons, and Memes…</span></h4><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">“~ All cultural formation in our time is now the development and propagation of memes that battle their way through a supply chain in cyberspace.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Most die; some thrive.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The memes that make it through encode deep meanings.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This is as serious a process as has ever existed.” - </span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">from the Twitter feed of Marc Andereessen @pmarca</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If you missed the context, Marc is describing the very essence of evolution, but not as Darwin described it. Instead, it decursively takes the form of what happens in neurons, and the brain in general to create words, emoticons, and memes. It’s also how the brain’s ability to evolve has escaped our collective skulls to find form in our culture. The “deep meanings” Marc refers to are literally knowledge. And sometimes information.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If you’re having trouble with this idea that neurons create knowledge, consider their decursive expression again - words. It may clarify things for you. Think of words where each underlying letter is informed by other knowledge, creating an emergent abstraction to cue other more highly evolved neurons that have become social in nature. Words expressed as sounds (or written symbols) are just the more abstract neurons that cue specific experiences that then find form outside of the skull. Every time you hear a word, a specific neuron is firing, with hundreds or thousands of other neurons fired in milliseconds before to support, refine, and invoke this abstraction.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Sure, you can think of that same word and most of those same neurons will fire, but until you publicly express the word, nothing has moved, that word doesn’t matter, at least not socially. Nothing matters until something moves. And once you express the word, it may cue other neurons in others' brains, and you can’t take it back - “the moving finger having writ”. Everything matters once something moves: </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">“The moving finger writes; and, having writ,</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Moves on: nor all thy piety nor wit</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Shall lure it back to cancel half a line,</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Nor all thy tears wash out a word of it.”</span></p><ul style="margin-bottom: 0px; margin-top: 0px; padding-inline-start: 48px;"><li aria-level="1" dir="ltr" style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; list-style-type: disc; vertical-align: baseline; white-space: pre;"><p dir="ltr" role="presentation" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">From The Rubaiyat of Omar Khayyam</span></p></li></ul><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">(This is the first time in my life that I’ve used the same quote three times in the same work. I suspect it’s significant. Or maybe I just like the poem.)</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Of course, words make up language when used in sequence, not unlike scripts of muscle movement. Decursively. But words are just one of evolution’s tricks that have escaped the skull to find a place in our culture. Emoticons are a more modern version, and memes are the reason, as Richard Dawkins has so nicely presented.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Sometimes it helps to work with these ideas in creatures that are sub-human. It allows us to be more objective. (As he noted in his work, it allowed Iain McGilchrist to review far more data about the divided brain.) For instance, many creatures have language even more subtle than words, emoticons, and memes. Each example can tell you a great deal about the range, resolution, degree of abstraction, and architecture of that creature’s brain. Perhaps not surprisingly for many of these creatures, you may have to go a long way back to find a common ancestor, unless, of course, it’s independent evolution.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If it’s not independent evolution, then it will give you some idea of how old that particular trick is in our human evolutionary past. For instance, insects and birds have various forms of dance and language to communicate things about the world to other members of their group. How old are those common ancestors? Proto versions of language may literally be that old, or else they were independently developed multiple times. I believe the former is more likely the case. Just like neurons creating knowledge, words as we now know them likely evolved much later, but the various forms of dance are likely to have evolved far earlier than formal language but are still quite similar in their current form today. Just watch a bee dancing. I suspect more advanced protozoa dance in a similar fashion. So did Michael Jackson. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">A nod or a wink is enough of a cue to invoke a script of muscle movement as long as the context is shared (required for information transfer). For instance, there are jellyfish that have been documented to socially communicate using bioluminescence in a dance of lights. Or even more primally, herd movements go all the way back to bacteria for similar reasons. We’re talking a billion years in some cases. And of course, a more familiar sound-based language is more common in closer relatives such as primates. Writing is a quite recent decursive human version of such "dance".</span></p><br /><br /><h4 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">…To Organize Individuals, Tribes, and Nations With Conviction</span></h4><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">These ideas about evolution evolving a new way to evolve are not limited to the skull or any individual. They also find form in how we interact with each other, the differences in conviction, and how we organize in larger groups. Decursively. If you want to understand what’s in your skull and how it’s organized, simply look around at how humans organize as a tribe or nation. And vice versa; at least once you get a feel for how cues and scripts work, human behavior begins to make a lot more sense. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">For the best example, I’ll again refer you to the “Divided Brain” and its descriptions of how each side of the brain sees and deals with the world differently. Similar generalizations are useful when comparing politics and religion. I’m not suggesting that you judge any of these groups, and that’s the point. Each has its own biases, triggers, and beliefs. Each has its own truths. One is not better or worse than another in any absolute sense, though it may seem like it at the time, subjectively, with conviction. The challenge is to step outside our minds when considering such issues. Your left-brain may be of help in this exercise. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In any family or tribe, there will be different approaches, agendas, priorities, and triggers for each member. Desmond Morris's work is helpful at this point. Depending upon pecking order, these solutions will cooperate and compete to hopefully find the best form, but often they won’t. There is no absolute best or worst answer. There are just scripts that yield behavior, in this case of a group.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">For instance, how can a man murder his children, then himself? But if he gets stopped before he dies, he likely will become so remorseful only minutes later. Or not? Some part of this man knows something we don't. Something not shared with the more remorseful facet of his brain.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The point is, that the brain and mind are definitely multifaceted, not just physically, but also operationally. They both have cooperating and competing facets. It’s just a matter of understanding which facet is controlling the body, (or major parts of it), at any given moment.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Such multifaceted operation also happens within the skull as one survival solution or mating strategy is sometimes better than another, each competing and cooperating in various ways. I realize that these are very broad generalizations, especially when applied to the mystery of the brain, but isn’t that the very thing we need to break the logjam of data about neuroscience?</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">What each of us comes to know about our world, politics, and religion is ultimately a function of nanochemistry and connection, both of which are constantly changing. The most significant knowledge tends to get reinforced more often (connection) and more dramatically (chemistry), so we come to know that thing with more faith and conviction. An example is this very paragraph which occurred to me in a dream, got me out of bed, and drove me to my computer keyboard. Again.</span></p><br /><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 18pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">It Is Written…</span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">“It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.” - Misattributed to Mark Twain</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This quote was used at the beginning of the movie, “The Big Short”. At the time I appreciated it, as I already had it in my notes to be included here for reasons other than its impact on the financial markets. I already knew that the above misattribution was an excellent example of cultural knowledge because of this ironic error in the attribution of the above quote:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">“Quote Investigator: Scholars at the Center for Mark Twain Studies of Elmira College have found no substantive evidence supporting the ascription to Mark Twain.”</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">It brings me such joy to have an opportunity to quote, “Quote Investigator'.' This conclusion summarily demonstrates both the fallibility of knowledge and its decursive nature. Even the screen writers of, "The Big Short" can possibly fail in their attribution. If Mark DID udder this lament and it only got repeated orally, it can now not be properly attributed to him because it was not important enough at the time to be written down and kept for the internet to discover - technically. Hold on; we’re going down a rabbit hole.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Somewhere in Dr. McGilchrist’s YouTube lectures, he provides an example of how our how left-brain has come to dominate our right-mind, in spite of the obvious. Iain tells a story about how doctors have come to the morgue to verify some aspect of a dead patient’s treatment, only to find that the patient still had a pulse. The doctor turned to the nurse and proclaimed, “This man is still alive!”</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The nurse retorted succinctly, “He can’t be alive. It says right here on the clipboard that his time of death was more than three hours ago!”</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Iain McGilchrist used this story to demonstrate how our left-brain (with its superpower denial), can effectively deny reality, no matter the evidence to the contrary. Apologies to Iain if I got any of this wrong. It’s what I re-member. It’s what I know at this moment. Now I’ll carefully back out of this particular rabbit hole.</span></p><br /><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 18pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">So, What Do YOU Know?</span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This is getting to be a habit, but again I’ll quote :</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">“It ain’t what you don’t know that gets you into trouble. It’s what you know for sure </span><span style="font-family: Arial; font-size: 14pt; white-space: pre-wrap;">that just ain’t so.” - Misattributed to Mark Twain</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Whomever the source, with this caveat in mind, I’ll ask the above question again in the hope that, after you’ve read this post, you’ll seriously consider this final question:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">So, what do YOU really know?</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">And do you know this knowledge for sure? Or do you agree with the nature of knowledge as I have described above? I know for sure that it is. But is it so? I know for sure that neurons create knowledge, but recognizing that I too am quite fallible, even if I’m wrong about the neuron, let's consider the ideas in a macro context.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="color: #222222; font-family: Arial;"><span style="background-color: white; font-size: 18.6667px; white-space: pre-wrap;">It could be said that "neurons creating knowledge" is just a different way of looking at all the data, and I'd agree with that assessment because that is the point - a fresh perspective for looking at all the data is the key to a better understanding. Seeing neurons as knowledge creators has dramatic consequences on how we might model the brain and human behavior. I will explore many of these going forward. For the moment, let's consider some low-hanging fruit.</span></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Human behavior is clearly multifaceted. And it can switch from one solution to another faster than the blink of an eye. Is it not useful to think of behavior as a collection of cooperating and competing cues and scripts played out on this stage we call the world? And also in our skull as a worldmapculus? </span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Is knowledge in the macro context not simply a trick of evolution in an attempt to seek a new, quicker, and less deadly way to evolve? </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Is it not more useful to think of knowledge as ethereal as opposed to real or physical? Compare it to music contrasted with the instrument being played. Or a story contrasted with its medium of expression whether paper and ink or pixels on a screen. These are all forms of abstracted knowledge. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Yes, I recognize the irony in what I'm presenting about knowledge and the neuron may be entirely wrong, at least logically, but I'm willing to risk it if it provides a more useful perspective for addressing this difficult challenge, which brings us to YOU. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">What you know about the topic may be more useful than what I know. </span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">If you've read this far, you've definitely thought about the nature of the neuron to some degree. If you have different answers to some of the questions I've presented, please share them with me, in private if you like. I read all non-spam emails.</span></p><br /><br /><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 18pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Shifting Gears</span></h3><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">It's time to move from the "what" to the "how" of knowledge creation. If we accept that neurons create knowledge, how do they accomplish this amazing trick? Most of the rest of what I'm presenting will be about answering this enigmatic question. The "how" is the proof in the pudding.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Trying to describe the concept that neurons and the brain create knowledge has been one of the most difficult challenges I’ve ever undertaken. For me, the hard part has been not slipping back into tech metaphors excessively. Perhaps I’ve written the same thing too many ways, or with too sparse of a vocabulary. I’ll blame this on the limits of my right-mind which has to draw attention to these ideas so my left-brain can capture them in written form.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Before we proceed, I'd also like to generalize about the ultimate consequences of this prime assertion, mostly to acknowledge they are myriad, profound, and difficult to overstate. Thinking of neurons as gnostic will dramatically change anything having to do with knowledge, which is to say, virtually everything from science to art and understanding human behavior. But the differences may be subtle, as my right-mind and likely yours, already knew that neurons create knowledge. Our right-mind just had to convince our left-brain. I hope my words written here help you along your path to whatever conclusion you reach. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Let's just summarize the potential impact as a completely new form of applied philosophy - Neognosticisim, in contrast with original Gnostism. I could write thousands of pages on this topic alone, but others could do a much better job once this concept is well understood, so I won't. At least not right now.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Instead, I will continue to edit and hopefully improve this current presentation to extend the scope into how neurons might accomplish their amazing trick of evolution, and try to do so without getting too technical. I shall rely on a series of stories about the evolution of a number of creatures that still haunt our brains in various ways. So here this section concludes. Next, we’ll explore how these many creatures helped inform who we are as humans.</span></p><br /><br /></span><div><span>Continued:</span></div><div><span><br /></span></div><div><span><h3 class="post-title entry-title" style="background-color: whitesmoke; font-family: "Trebuchet MS", verdana, sans-serif; font-size: 19.5px; text-indent: 10px;"><a href="https://suddendisruption.blogspot.com/2023/02/how-neurons-create-knowledge.html" style="color: black; text-decoration-line: none;">The Gnostic Neuron - Part 5 - How Neurons Create Knowledge</a></h3></span></div><div><br /></div>Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.com0tag:blogger.com,1999:blog-23742979.post-42191830889607914822023-01-31T08:18:00.006-08:002023-09-11T16:43:04.089-07:00The Gnostic Neuron - Part 5 - How Neurons Create Knowledge<p><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">(First version posted on 01/05/23)</span></p><p><b><span style="font-size: x-large;">How Neurons Create Knowledge</span></b></p><p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhtrSMdubo6Hp6P2gD_7dibkOF4ZcCcXomcf4jl6ZzeNR4bufi5T9kqw4z_7S_Na0EbXiEeOYrFDE18Ajvpi1xcjGI-bPT05NgzsoAASXx0A7uSUkeV5cDS9cb8v9-iZgFli8ml4I0_CmO89cvRCt7w8CLGX3EhxC5WsntD2PaPM6CQrSkYVBU/s1024/DALL%C2%B7E%202022-12-21%2012.54.20%20-%20Gnostic%20Neuron.png" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" data-original-height="1024" data-original-width="1024" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhtrSMdubo6Hp6P2gD_7dibkOF4ZcCcXomcf4jl6ZzeNR4bufi5T9kqw4z_7S_Na0EbXiEeOYrFDE18Ajvpi1xcjGI-bPT05NgzsoAASXx0A7uSUkeV5cDS9cb8v9-iZgFli8ml4I0_CmO89cvRCt7w8CLGX3EhxC5WsntD2PaPM6CQrSkYVBU/w640-h640/DALL%C2%B7E%202022-12-21%2012.54.20%20-%20Gnostic%20Neuron.png" width="640" /></a></div><br /><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span><p></p><p><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;"><br /></span></p><p><span style="font-family: Arial; font-size: 14pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p><span style="font-family: Arial; font-size: 14pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p><span style="font-family: Arial; font-size: 14pt; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Some people say, "How can you live without knowing?" I do not know what they mean. I always live without knowing. That is easy. How you get to know is what I want to know. - Richard Feynman - </span><span style="font-family: Arial; font-size: 14pt; font-style: italic; font-variant-alternates: normal; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The Meaning of it All (1999)</span></p><p><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">Once you start thinking about this new definition of knowledge, there are many answers to the above question, but none of them is a requirement to understanding the neuron’s gnostic nature. This may disappoint you but I’m not going to answer the above question directly or in technical detail. I will however describe in broad terms a few ways that neurons might create knowledge as I’ve newly defined it. Who knows, some of them may even be correct. In any case, all of that can be sorted out later using more rigorous science. The important thing at this point is to get a feel for the idea.</span></p><span id="docs-internal-guid-c324cfe8-7fff-af27-617a-8d04ee7ee687"><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Even though the topic is where I’ve spent the bulk of my time both before and after realizing that neurons actually do create knowledge, the ultimate answer would also be both nebulous and incomplete. Though quite technical in an analogical sort of way, because knowledge is the basis for most of evolution’s tricks, the answers would not only fill a book by itself, but perhaps a whole shelf of books, a wing in a library, or all the libraries that have ever been built. And burned. </span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">As I’ve defined knowledge, the answer to this question is essentially the whole history of animal evolution and all of its consequences. Since knowledge was the key to animal life evolving and surviving, the how of it is literally half of biology and every discipline that supports it, no matter how remotely. Knowledge is everything that isn’t material, including consciousness, a topic we’ll avoid for the present. Therefore, anything I might present would be by definition, incomplete. I’ll just lump all knowledge into philosophy, and find peace not knowing all possible answers. But we can explore a few if we’re careful not to nail them down too tightly at this early stage of understanding. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Virtually everything I present from this point on will be hypothetical, as in less than a theory. I’m going to at first set the bar for these answers so low that they can’t fail. Later they can be refined or have completely new answers replace them. After all, that’s how science works before it actually becomes science - it’s speculation. For example, any non-random relationship between things in the world and the firing of a neuron could be defined as knowledge. After that, it's just a matter of quality. Non-random is a good standard to start with.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I will try to use my most obvious examples. What I mean by this is, by the time I got around to figuring out how neurons might create knowledge, I’d already roughed out at least a dozen different ways. I just needed this fresh perspective to understand what it really meant to create knowledge. I needed to frame the question in a less rigid fashion - analogically. Once you entertain the idea that neurons do create knowledge, you realize evolution’s already figured out the “how” of knowledge. All we need to do is figure out how evolution solved the problem. And since evolution’s not trying to hide its accomplishments, they are there in plain sight only needing to be characterized in a bit more formal fashion. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">By the way, I stole this approach to solving this problem from a few lines of dialog from one of my favorite movies, “Hunt for Red October”. It was from the shower scene aboard the aircraft carrier where the Jack Ryan character is talking to himself trying to solve a riddle. Like his approach, I don’t have to figure out how to create knowledge. Evolution’s already done that. Knowledge exists. I need only understand how evolution solved the problem, a much easier challenge. Once I began to characterize what the nature of knowledge might be, examples started appearing everywhere I looked.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Once you begin to consider the idea that neurons might be firing because they have detected some significant real relationship between things in the world and that the result of that firing is ethereal knowledge, de Carte’s dualism comes immediately to mind, and so many other pieces also fall into place. Spiritualism is ethereal and perhaps our most precious form of knowledge. Which is probably why it finds form in consciousness.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Even if you doubt the nebulous nature of knowledge and that neurons create it, you’ll have to admit that knowledge certainly exists within the human skull, as we can express knowledge using our minds to control our voices to share this knowledge. And unless you assert that this knowledge came from eating an apple or some other more spiritual source, then understanding how evolution performs this amazing trick is almost certainly on the path to understanding what the brain does. How the brain does it is secondary and has multiple solutions, and many subtle sub-versions, some even becoming subversions.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Since the challenge doesn’t have just one answer, I can go after the low-hanging fruit first. More significantly, if you’ve been paying attention to my Zen maneuvers, you will realize by now that any one answer need not preclude another. Or in some cases, it may. But only for a while. Most of these riddles I will leave for the next generation to consider. But don’t worry, I won’t leave you hanging. I’ll at the very least present my simplest solution which may likely be just an approximation. It will still need to be challenged. How can I fail?</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Since I no longer have anything to prove, I’m going to present my answers in a very unscientific way - by telling a few stories over and over in different ways with different characters, but with increasing complexity for each example. Each will demonstrate how evolution MIGHT have evolved a new way to evolve - by creating an ethereal thing called knowledge. This may allow you to get a feel for the process and find the conviction that you need to accept that neurons really DO create knowledge, which is my prime objective in any case. The rest is mere detail to be worked out in time by people who do not yet realize the nature of the challenge. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">After all, we have to start somewhere, and so far, I haven’t found a better approach.</span></p><br /><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Venus Flytrap</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">My first example will also be my first exception to animals being the only source of knowledge. Does a venus flytrap create knowledge? And what’s a plant have to do with neurons? Exactly. I use this example to demonstrate that knowledge is not limited to the animal kingdom, or likely, even Earth. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Is the venus flytrap a parallel evolution of animal life? Almost certainly. Even though it can’t relocate, it’s a plant that moves quickly in order to acquire nutrition and energy, a rare quality for a plant. This exception from the plant kingdom will help demonstrate how life is a reflection of the world that drove its evolution, animal or plant. Exceptions or not. But let’s get back to our fun first example.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If you haven’t played with a venus fly trap you’ve missed out on one of nature’s most interesting “creatures”, and I push the definition of a creature here for a reason. A venus flytrap seems like an animal, at least some of the time, and in multiple ways.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As its name implies, this plant traps flies (and other bugs). It’s not alone. There are a whole collection of plant species that capture bugs to eat and digest, but this one is special. Its trick is to close its jaw-like leaves over its prey so quickly that they don’t have time to fly away. And if you’ve ever tried to catch a fly in your fist, you know how quick a venus flytrap has to be. Quicker than me. Almost certainly quicker than you.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Surprisingly, that’s not this plant’s most interesting quality. Only its most vivid. Its real trick is knowing WHEN to close. And why. If these special leaves sense any movement then were to immediately close (an impressive trick on its own), they would risk wasting all their energy trapping bits of lint and dirt blown in by the wind, not a very appetizing meal. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The venus flytrap needs to know the difference between the living and the non-living if its meal is going to provide the best nutrients. Since life itself is a collection of the stuff life needed to prosper, even this plant’s objective is a bit of knowledge about the advantages of eating a living, versus nonliving meal. That's valuable knowledge.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The venus flytrap uses a clever two-stage sensor to improve the odds. A fly lands on the first sensor, but nothing happens. If the fly moves enough to activate a second sensor AFTER a set length of time, only then does the trap snap shut. This two-stage timing and the distance between sensors are significant. This timing exclusion window is enough to capture the asynchronous behavior of insect movement and the sensor spacing almost certainly reflects the distance between a typical fly’s legs in some way. I learned about the first motion sensor as a child, but the second only a few years ago.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Though the test is crude, much of the time the flytrap gets it right. Or at least often enough to yield a net gain of energy in calories. This temporal timing window can be thought of as creating a crucial bit of knowledge about the difference between the living and not-living, and is so important it remains differentiated and lateralized in human brains, so has obviously evolved at least twice. Discriminating for life is so important that the flytrap’s speedy trick won’t work without it. At least not very well. This asynchronous bit of knowledge matters. That is why. Now for the nature of this knowledge.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The most obvious differentiator for animal life is movement, and this plant has evolved to detect it. You can think of it this way - insect movement is part of the relationship between the fly and the plant. It’s a very important bit of knowledge as far as the plant is concerned. Not only is this movement critical to creating this knowledge, movement, in general, is critical to creating much of primal knowledge as I’ll shortly present.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The Venus Fly Trap solves the problem using a time delay of at least 20 seconds, and it likely uses some form of ionic signaling to achieve this result. The first movement primes for the second, and could even be described as a type of “memory”. I’ll leave the details to a botanist, but not only does this plant know a bug is present, it knows this by sensing that bug’s presence over a time span to trigger the capture. Such temporal knowledge could be described as a detector of asynchronicity - two things happening at two separate points in time - a very highly evolved plant indeed!</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In “Insectivorous Plants” Darwin noted that, “When the leaves are irritated, the current is disturbed in the same manner as takes place during the contraction of the muscle of an animal.”</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Darwin is to be excused for describing this plant in electrical terms, after all, he didn’t yet know the difference between electronics and ionics. Since we’re avoiding the details of triggering for now, we can leave these investigations for others, but in summary, I’d like to suggest another very important question: </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Does a venus flytrap actually “know” something in the way you understand knowledge? I’m sure there are many times when a flytrap closes on a second twig blown by the wind within the appropriate timing window. That’s a fail. The venus flytrap’s discrimination about life is not perfect. Neither is knowledge. But the flytrap’s temporal test works often enough to survive and even prosper. The venus flytrap certainly seems to know when to have lunch. This life-detecting knowledge defined by both movement and delayed time is quite impressive for a plant that doesn’t even have a neuron.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As I conclude each of these examples, I’ll try to remember to also document why and when neurons fire. In the case of this plant, it doesn’t bother with a signal like neurons do. It goes straight to movement at the right moment. The why is nourishment, and the when is to detect life, an indicator of the quality of the food at hand. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">By the way, whatever trick the venus flytrap uses to acquire this knowledge is not in our skull as this “creature” is not on our evolutionary proto-path. But this exception nicely demonstrates that knowledge (and even temporal “memory”) is not the sole province of the animal kingdom, or likely even Earth.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><a href="https://www.sciencenews.org/article/how-venus-flytraps-store-short-term-memories-prey?utm_source=email&utm_medium=email&utm_campaign=latest-newsletter-v2&utm_source=Latest_Headlines&utm_medium=email&utm_campaign=Latest_Headlines" style="text-decoration-line: none;"><span style="color: #1155cc; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; text-decoration-line: underline; text-decoration-skip-ink: none; vertical-align: baseline; white-space: pre-wrap;">How Venus flytraps store short-term ‘memories’ of prey</span></a><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> </span></p><br /><p dir="ltr" style="background-color: white; line-height: 1.44; margin-bottom: 0pt; margin-top: 0pt; padding: 0pt 0pt 12pt;"><a href="https://www.nature.com/articles/s41598-021-81114-w" style="text-decoration-line: none;"><span style="background-color: transparent; color: #1155cc; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; text-decoration-line: underline; text-decoration-skip-ink: none; vertical-align: baseline; white-space: pre-wrap;">Action potentials induce biomagnetic fields in carnivorous Venus flytrap plants</span></a><span style="background-color: transparent; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> </span></p><br /><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 10pt;"><span style="font-family: "Trebuchet MS"; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This Is a Lie</span></h1><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The title of this section demonstrates how easy it is to create a paradox using words. If “this is a lie” is true, it makes “this” the truth, which makes it a lie, which must be the truth. But that’s not the worst of it. For my next example, I’m going to tell you a lie, but not completely. A complete lie would have nearly the same utility as telling the truth. Paradox aside, there’s a whole subfield of philosophy that ironically deals with the quality of truth. Truth by degrees is itself an invalidation. I’ve also presented its lack of utility in defining knowledge. So the lie I’m about to tell might even be true.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This approach to veracity is actually an idea I stole from Bizarro Superman. It goes to the core of Bizarro World philosophy. Bizarro Superman would typically say the opposite of the truth, and even present the opposite of logic. But not always. And not exactly. Therein lies the challenge - knowing which parts are true and which are false, and to what degree. It’s how we sort knowledge into two sets of mostly true and maybe false. Set theory needs fuzzy edges. Since I’ve framed this next example as the truth, I may as well make it a whopper.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">When I was a kid I asked a lot of questions, and each answer would beg another question. This is a common behavior from curious children of a certain age. I was no different. I did this at home. I did this at school. I did this at church. Most adults would simply ignore me after a while. My grandmother was pretty good at this game. She had lots of answers, but not all of them. When I reached her limit, she would simply leave me with a quixotic smile, which I remember well. This smile meant, “I don’t know, but perhaps you can find out.” In this way, instead of leaving me frustrated, she would encourage me to seek the answer somewhere else, and so send me off on my quest.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">In the Baptist Sunday school which I attended, they also tried to answer my questions. It got to the point that the pastor of the church asked my grandmother not to bring me back to Sunday School. I was being too disruptive, which leads us to the topic of God. Don’t worry. This is not the beginning of a sermon. Far from it. Indeed, it’s almost the opposite. I neither believe nor deny anyone else’s idea of God, but I did notice that when I asked a question that no one had a good answer for, the topic of God would ultimately come up. It got to the point that I began to equate the word God with, “I don’t really know”. God is a pretty good stand-in for the “true” part of knowledge, and its opposite.</span></p><br /><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 10pt;"><span style="font-family: "Trebuchet MS"; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The Territorial Imperative of Early Life</span></h1><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">A billion years ago survival was more challenging than it is today. Darwinian evolution ruled the planet. Billions of plant cells had to die for each new mutation to replicate and prosper. Improving life was all about dying. The only life that existed on Earth at the time was very small plant cells floating in water. But animals were about to literally and physically emerge using two tricks that likely co-evolved during the same event. These two were sense and movement. Along with these new tricks came the need to know WHEN to move, not unlike the Venus Flytrap above. It’s certainly possible that sense and movement evolved separately but less likely because a third non-material aspect required the interaction of both - the creation of knowledge.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">About the time I first started programming microcomputers, I came across a computer game called “Life”. This was not the old board game of the same name, it was something entirely different. This “Life” was based on some very simple rules documented in Martin Gardner’s column from Scientific American in 1970. I didn’t get a chance to read the column until years later, but the game presented in computer code was fascinating. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This “Life” is a zero-player game - very Zen. You’d set up initial conditions graphically in a program then press “start” to see what happened. Requiring only very modest computer resources, this game was a natural for the limited memory and displays of the earliest versions of microcomputers. Such vivid "graphical" applications were rare at the time even if these "graphics" were just ASCII characters. Experimenting with “Life”, and the “gliders” were a lot of fun, but what mostly caught my attention were the visual graphical patterns that caused a “lifeform” to prosper and stabilize - the trick was having not too many neighbors, and not too few. There was a Goldilocks zone. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Explore the game if you haven’t. For me, it was one possible model for a primal plant, or perhaps even, animal life. It inspired me to explore how plants were different from animals and why - movement was the key! Modern (and perhaps ancient) amoebas seem to live on this emergent boundary between plants and animals. How did plant cells come to move at all, and how could they best apply these new and amazing evolutionary tricks? I now realize that signaling ethereal knowledge is the key. And that knowledge has profound implications as I’m about to present. The title of this section which I took from a book I read back in the 1960s is a decursive hint.</span></p><br /><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Hemo</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Before we proceed, I want to take you back to one of my 7th-grade experiences:</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><a href="https://www.youtube.com/watch?v=08QDu2pGtkc&t=11s" style="text-decoration-line: none;"><span style="background-color: white; color: #1155cc; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; text-decoration-line: underline; text-decoration-skip-ink: none; vertical-align: baseline; white-space: pre-wrap;">Hemo the Magnificent</span></a></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">When I first saw this film as a child, it set off a whole series of possibilities in my mind. Note how the brain is presented as a telephone system. This was high-tech at the time. Computers did not yet dominate our cultural modeling of the brain circa 1959. Watching it decades later, I see the decursive nature of multi-celled specialization and chemical signaling in its various forms - neurons included. In spite of its now-dated content, some of which are in doubt, the movie has many useful concepts. I won’t do a detailed review but it’s worth watching, if nothing else to understand how I came to understand the challenge of evolution at the time I was about to begin learning about electronics, logic, and later, computers. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Staying with my theme, this film honors the difference between science and art which may have inspired its production. I was also fascinated by the idea that blood was similar to ancient seawater as it might have existed about a billion years ago. Decades later I became especially interested in the ratio of potassium to sodium in modern seawater, and even more significantly, contrasting those very ratios in biology between what’s inside the cells, and what’s outside. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I won’t dive deep into chemistry here but a quick review of certain aspects will be useful at this point. Various salts in dry form are made up of two different atoms in a stable crystal configuration, but when you put them in water something very interesting happens - they dissolve in solution. The most common example is table salt. When sodium chloride is dropped into water it will dissociate into separate sodium and chloride ions, each with opposite ionic charges. The same thing happens with potassium chloride. All of these ions float around freely in seawater most of the time. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">What’s really interesting is that there are about three potassium ions for every sodium ion withIN most plant and animal cells, but that ratio is reversed and varies by almost an order of magnitude in the saline fluid outside of the cell. There are 27 sodium ions for every potassium in typical seawater, perhaps less in ancient seawater. You might have noticed how they tried to explain this difference in the above movie but mostly failed. </span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Still, the difference in these ratios led me to explore the topic once I got into college chemistry. It’s where my first doubts about the electrical and logical nature of neurons began. The closer I looked, the more I realized that internally, neurons are ionic, not electronic. And outside of the neuron, they were more chemical in nature than anything having to do with electrical or even ionic charge. This now sets the stage for the next example.</span></p><br /><br /><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 10pt;"><span style="font-family: "Trebuchet MS"; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Hillock, the Proto-protozoan 1.0</span></h1><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">My second example will be the very first animal, a proto-protozoan, perhaps born of a proto-amoeba. The meaning of the words amoeba and protozoa have been so abused by reclassification as to be almost meaningless, but not quite. All protozoa are animals. Some protozoa are one-celled, others, multi-celled. They range from simple to complex in structure and function. In a similar respect, amoebas cover an informal classification including fungi, algae, and of course, protozoa. Our very first animal is of the very simplest type, a single-celled proto-amoeba, proto-protozoan, proto-neuron - but with no dendrites and no axon. We’ll call him Hillock to note his most significant structure and function.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This first hypothetical beast (and his immediate progeny) are so simple that they only have some standard replication and energy conversion stuff from the plant world. Hillock does not yet sense anything. It does not yet move on its own. It knows nothing, and never has. But that’s about to change. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Like some modern amoebas, Hillock has a leaky membrane. Also like some modern amoebas, this one has become somewhat structurally differentiated in the strength and nature of this membrane - asymmetrically. On the left side of the cell body (to have a reference) is an area of the membrane that is more likely to capture potassium ions (along with food) when encountered externally. This spot will ultimately evolve into a dendrite. On the right side is its hillock. In the future, it will protrude to become a proto-axon, again much later, so not today. For now both sides present with relative weakness and strength in the membrane surface as to changes in osmotic stability, asymmetrically left and right.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This particular Hillock has evolved a very interesting characteristic. Its proto-structure acts like a Zener diode from the tech world. Using the migration of potassium ions instead of electrons, Hillock allows a free flow inward but stops the exit of potassium ions until an inflection point is reached. For the techs among you, think in terms of the easy flow of potassium only in one direction (inward) until a certain threshold of ionic potential is reached at which point, the migration dramatically reverses direction, resetting the internal ionic balance to a more stable configuration. This hasn’t happened yet in all of evolutionary history, but it’s about to. The point is, the membrane at this hillock is about to do something similar with ionic charge that a Zener diode does with an electronic charge. This dynamic membrane has long let potassium ions enter this proto-amoeba but has so far never reached this inflection point to let the potassium ions all go at once, or at least a lot of them, but it hasn’t happened quite yet. Patience.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I’m now going to describe this event in the simplest terms I can. Science will ultimately need to characterize this structure in detailed biology but for now, I will take some liberty. As mentioned, the ratio of sodium to potassium in seawater is 27 to 1. You are 27 times more likely to encounter a sodium ion in seawater than you are to encounter a single potassium ion, yet plants and animals have roughly three times more potassium ions within their cell membranes than sodium ions. This means that there’s a difference in concentration of 81 times between the outside and the inside of any living cell as to the ratio of these two different types of ions. Since this ratio roughly holds for both plants and animals, detecting potassium ions can be thought of as a chemical indicator of life, or at least it’s 81 times more likely to be the case, which is a far more effective test for life than the Venus Flytrap’s motion detector, and so the knowledge it yields more valuable, by degrees.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If all this biology and ionics is becoming a blur, that’s fine. I’m just being technical in a very fluffy way for those who have never played with the details. What I’m describing is almost certainly wrong in many possible respects, except for the final result. It’s quite likely this final result happened in some fashion if not the one I’m actually describing. The details are not as important as the outcome, which is actually quite profound. I include these details to stir the imagination. Challenging them may help some to understand the dynamics of the first creation of knowledge by this proto-amoeba. Hillock is about to cross that line from being a plant to become an animal.</span></p><br /><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The Nature of Boundaries</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">It’s a billion years ago, plus or minus a couple hundred million years. Now imagine a totally barren desert landscape next to a large body of water. By totally barren I mean exactly that - just rock and sand above the water - nothing else. Plants won’t find their way onto land for several hundred million years. Imagine no visible life at all in this desert - no lichen on the rocks, no algae slime in the water. If you were searching for life you’d need a microscope and some luck. Life at this point was so rare as to not produce any obvious visual indication even when you looked closely. All you could see would be sand and water, and only feel the air if the wind was blowing.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">At a place in the shallows was a point where the sea, sky, and land came together to form a spatial triple boundary. There was the surface between the water and the air, another surface between the air and sand, and of course a third between sand and water. These surfaces meet at a dynamic and ever-changing line we call a shore. This sand, water, and air could also be described as volumes. Even surfaces have area, but shores form a dynamic line where all three meet. I’ll here suggest that most of the really fun stuff in nature happens at boundaries, from the nano context to the macro, and even mega. Keep an eye out for them and think about their nature. As we progress, the knowledge you might imagine emerging from these boundaries will be useful in this exercise.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The water in our story might well have been an ocean filling what is now called North America’s Great Basin, at a place not far from present-day Reno, Nevada. I’ve hiked the mountains and valleys that make up this ground for literally thousands of miles. I’ve come to know the dynamic sky that covers it, and have even enjoyed the remaining bodies of water making up these boundaries as described.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This hypothetical event happened long before the Sierra Nevada mountains as we currently know them even existed, but dry land matters little to this second story, and the sky, not at all. The sand at least provided a shore and shelter from the waves and small currents for the creature I’m about to describe. Land and air will matter much more for other creatures in a few hundred million years. We can ignore them for now.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Until the time I’m about to describe, knowledge did not exist on planet Earth. Nor had it ever existed. The prefix “proto” means “first” for a reason. Single-celled plants were relatively rare but still common enough to be found with a microscope and indeed were the stock from which a new way of dealing with the world was about to evolve. Knowledge was about to become evolution’s newest trick, and the very basis for evolving a new way to evolve. We have a front-row seat thanks to our mind being able to imagine this setting and a single creature which was also the very epitome of the word “singular” in its ability to create that very first bit of knowledge.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">What generally separates plants from animals? Movement, of course, even just seemingly random movement. This morning for the first time there would be meaningful movement. What makes movement meaningful? Knowledge. What makes knowledge meaningful? Movement. This critter was about to become the first expression of the mind-body aspect of evolution in a very Zen fashion. Now back to Hillock.</span></p><br /><br /><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Hysteresis - All or Nothing</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">As various potassium ions make their way through the left membrane of our proto-neuron over time, they tend to increase the ionic charge within this cell compared to the saline solution outside of this cell. This is best described as a type of ionic priming, the build-up of total ionic charge within the membrane. For now, we can also think of it as “sensing” for the accumulated ionic tension reflects the density of potassium captured by the membrane and so also reflects the recent density of potassium ions outside the cell. At least on average. Some ions slowly leak back out over time, thus decreasing this ionic tension or priming but mostly they are contained and accumulate. How much priming can our cell take? That’s where hysteresis comes in. This particular cell, this particular morning, had a weakness that became its strength.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Let’s get over to the right side of the cell again. This is where that weak spot evolved an interesting characteristic. The ionic tension of potassium is additive up to a critical and consistent inflection point, reflecting the physics of biology. Here I have to make a careful distinction. The ionic priming is not a function of potassium density, but there is a relationship, yielding a mathematical relation. The reason for this function/relation distinction is that there’s a random aspect to potassium density and membrane weakness. It’s not determinant. Our left-brain would like to think it’s determinant as that would make for a more accurate prediction, but we don’t need to project perfect accuracy into a story where there isn’t any to be had.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">So reflecting average density, potassium ions would build up in concentration or slowly leak back out, but if (and when) this potassium ion concentration reached a critical inflection point in a given period of time something very interesting happened. These ions diffused across the cell creating a gradient of ionic potential in the process. This particular spot in the cell membrane weakened but did not actually rupture as membranes are flexible. This weak spot was about to release its potassium ions all at once hysterically, as in hysteresis. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Ionically (and ironically), hysteresis is best understood and characterized in how magnets change states. This sudden release might create a chemical signal of sorts when this group of ions flooded out without bursting the membrane. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If you’re not familiar with magnetic hysteresis, think toggle switch. If you haven’t noted the difference, toggle switches are not like most light switches found in houses. Toggle switches were once more common but are now typically found in old World War II airplane dashboards and other specialized equipment. Toggle switches have a rounded-off shiny metal button-shaped lever instead of the more squared-off plastic version found in common house switches today. They also have another VERY important aspect best characterized by how they turn off and on. It’s called “all or nothing”.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">You can push a toggle switch very carefully past its center point until it goes just barely beyond, then it flips over completely without applying any more force. Next, you can do the very same thing in the opposite direction to turn it off. Toggle switches always switch when moved just past center. They were an early attempt to avoid electrical glitches. Toggle switches are a mechanical version of magnetic hysteresis, but critically, neurons do not remain in their switched state. They immediately switch back to their resting condition, producing a signal but not a change in state, a weakness of this particular metaphor, like the weakness in Hillock’s membrane.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Magnets, toggle switches, neurons, and yes, humans in the macro context all exhibit hysteresis. Observe that humans in relationships with other humans maintain a type of dynamic power balance, right up until they don’t. A failing marriage will have a building tension right up until the point of divorce. And then hysteresis. Few marriages survive the conviction of divorce. Decursively.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Something similar to hysteresis evolved in Hillock’s membrane - it gained an ionic inflection point. Feel free to apply any metaphor that helps as I describe this asymmetric ionic weakness. Some of the biologists among you will recognize this weak spot as the beginning of a “protein channel” using positive feedback. As ions accumulated, tension built until it gave up most of its potassium ions all at once before resetting to its normal resting condition ionically.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">It’s also a mechanical inflection point that allows a gun to fire. Yes, one more and perhaps better metaphor. Once gunpowder begins to burn. It’s all or nothing. This is one reason a neuron’s firing is called a triggering event. In other words, impact ignites gun power, then it’s all or nothing. Something similar happened with Hillock this particular morning, knowledge was about to come into existence. And then be gone.</span></p><br /><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">I Don’t Know, But the Neuron Does!</span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">See what I did with the above title? I made the distinction between subjective and objective knowledge. As your narrator, I don’t objectively know when the tension gets too high for the membrane to “hold”, but Hillock’s membrane literally DOES subjectively know. And that’s when it fires. The sovereignty of the decision lies with Hillock. "You can lead a horse to water, but it's up to the horse if it wants to drink." That's because it's the same with the horse's neurons. the sovereignty of knowledge lies with the neuron.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Finally, the ionic tension became too much for Hillock’s membrane and a large number of potassium ions burst forth from the right side of our proto-neuron which caused Hillock to move a tiny amount in a random direction somewhat away from this cloud of potassium ions, and at the same time creating a localized exception to entropy in the form of potassium density. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Interestingly, the new locations of these potassium ions could be described as an expanding probability cloud around the location from which they were released. This would become a useful localizing beacon for later progeny, a sign of other lifeforms in the area.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Though only an approximation, this bit of imperfect knowledge about the relationship between itself and other lifeforms in the immediate surrounding area was just enough to make a difference in the probability of survival somewhat like that game of “Life” described above. If this was not the case, the next steps in evolution would have been far less likely.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If we characterize this bloom of potassium ions as a chemical signal, even though it only lasted for an instant, it was still enough of a signal about the proximity of other lifeforms to gain an advantage over the other plant cells at that moment. It was just enough that a new classification kingdom was born - animals. Life on Earth would never be the same. </span><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; white-space: pre-wrap;">When you disturb a random process, you create a type of order, in this case, a bias of displacement yielding a</span><span style="font-family: Arial; font-size: 18.6667px; white-space: pre-wrap;"> chemical signal.</span></p><br /><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">A Plant Becomes an Animal</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">At that very moment, in this cell, floating in the Great Basin of North America, knowledge was created for the very first time, and then immediately was gone. No one noticed. This was because other animal life that might be able to sense such an event did not yet exist on Earth, but shortly would, once Hillock replicated. It’s only our imagination that provides a view of this very special moment. So back to the details of our very first proto-neuron. What exactly was being detected? Why does it matter? And what does it “mean”?</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Was the creation of this knowledge worth the energy expended in managing the ionic balance? The answer is likely yes, or this particular trick would not be with us today even though it’s now a far more refined and energy-efficient version.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">What was being detected of course, like the Venus Flytrap, was a chemical signal of significant life in the area, or at least the odds were 81 to 1 that life was being detected, and those were pretty good odds, considering. What did Hillock do with this ethereal knowledge? Two things. As this specific type of ionic release occurred, the body of this cell moved a very small amount in a random direction allowing itself to optimize its relative location for survival somewhat like Brownian motion. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The second thing that occurred is that a concentrated bit of chemistry (potassium ions in this case) being released into the seawater would provide a type of signaling for itself and other future progeny. It would tend to keep randomly moving until potassium concentration decreased to give it space from its neighboring lifeforms. This would have the result of producing the inverse of entropy, at least in a very limited and localized context. Such knowledge would become more significant over time as this new species evolved. This cell not only improved the odds of survival for itself, it literally became “Maxwell’s Demon” for concentrating potassium ions.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Carnot’s Second Law of Thermodynamics may be challenged by the above concept, but only locally. Though entropy on a macro scale mostly dominates, there are many hyper-localized exceptions. How else could life ever become increasingly complex? Even evolution itself could be described as a form of extropy.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Before you get excited about negative entropy or signaling progeny, what I’m describing is only hypothetical. This event may have occurred in a completely different way, but at some point, the first animal cell DID appear. And because of replication (the more significant half of evolution as Richard Dawkins noted in, “The Selfish Gene”). Some form of selection for the creation of knowledge occurred at some point millions of years ago. Existing animal life is proof. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Now you may consider this example of questionable value in understanding the creation of knowledge, but that’s the point. It barely qualifies as an example at all, which was my objective in this flight of fancy. I may be completely wrong in this description, but it doesn’t matter greatly. If it didn’t happen exactly in this way, the important thing was, it happened. We have the result to study. Somehow evolution got from plant to proto-animal and this description is as good as any for now. It’s enough to move us along in our quest to understand knowledge and the neuron.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">You might not think that the ratio between sodium and potassium within a cell compared to that outside the cell is very useful knowledge, but when you consider these ratios as an indicator for lifeforms, and thus the probability of encountering food or becoming food, the significance for survival increases dramatically and disproportionately. I honestly don’t know if this particular bit of knowledge was the first to evolve, but it likely occurred early in the evolutionary process, almost certainly before sodium-potassium pumps evolved. In any case, I was looking for one of the most primal examples of knowledge generation and this is my favorite so far. In the greater scheme of things, the accuracy of the example is far less important than the concept itself. If you have a better candidate for the very first genesis of knowledge, let me know.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">You may also wonder why I chose such a primal description of this first animal life. It’s specifically because this critter predates ion pumps, synapses, dendrites, and even axons, yet still has all the critical functions to be a proto-neuron. All of these other things would emerge in due course. Evolution had to start somewhere. The simplest version is the most likely.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The point of this example is that a lifeform was able to produce a chemical signal that reflected the ionic and mathematical relationship, in the form of a probability ratio between sodium and potassium ions. I could have described this mathematical relation as objective information except that it is not material nor consistent, exists for only a very small part of a second, and is not stored anywhere. But knowledge of this relationship is similar to what I’ve described in the last post. It was ephemeral, ethereal, and not very accurate, but it still beats the odds for survival so it had utility and conviction approximating Plato’s definition of knowledge, perhaps better. Well, except for missing the truth, which is why I describe it as a useful lie.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: white; color: #222222; font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Finally, a reasonable person might not consider what was created when this first proto-neuron fires as ethereal knowledge at all. Said possible knowledge was not required for the said critter to move, and perhaps even evolve. Where exactly is the knowledge? And why does it matter? The answers to these questions are subjective and relative to the critter in question. Not us. Did it survive? Yes. Or perhaps it was a cousin. It doesn’t matter much which. What matters is that some proto-animal found that useful moment to fire, perfect or not. As you’ll see the quality of knowledge will improve, but it will take time. Now onward.</span></p><br /><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 10pt;"><span style="font-family: "Trebuchet MS"; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Proto-protozoan, Version 2.0 - Friend or Foe?</span></h1><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">It’s ten million years later, or perhaps a hundred million. It’s hard to know. Evolution moves in fits and starts, of which the Cambrian explosion was only the most dramatic example. Knowledge was fragile and quite rare at first. The early tricks were sometimes the most difficult to get right. Evolving a new way to evolve was not easy, nor probable. But it did happen for it exists now. New developments were disproportionate, even extremely disproportionate. The start was slow, but the acceleration, amazingly fast as evolution tends to reapply these tricks over and over, and quite effectively. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">At long last, our first animal has finally specialized into having four cells, each now differentiated from later plants which have even more structure. There were two sensor neurons, an interneuron, and a muscle cell. This multi-celled protozoan has a primitive neural net with its signals converging on that muscle cell. The interneuron has actual dendrites and each of the neurons have useful ion pumps, a modest proto-axon bump, and most importantly, synapses for both input and output connections. We can now better describe how a proper neuron might create some practical knowledge. Yes, I’ve jumped over an amazing amount of evolutionary detail. Someone else will have to fill it in.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This second animal example in many ways does the same thing as the first, just more effectively. That 81-to-one ratio between the probability of encountering sodium and potassium ions is now part of our newly minted neuron’s basic machinery and will forever be part of basic biology for all neurons. One of the sensor neurons in this pair still detects potassium ions and can be considered the ancestor of all neurons in various respects. One of its progeny will take up residence in our tongue and evolve to become our taste receptor for “salt”, the sodium kind.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The prime difference of movement between plants and animals is also set, with a few exceptions here and there as noted. You can now think of these two kingdoms as two logical sets in this ionic ocean. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Something else which is very important has happened in the last few million years - the very first xenophobia, and the need for discrimination. Such discrimination between one thing or another is the very essence of what neurons do in the process of creating knowledge.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Because of their general inability to move, plant cells have to compete with other plant cells for light to create and store energy. Our new animals have the ability to move and so manage their territory more effectively. This movement is so important to our newly-minted proto-animals that the act of movement, along with where and when to move has become the reason for our brand-new upgraded neurons to exist. Where and when are forms of knowledge, but you won’t find this knowledge in any physical form except for the occasional firing of these neurons. This ephemeral knowledge is mostly a reflection of the challenges the world presents to these proto-creatures in the moment.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Plants and animals have had a schism and will never again be united as one. They no longer share the same agenda. Plants passively replicate and focus on converting sunlight into energy. Our new Proto-protozoan has become a mobile vegetarian. It finds surviving more effective by simply eating plant cells than to convert energy from the sun on its own. This newly defined relationship between plants and animals has made them competitors of sorts. So far, Proto-protozoan, Version 2.0 is not a cannibal. That too will later change. For now, its agenda is knowing the difference between plants and animals, reflecting the relationship between the two kingdoms.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Even though other animals are not the best of friends, there is an advantage to being part of a herd of its own kind. Fortunately, this new protozoan version 2.0 has evolved a way to recognize its food - the very first sense of smell, which brings us to our new sensor neuron. I hope you will forgive me for not describing it as a neuron; its dendrites do not have any synapses. Instead, they have the ability to chemically smell plant cells as differentiated from other animal cells. This is the quality, (and the knowledge), that it communicates to its adjacent interneuron.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">So how does our new sensor neuron tell the difference between a plant cell and another protozoan? It could be as simple as being able to detect chlorophyll in the water, or perhaps some other molecular marker now lost to history. In any case, this “odor” allows our sensor neuron to fire when this marker is in the water causing its interneuron to trigger movement in the adjacent muscle cell, and as we’ve already noted, seemingly “random” movement may actually have greater than random value for survival. These two sensors can be thought of as an analogical “OR” from the interneuron’s subjective perspective. The logical “OR” is fairly obvious as there are now two reasons to move. The “ana” part will be explained in due course.</span></p><br /><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Follow the Mosquito</span><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Again, the mosquito obviously isn’t on the human proto-path of evolution, or even close. But we may share a homing trick so distant in the past that it existed long before the mosquito or even insects in general. Here’s where taking decursion in the opposite direction can be of help in understanding the nature of Proto-protozoa Version-2.0. We’ll start with how a modern mosquito finds human blood for food. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Among other methods, mosquitos tend to move in a random direction for a random distance using a type of biological random number generator. They also have a neural sensor that fires when it detects carbon dioxide. The firing of this neuron, (and the bit of knowledge it represents), inhibits the mosquito’s random direction generator and so the path continues straight forward until no more carbon dioxide is detected. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">If we return to that model of the probability cloud of the drunk and the lamppost, this resetting of the random number generator establishes a new “lamppost” nearer the source of possible human carbon dioxide. Note that this is not determinant machinery. Nor logical. It’s all about probability in several different ways, and the trick is about beating the odds which is all that’s required to get a fresh meal of blood. Probability is also what makes knowledge different from information as I’ve described in the last post. In any case, this is one possible method for our Proto-protozoan, Version 2.0 to locate food using a different marker to help create this important knowledge.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The four cells making up our new animal can now herd with others of its kind as friends, yet also focus on eating plants instead of random cannibalism. If you haven’t noticed, Version 2.0’s inherited ability to sense ion ratios has divided its world into living and non-living proximity, and now its new ability to smell plants has further divided the set of living into either plants or animals. The intersection of these four mathematical sets could be described as a new more complex type of knowledge that allows our critter to know the difference between eating dirt, yet not being a cannibal. Knowledge is becoming more abstract. And more useful.</span></p><br /><span id="docs-internal-guid-c926360b-7fff-4876-483f-0ab907a6ad80"><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">For you techy types you can think of this set discrimination as a type of binary search, a very logical process that can yield prosperity, analogically. And it does. I hope it is clear how the firing of these two specialized neurons creates different types of knowledge to yield a third type of knowledge allowing for movement and increasing the probability of survival and replication. So far evolution has converted the relationship between the living and the non-living and also the relationship between plants and animals into very specific, though often, inaccurate knowledge.</span></p><br /></span><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Again, I’ve likely got many of the details of our Proto-protozoa Version-2.0 wrong, but it matters little. They obviously evolved in some fashion or they wouldn’t be here now. What’s more important is the concept of them using a type of very primal knowledge “to understand” the relationship between themselves and the nature of the things around them. It used this knowledge to time movement, survive and replicate. Knowledge is the key to survival, and neurons create it. Complexity just makes knowledge more specific as our next example will demonstrate.</span></p><br /><br />Continued:<br /></span><div><span><h3 class="post-title entry-title" style="background-color: whitesmoke; font-family: "Trebuchet MS", verdana, sans-serif; font-size: 19.5px; text-indent: 10px;"><a href="https://suddendisruption.blogspot.com/2022/05/assertion-salad.html" style="border-bottom: 1px dashed red; color: black; text-decoration-line: none;">The Gnostic Neuron - Part 10 - Assertion Salad</a></h3><div><br /></div></span></div>Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.com0tag:blogger.com,1999:blog-23742979.post-3642618944501446662023-01-29T06:00:00.007-08:002023-12-31T16:15:41.825-08:00The Gnostic Neuron - Part 9 - Assertion Salad<p> <span style="font-family: Arial; font-size: 11pt; font-weight: 700; white-space: pre-wrap;"><Originally posted July 1, 2021></span></p><span id="docs-internal-guid-30f69911-7fff-ec0e-d6b1-58694b61725a"><br /><br /><span id="docs-internal-guid-30f69911-7fff-ec0e-d6b1-58694b61725a"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></span><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg-hchcC6v2CBHbtfqlKQdt6ygV01FiOcqX_BKluBU5lQDhTcbB8ULa3jBQsi0U3O4ipTtiMs90N8rWqNoRHpl2fOO5iJLi8gHdxNyVe85w4f2bkOCNppHZNQEbqXJL9aZSQkFYKuW8lS8TZPoWKMMeQ9CZKJ1v7z9Ly21qu6MovVG4WJcmJlk/s640/08.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="640" data-original-width="640" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg-hchcC6v2CBHbtfqlKQdt6ygV01FiOcqX_BKluBU5lQDhTcbB8ULa3jBQsi0U3O4ipTtiMs90N8rWqNoRHpl2fOO5iJLi8gHdxNyVe85w4f2bkOCNppHZNQEbqXJL9aZSQkFYKuW8lS8TZPoWKMMeQ9CZKJ1v7z9Ly21qu6MovVG4WJcmJlk/w640-h640/08.jpg" width="640" /></a></div><div><br /></div><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Assertion Salad</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">"No generalization’s worth a damn, including this one." - Oliver Wendell Holmes</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">To summarize what I’ve posted so far I'll now share a list of my favorite brain assertions which I've been editing for years. In the spirit of playing with ideas about the brain, I’ll present these generalizations as a collection of poetry, a sort of free-verse association for concepts and ideas about the brain. The result will be an assertion salad, not unlike a word salad, but somewhat less random, and hopefully a bit more useful. </span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Some of these assertions will contradict others in various ways, some will be sweet, some salty. Some may even be unsavory. Think of it as an evolving recipe. What broad generalizations, what simple assertions can you make about the brain? Your salad will vary from mine depending upon the aspects of the brain you’ve focused on. Yours will have the ingredients you tend to appreciate, but keep your options open for new ideas and flavors.</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">This particular assertion salad informs my current casual and intuitive overview of the brain. Some of these assertions are more probable than others. Some may ultimately be dead wrong, but intriguing at the moment. For me, all have been useful in some way. </span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I suggest you literally document your own assertion salad by challenging mine or just check off the ones you agree with for now. Steal freely - that’s the key to great art. Create some new ones. Play with the concepts until they feel right; until you find significance. You can always come back for seconds. Or make a completely new salad. This exercise should ultimately evolve a more tasty result, and hopefully yield a better understanding of the brain:</span></p><br /><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">"The brain is embodied, and the body is embedded."</span><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"> - from "Second Nature" by Gerald Edelman, Noble laureate.</span></p><div><span><br /></span></div><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Reality exists.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><b style="font-family: Arial; font-size: 21.3333px; white-space-collapse: preserve;">Our world exists in this reality.</b></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><b style="font-family: Arial; font-size: 21.3333px; white-space-collapse: preserve;">Our world contains things.</b></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><b style="font-family: Arial; font-size: 21.3333px; white-space-collapse: preserve;">There are many relationships between these things.</b></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><b style="font-family: Arial; font-size: 21.3333px; white-space-collapse: preserve;">These relationships are ethereal, apart from reality.</b></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><b style="font-family: Arial; font-size: 21.3333px; white-space-collapse: preserve;">Neurons devine the nature of these relationships...</b></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><b style="font-family: Arial; font-size: 21.3333px; white-space-collapse: preserve;">but only a relatively very few of them.</b></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><b style="font-family: Arial; font-size: 21.3333px; white-space-collapse: preserve;">Neurons create knowledge from these relationships.</b></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><br /></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-weight: 700; white-space-collapse: preserve;">Neurons sense relationships between things in the world...</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-weight: 700; white-space-collapse: preserve;">One of those things sometimes being itself.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-weight: 700; white-space-collapse: preserve;">Therefore neurons create knowledge.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-weight: 700; white-space-collapse: preserve;">Relationships in the world inform knowledge.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-weight: 700; white-space-collapse: preserve;">Neurons sense relationships.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-weight: 700; white-space: pre-wrap;">Knowledge informs movement.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Knowledge means what knowledge moves.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Neurons express knowledge when they fire.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;"><span style="font-size: 14pt; text-align: left;">All knowledge can ultimately be reduced to knowing when to fire.</span></span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Knowledge strives to be invariant from multiple sources.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-weight: 700; white-space: pre-wrap;">Knowledge is proto-information.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-weight: 700; white-space: pre-wrap;">Knowledge is digital, sort of.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-weight: 700; white-space: pre-wrap;">Knowledge is analog, but not completely.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-weight: 700; white-space: pre-wrap;">Knowledge is abstracted by jumps from neuron to neuron.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-weight: 700; white-space: pre-wrap;">Primal neural pathways loop with the real world.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-weight: 700; white-space: pre-wrap;">1st-order knowledge occurs at a neural sensor.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-weight: 700; white-space: pre-wrap;">10th-ordered knowledge is quite primal.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-weight: 700; white-space-collapse: preserve;">100th-ordered knowledge loops in reality now.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-weight: 700; white-space: pre-wrap;">1000th ordered knowledge loops with the corti later.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-weight: 700; white-space: pre-wrap;">1,000,000th ordered knowledge is contemplated for 17 minutes.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;"></span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Information is fixed; knowledge evolves.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">What do YOU know?</span></p><div><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;"><br /></span></div><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The brain is a reflection of the world that drove its evolution.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">As an individual brain is a reflection of that individual's experience.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">As our culture is a reflection of our collective experience.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Which is a reflection of the brain's architecture.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Which of course, is part of that world, creating a circular dependency.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Only loosely tethered to reality.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">All within our skull.</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The brain is profoundly divided left and right,</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">providing for a necessary isolation.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The two sides of the brain are redundant, by degrees.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The two sides of the brain specialize, by degrees.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">They could accomplish neither if fully integrated.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Brain division creates this necessary isolation.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The two sides of our brain are actually separated by...</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">a common corpus callosum of competing inhibition.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;"> </span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The left side of our brain deals with logic, tools, and language,</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">but not exclusively so.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The right side of our brain deals with colors, music, and visualization, </span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">but not exclusively so.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Together, both sides of our brain create many useful dichotomies.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">And mine the area between these limits for wisdom.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;"> </span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The brain is divided in two, left and right.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Each side is layered in its evolution.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The mind is multifaceted reflecting this architecture.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">As the neuron is multifaceted in its creation of knowledge.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="background-color: transparent; color: black; font-family: Arial; font-size: 14pt; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><b><br /></b></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="background-color: transparent; color: black; font-family: Arial; font-size: 14pt; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><b>Knowledge is the more concrete, organic, flexible,</b></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="background-color: transparent; color: black; font-family: Arial; font-size: 14pt; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><b> and right-minded version of</b></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="background-color: transparent; color: black; font-family: Arial; font-size: 14pt; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><b> information. </b></span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><span id="docs-internal-guid-14be1e8a-7fff-8275-070b-7f55ff23eb4b"><b><br /></b></span></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="background-color: transparent; color: black; font-family: Arial; font-size: 14pt; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><b>Information is the more abstract, technical, fixed,</b></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="background-color: transparent; color: black; font-family: Arial; font-size: 14pt; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><b> and left-brained version of</b></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="background-color: transparent; color: black; font-family: Arial; font-size: 14pt; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><b> knowledge.</b></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="background-color: transparent; color: black; font-family: Arial; font-size: 14pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline; white-space: pre;"><br /></span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The brain is layered phylogenetically from the brainstem...</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">up, out and forward.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">These layers are best understood as creatures…</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">from our evolutionary past.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">These layers and sides of our brain both...</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">compete and cooperate to yield behavior.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Knowledge flows from the competition of the extreme limits.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Where the medium and the median are not always in the middle.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Thesis challenges anthesis.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Cooperation picks up the pieces that are left to forge synthesis.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Competition creates drama.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Cooperation resolves it.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;"> </span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The brain is steeped in biochemistry.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The brain is not electronic.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">It's not even electrical.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Electricity is an abomination to the brain.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">But ionic charge is the not-so-secret-sauce within the neuron.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Ionic brain waves are an artifact of recognition, not its cause.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Brain imaging is a macro view of nano experience.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Neurons that fire together only sort of wire together.</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The brain is ephemeral and biological.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">But it aspires to statehood... and usually fails.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">There are no "states" in the brain, only in the mind.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">And temporary approximations in the form of dynamic signals.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The operation of the brain is a signal-based simulation.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The brain is chemo-semiotic in nature.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Consciousness is a multifaceted</span><span style="font-family: Arial; font-size: 21.3333px; font-weight: 700; white-space: pre-wrap;"> chemo-semiotic simulation.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Memory is mimicry.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Our "memory" is the reconstruction of a stateless simulation.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">We "re-member" as we are re-cued by a similar experience...</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">even if it's just in our imagination.</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Nothing in the brain is hardwired.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The brain is plastic by degrees and in critical phases.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">An old dog can learn new tricks.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;"> </span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The brain does not compute; but can fake it when required.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Computers simulate the world by changing states in a logical system.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The brain simulates the world in a more subtle and elegant fashion.</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">A computer is fast, digital, consistent, synchronous, serial, fixed, objective, and logical.</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The brain is relatively slow, biologically analog, mostly malleable, actually asynchronous, profoundly parallel, permanently plastic, surprisingly subjective, and ultimately bioanalogical in its operation. But not exclusively so.</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Replace "stimulus-response" with "sense-decide-signal" or the more macro: "cues and scripts."</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The difference is in who has control.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">And what entity makes the decision.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Sovereignty lies between the dendrite and the hillock.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Neurons decide everything.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">We are inherently subjective, only aspiring to the objective.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Or something like that. </span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Reality is hypo-thetical.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The predictability of our world ranges from random to determinant,</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">but these limits are only asymptotically approached.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Knowledge is that which is significant in our world.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Knowledge is created by neurons in a dance with experience.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Knowledge comes together to form mind.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Mind then drives behavior to affect reality.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Neurons buffer the exit from, and the re-entry to reality.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Knowledge completes a dynamic and ethereal loop with the world.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Knowledge is a subjective approximation of the truth.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">By degrees.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Thetical, or theory, is where we live.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Truth is an objective abstraction.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Truth is an aspiration, only asymptotically approached.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Truth is hyper-thetical.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The mind exists in an ethereal world.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The brain exists in the material world.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The left-brain attends to the material, but not exclusively so.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The right-mind attends to the ethereal, but not exclusively so.</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Each person's world is defined by their subjective experiences.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Objective reality stands apart from each of us - maybe.</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">A homunculus is a subjective model of yourself in the cortex.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">A worldmapculus is a subjective model of your world in the cortex.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Our subjective worldmapculus is our own private "Matrix".</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">All are a "sandbox" striving to model all experiences objectively.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Using our personal worldmapculus, we are each curating our own subjective experiences, aspiring to live in the real world...</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">and generally succeeding.</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Behavior ranges from predictable to seemingly random...</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">when objectively observed.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">But these limits are only asymptotically approached.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Behavior generally reflects our subjective knowledge.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">We only aspire to the objective.</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Neurons create knowledge from the physical world.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Knowledge forms cues which may invoke scripts of muscle movement.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The mind is a collection of theatrical cues and scripts.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The mind is theater.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Theater is a worldmapculus.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">"All the world's a stage."</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Behavior is a collection of</span><span style="font-family: Arial; font-size: 21.3333px; font-weight: 700; white-space: pre-wrap;"> competing</span><span style="font-family: Arial; font-size: 16pt; font-weight: 700; white-space: pre-wrap;"> and</span><span style="font-family: Arial; font-size: 21.3333px; font-weight: 700; white-space: pre-wrap;"> cooperating</span><span style="font-family: Arial; font-size: 16pt; font-weight: 700; white-space: pre-wrap;">...</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">theatrical cues and scripts.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Cues and scripts are how we dance with the world.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;"> Consciousness is a collection of neuronal cues and scripts... both mediated by and marinating within...</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">that cauldron of chemistry contained within our skull.</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The brain is a collection of evolutionary survival tricks.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">These tricks are applied disproportionately and decursively.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The brain is more complex than complicated.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Evolution has evolved a more effective way to evolve.</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Our brain is in a dynamic operational loop with the world.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Everything you experience occurs within your skull.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">But is only an approximation of our shared and objective reality.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;"> </span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Knowledge is what happens when a neuron fires...</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">because it has found something significant in the world.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Nothing matters until something moves.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">All decisions are ionic in nature.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Everything matters when something moves.</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Knowledge is sometimes useful but always significant.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Neurons create knowledge,</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">but only some knowledge becomes information.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Wisdom is knowing that truth is a limit only asymptotically approached.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">While the left-brain struggles to find truth in its limited domain,</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">our right mind welcomes the wisdom of the whole.</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">In creating knowledge, neurons convert the physical into the mental.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The real into the ethereal.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">In cueing muscles, neurons reconvert the mental into the physical.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The ethereal into the real.</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Nature has evolved a million tricks in the form of a thousand creatures.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The neuron, brain, and mind are all multifaceted in many aspects.</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Neurons create knowledge.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Knowledge is expressed as ionic migration releasing bits of chemistry.</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">The brain is a Gordian Knot of connection,</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">a bundle of ionic fibers looping with the world and itself.</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Experience of the world creates a storm of ionic signals...</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">which form a</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">convergent</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">collection of</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">competing</span><span style="font-family: Arial; font-size: 21.3333px; font-weight: 700; white-space: pre-wrap;"> and</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">cooperating</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-weight: 700; white-space: pre-wrap;">cues</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">and</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">scripts</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">yielding</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">behavior</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">mediated</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">by,</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">and marinating</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">within</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">a</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">cauldron</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">of</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">chemistry</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">called</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">our</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">skull.</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Because we are still here.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Neurons create knowledge.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Knowledge cues scripts.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Scripts</span><span style="font-family: Arial; font-size: 21.3333px; font-weight: 700; white-space: pre-wrap;"> compete</span><span style="font-family: Arial; font-size: 16pt; font-weight: 700; white-space: pre-wrap;"> and</span><span style="font-family: Arial; font-size: 21.3333px; font-weight: 700; white-space: pre-wrap;"> cooperate</span><span style="font-family: Arial; font-size: 16pt; font-weight: 700; white-space: pre-wrap;"> to evolve...</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-weight: 700; white-space: pre-wrap;">Knowledge.</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">We are each a thousand creatures... </span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;"> who have evolved a million tricks...</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;"> over a billion years.</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">And one trick need not preclude another.</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt; text-align: center;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Or something like that...</span></p><br /><br /><br /><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Playing With Your Brain</span></p><br /><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">I hope you recognize most of these assertions by now. All of them have been presented multiple times and in multiple ways, each time with increasing specificity and speculation. You’ve just finished reading the final version in the form of bad poetry. You might note that it contains no "certainty". </span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Hopefully, at least a few have challenged your sensibilities. That was the objective. They were meant to make you wonder. Many of these observations are not mine. They are versions of things I’ve encountered from others. For most, I haven’t provided references. The objective is not to build a credible structure of science but to unleash intuition. With some, I’ve taken a great deal of liberty. And that's the point. The objective was to suspend disbelief and treat these concepts as an exercise in art. Play with them as you will. Think in terms of finger painting when you were in kindergarten. Get messy. There will be plenty of time for science later on.</span></p><br /><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">Thank you for reading. I will continue to evolve these posts so follow this blog. Let me know what you think, so I can think about what you’ve come to know.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;">The end.</span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14pt; font-variant-east-asian: normal; font-variant-numeric: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt;"><br /></p><p dir="ltr" style="line-height: 1.656; margin-bottom: 0pt; margin-top: 0pt;"><br /></p></span>Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.com0tag:blogger.com,1999:blog-23742979.post-27361377306826920482022-04-12T15:44:00.022-07:002022-10-05T08:57:25.240-07:00What Greta Thunberg Should Be Concerned About<div><br /></div><div><br /></div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi_QS-lchdzvzVlzX2aXEJ1Qg82lnxB36NU2iFNtIMJ87K5W5KU0709X-geWyhaAzE8zH1sF43n81aR3TlgERCvgbrF38CzOCF-rjvJnqr8jjq7SG6E4stXotEPZMyr9Fja27EXiEkC2Y8xugZtJ8xn89KV3yWt6DRetp-eTMiIDNWrOPyk96Q/s1238/Greta_Thunberg,_11.10.2019_(cropped).jpg" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" data-original-height="1238" data-original-width="875" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi_QS-lchdzvzVlzX2aXEJ1Qg82lnxB36NU2iFNtIMJ87K5W5KU0709X-geWyhaAzE8zH1sF43n81aR3TlgERCvgbrF38CzOCF-rjvJnqr8jjq7SG6E4stXotEPZMyr9Fja27EXiEkC2Y8xugZtJ8xn89KV3yWt6DRetp-eTMiIDNWrOPyk96Q/w453-h640/Greta_Thunberg,_11.10.2019_(cropped).jpg" width="453" /></a></div><br />What is the greatest threat to human life?<br />
<br />Disease?<div><br />Supervolcano?<br />
<br />Earthquake?</div><div><br />Uncontrolled fire?<br /><br />Asteroid impact?<br />
<br />
I'm not sure, but it's certainly not Global Warming, Climate Change, or whatever it's being called this week. Without getting into the debate, I've been down that rabbit hole a few times and my conclusion is far less certain than Gret's on this topic. </div><div><br /></div><div>But there's a far scarier, and yes, even more, likely threat to her generation, and perhaps even yours. It too involves water, just a lot more of it in a shorter period of time. I'm of course talking about a tsunami, but on a whole different scale than anything we've so far seen in the news.<br />
<br />
You only need to do a bit of research to understand that man has been dealing with rising (and falling) sea levels all around the world for thousands of years. And quite successfully. Notwithstanding Venice's current challenge, civilization has adapted to gradual sea-level rise simply by moving farther inland or building seawalls. It's not that big of a deal in the long term. It's when the water rises quickly that nature presents its greatest challenge.<br />
<br />
When I first saw the television coverage on Boxing Day in 2004, and the announcer said hundreds had died, I knew those estimates were missing a few zeros. These first news feeds were only from the resort areas. Damage and death would be different at each one, but what about the thousands of miles of coastline not yet filmed? Most of the damage was not even seen for weeks or in some cases months. That's why the final estimate of death was 238,000 when the first reports were only for a few hundred deaths. But what if the Boxing Day disaster was only a taste of what might happen?</div><div><br /></div><div>Man likes to build near the ocean. And you can't outrun a tsunami. In some cases, even with plenty of warning. Here are the simple facts:<br />
<br />
Half of humanity lives below 500 feet elevation.<br />
<br />
Most of those people can't move as quickly as a tsunami.<br />
<br />
Tsunamis have multiple causes, some far more probable than you might realize:</div><div><br /></div><div><b><a href="https://en.wikipedia.org/wiki/Lituya_Bay#1958_megatsunami" target="_blank">1700 foot tsunami at Lituya Bay, Alaska</a></b></div><div><br /></div><div><br /></div><div><b><a href="https://www.sciencemag.org/news/2021/07/giant-tsunami-dino-killing-asteroid-impact-revealed-fossilized-megaripples" target="_blank">Giant 4500 foot tsunami from dino-killing asteroid</a></b></div><div><br /></div><br /><b><a href="https://www.sciencealert.com/ferocious-super-quake-we-never-knew-about-sent-humans-into-hiding-for-1-000-years">An Atacama Super-Quake We Never Knew About Sent Humans Into Hiding For 1,000 Years</a></b><div><br /></div><div><br /></div><b><a href="https://www.e-education.psu.edu/earth107/node/1609">Canary Island Landslides and Potential Megatsunami</a></b><br /><div><br /></div><div>This last example is being debated as there is some evidence that gravel from a prior event has been found distributed all the way across the Flordia peninsula. </div><br /><br />10-05-22 <a href="https://scitechdaily.com/dinosaur-killing-asteroid-triggered-monstrous-global-tsunami-with-mile-high-waves/" target="_blank">Dinosaur-Killing Asteroid Triggered Monstrous Global Tsunami With Mile-High Waves</a>Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.com0tag:blogger.com,1999:blog-23742979.post-63012996568806515992022-04-03T06:11:00.003-07:002022-05-18T10:33:22.400-07:00Car and Phone Design Abuse<div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhVceAKYAIyUORPwfzkg0oPPTG5AxNaQdRQM4qU0I1acVWM-in4qHg5qA9QxdAim2saWp_jsRPy2ktqU-utr1rscEtq0FyZL129k979QD-VIyGGAzSiuTBJ2lLjWoWlYpDtnh2G-vHWZhXsfsJYWAPS_yvmUHUWvdVibT5TwOo94afPY2uE_NU/s640/Samsung_Galaxy_S2_shattered_screen.jpg" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><br /></a><div class="separator" style="clear: both;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgDw8YRcf-htw-B9hwyxq0lRb5luPb-z3zWTru_XqkrekE60c3ceCtK5w_MJlIONoa6OdDXyXA3Bm-l2fppCzTtTU3fyL8OdTZYyQjTvT9B5pOuZg1XH8A3_WyOai_Wzo6D6y0EM733j9oD5fYLA587DoYAm0-Fbtg6VRvczP4sXXEDidko7GA/s675/Classic-Cadillac-Tail-Fins-From-1948-to-1965.jpg" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" data-original-height="584" data-original-width="675" height="277" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgDw8YRcf-htw-B9hwyxq0lRb5luPb-z3zWTru_XqkrekE60c3ceCtK5w_MJlIONoa6OdDXyXA3Bm-l2fppCzTtTU3fyL8OdTZYyQjTvT9B5pOuZg1XH8A3_WyOai_Wzo6D6y0EM733j9oD5fYLA587DoYAm0-Fbtg6VRvczP4sXXEDidko7GA/s320/Classic-Cadillac-Tail-Fins-From-1948-to-1965.jpg" width="320" /></a></div><br style="text-align: left;" /><br /></div></div><span style="font-family: arial; font-size: large;">It was once said that Detroit updated the shape of American cars every couple of years so that others could immediately tell how old of a car you were driving. </span><div><span style="font-family: arial; font-size: large;"><br /></span></div><div><span style="font-family: arial; font-size: large;">This was in contrast with German car companies which tended to keep an effective design with only minor changes. The VW bug looked the same for decades while </span><span style="font-size: large;"><span style="font-family: arial;">the shape of</span><span style="font-family: arial;"> Cadillac</span><span style="font-family: arial;"> </span><span style="font-family: arial;">tail-fins</span><span style="font-family: arial;"> changed every other year. This was apparently done to make the car look dated and out of fashion thus encouraging customers to purchase a new car more often</span></span><span style="font-family: arial; font-size: x-large;">.</span></div><div><span style="font-family: arial; font-size: x-large;"><br /></span></div><div><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhVceAKYAIyUORPwfzkg0oPPTG5AxNaQdRQM4qU0I1acVWM-in4qHg5qA9QxdAim2saWp_jsRPy2ktqU-utr1rscEtq0FyZL129k979QD-VIyGGAzSiuTBJ2lLjWoWlYpDtnh2G-vHWZhXsfsJYWAPS_yvmUHUWvdVibT5TwOo94afPY2uE_NU/s640/Samsung_Galaxy_S2_shattered_screen.jpg" style="clear: left; margin-bottom: 1em; margin-right: 1em; text-align: center;"><img border="0" data-original-height="348" data-original-width="640" height="217" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhVceAKYAIyUORPwfzkg0oPPTG5AxNaQdRQM4qU0I1acVWM-in4qHg5qA9QxdAim2saWp_jsRPy2ktqU-utr1rscEtq0FyZL129k979QD-VIyGGAzSiuTBJ2lLjWoWlYpDtnh2G-vHWZhXsfsJYWAPS_yvmUHUWvdVibT5TwOo94afPY2uE_NU/w400-h217/Samsung_Galaxy_S2_shattered_screen.jpg" title="This image was originally posted to Flickr by ashwin kumar at https://flickr.com/photos/34501870@N00/8082053547" width="400" /></a></div><div><br /></div><div><div><span style="font-family: arial; font-size: large;">A similar manipulation is currently happening with cell phones. </span></div><div><span style="font-family: arial; font-size: large;"><br /></span></div><div><span style="font-family: arial; font-size: large;">While they worked to make the glass face more scratch-resistant, they still break all too often. You see the result in public as most people continue to use these cracked phones.</span></div><div><span style="font-family: arial; font-size: large;"><br /></span></div><div><span style="font-size: large;"><span style="font-family: arial;">Even worse, phone designers have decided to wrap the entire phone in this glass material instead of more durable plastic, again, in the name of some fashion. I believe the real reason is to encourage their customers to buy a new phone every couple of years. This new glass-back design has resulted in a phone that's not only as slippery</span><span style="font-family: arial;"> </span><span style="font-family: arial;">as a bar of wet soap, but also</span><span style="font-family: arial;"> as fragile as, literally, glass. It's like trying to hold on to a live fish.</span></span></div><div><span style="font-family: arial; font-size: large;"><br /></span></div><div><span style="font-family: arial; font-size: large;">Years ago, the Japanese company Sharp sold a clam-shell calculator made of durable plastic that had a surface that was soft and easy to grip. Dropping them was rare. </span></div><div><span style="font-family: arial; font-size: large;"><br /></span></div><div><span style="font-family: arial; font-size: large;">Unfortunately, I guess we didn't consume enough of them.</span></div></div><div><span style="font-family: arial; font-size: large;"><br /></span></div><div><span style="font-family: arial; font-size: large;">And while I'm at it here are a couple of other "enhancements" to cell phones that have turned out to be worse than what they replaced.</span></div><div><span style="font-family: arial; font-size: large;"><br /></span></div><div><span style="font-family: arial; font-size: large;">Wrap-around screens - why have phone designers become so obsessed with getting rid of borders? Frames for pictures have a reason. They keep the picture from blending into the background, whatever that might be. It's the same for phones, but in a much more physical and controlling way. Those rounded screen edges make it difficult to use keyboards that go all the way to the edge. Those keys are hard to activate. At the same time, other unintended functions for various programs tend to activate when you don't want to. This fashion violates function!</span></div><div><span style="font-family: arial; font-size: large;"><br /></span></div><div><span style="font-family: arial; font-size: large;">Fingerprint Readers - why all the rush to put fingerprint readers in the screen when they worked so well on the back of the phone where you could find them by feel? Plus they seem to work far better than the screen versions. Then there's the problem of exactly where your finger goes when the screen is off. I'd take the Pixel 2 fingerprint reader over the new Pixel 6 any day.</span></div><div><span style="font-family: arial; font-size: large;"><br /></span></div><div><span style="font-family: arial; font-size: large;">Both of these "upgrades" are definitely downgrades in my opinion.</span></div><div><span style="font-family: arial; font-size: large;"><br /></span></div><div><span style="font-family: arial; font-size: large;"><br /></span></div>Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.com0tag:blogger.com,1999:blog-23742979.post-88386437822541070712022-02-25T13:36:00.003-08:002024-02-27T10:23:48.326-08:00The Ideal Phone / Watch / Tablet<p>02-27-24 <b><a href="https://techcrunch.com/2024/02/27/mwc-2024-motorolas-rollable-concept-phone-laughs-at-your-silly-foldable/?guccounter=1&guce_referrer=aHR0cHM6Ly9uZXdzLmdvb2dsZS5jb20v&guce_referrer_sig=AQAAALDv05sy85Aq0QOLtXKHD5oDU7MqpKm-3YlaEAaLw4o1N8jnOJ_5sndNwWvFjOYl0dnoY3Nj4lXux47StIZKN865BRUxgKpXg57A8Q-L3brdy1At_SLPiz89YR_EfwPQW_VjPt-93zS8z6r2fsVxAFjJ_nsbQEoyrUdXgEWRsAzz" target="_blank">And Two Years Later</a></b></p><p><br /></p><p>Instead of rolling the display out of the phone, roll it around your arm:</p><p><b><a href="https://youtu.be/NydMRmjuDF4">A rollable phone</a></b></p><p><a href="https://blogger.googleusercontent.com/img/a/AVvXsEhMFw5Q_QNvMfZjc9ZGaxl0t3LrT4oayHD0ME9g8xyeWEDrigqFv9kG_Klsvh3av8PfBM6BSzvu8Jx_IG0OdesV5cr898fi8m5NHPUUIA95SAePucWFIfHbDzMvhfO3SNRv8Oh0luOzz8R2UmWqLB7pAO8-mLKHW8MftedYhXvEmBYk0JVrtW0=s300" style="clear: left; display: inline; margin-bottom: 1em; margin-right: 1em; text-align: center;"><img border="0" data-original-height="300" data-original-width="300" height="640" src="https://blogger.googleusercontent.com/img/a/AVvXsEhMFw5Q_QNvMfZjc9ZGaxl0t3LrT4oayHD0ME9g8xyeWEDrigqFv9kG_Klsvh3av8PfBM6BSzvu8Jx_IG0OdesV5cr898fi8m5NHPUUIA95SAePucWFIfHbDzMvhfO3SNRv8Oh0luOzz8R2UmWqLB7pAO8-mLKHW8MftedYhXvEmBYk0JVrtW0=w691-h640" width="691" /></a></p><br /> This is a design I've been thinking about for years - what is the ultimate phone/watch/tablet?<p></p><p>Is it a communicator like on Star Trek? No - cell phones require both hands to use.</p><p>Is it a bigger phone? No - it wouldn't fit in your pocket, plus have the same drawbacks as a phone.</p><p>Is it a bigger watch? Perhaps, but how do you make it big enough to be useful as a smartphone or tablet?</p><p>The answer to this last question is a simple bit of mechanical magic which I first encountered handing off a "baton" while running relays with my teammates. This modern baton wasn't a stick at all; instead it was a long piece of stressed metal that opens in a straight fashion but will wrap around when struck against your arm. It's really a VERY simple device, but has an application not yet appreciated: </p><b><a href="https://www.amazon.com/gp/product/B07PZBS5W4/ref=as_li_tl?ie=UTF8&tag=inverse-81252728-20&camp=1789&creative=9325&linkCode=as2&creativeASIN=B07PZBS5W4&linkId=e73b389f434aac139ca44c264b77acd7" target="_blank">Reusable Erasable Wearable Silicone Memo Waterproof Wrist Band</a></b><br /><div><br /></div><div>I want to be clear; even though this is a link to an Amazon product, this is not an endorsement, nor do I care one way or the other if you buy one or not. The link is not monetized by me. I'm simply using the link because it presents what COULD be a new type of flexible display.</div><div><br /></div><div>Imagine a leather strap attached to the back of a metal device similar to the one from this link but 5 inches wide and only about 8 inches high, but still mechanically stressed in a similar fashion. Now roll it out flat and apply a display to its surface. In this form it would present a reasonably sized tablet.</div><div><br /></div><div>Next, strap it to your wrist like an old-fashioned wrist guard and add electronics from a smartphone. Now flip it down out of the way. When the phone rings, a horizontal strip across the center would light up showing who was calling and maybe even a photo. If you need more display space, simply reach around your wrist to flip up the section above, and possibly even the section below - your wrist becomes a flexible touch-sensitive 5 x 8 inch tablet able to do anything you might do with a normal tablet, cell phone or watch.</div><div><br /></div><div>You're welcome.</div><div><br /></div><div><b><a href="https://arstechnica.com/gadgets/2023/10/motorola-demos-smartphone-that-can-wrap-around-your-wrist-again/" target="_blank">Finally?</a></b></div>Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.com0tag:blogger.com,1999:blog-23742979.post-74497885551849599302020-07-07T06:30:00.033-07:002023-02-03T04:56:47.853-08:00COVID-19 - the Zendemic Wrapped in Toilet Paper<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi56K5XXlI2jfSf5VhRw74BorOB4eeenYwUFQ8LcPcbu085M7z5NViy29UgRerHm8mMWbaNdvJB5HY2wDjwglegUcSmGSO5dbus5lKd1m8z0h8KsAI_hiBPTQzK-Rf-a4MtaBtJ-g/s1600/1280px-Rotolini.jpg" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" data-original-height="1280" data-original-width="1280" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi56K5XXlI2jfSf5VhRw74BorOB4eeenYwUFQ8LcPcbu085M7z5NViy29UgRerHm8mMWbaNdvJB5HY2wDjwglegUcSmGSO5dbus5lKd1m8z0h8KsAI_hiBPTQzK-Rf-a4MtaBtJ-g/s640/1280px-Rotolini.jpg" width="640" /></a></div>
<br />
<div>
<br />
<br /></div>
<div>
Here's a radical idea. I want it in the public record as of March 14th, 2020.<br />
<div>
<br /></div>
<div>
What if COVID-19 is actually not as deadly as it seems?<br />
<br />
What if we're undercounting the actual cases, and over attributing the deaths?</div>
<div>
<br /></div>
<div>
OK, let me put that a little differently. What if COVID-19 on its own only rarely causes death? I'm deadly serious. What if COVID-19 is actually a relatively mild biological challenge to a normal healthy human, similar to its cousin the common cold. OK, maybe a cold that kills some people. But would the numbers of diagnosed cases of COVID-19, and the deaths "caused" by it, look any different than they do now?<br />
<br />
<div>
As obvious evidence, why such inconsistent numbers from country to country? The disease is the same. The variance must reflect standards for data capture or perhaps the demographics or lifestyle of the patients. Let's address data first.<br />
<br />
When and why does a COVID case become a case? While we're questioning, where are the useful comparative data? Out of a thousand people without confounding issues, how many will die per age group? And why do we have no random sample control groups to track overall transmission rates and deaths instead of guessing about the denominator? This denominator problem is best understood by the difference in the death rate between China (9.9%) and Korea (0.7%). Korea did more testing and so have a more useful denominator. Another way of looking at it is that China only tested those who already had a serious case. They didn't bother to test mild cases. This lack of testing is happening in the U.S. as well. At least so far. I realize there's a priority for tests being used to track individual contact and transmission, but a baseline of periodically sampled control groups would be of great value in learning how the disease is evolving in a given population.<br />
<br />
Now for lifestyle and demographics. What is the general health of the population? Compare Italy (7.9%) and Germany (0.3%) percent. Are Germans that much healthier than the Italians? Or is this also confounded by the denominator issue in Italy? We'll know in the long term.<br />
<br />
Also, why does China now have so few new cases and deaths? Were they THAT good at stopping transmission? It's hard to believe China effectively isolated a hundred thousand from the other 1.4 billion. And did it without exception. If this disease is so contagious, China should be keeping its early lead in both cases and death. They obviously aren't. Or else they aren't reporting it.</div>
<div>
<br /></div>
<div>
</div>
<div>
Which brings us to this problem with the skewing of the death demographic (65 plus years old, immuno-challenged, etc.). This demographic is extraordinary for a deadly disease. But perhaps not for an ordinary cold. It's clear that most of the deaths are those over 65 years old, but what percent of 65 and older that contract COVID-19 die? Also, there is that lack of dying children. Why?<br />
<br />
Normally, about 150,000 people die around the world each day. That's about 20 people per day per million. The most convincing data will be when deaths exceed 20 people per day, per million. So far, COVID-19 has only added another 633 people per day in the United States, a rate of only abut two per day per million. Or is it even this high? How many of those 633 would have died from other causes within 24 hours? It seems that COVID-19 might be taking the blame for a normal death rate in a typical winter. Or at the very least, taking the blame for far more death than it deserves.</div>
<div>
<br /></div>
<div>
</div>
With nearly eight billion people, at any given moment there are thousands of people in the world on the edge of death. Sad but true. A simple cold or flu can push some of them over that edge. What then is the cause of that death? Their pre-existing condition? Or the most recently diagnosed cold or flu? What if this COVID-19 event is largely an attribution artifact? What if they simply die a bit earlier of additional COVID-19 biological stress. Which disease or chronic condition should get the credit?<br />
<div>
<br /></div>
<div>
</div>
If we didn't know that the COVID-19 virus existed, would these deaths be blamed on other causes? Would they even be seen as abnormal? Is COVID-19 simply an artifact of an improved technical ability to measure a new disease? And to publish the results in the media instantly?<br />
<div>
<div>
<br /></div>
<div>
Then there is the toilet paper thing. If you haven't realized it yet, there is no "real" shortage of toilet paper, just people hoarding it. It's a self-fulfilling prophecy. This happened once in the 1970s. I remember it well. The same thing happened with gasoline at about the same time. Here's why it matters. </div>
<div>
<br /></div>
<div>
A run on TP is similar to a run on medical services. If you hear that there is a new disease, you might be just a bit more likely to go to the hospital and get tested. When the result is positive - boom - they isolate you and fill up a bed. Soon someone else comes in with a positive test and our hypersensitive medical system responds. Even a small shift in demand can overwhelm this medical system. Soon the hospital's full and there's an "epidemic". Hyper-analysis of this epidemic will find a correlation with whatever version of cold or flu that happens to have emerged during the season. In this case, that disease might be COVID-19. And the media runs with it. Panic ensues.<br />
<br />
Is COVID-19 the first actual media disease not unlike this run on toilet paper?<br />
<br />
If so, this Zendemic will resolve quickly, no more than a few weeks. Otherwise, deaths will exceed the typical 150,000 per day for months on end. So far it hasn't, but we will know soon.</div>
<div>
<br />
Habeas corpus.<br />
<br />
<br />
<br />
<br />
03-18-20 <a href="https://www.vox.com/2020/3/12/21173783/coronavirus-death-age-covid-19-elderly-seniors"><b>The picture is becoming more clear.</b></a><br />
<br />
03-25-20 <b><a href="https://www.theguardian.com/world/2020/mar/24/what-is-coronavirus-mortality-rate-covid-19">What is coronavirus – and what is the mortality rate?</a></b><br />
<br />
The above article finally addresses some of the questions I presented above. Well, sort of. For instance, I noted and questioned, "It's clear that most of the deaths are those over 65 years old, but what percent of 65 and older that contract COVID-19 die?"<br />
<br />
Though I didn't use 80 years old to define my question, that age nicely frames the issue and makes my point. I might have said 90 percent of those that die are over 80 years old, but what percent of 80 and older that contract COVID-19 die?<br />
<br />
Their answer - 10%.<br />
<br />
So if COVID-19 could be exposed to all 80-year-olds (which is impossible), how many would die? Google says three million. Normally about 300,000 will die each year (linear rate). That is a useful baseline, and also the estimate The Guardian makes for COVID-19. Which was my original point. Of course, the final count could be greater, but not by orders of magnitude, and likely well under 50 percent greater.<br />
<br />
So the question becomes, how much do we economically impact eight billion people for any excess death over 300,000?<br />
<br />
Actually, I think this has been a good test run for a bug ten or a thousand times worse that may occur next year. Or the year after that. But not yet. COVID-19 is not the black plague. Not even close.<br />
<br />
03-27-20 I posted the following to a friend's Facebook feed:<br />
<br />
<span face="" style="background-color: #f2f3f5; color: #1c1e21; font-size: 13px;"><span style="font-family: inherit;">Justin is correct. The numbers of deaths in America so far attributed to COVID-19 are so low they get lost in the noise of the typical death rate caused by respiratory failure which is around 500 per day in America, or 1.5 deaths per day per million. B</span></span><span face="" style="background-color: #f2f3f5; color: #1c1e21; font-size: 13px;"><span style="font-family: inherit;"><span style="font-family: inherit;">ut that's just the view from the top and ignores the denominator problem - how many died per day per what size population? Even though this COVID-19 has been declared a pandemic, it remains epicentric, meaning most of the deaths occur in hot spots like Wuhan, Milan and New York. What is the size of each of those exposed populations? We don't have good numbers yet, but we can use China as an example. As shown there, the ultimate impact will be far less than the media currently suggests. So far, the sky is not actually falling, and is unlikely to do so.</span></span></span><br />
<div>
<br /></div>
03-29-20 another Facebook comment:</div>
</div>
</div>
</div>
<div>
<br /></div>
<div>
<span face="" style="background-color: #f2f3f5; color: #1c1e21; font-size: 13px; white-space: pre-wrap;">Bruce, over 6000 people in America die every DAY for one reason or another. That's 180,000 in the month or so that we've been keeping count of COVID-19. Now, many of those deaths are from accidents, etc. but a large number are from chronic conditions, many of which, are conflated with Corvid-19 because that is the current proximate cause of death. In only a few of your 2043 cases is COVID-19 the clear and direct cause of death. In 2009 hundreds of thousands died from swine flu. Or did they? Like COVID-19, many of those deaths had respiratory and other comorbid factors as well. Yes, it's sad, but the reality is, various diseases ripple through our population each winter bringing early death to hundreds of thousands that might have lived a few more days, weeks or months. Only a small minority would live for years longer. I'm not suggesting that Covid19 isn't deadly and we of course should try to avoid its spread, or at least slow it down. I think this exercise is good practice for when we get a really bad bug like Ebola, but let's try to keep these numbers (and causes of death) in context. So far this is no worse than a bad flu, and if China, South Korea, and Germany are useful examples, it will end about the same way within a few weeks. "If you can keep your head when all about you are losing theirs...yours is the Earth and everything that's in it..." Rudyard Kipling.</span></div>
<div>
<br /></div>
<div>
03-30-20 <b><a href="https://www.nbcnews.com/news/us-news/dr-deborah-birx-predicts-200-000-deaths-if-we-do-n1171876?fbclid=IwAR3XW8jhNRrjJMfjEtqPKdbm593nl2vW1T2TdUHAURO92TCIyvoI8iSrrYg">For reference: 200,000 to 2 million deaths in America is conventional wisdom.</a></b></div>
<div>
<br />
03-31-20 <b><a href="https://off-guardian.org/2020/03/24/12-experts-questioning-the-coronavirus-panic/?__cf_chl_jschl_tk__=6dc4d62a039a45ec2e0e91d18df029b20d9545f4-1585674083-0-AUJPNXsO5BwkRlzpA-wRJZQ-uPkVFJqJxOUEDEN9but5UtXxtlChfcXA8nJSTQBXLvB1Mn6od8GIBLowNP_sfLjvce8gLIi_5MaRTUwf6kdTZKTKmhWvbeWbrTb7RLzumzohWYpTzdAkCrFCTAEo7-V0VMIrPNjvCp3eB7-2wiQjX0kYGP3K6d31tJMR4Do33crjMGCxrbLDOAy331xMjgKeyX3v4h3wJdHwFC5ow3Pko_btUJVJ5XvDCFPjnxYXGdHeloo29sNpjSZTaJUC7TlkygzqZil-lXyvXLymo9AvTwPYSL9rth3eFr32YsKM4WiThzNyDgGW3Cnw5oK1SMSAdd81my_adswyLYYpgUB-">12 Experts Questioning the Coronavirus Panic</a></b><br />
<br />
04-02-20 <b><a href="https://www.thecollegefix.com/bulletin-board/cdc-advisor-says-real-fatality-rate-of-covid-19-is-too-low-to-justify-drastic-crackdowns/?fbclid=IwAR1Hci0BxAheGM9IByzgODgyNsCv9hrvrTinWTq13PiBYMYynNR1TXwAOm4">CDC advisor says ‘real’ fatality rate of COVID-19 is too low to justify ‘drastic crackdowns’</a></b><br />
<br />
Will unemployment, domestic violence, murder, and suicide have greater negative social impact than the COVID-19 disease itself? Stats to follow as they become available.<br />
<br />
<b><a href="https://www.theguardian.com/society/2020/mar/28/lockdowns-world-rise-domestic-violence">Lockdowns around the world bring rise in domestic violence</a></b><br />
<br />
<b><a href="https://www.reuters.com/article/us-health-coronavirus-new-orleans/why-is-new-orleans-coronavirus-death-rate-seven-times-new-yorks-obesity-is-a-factor-idUSKBN21K1B0">Why is New Orleans' coronavirus death rate nearly three times New York's? Obesity is a factor</a></b></div>
<div>
<br />
<br />
The test described below may well be the turning point in this biological mystery. Sure, the Abbott ID NOW test is quick and simple. It will be used a lot, but more importantly and for the first time, there will be the ability to do large random sample testing over various large populations. This should solve the "denominator" problem. With that information at hand, analysis and local triage and isolation become manageable. The rest is just implementation. Check it out:<br />
<br /></div>
<b><a href="https://www.chicagobusiness.com/opinion/why-abbotts-5-minute-covid-test-could-be-game-changer">Why Abbott's 5-minute COVID test could be a game-changer</a></b><br />
<div>
<br /></div>
<div>
04-03-20 <b><a href="https://www.worldometers.info/coronavirus/">Corona WorldOMeter</a></b><br />
<br />
Look carefully at the curves for each country (or even the world as a whole). These curves are not geometric (becoming ever steeper). Instead, they are flattening. These are pretty typical two-dimensional propagation curves. They are like a forest fire that only burns the weaker trees. This bug is harvesting those with significant comorbid factors in their health.<br />
<br />
Yes, some are dying weeks or months before they might have, but most would have died sometime this year. COVID-19 will ultimately kill about the same number as the flu does each year, and in many cases, the very same people. Their death will just be attributed to a different disease. This event is more about a panicked media than a biological challenge. Callous? Of course. But with increasing reports of bankruptcies, domestic abuse, murder, and suicide, there is serious doubt about this disease being worse than the cure. Still, it's a useful dress rehearsal for a much worse bug in the future, and much good will ultimately come from this event.<br />
<br />
04-06-20 <b><a href="https://reason.com/2020/04/06/u-s-fever-trends-suggest-covid-19-rates-could-soon-decline/?utm_medium=email">Fever Map Indicates Dramatic Drop in Temperature</a></b><br />
<br />
<b><a href="https://healthweather.us/?mode=Atypical">Kinsa Source Health Map</a></b><br />
<br />
After working with this map for a while I've jumped to the conclusion that this may be the most useful data so far about this whole COVID-19 issue. OK, temperature spikes do not equal COVID-19, but when these spikes correlate to jurisdictions with spiking COVID-19 cases and deaths, probability shifts dramatically in the favor that these temperature spikes ARE caused by COVID-19. If this is true then we should see not just a flattening of the curve, but a dramatic drop in new cases within days, or at most a very few weeks.<br />
<br />
<b><a href="https://neurosciencenews.com/coronavirus-deadly-body-16080/">How COVID-19 affects humans</a></b><br />
<br />
04-07-20 It appears in the graph below that about 200 deaths per day in the U.S. may have been misattributed to COVID-19 instead of all other causes of pneumonia. Of course, this is only one of the many comorbidity factors widely associated with this pandemic. If the other factors are added in, it might account for most of the current 1400 deaths per day, except for the geographical distribution of the dead. They are not evenly distributed across the population. They are epicentric in nature, especially in NY and NJ. Yes, there is misattribution but it likely only accounts for a fraction of the cases. The rest must be from the direct biological impact of COVID-19. It IS a real disease? We just don't yet have it well characterized.<br />
<br />
04-09-20 <a href="https://nypost.com/2020/04/07/feds-classify-all-coronavirus-patient-deaths-as-covid-19-deaths/?utm_source=facebook_sitebuttons&utm_medium=site%20buttons&utm_campaign=site%20buttons">Misattribution?</a><br />
<br />
04-13-20 Shit may be the breakthrough we need to solve the denominator problem. Not familiar with the issue? The Worldmeter currently says we have 1,872,825 cases of Coronavirus which has caused 116,037 deaths worldwide. That's a death rate of over six percent, which is patently absurd. If this disease is really killing six percent of those who contract it, it is three times worse than the Spanish Flu, and that's simply not the case. There must be FAR more cases than have been documented. That would change the denominator in the death rate. This work with sewer sludge may ironically clarify our understanding. Next, we need to take on the misattribution issue, and the real scale of the Corona threat will come into focus:<br />
<br />
<b><a href="https://www.statnews.com/2020/04/07/new-research-wastewater-community-spread-covid-19/">New research examines wastewater to detect community spread of Covid-19</a></b></div>
<div>
<br /></div>
<div>
OK, I want to be clear. Corona IS a deadly disease, but only by degrees, and with extremely disproportionate targets. Here is a subgroup I just read about. It is rest home in New Jersey with about 700 rooms which means they have a staff of about 70 per industry average. At this home, 70 residents and two nurses died with a positive Corona test. That sample is consistent with the Diamond Princess - 700 tested positive and 10 deaths, many of them were older passengers. In both cases, 10% of the elderly and 2% of the younger (but perhaps not completely healthy) died. This data is a place to start.<br />
<br />
04-19-20 <b><a href="https://medium.com/the-atlantic/a-new-statistic-reveals-why-americas-covid-19-numbers-are-flat-82e9b600551f">A New Statistic Reveals Why America’s COVID-19 Numbers Are Flat</a></b><br />
<br />
"According to the Tracking Project’s figures, nearly one in five people who get tested for the coronavirus in the United States is found to have it. In other words, the country has what is called a “test-positivity rate” of nearly 20 percent."<br />
This is the first decent "denominator" data I've seen so far. If this 20% number is correct, then Corona with currently 39,000 deaths in the United States has a death rate of 0.06 percent, which is less deadly than the typical flu. Then you have the issue of the under counting because of lack of tests, and the misattribution issue which would effectively over count the dead. All of this new data seems to be homing in on my original assertion that Corona is not nearly as deadly as the media has presented.<br />
<br />
04-21-20 Both Santa Clara and Los Angeles counties now have studies showing that from 20 to 80 TIMES more people have positive Corona antibody tests which are consistent with the "Tracking Project" above. Again, this would mean Corona's CFR is comparable to a typical flu. Where are the numbers from the rest of the country? And why is this topic not being addressed in the daily briefing?<br />
<br />
<b><a href="https://patch.com/california/los-angeles/hundreds-thousands-la-infected-coronavirus-study">Hundreds Of Thousands In LA Infected With Coronavirus: Study</a></b><br />
<br />
04-23-20 Governor Cuomo just announced that 21% of New York city has a positive antibody test for Corona, yet does not acknowledge what this means for the CFR. Our government has been grossly negligent in managing this "pandemic" and its metrics.<br />
<br />
04-26-20 Santa Clara, Los Angles, New York, and Miami are all reporting positive antibody tests many TIMES in excess of reported diagnosed cases. Actually more than an order of magnitude greater than reality. It's time to reassess the nature of Corona.<br />
<br />
<b><a href="https://www.miamiherald.com/news/coronavirus/article242260406.html">Miami Joins the Crowd</a></b><br />
<br />
I won't bother with the bug's official name, COVID-19 anymore. This disease has had so much world wide impact that it will forever be known as the Corona panic. Or perhaps ultimately known as a the media disease instead of a biological one. Like the disease itself, the published perception has been FAR worse than the reality - somewhere between one and two orders of magnitude. Even if this media impact was mostly not deadly (suicide and murder stats will likely show an increase), the financial costs will be enormous, perhaps incalculable.<br />
<br />
As for the disease itself, if these antibody test are ultimately validated, and as I originally suggested at the beginning of this post, Corona will not be remembered as a deadly disease, at least not in the same terms as Ebola or HIV, and certainly not on the scale originally feared. It will take years to sort out the misattribution to even discover Corona's true death rate.<br />
<br />
Also, the cost of the "cure" will far exceed the social impact of the relatively modest number of dead. In terms of death rate, Corona will likely fall somewhere between an average flu and perhaps, the swine flu, but in its ability to spread, it will be closer to the common cold which is far greater than the flu.<br />
<br />
I will let others with far more knowledge present the details, but it's safe to say that our media and government response to Corona can not be rationalized nor supported when Corona is ultimately compared to the Flu and our historical response to that and other diseases. Still, this panic response has been quite informing and an interesting exercise, even though VERY expensive.<br />
<br />
<br />
05-04-20 Here's another useful approach to understanding the true impact of Corona:<br />
<br />
<b><a href="https://www.cdc.gov/nchs/nvss/vsrr/covid19/excess_deaths.htm">Excess Deaths Associated with COVID-19</a></b><br />
<br />
<br />
05-05-20<br />
<div>
<br /></div>
<div>
<div>
Here's a very quick summary. Corona is both a deadly disease and a common cold, each by degrees, and dependent upon conditions once two issues are defined - the denominator problem and misattritubtion of the causes of death, both over and under stated for various reasons.<br />
<br />
So far it appears that somewhere between two and twenty percent of America has antibodies for this disease. A good guess might be about 30 million Americans have already contracted and survived this disease. And that's its most important metric. It means this disease is not very deadly, perhaps not much worse than a bad case of the flu. Characterizing it's mild cases should be relatively easy. Understanding how it kills could be much more difficult as most of those deaths are mired in comorbidities and teasing apart cause from correlation will be difficult. With 30% of the deaths occurring in rest homes, Corona will soon be largely managed as another disease of the elderly, while most of the world gets back to work.<br />
<br />
The important question is, how much deeper than 10% will this disease penetrate the U.S. population? And how many more will die before this immune base begins to impact the transmission rate?<br />
<br /></div>
<div>
05-26-20 Misattribution remains a mess:<br />
<br />
<b><a href="https://justthenews.com/politics-policy/coronavirus/how-alcohol-poisoning-led-colorado-change-how-it-reports-coronavirus?utm_source=justthenews.com&utm_medium=feed&utm_campaign=external-news-aggregators">Beating Up the Numbers:</a></b><br />
<br />
One example of misattribution:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgFo2Zqy7x7gK8Vbhkuln2QrFXcd9UTDlxz41Qpvd1MSLhuSfCueqR15n3LsZGe72PPzroTuZXqCKeVU3Jv6pgcQR4JjRDWdUou0QRbR1E56Q5LZPoD3JIXB4mmu4JkscVisuxkSg/s1600/Pneumonia+deaths+plunge+during+COVID+19.jpg" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" data-original-height="654" data-original-width="901" height="464" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgFo2Zqy7x7gK8Vbhkuln2QrFXcd9UTDlxz41Qpvd1MSLhuSfCueqR15n3LsZGe72PPzroTuZXqCKeVU3Jv6pgcQR4JjRDWdUou0QRbR1E56Q5LZPoD3JIXB4mmu4JkscVisuxkSg/s640/Pneumonia+deaths+plunge+during+COVID+19.jpg" width="640" /></a></div>
<br />
<br />
06-01-20 It will take a while until we learn the truth of Corona but there are a few conclusions that can be drawn now:<br />
<br />
Likely beginning in late 2019 Americans began transmitting COVID-19 without even knowing it.<br />
<br />
By June 1st, 2020, between two to twenty percent of those living in large U.S. cities have contracted and recovered from Corona without ever knowing it. Somewhat less than one percent had symptoms acute enough to be tested. Approximately three-hundredths of this one percent died with a positive COVID-19 test. Some of these deaths were certainly caused by this deadly disease. Many others were not. There has been gross misattribution of the proximate cause of death in both directions. I believe that ultimately, COVID-19 will be seen to have been less lethal than the average flu. Only our response has been exceptional, and perhaps a good simulation for the real thing.<br />
<br />
Corona is now mostly a political issue.<br />
<br />
06-20-20 <b><a href="https://reason.com/2020/06/22/daily-covid-19-deaths-have-fallen-dramatically-since-april/?utm_medium=email">Daily COVID-19 Deaths in the U.S. Have Fallen Dramatically Since April</a></b><br />
<br />
06-22-20 <b><a href="https://justthenews.com/politics-policy/coronavirus/stanford-prof-median-infection-fatality-rate-coronavirus-those-under-70?utm_source=justthenews.com&utm_medium=feed&utm_campaign=external-news-aggregators">Stanford prof: Median infection fatality rate of coronavirus for those under 70 is just 0.04%</a></b><br />
<br />
06-24-20 <b><a href="https://www.erinbromage.com/post/the-risks-know-them-avoid-them">Transmission of disease - Erin Bromage</a></b><br />
<br />
Erin does not really deal with misattribution which would have a dramatic effect on mortality rates, but there is much good basic information here:<br />
<br />
06-25-20 <b><a href="https://www.erinbromage.com/post/where-we-are-now">Where Are We Now? - Erin Bromage</a></b><br />
<br />
07-07-20 As of today, COVID-19 has killed 538,933 worldwide of which 130,312 are in the United States. This is an interesting ratio in that one would expect outcomes to be above average in America because of better health resources. So why does 5% of the world's population have 24% of the death? Could it be misattribution of cause of death?<br />
<br />
Let's back out that 130,312 questionable deaths as bad data and apply the remaining deaths to the population of the rest of the world. This exercise gives us 408,621 deaths for 7,331,462,517 or about 56 deaths per million. Now it's true that misattribution could be understated in the rest of the world as well as overstated in America, but odds are not 4.8 times. Also, infection rates will vary widely, but will tend to average over such a large base. All things equal, the world number is more likely to be more accurate. And if we apply this death rate back to America, it would be 18,480 deaths to date, far less than the stated 130,312 and almost certainly more accurate.<br />
<br />
Finally, if we take our more probable infection rate of 10% instead of the 1% case infection rate, our denominator yields an infection fatality rate of 0.06%, about like a bad flu year. This is almost certainly a better assessment than we have gotten from the World Health Organization or the CDC.<br />
<br />
What is all the fuss about?<br />
<br />
08-31-20 <b><a href="https://www.cdc.gov/nchs/nvss/vsrr/covid19/excess_deaths.htm" target="_blank">CDC Excess Death Analysis</a></b><br />
<br />
10-10-20 <b><a href="https://www.youtube.com/watch?v=e4hrHAefWaY" target="_blank">JP Video on the NEW CDC Infection Fatality Rates per CDC</a></b></div></div><div><br /></div><div>Flu = 0.1% </div><div>COVID ages 0-19 = 0.00003%</div><div><div>COVID ages 20-49 = 0.0002%</div><div><div>COVID ages 50-69 = 0.005%</div><div><div>COVID ages 70 + = 0.054%</div><div><div>COVID ages 0-19 = 0.00003%</div><div><br /></div><div>10-10-20 <b><a href="https://www.cdc.gov/coronavirus/2019-ncov/hcp/planning-scenarios.html" target="_blank">Infection Fatality Rate CDC site</a></b></div><div><br /></div></div></div></div></div><div>10-25-20 <b><a href="https://news3lv.com/news/nation-world/cdc-reports-almost-300000-more-deaths-than" target="_blank">CDC Reports 300,000 More Deaths in 2020</a></b></div><div><br /></div><div>Now where are these extra deaths coming from?</div><div><br /></div><div>It seems like the source of these deaths is extremely disproportionate with those in assisted living and non-white which would indicate harvesting (pulling deaths forward) and/or racist or unhealthy lifestyles, or simple misattribution. COVID is not a normal virus or we are measuring something else entirely. The race factor could be a function of job loss, suicide, and domestic abuse with the groups hit hardest by the economic downturn because of income which could be the race factor:</div><div><br /></div><div><div>10-25-20 <b><a href="https://newatlas.com/health-wellbeing/cdc-300000-excess-deaths-usa-coronavirus-covid19/" target="_blank">CDC Reports 300,000 Excess Deaths</a></b></div><div><br /></div><div>11-02-20 <b><a href="https://www.timesofisrael.com/covid-19-antibodies-vanish-fast-that-doesnt-mean-mass-reinfection-looms/" target="_blank">Antibodies and what they mean</a></b></div><div><br /></div><div>Over 100,000 new cases of COVID in the United States in one day, with some states having "positivity rates" of 30 percent. This means that the disease is rippling out through the general population, but strangely enough, extremely few cases are ending in death or even hospitalization as a ratio of the number of cases. Indeed, the number of cases seems to now be completely decoupled from the number of deaths. In other words, cases are going up exponentially, but the number of deaths remains flat at about 1000 per day. This would indicate that the two are not connected at all, and maybe never were. To be generous, death is VERY loosely correlated with contracting COVID. </div><div><br /></div><div>This draws attention to those thousand who are dying each day. Are they really dying from COVID? Or is a thousand per day simply the limit of misattribution for a population of this size? In the long run, we will know.</div></div><div><br /></div><div>11-05-20 <b><a href="https://www.sciencealert.com/case-study-reveals-rare-patient-who-showed-no-symptoms-but-shed-infectious-sars-cov-2-for-70-days" target="_blank">Carrier Who Shed Virus For 70 Days</a></b></div><div><br /></div><div>It's past time to admit that COVID is not nearly as deadly as originally feared, not even within an order of magnitude. Maybe not even within two orders of magnitude of published worst-case scenarios. A large part of America has obviously now already had this disease, more than half of which were never even aware of it. Only a very small fraction of those with COVID required hospitalization, and only a much smaller fraction of those died. This fraction was so small that even determining the true cause of death for these relative few was difficult, and often inaccurate. It's time to post analyze all this data and put this disease in its place - a relatively minor threat to the world in 2020.</div><div><br /></div><div><br /></div><div>11-18-20 FINALLY! The general press is picking up on the denominator issue I first posted in the blog in March. Eight months isn't bad. Oh well.</div><div><br /></div><div><a href="https://reason.com/2020/11/18/a-new-prevalence-estimate-suggests-the-covid-19-infection-fatality-rate-in-texas-is-roughly-0-4-percent/?utm_medium=email" target="_blank">IFR - 0.4 Percent</a></div><div><br /></div><div>11-20-20 Misattribution? </div><div><br /></div><div>It's becoming clear that the key to understanding COVID is teasing apart which cases would normally be minor for COVID but coincidental or in some way enabled by biological changes brought on by COVID that dramatically impact the risk of existing comorbidities in a way that makes COVID a "harvesting" disease. In any case, COVID is not the threat to humanity presented by the media. It may be a weird disease with some even stranger preference for killing the weakest in our culture. Or perhaps COVID is a disease that rarely kills and what we have is just a bad case of mass hysteria and all of those deaths an artifact of misattribution, over, or mis-treatment brought on by fear. Yes, this is a radical thought, but COVID is a radically different kind of disease and so this idea needs to be seriously considered:</div><div><br /></div><b><a href="https://justthenews.com/politics-policy/health/death-man-who-fell-ladder-ruled-natural-caused-covid-19?utm_source=justthenews.com&utm_medium=feed&utm_campaign=external-news-aggregators" target="_blank">Death of man who fell off ladder ‘ruled as natural, caused by COVID-19′</a></b>
</div><div><br /></div><div><br /></div><div>11-26-20 Could the true number of COVID deaths actually be ZERO? Not surprisingly, this publication was deleted shortly after it was published, but remaining on the Way Back Machine:</div><div><br /></div><div><b><a href="https://web.archive.org/web/20201126223119/https://www.jhunewsletter.com/article/2020/11/a-closer-look-at-u-s-deaths-due-to-covid-19" target="_blank">A closer look at U.S. deaths due to COVID-19 from Johns Hopkins News-Letter</a></b></div><div><br /></div><div>11-27-20 Why is science being deleted instead of being challenged based on underlying data?</div><div><br /></div><div><b><a href="https://justthenews.com/politics-policy/coronavirus/johns-hopkins-published-then-deleted-study-questioning-us-coronavirus?utm_source=justthenews.com&utm_medium=feed&utm_campaign=external-news-aggregators" target="_blank">Johns Hopkins published then deleted an article questioning the U.S. coronavirus death rate</a></b></div><div><br /></div><div>12-03-20 If as presented by the post below, <b><a href="https://academic.oup.com/cid/advance-article/doi/10.1093/cid/ciaa1785/6012472" target="_blank">Up to two percent of the U.S. may have already had COVID by mid-December of 2019</a></b></div><div><br /></div><div>12-30-20 <b><a href="https://www.kusi.com/carl-demaio-calls-for-audit-of-covid-19-data-only-36-flu-cases-reported/" target="_blank">What Happened to the flu this year?</a></b></div><div><br /></div><div><br /></div><div>Total U.S. Deaths in 2019 - 2,854,838</div><div><br /></div><div><br /></div><div>03-08-21 When I first wrote this post, I believe that at the very least we'd get a useful dry run in case a much more deadly disease reaches our shores. Now, nearly a year later I'm far more concerned that our media and government have destroyed their credibility to the point where we may now be far LESS effective at stoping the next pandemic.</div><div><br /></div><div>05-27-21 <b><a href="https://justthenews.com/politics-policy/coronavirus/cdc-more-10000-covid-19-vaccine-breakthrough-cases-have-been-reported?utm_source=justthenews.com&utm_medium=feed&utm_campaign=external-news-aggregators" target="_blank">CDC: More than 10,000 COVID-19 vaccine breakthrough infections have been reported</a></b></div><div><br /></div><div>Still, no useful denominator, but these numbers are quite similar to the disease itself, demonstrating once again how lethal this disease wasn't. Hopefully, time and more data will enlighten us as to the true nature of this disease.</div><div><br /></div><div>07-29-21 - Without addressing nor endorsing his general political conclusions, I include Dr. Yeadon's contrast of COVID science compared to government policy is quite useful, especially his view on misattribution, lockdown, and masks:</div><div><br /></div><div><b><a href="https://www.bitchute.com/video/9Ci2jK1yFoOd/" target="_blank">Dr. Michael Yeadon Warning</a></b></div><div><br /></div><div><b><a href="https://tcn.video/dr-michael-yeadon-full-interview-the-awakening-3/" target="_blank">Another interview of Dr. Michael Yeadon</a></b></div><div><br /></div><div><br /></div><div>11-04-21 <b><a href="https://www.thelancet.com/journals/laninf/article/PIIS1473-3099(21)00648-4/fulltext" target="_blank">Covid Delta Transmission Lancet Study</a></b></div><div><br /></div><div>It appears from the above study that the vaccine does not decrease the probability of transmission because it has only minor improvements in viral load and window of transmission, and even those effects are short-lived.</div><div><br /></div><div>11-07-21 <a href="https://www.zerohedge.com/markets/italian-institute-health-drastically-reduces-its-official-covid-death-toll-number?utm_source=&utm_medium=email&utm_campaign=257" target="_blank"><b>Italian Institute Of Health Drastically Reduces Its Official COVID Death Toll Number by Over 97%</b></a></div><div><br /></div><div>The above is "fake news" in that there has been no official revision of the number of COVID deaths in Italy, only the admission that only 2.9% of COVID deaths had no comorbidities. Still, when is a COVID death not a COVID death? Why the special treatment of epidemiology of this particular bug?</div><div><br /></div><div><br /></div><div>02-19-22 - <b><a href="https://www.theguardian.com/world/2022/feb/17/us-excess-deaths-pandemic-cdc" target="_blank">COVID killed about a million people in two years.</a></b></div><div><br /></div><div>That's about 0.15 percent per year of excess death assuming no misattribution. That's about 40 times greater than the 6% fatality rate we were warned about. What would the death rate have been without any special measures taken? We should know in time. Were those lives worth the cost in dollars, and deaths of despair? We will likely never know that answer.</div><div><br /></div><div><br /></div><div>01-18-23 <b><a href="https://www.tabletmag.com/sections/arts-letters/articles/stanford-failed-academic-freedom-test" target="_blank">In Retrospect</a></b></div><div><br /></div><div>02-03-23 <b><a href="https://www.express.co.uk/comment/expresscomment/1729287/covid-19-infection-china-cover-up-conspiracy-USA-wuhan-laboratory-man-made-virus-weapon#amp-readmore-target" target="_blank">'Knew Covid had been engineered to make it infectious to humans but were told to shut-up'</a></b></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div>Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.com0tag:blogger.com,1999:blog-23742979.post-61048140390916147332020-05-05T07:01:00.016-07:002023-02-03T08:15:24.195-08:00April Fools!April Fools on April 20th? OK, I'm a bit late but it's no less valid. To be honest I wanted to publish this on April 1st but I didn't have any indication of the required data at the time. And to be honest, the joke would not have been well taken at the time. Now it might. Now I do:<br />
<br />
<b><a href="https://medium.com/the-atlantic/a-new-statistic-reveals-why-americas-covid-19-numbers-are-flat-82e9b600551f">A New Statistic Reveals Why America’s COVID-19 Numbers Are Flat</a></b><br />
<div>
<br /></div>
<div>
You don't need to read the whole article as it's mostly post rationalization. I'll post the critical paragraph:</div>
<br />
<b>"According to the Tracking Project’s figures, nearly one in five people who get tested for the coronavirus in the United States is found to have it. In other words, the country has what is called a “test-positivity rate” of nearly 20 percent."</b><br />
<div>
<br /></div>
<div>
If that 20 percent estimate ultimately turns out to be accurate, most of America (and much of the world) are April Fools. The reason is what I called the "denominator problem" in <a href="https://suddendisruption.blogspot.com/search/label/Covid-19"><b>my first post on this topic</b>.</a></div>
<div>
<br /></div>
<div>
Here's why in simple math. As of this morning according to WorldOMeter, 39,015 people in the United States have died due to Corona (I will no longer bother to dignify the bug with its proper name, as the name of a beer is more appropriate). This means that the actual death rate of those who contract the disease is 39,015 / 328,200,000 / 5 = 0.06 percent rounded UP. Please note, that is NOT six percent. The likely death rate of Corona is only six hundreds of ONE percent, or about twice as bad as the Swine Flu Epidemic of 2009. </div>
<div>
<br /></div>
<div>
"an overwhelming body of evidence shows that this is an undercount."</div>
<div>
<br /></div>
<div>
And THAT is an understatement.</div>
<div>
<br />
05-05-20<br />
<br /></div>
<div>
<div>
Here's a very quick recap. Corona is both a deadly disease and a common cold, each by degrees, and dependent upon conditions once two issues are defined - the denominator problem and misattritubtion of the causes of death, both over and under stated for various reasons.<br />
<br />
So far it appears that somewhere between two and twenty percent of America has antibodies for this disease. A good guess might be about 30 million Americans have already has this disease. And that's its most important metric. It means this disease is not very deadly, perhaps not much worse than a bad case of the flu. Characterizing it's mild cases should be relatively easy. Understanding how it kills could be much more difficult as most of those deaths are mired in comorbities and teasing apart cause from correlation will be difficult. With 30% of the deaths occurring in rest homes, Corona will soon be largely managed as another disease of the elderly, while most of the world gets back to work.</div><div><br /></div><div><br /></div>
<div>
<br /></div><div>02-03-23 I was obviously wrong about getting back to work. Here is the reason:</div><br /><a href="https://www.express.co.uk/comment/expresscomment/1729287/covid-19-infection-china-cover-up-conspiracy-USA-wuhan-laboratory-man-made-virus-weapon#amp-readmore-target">'<b>Knew Covid had been engineered to make it infectious to humans but were told to shut-up'</b></a>
</div><div><b><br /></b></div><div><b><br /></b></div><div><b><br /></b></div>
Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.com0tag:blogger.com,1999:blog-23742979.post-66249585898014845512020-03-29T06:03:00.006-07:002022-03-31T19:01:18.277-07:00Defy Aging - Keep Moving and Stay HungryFirst posted on my sixtieth (or is that fortieth?) birthday, 01-29-12. Updated periodically:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhxJePhwlSoilqOfUnYb6lwq_hIfBIgrrcLaORuo6nYyN9dxjRIk8e74gILczIpepDOie0W55ApB28NA9nUj2gmw5W6DGnbB0YNf6R4U__Ne82Dtkeulu7xQOmi06hIiAmdIg3Pxw/s1600/Mimi+Kirk.jpg" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" height="436" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhxJePhwlSoilqOfUnYb6lwq_hIfBIgrrcLaORuo6nYyN9dxjRIk8e74gILczIpepDOie0W55ApB28NA9nUj2gmw5W6DGnbB0YNf6R4U__Ne82Dtkeulu7xQOmi06hIiAmdIg3Pxw/s640/Mimi+Kirk.jpg" width="640" /></a></div>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
60 is the new 40. 50 is the new 30. I've even seen a proclamation that 80 is the new 30!<br />
<br />
<div>
Such declarations about this new "age" can be seen everywhere. Are they simply age denial? Do baby-boomers refuse to grow old? And is this denial just a way to lie about our age? Will lying to ourselves help us live longer? Maybe.<br />
<br />
There's been lots of interesting research about the placebo effect in all its forms. What's interesting is, placebos seem to work even when the patient KNOWS it's a placebo. And how is lying about your age different from giving yourselves a placebo?</div>
<div>
<br />
Unfortunately, the placebo effect will not solve everything. There are still some hard facts in this new age of aging. Since the Kellogg brothers made health a popular topic at the beginning of the last century, thousands of treatments have been tried in an attempt to stay younger. Most have been proven to be worthless, but a few obviously make a difference:<br />
<br />
"Today, the average age for someone moving into a nursing home is 81. In the 1950s, it was 65."<br />
<br />
"People are living 34 years longer than their great-grandfathers."</div>
<div>
<br />
"The number of people in the world over 100 years old is now approaching half a million."<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEicp1f1QxlWQZv56vqH1T4SKE0CTbnqmQ4puxpcUarWKTnw4EaflD4JUlauuV0-bWY2yXLRYAKJ-5kElLKxyrnc10n_PW-nrCT8y3gR7oYMjUqASnhywgQRH4eU1rRerO9BUXy62A/s1600/sun-damaged-trucker-face.jpg" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" height="278" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEicp1f1QxlWQZv56vqH1T4SKE0CTbnqmQ4puxpcUarWKTnw4EaflD4JUlauuV0-bWY2yXLRYAKJ-5kElLKxyrnc10n_PW-nrCT8y3gR7oYMjUqASnhywgQRH4eU1rRerO9BUXy62A/s1600/sun-damaged-trucker-face.jpg" width="400" /></a></div>
The internet is full of such dramatic results, so how does one gain the benefit? A few simple things make most of the difference.<br />
<br />
Avoiding tobacco and <a href="http://www.nejm.org/doi/full/10.1056/NEJMicm1104059"><b><span style="color: #0b5394;">limiting solar exposure</span></b></a> is good for the skin. The guy to the left was a life-long truck driver. He's obviously not British. Tobacco has a similar effect but to all of your skin.<br />
<br />
Appearance aside, the most important factors in staying young are still diet and exercise, so keep moving and stay hungry.<br />
<br />
<br />
<br />
<b><span style="font-size: x-large;">Keep Moving</span></b><br />
<br />
Whether you are overweight, have chronic pain, arthritis, dementia, depression, diabetes, anxiety or fatigue there is one piece of advice that will improve your quality and length of life - "Keep Moving". What's surprising is how this advice not only affects the physical but also your mental health.<br />
<br />
ANY physical activity that keeps you moving for at least 30 minutes a day, EVERY day will make a huge difference. That "every day" is the hard part. Success starts with finding something you enjoy. It can be yoga, swimming or walking. Start slowly and work your way up. Even if it takes a year to do 2 miles a day, after that you've gained 80% of the benefit of exercising in general. The second, fifth and seventeenth years are much easier. The best exercise is the one that you DO, so it's probably the one you enjoy most. Find your favorite way to move.<br />
<br />
Here is the best summary I've found on the topic, graphically presented. If you do nothing else about your health this year, at least spend nine minutes watching this video. It may add years to your life:<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjFnHBmRlS8u3JrOHZ7CwKxcVpx129Ge2lHvecbtp4HH40BdtgMONFdkiffrMALf2kwLaU0_Ye2t4S3q5aYVr0yYgnncZnRsuCzD0MbFCxVXsCCht6xMBzJvqFLiQx7pYV0JFi6UA/s1600/23-and-a-half-hours-your-health.png" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" data-original-height="299" data-original-width="485" height="394" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjFnHBmRlS8u3JrOHZ7CwKxcVpx129Ge2lHvecbtp4HH40BdtgMONFdkiffrMALf2kwLaU0_Ye2t4S3q5aYVr0yYgnncZnRsuCzD0MbFCxVXsCCht6xMBzJvqFLiQx7pYV0JFi6UA/s640/23-and-a-half-hours-your-health.png" width="640" /></a></div>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><a href="https://www.youtube.com/watch?v=aUaInS6HIGo&sns=fb">23 and 1/2 Hours : What is the Single Best Thing You Can Do For Your Health?</a></b><br />
<br />
<br />
<div style="margin: 0in 0in 0.0001pt;">
<span style="font-family: "georgia" , serif; font-size: 10pt;">“You don’t deteriorate from age, you age from deterioration.”</span> <i><span style="font-family: "georgia" , serif; font-size: 10pt;">- Joe Weider </span></i></div>
<br />
<br />
<span style="font-size: x-large;"><b>Stay Hungry</b></span><br />
<br />
The meaning is obvious. The trick is to not stay too hungry. Just like the exercise part, if you take it to the point of pain you're more likely to return to your old lifestyle. On the other hand, if you eat only what you need, you'll not only stay lean and healthy, you'll enjoy life more.<br />
<br />
Have you noticed how much better food taste when you're hungry? Well, at least the first few hundred calories. This is an important hint. When the meal becomes less compelling, stop eating. I know it's easier to say than do for a number of reasons. But if you eat just 100 calories less than you burn each day, you'll lose 12 pounds a year - that's hard science, and it's major progress. Still, it's easier said than done.<br />
<br />
The trick is finding how many calories you really NEED. It's probably a lot less than you think. That's because we're used to eating about twice as much as we require. Food is everywhere you turn. There's now even a snack bar at our local DMV. People seem to eat every hour or two. And they eat more than they did a hundred years ago. There's just too much food in our cage. Fortunately, our body is smart enough to send most of those extra calories right down the toilet. But not all of them. Over time, even a few extra calories a day will add to your waistline.<br />
<br />
You can use an internet calculator to find how many calories you need per day. You can tell if the number's right by how hungry you are at the end of the day:<br />
<br />
<b><a href="https://www.calculator.net/calorie-calculator.html">Calorie Calculator</a></b><br />
<br />
Once you know this number, slowly decrease intake until you find that edge between hunger and health. This should become your average consumption target. Avoid grazing. Eat at appointed times, and only planned amounts. Take some of that food out of your virtual cage. And as you decrease volume, increase variety. That's the key to good nutrition.<br />
<br />
Another trick is micro-fasting. If you know you'll be having a large dinner, skip lunch. Sure, you'll be hungrier than usual and probably eat a bit more at dinner, but you're already a few hundred calories under your target. Just don't stuff yourself. Keep your AVERAGE consumption just below your need. Take your time losing those extra pounds. We are each the summation of what we do and what we eat.<br />
<br />
"Staying hungry" will also improve the quality of experience for your other appetites. From sex to alcohol, to Netflix, less can be more if you hone your appetite with a bit of moderation. Find the "sweet spot" and stay hungry in all respects.<br />
<br />
<h2>
<span style="font-size: x-large;">Live Longer</span></h2>
If it's that simple, why are only a few truly healthy? It's obvious not everyone is gaining these extra years. Not surprisingly, access to excess noted above and electric grocery carts are the reasons. The majority of people today are actually shortening their lives with calories and the couch. Many are now dying younger than they would have a hundred years ago because of this default lifestyle. And more will follow them into the grave shortly. Just look around.<br />
<br />
Our society has become bifurcated where most people (of all ages) default into less activity and consume more calories. A minority eat less and lead more active lives. What's truly amazing is that this minority is still able to skew the average lifespan upward, while the bulk of America is killing themselves early. That's why a healthy lifestyle may extend one's life even more than the averages indicate. If you live well, your chronological age may not matter as much as you think.<br />
<br />
Misrepresenting your age may be a lie, but it's a lie worth living.<br />
<div>
<br />
<div style="margin-bottom: .0001pt; margin: 0in;">
"Count your age by friends, not years. Count your life by smiles, not tears." - John Lennon</div>
</div>
<div>
<br />
<br />
Even more data:</div>
<br />
Consistent with the social bifurcation of watching diet and exercise:<br />
<br />
05-08-17 <b><a href="https://www.theguardian.com/inequality/2017/may/08/life-expectancy-gap-rich-poor-us-regions-more-than-20-years">Life expectancy gap between rich and poor US regions is 'more than 20 years'</a></b><br />
<br />
04-17-13 Here is a demonstrative meta-study of the effects of 50 calorie reduction per day for an entire country! Now if we could just learn to do that as individuals:<br />
<br />
<a href="http://www.independent.co.uk/life-style/health-and-families/health-news/the-cuban-diet-eat-less-exercise-more--and-preventable-deaths-are-halved-8566603.html?utm_source=indynewsletter&utm_medium=email">The Cuban diet: eat less, exercise more - and preventable deaths are halved</a><br />
<br />
Another example in progress:<br />
<br />
<b>02-22-18 <a href="https://www.reuters.com/article/us-venezuela-food/venezuelans-report-big-weight-losses-in-2017-as-hunger-hits-idUSKCN1G52HA">Venezuelans report big weight losses in 2017 as hunger hits</a></b><br />
<br />
06-10-13 Cause or effect? <b><a href="http://www.wtop.com/267/3352449/Study-Fast-walkers-stay-ahead-of-aging">Fast walkers stay ahead of the game</a></b></div>
<div>
<br />
01-15-15 More data:<br />
<br />
<b style="background-color: white; color: #292221; font-family: georgia, times, "times new roman", serif;"><a href="http://www.express.co.uk/life-style/health/552048/Brisk-20-minute-walk-each-day-could-reduce-risk-early-death#prclt-GTaA1K2y">Daily walk adds years to your life: Just 20 minutes a day is enough</a></b><br />
<br />
04-21-15 Or is an hour a day the sweet spot?<br />
<br />
<b><a href="http://well.blogs.nytimes.com/2015/04/15/the-right-dose-of-exercise-for-a-longer-life/?action=click&pgtype=article&region=Marginalia&version=Full&contentCollection=Asia+Pacific#prclt-WMxCNu1n">The Right Dose of Exercise for a Longer Life</a></b><br />
<div style="-webkit-text-stroke-width: 0px; color: black; font-family: "Times New Roman"; font-size: medium; font-style: normal; font-variant: normal; font-weight: normal; letter-spacing: normal; line-height: normal; margin: 0px; orphans: auto; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px;">
<br /></div>
<div style="margin: 0px;">
<div style="color: black; font-family: "times new roman"; font-size: medium; font-style: normal; font-variant: normal; font-weight: normal; letter-spacing: normal; line-height: normal; text-transform: none; white-space: normal; word-spacing: 0px;">
<b><a href="https://www.blueprintincome.com/tools/life-expectancy-calculator-how-long-will-i-live/">Risk of Death Calculator</a></b></div>
<div style="color: black; font-family: "times new roman"; font-size: medium; font-style: normal; font-variant: normal; font-weight: normal; letter-spacing: normal; line-height: normal; text-transform: none; white-space: normal; word-spacing: 0px;">
<br /></div>
<div style="color: black; font-family: "times new roman"; font-size: medium; font-style: normal; font-variant: normal; font-weight: normal; letter-spacing: normal; line-height: normal; text-transform: none; white-space: normal; word-spacing: 0px;">
01-20-16 Here's an interesting idea that fits in with the work I've been doing on neuroscience and behavior:</div>
<div style="color: black; font-family: "times new roman"; font-size: medium; font-style: normal; font-variant: normal; font-weight: normal; letter-spacing: normal; line-height: normal; text-transform: none; white-space: normal; word-spacing: 0px;">
<br /></div>
<b><a href="https://aeon.co/essays/hunger-is-psychological-and-dieting-only-makes-it-worse?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+AeonMagazineEssays+%28Aeon+Magazine+Essays%29">The Hunger Mood</a></b></div>
</div>
<div style="margin: 0px;">
<br />
04-09-17 Interesting meta-collection: <b><a href="http://www.stumbleupon.com/su/2oQoXY/:9-Grsm1q:9VSPRVT+/www.businessinsider.com/best-age-for-everything-2017-3">Peaking</a></b></div>
<br />
06-08-17 <b><a href="http://www.longevityillustrator.org/">Longevity Illustrator</a></b><br />
<div>
<br /></div>
<div>
07-17-18 <b><a href="http://www.bbc.com/future/story/20180712-the-age-you-feel-means-more-than-your-actual-birthdate">Think Yourself Young</a></b><br />
<br />
11-08-18 <a href="https://www.bbc.com/news/world-europe-46133262">Dutchman, 69, brings lawsuit to lower his age 20 years</a><br />
<br />
01-17-19 <b><a href="https://www.bakadesuyo.com/2019/01/long-awesome-life/">This Is How To Have A Long Awesome Life: 7 Secrets From Research</a></b><br />
<br />
04-22-19 <b><a href="https://elemental.medium.com/the-easier-way-to-do-intermittent-fasting-9a9c60ba2e96">Effective Microfasting</a></b><br />
<br />
10-25-19 <b><a href="https://www.washingtonpost.com/outlook/five-myths/five-myths-about-aging/2019/09/20/e7bfa11a-daf9-11e9-bfb1-849887369476_story.html">Five Myths About Again</a></b><br />
<br />
01/13/20 <b><a href="https://eprognosis.ucsf.edu/leeschonberg.php">Lee Schonberg Index</a></b><br />
<br />
02-02-20 <b><a href="https://www.theatlantic.com/family/archive/2020/01/old-people-older-elderly-middle-age/605590/">When Does Someone Become "Old"?</a></b><br />
<br />
03-29-20 <b><a href="https://www.sciencenews.org/article/number-steps-per-day-not-speed-linked-mortality-rate?utm_source=Editors_Picks&utm_medium=email&utm_campaign=editorspicks032920">The number of steps per day, not speed, is linked to mortality rate</a></b><br />
<br />
05-05-20 A fairly obvious filtering of the healthiest, but still deaths were associated with how MANY steps taken and not how fast they were taken:<br />
<br />
<b><a href="https://jamanetwork.com/journals/jama/article-abstract/2763292">More Steps Per Day Linked to Lower Mortality Risk</a></b><div class="article-detail__content" style="box-sizing: inherit;">
<div style="box-sizing: inherit; font-family: Arial, sans-serif; font-size: 12px; margin-bottom: 0.4375rem;">
<br /></div>
<div style="box-sizing: inherit; margin-bottom: 0.4375rem;">Taking more steps per day is associated with lower all-cause mortality risk, according to an observational study in JAMA. <br /> Roughly 4800 adults aged 40 and up participating in the National Health and Nutrition Examination Survey (NHANES) wore accelerometers on their hips during waking hours for 7 days. During a mean 10 years' follow-up, 24% died. The unadjusted all-cause mortality rates were: <br /> <a href="https://www.blogger.com/#"></a> <br /> 77 per 1,000 person-years for those who took less than 4000 steps per day; <br /> 21 per 1,000 for 4,000–7,999 steps; <br /> 7 per 1,000 for 8,000–11,999 steps; and <br /> 5 per 1,000 for 12,000 steps and above. <br /> In adjusted analyses, people who took 8,000 steps per day had lower all-cause, cardiovascular, and cancer mortality than those who took 4,000 steps. Faster walking speed was not associated with lower mortality after adjusting for total daily steps. <br /> <br /> <br /> 09-21-20<span face="Arial, sans-serif"><span style="font-size: 12px;"> </span></span><b><a href="https://www.jyu.fi/en/current/archive/2020/09/older-people-have-become-younger-physical-and-cognitive-function-have-improved-meaningfully-in-30-years#:~:text=Research%20news-,Older%20people%20have%20become%20younger%3A%20physical%20and%20cognitive%20function,improved%20meaningfully%20in%2030%20years&text=of%20Jyv%C3%A4skyl%C3%A4%2C%20Finland.-,The%20study%20compared%20the%20physical%20and%20cognitive%20performance%20of%20people,aged%20people%20in%20the%201990s." target="_blank">Older People Have Become Younger</a></b></div><div style="box-sizing: inherit; margin-bottom: 0.4375rem;"><br /></div><div style="box-sizing: inherit; margin-bottom: 0.4375rem;">03-31-22 <b><a href="https://www.inverse.com/mind-body/10000-steps-longevity-health">10,000 Steps?</a> Or will 8,000 do? It apparently depends on your age.</b></div><div style="box-sizing: inherit; margin-bottom: 0.4375rem;"><b><br /></b></div><div style="box-sizing: inherit; margin-bottom: 0.4375rem;"><br /></div>
</div>
</div>
Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.com1tag:blogger.com,1999:blog-23742979.post-3587958912839375312020-02-09T11:25:00.026-08:002024-03-18T05:02:47.992-07:00The Emergence of ManFirst posted 08-03-12, updated every now and then.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhHsH7L3wMt1FHOdd3ZKfu49zd3EhmHDQj4PxJ8SY5roksGaaHtPCf7tzaMdrEHhUEBT_ag9agUN-g4mrF2pPUUM-Z7ryhCX3V7ctovE9vElSkMjQhvvEsBMBb3aZx1DyRTDA5vUA/s1600/Birth+of+Mankind.jpg" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" height="265" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhHsH7L3wMt1FHOdd3ZKfu49zd3EhmHDQj4PxJ8SY5roksGaaHtPCf7tzaMdrEHhUEBT_ag9agUN-g4mrF2pPUUM-Z7ryhCX3V7ctovE9vElSkMjQhvvEsBMBb3aZx1DyRTDA5vUA/s400/Birth+of+Mankind.jpg" width="400" /></a></div>
As a boy, I was impressed with "2001 A Space Odyssey". The idea that a spark from space vaulted man beyond the other primates seemed plausible. And it got me thinking.<br />
<br />
How different ARE we from the other primates? And when did this difference occur?<br />
<br />
These questions lead me to, "The Naked Ape", by Desmond Morris. He defined a few differences, but far more similarities. So I kept looking.<br />
<div>
<br /></div>
Over the years I've kept track of the various discoveries looking for the significant differences between us and our cousins. It's time to document them.<br />
<br />
For background, let's step back about a billion years:<br />
<br />
<b><a href="https://arstechnica.com/science/2019/05/billion-year-old-fossils-may-be-early-fungus/">Billion-year-old fossils may be early fungus</a></b><br />
<b><a href="http://www.bbc.com/news/science-environment-38800987">Scientists Find 'Oldest Human Ancestor'</a></b><br />
<br />
And a half a billion years later:<br />
<br />
<b><a href="http://www.bbc.co.uk/news/science-environment-22770646">Tiny Chinese Archicebus fossil is the oldest primate yet found</a></b><br />
<br />
<br />
<b><span style="font-size: large;">Walking and Running</span></b><br />
<br />
When I was a child, walking erect was the gold standard of humanity. And it's true, we're better on two feet chasing down game than all others, but only by degrees.<br />
<br />
<a href="https://phys.org/news/2019-09-rare-million-year-old-fossil-unearths-view.html"><b>Rare 10 million-year-old fossil unearths new view of human evolution</b></a><br />
<br />
<b><a href="https://www.livescience.com/64275-little-foot-hominin-excavated.html">Little Foot - 3.67 Million Years Old</a></b><br />
<div>
</div>
<br />
Bonobos showed that walking erect is no big deal:<br />
<br />
<b><a href="https://www.youtube.com/watch?feature=player_embedded&v=l0tAQcpLILQ">Walking Upright</a></b><br />
<br />
<div>
But that's not running. About three million years ago a significant change occurred. Humans became marathon runners and developed some new hunting scripts largely stolen from wolves, which begs the question, who ultimately domesticated whom?</div>
<br />
<b><a href="http://www.nytimes.com/2015/05/22/science/family-tree-of-dogs-and-wolves-is-found-to-split-earlier-than-thought.html">Family Tree of Dogs and Wolves Is Found to Split Earlier Than Thought</a></b><br />
<br />
<b><a href="https://www.sciencenews.org/article/first-steps-book-bipedalism-human-evolution-anatomy-behavior" target="_blank">First Steps</a></b><div><br />
<b><span style="font-size: large;">Diet</span></b><br />
<br />
Food has also been an area of study to define differentiation, but which species jumped what line, and when?<br />
<b><br /><a href="http://www.bbc.co.uk/news/science-environment-22752937">Human ancestors changed diet 3.5 million years ago</a></b><br />
<br /><br /><b><span style="font-size: large;">Thumbs</span></b><br />
<br />A well developed thumb is closely associated with human advancement:</div><div><br /></div><div>04-16-21 <b><a href="https://www.sciencenewsdigital.org/sciencenews/february_27__2021/MobilePagedArticle.action?articleId=1660322#articleId1660322" target="_blank">Humanlike grips go back 2 million years</a></b></div><div><br /><br />
<span style="font-size: large;"><b>Tools</b></span><br />
<br />
Another thing that set us apart was thought to be tool use, but this test also fell as chimps and other species have now demonstrated.<div><br /></div><div>04-14-15 <b><a href="http://news.sciencemag.org/africa/2015/04/world-s-oldest-stone-tools-discovered-kenya">World's oldest stone tools discovered in Kenya </a></b><br />
<br />
<b><a href="http://www.bbc.co.uk/news/science-environment-10938453">Tool-making and Meat-eating Began 3.5 Million Years Ago?</a></b><br />
<br />
<b><a href="http://in.reuters.com/article/2015/01/22/science-hands-idINKBN0KV2N220150122">Tool-Making 3 Million Years Ago?</a></b><br />
<br />
<b><a href="https://www.sciencenews.org/article/ancient-homo-stone-tools-north-africa-arabia-early">Stone-tool makers reached North Africa and Arabia surprisingly early - 2.4 million years ago</a> </b><br />
<br />
<a href="http://www.livescience.com/41986-human-hand-fossil-reveals-early-tool-use.html">Tool-making 1.4 Million Years Ago?</a><br />
<br />
<a href="https://www.sciencenews.org/article/homo-erectus-hand-ax-stone-age-tools?utm_source=email&utm_medium=email&utm_campaign=latest-newsletter-v2&utm_source=Latest_Headlines&utm_medium=email&utm_campaign=Latest_Headlines">Bone Tool-making - 1.4 Million Years Ago</a></div><div><br /></div><div><div>03-18-24 <b><a href="https://www.sciencealert.com/archaeologists-just-uncovered-the-oldest-evidence-of-humans-in-europe">Archaeologists Just Uncovered The Oldest Evidence of Humans in Europe</a></b></div><div><br /></div><div>12-23-22 <b><a href="https://www.sciencealert.com/ancient-humans-may-have-sailed-the-mediterranean-450000-years-ago" target="_blank">Sailing 500,000 Years Ago</a></b></div></div><div><br /></div><div>10-28-22 - <b><a href="https://www.sciencealert.com/half-a-million-year-old-signs-of-extinct-human-species-found-in-poland-cave" target="_blank">Half-a-Million Year Old Signs of Extinct Human Species Found in Poland Cave</a></b><br />
<br />
12-06-19 <b><a href="https://ktla.com/2017/04/26/discovery-near-san-diego-freeway-shows-humans-were-in-america-100000-years-earlier-than-we-thought/?fbclid=IwAR3dCSBWq5hNQHMWTiVLoiiJVDZje9rpICQ2Cum3lzXRSVHDuH0PgMWyp-E">San Diego Tool Use 130,000 years ago</a></b><br />
<br />
<b><a href="http://gizmodo.com/ancient-stone-tools-hint-at-the-real-paleo-diet-1784990084">Ancient Stone Tools Hint at the Real Paleo Diet 126,000 to 781,000 Years Ago</a></b><br />
<div> </div>01-26-21 <b><a href="https://www.sciencenews.org/article/homo-erectus-not-humans-invented-barbed-bone-point-tool" target="_blank">Bone points may NOT have been invented by humans being up to 1.7 million years old</a></b></div><div><b><br /></b></div><div><b>02-03-21 <a href="https://www.smithsonianmag.com/science-nature/essential-timeline-understanding-evolution-homo-sapiens-180976807/" target="_blank">AN EVOLUTIONARY TIMELINE OF HOMO SAPIENS</a><br /></b> <br /><div><b><a href="https://singularityhub.com/2022/01/07/how-a-handful-of-prehistoric-geniuses-launched-humanitys-technological-revolution/" target="_blank">The spear which was developed about 500,000 years ago</a></b> is also a clear example of tool use. So far we've seen no other species accomplish this trick, though a weapon may become a possible learned behavior for some other primates as other tools have been. Is it just a matter of time?</div><div><br /></div><div>10-22-20 <b><a href="https://scitechdaily.com/defining-moment-in-human-evolution-turbulent-era-sparked-leap-in-human-behavior-technology-320000-years-ago/" target="_blank">Defining Moment in Human Evolution: Turbulent Era Sparked Leap in Human </a></b></div><div><b><a href="https://scitechdaily.com/defining-moment-in-human-evolution-turbulent-era-sparked-leap-in-human-behavior-technology-320000-years-ago/" target="_blank">Behavior, Technology 320,000 Years Ago</a></b></div><div><br /></div><div><b><a href="http://news.sciencemag.org/sciencenow/2013/05/when-did-humans-begin-hurling-sp.html?ref=hp">When Did Humans Begin Hurling Spears? - 90,000 Years Ago?</a></b></div>
<br />
<b><a href="https://www.washingtonpost.com/news/morning-mix/wp/2016/05/11/australian-researchers-say-theyve-found-the-worlds-oldest-hatchet/">Australian researchers say they’ve found the world’s oldest hatchet</a></b><br />
<br />
<b><a href="http://www.npr.org/sections/thetwo-way/2016/10/19/498421284/those-ancient-stone-tools-did-humans-make-them-or-was-it-really-monkeys">Monkey or Man?</a></b><br />
<br />
11-04-16 <a href="http://www.forbes.com/sites/shaenamontanari/2016/11/04/a-49000-year-old-human-settlement-has-been-discovered-in-australia/#5f63cc31502d" style="font-weight: bold;">49,000 Year Old Human Settlement in Australia</a><br />
<br />
01-31-18<b><a href="https://www.sciencenews.org/article/sharp-stones-found-india-signal-surprisingly-early-toolmaking-advances"> Sharp stones found in India signal surprisingly early toolmaking advances</a></b><br />
<br /><div><b><a href="https://www.washingtonpost.com/news/animalia/wp/2016/07/11/in-brazil-scientists-unearth-a-trove-of-ancient-stone-tools-used-by-monkeys/">700-year-old Stone Tools - Used by Monkeys</a></b></div><div><br /></div><div>02-18-21 <b><a href="https://www.sciencenews.org/article/cricket-leaf-song-speakers-male-mating-call-competition" target="_blank">Crickets Use Tools?</a></b><a href="#"></a><br /><br />10-21-23 <b><a href="https://www.insider.com/archeologists-500000-year-old-wood-structures-found-zambia-2023-9" target="_blank">500,000-year-old pieces of wood discovered</a></b></div><div><br /></div></div><div><br /></div><div><br />
<span style="font-size: large;"><b>Trade</b></span><br />
<br />
Clear evidence of trade between distant (and separate) tribes would definitely set man apart from other primates:<br />
<br />
<a href="http://www.latimes.com/science/sciencenow/la-sci-sn-humans-turning-point-20180315-story.html">Evolve or die: Why our human ancestors learned to be social more than 320,000 years ago</a><br />
<br />10-12-21 <b><a href="https://www.sciencenews.org/article/dog-dna-ancient-trade-network-arctic-siberia-europe-near-east?utm_source=email&utm_medium=email&utm_campaign=latest-newsletter-v2&utm_source=Latest_Headlines&utm_medium=email&utm_campaign=Latest_Headlines" target="_blank">Dog DNA reveals ancient trade network connecting the Arctic to the outside world </a></b></div><div><b><span style="font-size: large;"><br /></span></b></div><div>
<b><span style="font-size: large;">Culture</span></b><br />
<br />
In response, the transfer of culture became the new human benchmark, but the ability to transfer new knowledge from one generation to another has also been demonstrated by chimps...<br />
<br />
<b><a href="http://news.nationalgeographic.com/news/2013/13/130425-humpback-whale-culture-behavior-science-animals/">And WHALES</a></b><br />
<br />
Then there was self-awareness, which was disproved in spite of Darwin's original mirror observations:<br />
<br />
<b><a href="http://en.wikipedia.org/wiki/Mirror_test">Mirror Test</a></b><br />
<br />
<b><a href="https://www.quantamagazine.org/a-self-aware-fish-raises-doubts-about-a-cognitive-test-20181212/">Fish Passes the Mirror Test</a></b><br />
<div>
<br />
And a nice test of abstract thinking is meta-cognition:<br />
<br />
<b><a href="http://www.sciencedaily.com/releases/2013/04/130403141442.htm">Chimps: Ability to 'Think About Thinking' Not Limited to Humans</a></b><br />
<br />
And have episodic memory:<br />
<br />
<a href="http://www.bbc.co.uk/news/science-environment-23330813" style="font-weight: bold;">Chimpanzees and orangutans remember distant past events</a><br />
<br />
<div>
How about 500,000 year-old art? But which species made it? :<br />
<br />
<b><a href="http://news.nationalgeographic.com/news/2014/12/141203-mussel-shell-oldest-art/">Art on the half-shell</a></b><br />
<br />
176,000 year old ritual?<br />
<br />
<b><a href="http://www.theatlantic.com/science/archive/2016/05/the-astonishing-age-of-a-neanderthal-cave-construction-site/484070/?platform=hootsuite">Bruniquel Cave</a></b><br />
<br />
<b><a href="https://www.sciencenews.org/article/south-african-cave-stone-may-bear-worlds-oldest-drawing?utm_source=editorspicks091618&utm_medium=email&utm_campaign=Editors_Picks">Blombos Cave has 73,000 year old art?</a></b><br />
<br />
<br />
<b><span style="font-size: large;">Language?</span></b></div>
<br />
<b><a href="http://blog.longnow.org/02013/07/30/language-may-be-much-older-than-previously-thought/">Next to evolve was language.</a></b> The chimp Washoe laid that one to rest in the 1960s. And then there's Koko the gorilla who recognizes 1000 signs vocabulary and 2000 spoken words. She has an IQ of about 80. Also bonobos have <a href="http://www.wired.com/2014/07/chimpanzee-bonobo-gestures/"><b>gesture language</b></a> plus now respond to spoken language with keyboard feedback. It may be simple, but it's language. And even more languages are being discovered:<br />
<br />
<b><a href="http://www.cbc.ca/news/technology/story/2013/06/21/science-prairie-dog-language-decoded.html">Prairie dogs' language decoded by scientists</a></b><br />
<br /><b><span style="font-size: large;"><br /></span></b> <b><span style="font-size: large;">Fire?</span></b><br />
<br />
The most literally obvious and vivid tool of man has been fire. The control of fire allowed our gut to decrease in length by about a yard as we began to cook our food and digestion improved. Human resistance to air pollution also emerged over the last million years, an indication that we lived with fire during that time.<br />
<br />
Control of fire wasn't just tool use, it was the most exquisite form of tool use. The trick was getting close enough to use the flame but not get burned, and then of course, not letting the fire go out. How many thousands of our ancestors played with fire before we learn to pass on these two tricks? And was this the brain and thumbs at work? Fire was the turning point. The most concise summary of why I found in a New Yorker article, "<b><a href="https://www.newyorker.com/magazine/2017/09/18/the-case-against-civilization">The Case Against Civilization</a></b>" (which I disagree with overall):<br />
<br />
"The earliest, oldest strata of the caves (in Africa) contain whole skeletons of carnivores and many chewed-up bone fragments of the things they were eating, including us. Then comes the layer from when we discovered fire, and ownership of the caves switches: the human skeletons are whole, and the carnivores are bone fragments. Fire is the difference between eating lunch and being lunch."<br />
<br />
We know of no other primate who developed the independent use of fire, (though some Bonobos have now been trained to do so with a lighter, and even use water to put it out). Man's sustained use of fire is estimated to have begun sometime between 1.5 million and 400,000 years ago:<br />
<br />
<b><span style="color: #0b5394;"><a href="http://www.slate.com/articles/health_and_science/human_evolution/2012/10/who_invented_fire_when_did_people_start_cooking_.html">Who Mastered Fire?</a></span></b><br />
<br />
<a href="http://abcnews.go.com/Technology/early-humans-cooking-food-million-years-ago/story?id=16080804#.T4IyWe1rFDI">Were Early Humans Cooking Their Food a Million Years Ago?</a><br />
<div>
<br />
Still, isn't the difference between us and other primates simply a matter of DEGREE in thinking and manipulating our environment? Scripts and tools are certainly learned and used effectively by other species. But our fore-brains allowed for abstraction, delayed gratification and far more complex simulations as demonstrated by the wide range of different human behaviors. So is our main difference from other primates the complexity of behaviors created by individualism and hyper-specialization?<br />
<br />
Or maybe not:<br />
<br />
<b><a href="https://neurosciencenews.com/fire-acquisition-evolution-15129/">Neanderthals Light it Up</a></b><br />
<br />
<b><span style="font-size: large;">Canned Food?</span></b><br />
<div>
<br /></div>
<div>
<b><a href="https://www.ancient-origins.net/news-evolution-human-origins/qesem-cave-0012718">New Shocking Clues Into Human Origins From Qesem Cave - 420,000 Years Ago</a></b></div>
</div>
<div>
<br />
<b><span style="font-size: large;">Burial?</span></b></div><div><b><span style="font-size: large;"><br /></span></b></div>11-13-23 <b><a href="https://www.sciencealert.com/the-oldest-known-burial-site-in-the-world-wasnt-made-by-our-species" target="_blank">300,000 Year Ago the Oldest Known Burial Site in The World</a></b></div><div><br /></div><div>05-06-21<b> <a href="https://www.sciencenews.org/article/africa-oldest-known-human-burial-child-grave-cave?utm_source=email&utm_medium=email&utm_campaign=latest-newsletter-v2&utm_source=Latest_Headlines&utm_medium=email&utm_campaign=Latest_Headlines" target="_blank">A child's 78,000-year-old grave</a></b><div><b><span style="font-size: large;"><br /></span></b></div><div><b><span style="font-size: large;">Clothes Make the Man?</span></b><br /><br />09-17-21 <b><a href="https://www.sciencealert.com/strong-evidence-found-for-the-manufacture-of-clothing-as-far-back-as-120-000-years">Discovery Suggests Humans Were Already Manufacturing Clothes 120,000 Years Ago</a></b><br /></div><div><br /></div><div>10-01-23 <a href="https://www.sciencealert.com/ancient-tracks-reveal-oldest-evidence-of-footwear-ever-found" target="_blank"><b>Ancient Tracks Reveal Oldest Evidence of Footwear Ever Found 110,000 years ago?</b></a></div><div><br /></div><div><b><span style="font-size: large;">Out of Africa</span></b></div><div>
<br /></div>
Whatever makes us different was probably well established by 60,000 (or 100,000?) years ago, as that's when humans became successful enough to spread from Africa to the rest of the world in our anatomically modern form. Was it a combination of language, hunting methods, tools, spears, and fire? Or was it some kind of proto-agriculture for which we've yet to find evidence?<br />
<br />
Blombos Cave contained scratches on ocher objects from 75,000 to 100,000 years ago.<br />
<br />
<b><a href="http://www.newscientist.com/article/mg22329813.000-human-exodus-may-have-reached-china-100000-years-ago.html#.U-o8NWOupCo">Left 100,000 Years Ago?</a></b><br />
<br />
<b><a href="https://phys.org/news/2017-10-longer-home-key-stone-age.html">Or Stayed 60,000 Years Ago?</a></b><br />
<br />
01-25-18 <b><a href="http://www.businessinsider.com/human-brains-may-only-be-40000-years-old-scientists-say-2018-1">The modern human brain may only be 40,000 years old</a></b><br />
<br />
06-16-20 <b><a href="https://www.sciencenews.org/article/clues-earliest-known-bow-arrow-hunting-outside-africa-found?utm_source=email&utm_medium=email&utm_campaign=latest-newsletter-v2&utm_source=Latest_Headlines&utm_medium=email&utm_campaign=Latest_Headlines">Earliest known bow-and-arrow hunting outside Africa 48,000 years ago</a></b><br />
<br />
<b><span style="font-size: large;">Music, Art and Property?</span></b><br />
<br />
Border Cave takes some level of symbolic culture and the ownership of property back to 44,000 years. The Venus of Fels Cave in Germany is clearly art from 35,000 years ago.<br />
<div>
<br /></div>
<div>
<b><span style="color: #0b5394;"><a href="http://www.latimes.com/news/science/sciencenow/la-sci-sn-modern-culture-africa-20120730,0,4412702.story?track=rss">Border Cave</a></span></b><br />
<br />
Could "owning things" be that line between us and chimps? This is one of the ideas put forth in <b><span style="color: #0b5394;"><a href="http://suddendisruption.blogspot.com/search/label/Sex%20at%20Dawn">Sex at Dawn</a></span></b>. Maybe Christopher Ryan is on to something. Will this mystery lead us back to ourselves? In any case, ten to fifty thousand years ago was an exciting time for man.</div><br /><br /></div><div><b><a href="https://www.sciencealert.com/more-evidence-found-for-sophisticated-symbolic-behavior-in-neanderthals" target="_blank">Beautiful Bone Carving From 51,000 Years Ago Is Changing Our View of Neanderthals</a></b></div><div> <div>
<a href="https://www.theguardian.com/science/2018/nov/07/worlds-oldest-figurative-painting-discovered-in-borneo-cave">40,000 Year Old Cave Painting</a><br />
<br />
<div>
</div>
<a href="http://www.boston.com/news/health/articles/2009/06/24/archaeologists_unearth_oldest_musical_intstruments_ever_found/">Archaeologists Unearth 35,000 Year Old Musical Instrument</a></div>
<br />
<b><span style="color: #0b5394;"><a href="http://www.newscientist.com/blogs/shortsharpscience/2013/01/worlds-oldest-portrait.html">World's Oldest Portrait - Symbolic Abstraction </a></span></b><b><span style="color: #0b5394;"><a href="http://www.newscientist.com/blogs/shortsharpscience/2013/01/worlds-oldest-portrait.html">26,000 Years Ago</a></span></b><br />
<div>
<br /><b><a href="https://en.wikipedia.org/wiki/Venus_of_Willendorf" target="_blank">Venus of Willendorf - 25000 years ago</a></b></div><div><br /></div><div>11-24-23 <b><a href="https://www.blogger.com/blog/post/edit/23742979/358795891283937531" target="_blank">Is the Gunung Padang mound 25,000 years old?</a></b></div><div><br /></div><div>Not all hunter-gatherers moved around. How could they have carried all these pots?</div><div>
<br />
<b><a href="http://www.npr.org/blogs/13.7/2013/04/18/177748920/what-15-000-years-of-cooking-fish-tells-us-about-humanity">What 15,000 Years Of Cooking Fish Tells Us About Humanity</a></b><br />
<br />
02-09-20 <b><a href="https://www.rt.com/news/480396-ancient-cave-carvings-discovered-spain/?utm_source=rss&utm_medium=rss&utm_campaign=RSS">15,000 Year Old Catalan Altamira cave carvings</a></b><br />
<br />01-14-20 <b><a href="https://www.scmp.com/news/asia/southeast-asia/article/3117683/indonesian-cave-painting-life-size-pig-believed-be-over" target="_blank">Pig Painting is 45,000 years old</a></b></div><div><br />
<br />
<b><span style="font-size: large;">Agriculture?</span></b><br />
<br />
The key to real civilization seems to be the domestication of plants and animals - agriculture. It's often described in terms of specialization and our ability to withhold gratification until the resource matures (wheat, cows or eggs into chickens).</div><div><br /></div><div>10-01-21 <b><a href="https://www.livescience.com/ancient-cassowary-rearing?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Livesciencecom+%28LiveScience.com+Science+Headline+Feed%29">People were raising cassowaries 18,000 years ago</a></b><br />
<br />
This may be the key to domestication 14,000 years ago:<br />
<br />
<b><a href="http://news.nationalgeographic.com/news/2013/03/130302-dog-domestic-evolution-science-wolf-wolves-human/">We Didn't Domesticate Dogs. They Domesticated Us.</a></b><br />
<br />
<b><a href="http://www.theguardian.com/science/2015/mar/01/hunting-with-wolves-humans-conquered-the-world-neanderthal-evolution?utm_source=digg&utm_medium=email">How hunting with wolves helped humans outsmart the Neanderthals.</a></b><br />
<br />
<b><a href="http://www.sciencemag.org/news/2018/07/oven-was-used-make-bread-thousands-years-agriculture">14,000-Year-Old Bread</a></b><br />
<br />
Another line blurred:<br />
<br />
<b><a href="http://www.youtube.com/watch?v=U2lSZPTa3ho&feature=youtu.be">Baboons Kidnap and Raise Feral Dogs as Pets</a></b><br />
<br />
Even the line of first settlements are moving backward and becoming blurred. In school I was taught civilization started about 5,000 years ago. Then it was 7,000 years. Then 10,000. And now:<br />
<br />
<b style="font-family: "times new roman";"><span style="font-size: large;">Shelter?</span></b><br />
<br />
Except for digging holes, and a few other minor exceptions, no other species builds shelter:<br />
<br />
<b><a href="http://www.telegraph.co.uk/news/uknews/7937240/Oldest-house-in-Britain-discovered-to-be-11500-years-old.html">Oldest house in Britain discovered to be 11,500 years old</a></b><br />
<b><a href="http://www.anonymousmags.com/7401-2/">Stone Building in Russia</a></b><br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh6YJLWYouX6_M2FSFXLN0pQxYfaXDNB86T5zAYIa72XnrhpQOan1vGaVHvmd3lJYqxrMz44u-HxfTJldLFPw9gjsTpTiod43FYoRbSZUpGUCLJUKmcx6upthLiGZGgqEAFjYbKAQ/s1600/Gobekli+Tepe.jpg" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" height="440" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh6YJLWYouX6_M2FSFXLN0pQxYfaXDNB86T5zAYIa72XnrhpQOan1vGaVHvmd3lJYqxrMz44u-HxfTJldLFPw9gjsTpTiod43FYoRbSZUpGUCLJUKmcx6upthLiGZGgqEAFjYbKAQ/s1600/Gobekli+Tepe.jpg" width="640" /></a></div>
<br />
<br />
<b><span style="color: #0b5394;"><a href="https://www.youtube.com/watch?v=TZ0ViMVxKZA&feature=player_embedded#!">12,000 Year-Old Gobekli Tepe</a></span></b><br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh_-Jk5_OUXRQNwni-RplGtMY-xZxWfBWZrz6RAUqYBFI3U-zOLQe2WXXqDoG7ZwU0mmU0SzDK-374sdzQoPoyaZMjXROh4T1vztOOthy8IABqR8z_XsgYfjpa34EWDVHWcdDADQQ/s1600/1280px-Gobekli_Tepe_2.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1600" data-original-width="1063" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh_-Jk5_OUXRQNwni-RplGtMY-xZxWfBWZrz6RAUqYBFI3U-zOLQe2WXXqDoG7ZwU0mmU0SzDK-374sdzQoPoyaZMjXROh4T1vztOOthy8IABqR8z_XsgYfjpa34EWDVHWcdDADQQ/s320/1280px-Gobekli_Tepe_2.jpg" width="212" /></a></div>
<br />
<b><a href="http://www.ancient-origins.net/opinion/secret-gobekli-tepe-cosmic-equinox-and-sacred-marriage-part-1-002861#ixzz3WSUrXlCR">Gobekli Tepe Update 04-04-15</a></b><br />
<br />
(Wiki says Gobekli Tepe is only dated to 9559 projecting to 11,000 years old) That's still some impressive stone work which must have taken a few thousand years to develop. 20,000 years seems like a more safe number for now. We just need to find more sites and map progress, but we're definitely blurring back into our ancestors. When exactly did we become "human"?<br />
<div>
<br />
<div>
As a side note, dogs have been with us for about 14,000 years according to bone evidence.<br />
<br />
And here is an even broader overview taking evolution into our culture - a lot of good ideas here:<br />
<div>
<br />
<b><span style="color: #0b5394;"><a href="http://www.edge.org/conversation/how-culture-drove-human-evolution">How Culture Drove Human Evolution - Joseph Henrich</a></span></b><br />
<br /></div>
<div>
This next post strays a bit far from the origins of man, but contains so many useful observation about humanity:<br />
<br />
<b><span style="color: #0b5394;"><a href="http://www.orionmagazine.org/index.php/articles/article/7146">State of the Species - Charles C. Mann</a></span></b><br />
<br />
Maybe the missing mechanism is EPIgenitics working with genetics. It's an example of how evolution can go well beyond sexual preference:<br />
<br />
<b><span style="color: #0b5394;"><a href="http://scientists%20claim%20that%20homosexuality%20is%20not%20genetic%20%E2%80%94%20but%20it%20arises%20in%20the%20womb/">Scientists claim that homosexuality is not genetic — but it arises in the womb</a></span></b><br />
<br />
Here is a fun idea about how the n-grams of our cultural evolution is reflected in our language:<br />
<br />
<b><span style="color: #0b5394;"><a href="http://www.matjazperc.com/ngrams/evolution.html">Evolution of the most common English words and phrases over the centuries</a> 12-12-12</span></b></div>
</div>
</div>
<br />
<b><span style="color: #0b5394;"><a href="http://www.natureworldnews.com/articles/428/20121223/worlds-oldest-wooden-water-wells-discovered.htm">World's Oldest Wooden Water Wells Discovered From About 5000 Years Ago</a> </span></b>12-24-12<br />
<br />
Is a long childhood the key difference? Maybe:<br />
<br />
<b><span style="color: #0b5394;"><a href="http://www.slate.com/articles/health_and_science/science/2013/01/evolution_of_childhood_prolonged_development_helped_homo_sapiens_succeed.single.html">Why Are We the Last Apes Standing?</a></span></b><br />
<br />
Believe it or not, this was published long after I published this post (which like primates is still evolving). Mark Changizi seems to agree that we differ only by degree ("quantitatively so, not qualitatively"). Interesting post. I need to get his books on my list:<br />
<br />
<a href="http://blogs.discovermagazine.com/crux/2011/12/07/bursting-the-bubble-of-human-intelligence/">Bursting the Bubble of Human Intelligence</a> 04-09-13<br />
<div>
<br />
It seems this puzzle is filling in literally day by day. Stay tuned for more updates.<br />
<br />
It appears we must guard against cultural imperialism in our acquisition of knowledge. And does human behavior vary to try all possible combinations in the same way a species replicates to fill the physical range of it's environment?<br />
<div>
<br />
<b><a href="http://www.psmag.com/magazines/pacific-standard-cover-story/joe-henrich-weird-ultimatum-game-shaking-up-psychology-economics-53135/">Why Americans Are the Weirdest People in the World</a> </b>02-25-13<br />
<br />
<br />
<b><a href="https://www.livescience.com/65956-largest-neolithic-settlement-in-israel.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Livesciencecom+%28LiveScience.com+Science+Headline+Feed%29">9000 Year Old Neolithic Settlement in Israel</a></b> - 07-17-19<br /></div><div><br /></div><div>
<div>
<br />
<b><span style="font-size: large;">The Wheel</span></b><br />
<br />
02-19-16 The wheel is certainly a definitive test of humanity. Well, at least so far:<br />
<br />
<b><a href="http://www.sloveniatimes.com/world-s-oldest-wheel-home-after-decade-under-restoration">Oldest Wheel - 5200 Years Ol</a>d</b></div>
<div>
</div>
<br />
<div>
<b><a href="https://www.sciencealert.com/this-ancient-inscription-is-the-oldest-sentence-in-the-worlds-first-alphabet" target="_blank">First sentence based on an alphabet - 4700 years old</a></b><br />
<h2>
Chewing Gum?</h2>
<b><a href="https://www.cnn.com/2019/12/17/world/ancient-chewing-gum-genome-scn/index.html">Chewing gum - 5700 years old</a></b><br />
<br /><br /></div><div><b>5500 Year Old Sumer</b> - city with <span face="-apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, Oxygen-Sans, Ubuntu, Cantarell, "Helvetica Neue", sans-serif" style="background-color: white; color: #282829; font-size: 15px;">writing and a stratified society (rulers and priests, working class and peasants).</span></div><div><span face="-apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, Oxygen-Sans, Ubuntu, Cantarell, "Helvetica Neue", sans-serif" style="background-color: white; color: #282829; font-size: 15px;"><br /></span></div><div><span face="-apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, Oxygen-Sans, Ubuntu, Cantarell, "Helvetica Neue", sans-serif" style="background-color: white; color: #282829; font-size: 15px;"><b>5500 Year Old Caral and Norte Chico</b> - civilization in modern-day Peru without </span><span face="-apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, Oxygen-Sans, Ubuntu, Cantarell, "Helvetica Neue", sans-serif" style="background-color: white; color: #282829; font-size: 15px;">stratified society</span><span face="-apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, Oxygen-Sans, Ubuntu, Cantarell, "Helvetica Neue", sans-serif" style="background-color: white; color: #282829; font-size: 15px;">.</span></div><div><br /></div>12-27-18 <b><a href="https://www.youtube.com/watch?time_continue=1&v=DZv8VyIQ7YU">Seven Million Years of Human Evolution</a></b></div>
</div>
</div>
</div>
<div>
<br /></div>
<br />09-09-20 <b><a href="https://theconversation.com/when-did-we-become-fully-human-what-fossils-and-dna-tell-us-about-the-evolution-of-modern-intelligence-143717">When did we become fully human?</a></b><br />
<div><br /></div></div></div><div>And the beat goes on...</div><div><br /></div><div>02-06-22 <b><a href="https://www.livescience.com/15689-evolution-human-special-species.html" target="_blank">Top 10 things that make humans special</a></b></div><div><br /></div><div><br /></div><div><span style="font-size: large;"><b>Recursion?</b></span></div><div><span style="font-size: large;"><b><br /></b></span></div><div><div>11-03-22 <b><a href="https://www.scientificamerican.com/article/crows-perform-yet-another-skill-once-thought-distinctively-human/" target="_blank">Recursion in Crows</a></b></div><div><br /></div><div><br /></div></div><div><span style="font-size: large;"><b><br /></b></span></div>Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.com0tag:blogger.com,1999:blog-23742979.post-1145747490245241912019-11-20T13:57:00.001-08:002021-03-15T15:50:40.435-07:00What Ever Happened with H2S Induced Hibernation?11-20-19 <a href="https://www.newscientist.com/article/2224004-exclusive-humans-placed-in-suspended-animation-for-the-first-time/"><b>Humans Placed in Suspended Animation for the First Time</b></a><br />
<br />
<br />
<span style="font-family: "courier new"; font-size: x-large;"><b>What Ever Happened with H2S Induced Hibernation?</b></span><br />
<span style="font-family: "courier new";"><br /></span>
<span class="Apple-style-span" style="font-family: "courier new";">I wrote this post on April 22, 2006</span><br />
<span style="font-family: "courier new";"><br />
</span><br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjuAKeSYA0MGhPeB116blrPrmggtBTV4ZSHfChf0xMWfli0ET2CMSnU1tNPTeyMbulPO8vruOGT8yD5LVeCnuiP5ByRvM9IKt_6FVA_w0du-8SN40GftR6630cp9f05QhC0HkHjlA/s1600/20170825_190117.jpg" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" data-original-height="901" data-original-width="1600" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjuAKeSYA0MGhPeB116blrPrmggtBTV4ZSHfChf0xMWfli0ET2CMSnU1tNPTeyMbulPO8vruOGT8yD5LVeCnuiP5ByRvM9IKt_6FVA_w0du-8SN40GftR6630cp9f05QhC0HkHjlA/s640/20170825_190117.jpg" width="640" /></a></div>
<span style="font-family: "courier new";"><br /></span>
<span style="font-family: "courier new";"><br /></span>
<span style="font-family: "courier new";">One year ago today, s</span><span style="font-family: "courier new";">omething extraordinary happened</span><span style="font-family: "courier new";">...</span><br />
<br />
<span style="font-family: "courier new";">
<a href="http://www.fhcrc.org/science/labs/roth/"><span style="color: #3333ff; font-weight: bold;">Mark Roth</span></a> at Fred Hutchison Cancer Research Center in Seattle <a href="http://www.fhcrc.org/science/labs/roth/"><span style="color: #3333ff; font-weight: bold;"></span></a>announced the astounding ability to induced hibernation in mice by having them breathe 80 parts per million (ppm) hydrogen sulfide</span> <span style="font-family: "courier new";">gas (H2S). Yes, that's the gas that smells like rotten eggs.</span><br />
<br />
<span style="font-family: "courier new";">Not only did these critters fall asleep for six hours, their heart rate and respiration dropped by 92% - apparently replicating the effects of true hibernation. And their temperature dropped to 2 degrees </span><span style="font-family: "courier new";">C above ambient temperature. They in effect became cold-blooded.</span><br />
<br />
<span style="font-family: "courier new";">It should also be noted, when the gas was removed, the mice awoke with no apparent ill effects. The critters could still run their maze in a normal fashion.</span> <span style="font-family: "courier new";"><br />
<br />
There are hints that H2S Induced Hibernation might be a natural defense mechanism or at least a normal biological process. It appears this H2S gas is produced by the body under certain</span> <span style="font-family: "courier new";">conditions and may be the key to normal hibernation. This may also be the cause of "Cold Water Shock Reflex" in which those who have "drowned" in cold water come back to life.</span><br />
<br />
<span style="font-family: "courier new";">At 80 ppm, H2S can not simply be replacing O2 in the blood which exist at 210,000 PPM in typical air. It seems that H2S acts more like a hormone causing ALL cells in the body to slow down at</span> <span style="font-family: "courier new";">the same time. Is H2S the body's way of adjusting the thermostat?</span> <span style="font-family: "courier new";"><br />
<br />
Hold on! I'm way out of my element here. I'm not qualified to do biology. I'm not even qualified to write about it.</span> <span style="font-family: "courier new";"><br />
<br />
But I DO considered this ASTOUNDING news! And indeed the world reported it. Well at least in a tepid way (sorry about the pun). From the <a href="http://news.bbc.co.uk/1/hi/sci/tech/4469793.stm"><span style="color: #3333ff; font-weight: bold;">BBC</span></a> to the <a href="http://www.washingtonpost.com/wp-dyn/content/article/2005/04/21/AR2005042101262_pf.html"><span style="color: #3333ff; font-weight: bold;">Washington Post</span></a> they did at least rehash Mark's original work. Even</span> <span style="font-family: "courier new";"><a href="http://en.wikipedia.org/wiki/Hydrogen_sulfide"><span style="color: #3333ff; font-weight: bold;">Wikipedia</span></a> added three paragraphs to the Hydrogen Sulfide (H2S) page. I was impressed with that.</span><br />
<br />
<span style="font-family: "courier new";">But THAT was it...</span><br />
<br />
<span style="font-family: "courier new";">I'm serious.</span><br />
<br />
<span style="font-family: "courier new";">Nothing more.</span><br />
<br />
<span style="font-family: "courier new";">No follow-up questions.</span><br />
<br />
<span style="font-family: "courier new";">No follow-up answers.</span><br />
<br />
<span style="font-family: "courier new";">No in-depth reporting.</span><br />
<br />
<span style="font-family: "courier new";">No detailed analysis.</span><br />
<br />
<span style="font-family: "courier new";">No flying out to Seattle.</span><br />
<br />
<span style="font-family: "courier new";">No camping on the lawn.</span> <span style="font-family: "courier new";"><br />
<br />
No helicopter shots.</span> <span style="font-family: "courier new";"><br />
<br />
No checking tax returns.</span><br />
<br />
<span style="font-family: "courier new";">Hell, Tom Cruise jumps up and down on a couch and the media follows him around for weeks! Where is the coverage for the stuff that REALLY counts? Oh well. I would wait. There was sure to be</span> <span style="font-family: "courier new";">more news on the topic in a short time. So I set my Google news reader and waited...</span><br />
<br />
<span style="font-family: "courier new";">And waited...</span><br />
<br />
<span style="font-family: "courier new";">And waited...</span><br />
<br />
<span style="font-family: "courier new";">And I'm still waiting.</span><br />
<br />
<span style="font-family: "courier new";">It's been one year. Other than some comments from an aging blog and one think tank, there has been nothing at all. Nothing! Am I way off base or is this NOT a Nobel class discovery?</span><br />
<br />
<span style="font-family: "courier new";">Where's the follow-up from Mark Roth?</span> <span style="font-family: "courier new";"><br />
<br />
Where's the H2S Induced Hibernation blog?</span><br />
<br />
<span style="font-family: "courier new";">Where are the frat boy posts about their flatulent experiments?</span> <span style="font-family: "courier new";"><br />
<br />
Where's the Flatliner crew?</span><br />
<br />
<span style="font-family: "courier new";">Where's Kiefer Sutherland when we need him?</span><br />
<br />
<span style="font-family: "courier new";">Where are all the science fiction plots?</span><br />
<br />
<span style="font-family: "courier new";">When I read the news release last year, I thought follow-up would be like the coverage for Cold Fusion a few years ago - lots of people trying to reproduce the results. Maybe we would even get</span> <span style="font-family: "courier new";">some quick test with humans.</span><br />
<span style="font-family: "courier new";"><br />
But no...</span> <span style="font-family: "courier new";"><br />
<br />
Nothing.</span><br />
<br />
<span style="font-family: "courier new";">Nada.</span><br />
<br />
<span style="font-family: "courier new";">Zilch.</span> <span style="font-family: "courier new";"><br />
<br />
What's a geek to do? There's only one thing. Ask the questions that SHOULD have been asked a year ago. So here goes.</span><br />
<br />
<span style="font-family: "courier new";">Does this Roth effect work longer than six hours?</span><br />
<br />
<span style="font-family: "courier new";">Does it work for days?</span> <span style="font-family: "courier new";"><br />
<br />
Does it work for weeks?</span> <span style="font-family: "courier new";"><br />
<br />
Does it work for months?</span> <span style="font-family: "courier new";"><br />
<br />
Does it work on other larger mammals?</span><br />
<br />
<span style="font-family: "courier new";">Does it work on humans?</span> <span style="font-family: "courier new";"><br />
<br />
Any obvious side effects?</span><br />
<br />
<span style="font-family: "courier new";">Any long term side effects?</span><br />
<br />
<span style="font-family: "courier new";">How long can someone stay under without ill effects?</span><br />
<br />
<span style="font-family: "courier new";">Does this low-level metabolism consume fat like it does in bears?</span> <span style="font-family: "courier new";"><br />
<br />
Does muscle tone also atrophy?</span> <span style="font-family: "courier new";"><br />
<br />
Does this low-level metabolism extend life?</span><br />
<br />
<span style="font-family: "courier new";">Is 80 PPM a threshold or is there a proportional effect at 40 PPM? 20 PPM?</span><br />
<br />
<span style="font-family: "courier new";">What happens at 160 ppm? Is the sleep deeper? (yes, I know H2S is deadly at higher concentration, but so is table salt).</span> <span style="font-family: "courier new";"><br />
<br />
Is this truly a natural feature of mammals?</span> <span style="font-family: "courier new";">If H2S is produced internally, can the effect be induced by meditation? If so, how does one exit the state?</span><br />
<br />
<span style="font-family: "courier new";">I could go on and on but you get the idea. To get the answers to these and other questions, first they have to be asked. And then asked by the right people. That's what this blog post is all about. We need the right people asking these questions - not me.</span> <span style="font-family: "courier new";"><br />
<br />
There's a saying in the world of finance, "Capital finds it's highest and best use". This seems to take a little longer with science. It also takes imagination, speculation and a whole lot of</span> <span style="font-family: "courier new";">promoting.</span> <span style="font-family: "courier new";"><br />
<br />
Promotion is important. America was not named for Columbus. America was named for a navigator and blogger of the fifteenth century - Amerigo Vespucci. His letters were published widely on his</span><span style="font-family: "courier new";"> return from the new world. He didn't discover anything, but promoted what he found. The name stuck.</span> <span style="font-family: "courier new";"><br />
<br />
That's why H2S Induced Hibernation now needs to be all about blogs, Digg and Wikipedia. It's up to us. It's time for some speculation. Maybe even some speculative fiction. We need serious talent</span> <span style="font-family: "courier new";">applied to finding the answers to the above and other questions. More discussion may help.</span> <span style="font-family: "courier new";"><br />
<br />
Here are some ideas as to how H2S could be used. Maybe this will help move things along.</span> <span style="font-family: "courier new";"><br />
<br />
<span style="font-weight: bold;">Time in trauma care</span> - This one is obvious. With such low concentrations of H2S needed, a simple regulator mask in first aid kits might extend that "Critical Hour" to a "Critical Day" giving time to do a better</span> <span style="font-family: "courier new";">job with transport, evaluation, and treatment. It's easier to stop bleeding when the heart is only pumping eight times per minute. It's easier to keep cells alive when their demand for</span><span style="font-family: "courier new";"> resources has dropped by 92%.</span><br />
<br />
<span style="font-family: "courier new";"><span style="font-weight: bold;">Mine Disasters</span> - During the recent mine disaster in West Virginia, the miners only had air for one hour. Could this have been extended to 12 hours by adding a little H2S to those respirators?</span> <span style="font-family: "courier new";">Coal mine accidents are an even bigger problem in China with over 6,000 dead per year. Think of the lives that could be saved even if a small percentage had this advantage.</span> <span style="font-family: "courier new";"><br />
<br />
<span style="font-weight: bold;">Fire Escape</span> - Since most fire deaths are caused by smoke inhalation, many extra minutes could be gained with one of those new and improved masks from the coal mine? Check the first-aid kit. Is it there yet? Again, the lives saved would be in the thousands</span> <span style="font-family: "courier new";">world-wide.</span><span style="font-family: "courier new";"><br />
<br />
<span style="font-weight: bold;">Underwater Rescue</span> - Another good application for limited oxygen? And maybe a re-make of the movie Abyss? Lots of possibilities here.</span> <span style="font-family: "courier new";"><br />
<br />
<span style="font-weight: bold;">ALL incurable disease</span> - This is a no brainer. Got a problem? Take a break for a while. Wake up to review the literature. Take another break. Repeat until cured.<br />
<br />
<span style="font-weight: bold;">Medical scheduling</span> - Waiting for an organ? Make</span><span style="font-family: "courier new";"> sure you have enough time. It's better than death.</span> <span style="font-family: "courier new";"><br />
<br />
<span style="font-weight: bold;">Military Use</span> - Lot's of possibilities here, from trauma to transport. Here's where Kiefer Suterland comes in with a new release of 24 Hours lived in 24 years. How's THAT for a challenge to</span><span style="font-family: "courier new";"> his premise?</span> <span style="font-family: "courier new";"><br />
<br />
<span style="font-weight: bold;">Sleep Efficiency</span> - How about all that time we waste sleeping? Might we extend our life by taking it deeper? Or maybe the opposite, and find out how to shorten sleep? Keep an open mind.</span> <span style="font-family: "courier new";"><br />
<br />
<span style="font-weight: bold;">Weight Loss</span> - this could be a biggie, both in terms of dollars and quality of life. Let's say you're not a fan of winter anyway. Why not do like the bears do? You could wake up ready for your</span><span style="font-family: "courier new";"> new spring swim suit.</span> <span style="font-family: "courier new";"><br />
<br />
<span style="font-weight: bold;">Capital Punishment</span> - This is a bit radical, but at least it's not a death sentence. And they aren't causing any problems in the mean time. In time we might even find a "cure" for murder.</span> <span style="font-family: "courier new";"><br />
<br />
<span style="font-weight: bold;">Pregnant Mothers</span> - This might at first seem radical too, but Mark Roth's page refers to "embryonic diapause, a pause in embryonic development found in about 70 species of mammals". It might be</span> <span style="font-family: "courier new";">useful one way or the other. Don't count it out.</span><br />
<br />
<span style="font-family: "courier new";"><span style="font-weight: bold;">Punishment</span> - What the hell. Let's put them ALL on ice as a cost reduction measure! We could count it as good time. Would it still be punishment? Fun to think about.</span> (note - after I wrote this I found one blog post at <a href="http://www.worldthinktank.net/wttbbs/"><span style="color: #3333ff; font-weight: bold;">World Think Tank</span></a> that talked about using H2S for prison riot control. Could we extent this to riot control in general?)<br />
<br />
<span style="font-family: "courier new";"><span style="font-weight: bold;">Athletes</span> - Since I'm getting radical, how about extending the performance window of our very best athletes? We could give them the option of waking up every four years in time to train for the</span> <span style="font-family: "courier new";">Olympics. The other option would simply be to let them "rest" off season.</span><br />
<br />
<span style="font-family: "courier new";"><span style="font-weight: bold;">Space Travel</span> - Yep. Classic application. Maybe we could finally do some. There are at the very least, some fresh movie plots here, or the chance to make them more realistic.</span><br />
<br />
<span style="font-family: "courier new";"><span style="font-weight: bold;">Time Travel</span> - This is of course relative and one direction. But how about sleeping a few weeks at a time and find yourself subjectively rushing forward into the future? It might be fun.</span> <span style="font-family: "courier new";"><br />
<br />
<span style="font-weight: bold;">Tivo for life</span> - This is an extension of the time travel idea - sort of fast forward when you want, live life when YOU want. Let's say you're a basketball fan but hate the rest of the year -</span><span style="font-family: "courier new";"> beep, beep, beep. Treat the boring parts of life like one big commercial. Live life on YOUR terms!</span> <span style="font-family: "courier new";"><br />
<br />
<span style="font-weight: bold;">Tivo for the heart</span> - Will H2S sleep dampen a heartache? I think Heinlein used this in "Door Into Summer". Would it help? Who knows. If you've ever been there, anything's worth a try.</span><br />
<br />
<span style="font-family: "courier new";"><span style="font-weight: bold;">Tivo for the soul</span> - Could this be the ultimate form of meditation? Stay awake for only short slices of life and jump WAY into the future. Would it give you a different perspective? Would you</span> <span style="font-family: "courier new";">dream? Would it matter?</span><br />
<br />
<span style="font-family: "courier new";">Anyway, you get the idea. The point is, there are LOTS of possibilities not being effectively promoted. Feel free to ad yours below. These examples are why it's so important to know...<br />
<br />
</span><span style="font-family: "courier new";">H2S Induced Hibernation useful?</span><br />
<br />
<span style="font-family: "courier new";">It's been a YEAR!</span><br />
<br />
<span style="font-family: "courier new";">Clue us in.</span><br />
<br />
<span style="font-family: "courier new";">Or is everyone, "No Longer Sleepless in Seattle" ?</span> <span style="font-family: "courier new";"><br />
<br />
BTW, amazing work Mark. Congratulations.</span> <span style="font-family: "courier new";"><br />
</span><br />
<span style="font-family: "courier new";"><br />
</span><br />
<b style="font-family: "courier new";"><span style="font-family: "courier new";"><span style="font-family: "courier new";">Sudden Disruption</span></span></b><br />
<div style="margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; margin: 0px;">
<div style="margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; margin: 0px;">
<div style="margin: 0px;">
<div style="font-family: "times new roman";">
<span style="font-family: "courier new";"><b><span style="font-family: "courier new";"><span style="font-family: "courier new";"><br /></span></span></b></span>
<span style="font-family: "courier new";"><b><span style="font-family: "courier new";"><span style="font-family: "courier new";"><br /></span></span></b></span>
<span style="font-family: "courier new";"><br /></span><span style="font-family: "courier new";"><span style="font-family: "courier new"; font-size: large;"><b>The latest:</b></span></span></div>
<div style="font-family: "times new roman";">
<span style="font-family: "courier new";"><br /></span><span style="font-family: "courier new";"><span style="font-family: "times new roman";">07-28-16 <a href="http://www.popsci.com/reanimators">HOW SCIENTISTS ARE BRINGING PEOPLE BACK FROM THE DEAD</a></span></span><br />
<span style="font-family: "courier new";"><b style="font-family: "Times New Roman";"><br /></b></span>
<span style="font-family: "courier new";"><b style="font-family: "Times New Roman";">01-20-16 <a href="https://www.washingtonpost.com/news/morning-mix/wp/2016/01/20/being-frozen-to-death-saved-this-mans-life-it-could-save-others-too/">Being frozen ‘to death’ saved this man’s life. It could save others,’ too.</a></b><span style="font-family: "times new roman";"> </span><span style="font-family: "times new roman";"><b><span style="font-family: "courier new";"><span style="font-family: "courier new";"><br /></span></span></b></span><span style="font-family: "times new roman";"></span><span style="font-family: "times new roman";"><b><span style="font-family: "courier new";"><span style="font-family: "courier new";"><br /></span></span></b></span>06-21-15 </span><span style="font-family: "courier new";"><a href="https://www.rug.nl/research/portal/en/publications/the-role-of-endogenous-h2s-production-during-hibernation-and-forced-hypothermia(d7d7de5d-34d8-4890-aa8b-3646d2a02a45).html">The role of endogenous H2S production during hibernation and forced hypothermia: towards safe cooling and rewarming in clinical practice</a></span></div>
<div style="font-family: "times new roman";">
<span style="font-family: "courier new";"><br /></span><span style="font-family: "courier new";">Apparently "Torpor" is the new handle for this technology:</span></div>
<div style="font-family: "times new roman";">
<span style="font-family: "courier new";"><br /></span><span style="font-family: "courier new";">10-03-14 </span><span color="rgba(0 , 0 , 0 , 0.8)" style="background-color: white; font-family: "roboto slab" , "times new roman" , serif; font-size: 14px; white-space: pre-wrap;"><b><a href="http://news.discovery.com/space/nasa-eyes-crew-deep-sleep-option-for-mission-to-mars-141003.htm">NASA Eyes Crew Deep Sleep Option for Mars Mission</a></b></span></div>
<div style="font-family: "times new roman";">
<span style="font-family: "courier new";"><br /></span><span style="font-family: "courier new";">04-21-14 </span><b><a href="http://www.telegraph.co.uk/news/worldnews/northamerica/usa/10777688/Survival-of-teenage-stowaway-on-five-hour-flight-to-Honolulu-is-medical-miracle-say-experts.html">Survival of teenage stowaway on five-hour flight to Honolulu is medical 'miracle'</a></b></div>
<div style="font-family: "times new roman";">
<span style="font-family: "courier new";"><br /></span><span style="font-family: "courier new";">03-27-14 </span><b><a href="http://www.theverge.com/2014/3/26/5551120/upmc-presbyterian-hospital-will-conduct-first-human-suspended-animation-trials">Doctors will place patients between life and death in suspended animation trials</a></b></div>
<div style="font-family: "times new roman";">
<br /></div>
<div style="font-family: "times new roman";">
<a href="http://singularityhub.com/2011/09/14/aint-no-science-fiction-suspended-animation-is-fda-approved-and-heading-to-clinical-trials/">10-07-13 - Ain't No Science Fiction, Suspended Animation Is FDA Approved and Heading to Clinical Trials</a><span style="font-family: "courier new";"><span style="font-family: "courier new";"><br /></span></span><span style="font-family: "courier new";">Mass extinction is a great reason to hibernate:</span></div>
<div style="font-family: "times new roman";">
<b style="font-family: "courier new";"><br /></b><b style="font-family: "courier new";">04-17-12 </b><b><a href="http://phys.org/news/2013-04-key-ingredient-mass-extinctions-boost.html">Key ingredient in mass extinctions could boost food, biofuel production</a></b></div>
<div style="font-family: "times new roman";">
<b style="font-family: "courier new";"><br /></b><b style="font-family: "courier new";"><span style="font-family: "courier new";"><b>02-20-12 - <a href="http://www.thelocal.se/39204/20120220/">Peter Skyllberg Snowed in for Two Months - Related?</a></b></span></b></div>
<div style="font-family: "times new roman";">
<span style="color: #0b5394; font-family: inherit;"><b><a href="http://discovermagazine.com/2007/may/suspended-animation"><span style="background-color: white; line-height: 17px;">Was Mitsutaka Uchikoshi</span> another example?</a></b></span><span style="font-family: "courier new";"><b><br /></b></span></div>
<div style="font-family: "times new roman";">
<span style="font-family: "courier new";"><b>09-30-10<span class="Apple-style-span" style="font-weight: normal;"> Considering the theme of my original post, here is yet another example of the media missing the story and working the "politically incorrect" angle. </span><br /><span class="Apple-style-span" style="font-weight: normal;"><br /></span><br /><span class="Apple-style-span" style="font-weight: normal;">In this case, documents were revealed from a Naval Surgeon in 1805. The headline is about "Bizarre naval experiments" and the focus is tobacco smoke and deliberate transmission of venereal disease, when the real story is quite probably the first documented case of suspended animation. </span><br /><br /><span class="Apple-style-span" style="font-weight: normal;">When will they ever "get it"?</span></b></span></div>
<div style="font-family: "times new roman";">
<span style="font-family: "courier new";"><b></b></span><br /></div>
<h1 style="color: #343434; font-family: "times new roman"; letter-spacing: -0.03em; line-height: 1.16em; margin: 0px; padding: 0px 0px 4px;">
<span style="font-family: "courier new";"><b><span class="Apple-style-span" style="font-size: medium;"><a href="http://www.telegraph.co.uk/news/newstopics/howaboutthat/8031653/Bizarre-naval-experiments-revealed.html">Bizarre naval experiments revealed</a></span></b></span></h1>
<div style="font-family: "times new roman";">
<span style="font-family: "courier new";"><b><br />04-21-10</b> Significant advancement and recognition for the concept of induced hibernation! </span></div>
<div style="font-family: "times new roman";">
<span style="font-family: "courier new";"><br /></span>
<span style="font-family: "courier new";">I revisit this topic each year. It appears there's been significant progress. Look through my list of uses below to understand why, and how important this discovery is. Or start with Mark Roth's latest TED video at the end of the post.</span></div>
<div style="font-family: "times new roman";">
<span class="Apple-style-span" style="font-family: "courier new";"><br /></span>
<span style="font-family: "courier new";"><br /><span style="color: #3333ff; font-weight: bold;"><a href="http://www.wctv.tv/news/headlines/64387427.html">CNN Update - </a><span class="Apple-style-span" style="color: black; font-weight: normal;"><span style="color: #3333ff; font-weight: bold;"><a href="http://www.wctv.tv/news/headlines/64387427.html">10-15-09</a><span class="Apple-style-span" style="color: black; font-weight: normal;"><a href="http://www.wctv.tv/news/headlines/64387427.html"><span style="color: #3333ff; font-weight: bold;">...</span></a></span></span></span></span></span></div>
<div style="font-family: "times new roman";">
<span style="font-family: "courier new";"><br /></span><span style="font-family: "courier new";">Another major update 02-18-10 - <b><span class="Apple-style-span" style="color: #0b5394;">Wired Interview</span></b></span></div>
<div style="font-family: "times new roman";">
<br /></div>
<div style="font-family: "times new roman"; margin: 0px;">
<span style="font-family: "courier new";"><span style="font-family: "courier new";"><b><span class="Apple-style-span" style="color: #0b5394;"><span class="Apple-style-span" style="color: black; font-weight: normal;"><b><span class="Apple-style-span" style="color: #0b5394;"><a href="http://www.wired.com/epicenter/2010/02/mark-roth-on-mice-and-men/">Mark Roth on Mice and Men and Suspended Animation</a></span></b></span></span></b></span></span></div>
<div style="font-family: "times new roman"; margin: 0px;">
<span style="font-family: "courier new";"><br /></span></div>
<div style="font-family: "times new roman"; margin: 0px;">
<span style="font-family: "courier new";"><span class="Apple-style-span" style="font-family: "courier new";"><b><span class="Apple-style-span" style="color: #0b5394;"><a href="http://labs.fhcrc.org/roth/">Mark Roth Home Page</a></span></b></span></span><br />
<br />
<div style="display: inline; margin: 0px;">
<div style="display: inline; margin: 0px;">
<div style="display: inline;">
<span style="font-family: "courier new";"><b><span style="font-family: "courier new";"><span class="Apple-style-span" style="color: #0b5394;"><a href="http://www.ted.com/talks/mark_roth_suspended_animation.html">Mark Roth: Suspended animation is within our grasp 02-2010 </a></span></span></b></span></div>
</div>
</div>
<br />
<div style="margin: 0px;">
<div style="margin: 0px;">
<span style="font-family: "courier new";"><b><span style="font-family: "courier new";"><span class="Apple-style-span" style="color: #0b5394;"><a href="http://www.ikaria.com/">Mark Roth's company - Ikaria Holdings</a></span></span></b></span><br />
<span style="font-family: "courier new";"><b><br /></b></span><span style="font-family: "courier new";"><span style="color: #0b5394; font-family: "courier new";"><b><a href="http://www.benthamdirect.org/pages/content.php?CMC/2009/00000016/00000010/0009C.SGM">Potential Applications of Hydrogen Sulfide-Induced Suspended Animation</a></b></span></span><br />
<span style="font-family: "courier new";"><b><br /></b></span>
</div>
</div>
</div>
</div>
<span style="font-family: "courier new";"><b><span style="font-family: "courier new";"><span class="Apple-style-span" style="font-family: "times new roman";"><span style="font-family: "courier new";"></span></span></span></b></span>11-20-19 <a href="https://www.newscientist.com/article/2224004-exclusive-humans-placed-in-suspended-animation-for-the-first-time/"><b>Humans Placed in Suspended Animation for the First Time</b></a><br />
<br />
12-06-19 <b><a href="https://www.usatoday.com/story/news/world/2019/12/06/british-woman-survives-6-hour-cardiac-arrest-after-hiking-spain/4351431002/">Woman Survives six hours</a></b><br />
<br />
03-16-20 <a href="https://www.ncbi.nlm.nih.gov/pubmed/27314446?fbclid=IwAR1ztG1SkB5CSHjb4y962lhgz4NA6cjzTaGn5B3HaVi5YSdqknd4LJBaS_A">Hydrogen Sulfide Is an Antiviral</a><br />
<br />
06-15-20 <b><a href="https://www.scientificamerican.com/article/switch-in-mouse-brain-induces-a-deep-slumber-similar-to-hibernation/">Switch in Mouse Brain Induces a Deep Slumber Similar to Hibernation</a></b></div><div style="margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; margin: 0px;"><br /></div><div style="margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; margin: 0px;">03-15-21 <b><a href="https://www.iflscience.com/plants-and-animals/secrets-of-humanitys-closest-hibernating-relative-could-pave-way-for-space-flight-and-surgery/" target="_blank">Lemurs Hibernate Too</a></b></div>
<div style="font-family: "Times New Roman"; font-weight: normal; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; margin: 0px;">
</div>
</div>
Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.com11tag:blogger.com,1999:blog-23742979.post-75572039204804797422019-09-09T07:45:00.002-07:002023-05-04T08:39:42.032-07:00Absolutely!<span style="font-family: "courier new" , "courier" , monospace; line-height: 19px;">First posted on 8-7-2013:</span><br />
<span style="font-family: "courier new" , "courier" , monospace; line-height: 19px;"><br /></span><span style="font-family: "courier new" , "courier" , monospace; line-height: 19px;">"Absolutely" sets a new standard for "most abused word". </span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span style="font-size: x-small; line-height: 19px;"><br /></span> </span><br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg2hv3CB-_42ZeGKslAJwWqxMbF1luEqkm34EpIe0KzvfIhzK7SLgZ2X8c2nhAb41aqtcXc-DssjsGmOSMDJA_fMS73RU1be34x1xP790ecm9OVcvejcCshnWATE2YjuHH4c_XCXA/s1600/317522_10151503985757192_812196010_n+(1).jpg" style="margin-left: auto; margin-right: auto;"><span style="font-family: "courier new" , "courier" , monospace;"><img border="0" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg2hv3CB-_42ZeGKslAJwWqxMbF1luEqkm34EpIe0KzvfIhzK7SLgZ2X8c2nhAb41aqtcXc-DssjsGmOSMDJA_fMS73RU1be34x1xP790ecm9OVcvejcCshnWATE2YjuHH4c_XCXA/s640/317522_10151503985757192_812196010_n+(1).jpg" width="640" /></span></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;"><span style="font-family: "courier new" , "courier" , monospace; font-size: small;">My granddaughter Leah searching for absolutes</span></td></tr>
</tbody></table>
<span style="font-family: "courier new" , "courier" , monospace;"><span style="font-size: x-small; line-height: 19px;"><br /></span> <span style="font-size: x-small; line-height: 19px;"><br /></span><span style="line-height: 19px;">Turn on any interview news show. Within seconds you'll hear the word, "Absolutely!" What's wrong with that? Well, this definitive response is likely about some relatively complex issue that doesn't even approach any absolute condition. Or the question wouldn't have been asked in the first place.</span></span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span></span> <span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;">Why is "Absolutely!" such a common answer? Is there really that much certainly in our world? Nope. It's the result of lazy thinking.</span></span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span> <span style="line-height: 19px;">"Absolutely" is being applied to things that are not only NOT absolute, often they are not even probable. </span><span style="line-height: 19px;">Consider the source. Who is the spokesman? </span><span style="line-height: 19px;"> Politician</span><span style="line-height: 19px;">? Anyone else with a bias? See what I mean? </span><span style="line-height: 19px;">Ironically, the very opposite of their "absolute" assertion is often the case. Strangely enough, the more emphatic the claim, the less likely it is to be true.</span></span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><b><br />"I did not have sexual relations with that woman" and "there is absolutely no sex of any kind" </b></span><b style="font-family: "Courier New", Courier, monospace;">- President Bill Clinton, 1998 from public statement and deposition</b><br />
<br />
<span style="font-family: "courier new" , "courier" , monospace; line-height: 19px;">This onslaught of "absolutely!" is an excellent opportunity for some critical thinking. Consider possible exceptions to the assertions as they are being stated. </span><span style="font-family: "courier new" , "courier" , monospace; line-height: 19px;">Under what conditions might the statement NOT be true? See what I mean?</span><br />
<span style="font-family: "courier new" , "courier" , monospace; line-height: 19px;"><br /></span> <span style="font-family: "courier new" , "courier" , monospace; line-height: 19px;">Years ago </span><span style="font-family: "courier new" , "courier" , monospace; line-height: 19px;">I had a friend who, when I'd make some brash statement would say, "As opposed to?", then follow it with possible exceptions. She was very good at this, and it became a game we played. So I stole her trick. It's great mind candy, and quickly begs an even more important question - </span><span style="font-family: "courier new" , "courier" , monospace; line-height: 19px;">is ANYthing absolute?</span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><br /></span> <span style="font-family: "courier new" , "courier" , monospace;"><br /></span> <b><span style="font-family: "courier new" , "courier" , monospace; font-size: large;">Absolutely!</span></b><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span> <span style="line-height: 19px;">Seriously, isn't "absolutely" simply an idea? A creation of the human mind? Is it not our aspiration to see things all one way? Or all the other? Isn't "absolutely" merely the result of a bad case of polar thinking?</span></span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span> <span style="line-height: 19px;">"Absolutely" doesn't really exist. Seeking the truth is best approached asymptotically, leaving the end-point for the weaker mind. It's like "u</span><span style="line-height: 19px;">nsinkable", "u</span><span style="line-height: 19px;">nstoppable" or "i</span><span style="line-height: 19px;">mmovable", each an admirable goal, but not achievable in the real world. "Unsinkable" didn't even complete its first voyage.</span></span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><br /></span> <span style="font-family: "courier new" , "courier" , monospace; line-height: 19px;">So is NOTHING absolutely true? Nope. Well, not likely.</span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span><span style="line-height: 19px;"><b>"There are no absolutes", MAY be the ONLY valid absolute.</b></span></span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span> <span style="line-height: 19px;">I can imagine everyone now bringing to mind their favorite absolutes - God, love, mathematics, gravity. I could go on and on, and so can you. So</span><span style="line-height: 19px;"> let's start with the most popular:</span></span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span></span> <span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span></span> <span style="font-family: "courier new" , "courier" , monospace;"><span style="font-size: large; line-height: 19px;"><b>God</b></span></span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span><span style="line-height: 19px;">Many of you think me an atheist, but you'd be wrong. I'm at best (or worst) an agnostic. And I have doubts about that. But I <b>DO</b> hold that each of us should be given the tolerance to explore as we wish. And that's the key - we need to keep an opened mind.</span></span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span> <span style="line-height: 19px;">God is not only an overloaded word (many meanings), it's also one of the most over-loaded concepts we have in all the various aspects of human culture. Unfortunately, the truth of God does not have a very good track record, under any religion. </span></span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span></span> <span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;">Compare the "truths" of God, Jesus Christ, Buddha and Mohammad as documented.</span><span style="line-height: 19px;"> </span><span style="line-height: 19px;"> They can't <b>ALL</b> be right.</span><span style="line-height: 19px;"> To be candid, I think Mohammad, Buddha and Jesus Christ may have been pretty impressive guys, but you wouldn't know it from all the contradictions and contemporary interpretations. To be fair, let's take each one separately. They still get interpreted in many different ways by different "believers". Again, even for any one religion, they can't <b>ALL</b> be right. And if any of these doctrines were absolutely true, why would there ever be a need for change? And yet they <b>DO</b> change from time to time. </span></span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span><span style="line-height: 19px;">Let me get specific about the absolute aspect of God, not religion. Is our existence (or its illusion for some), proof of God? Hardly. Our "existence" is demonstratively discoverable, and aspects of it are changing day by day. Make a statement about existence, and a thousand others will provide counterpoints. And that's WITHOUT questioning perception and relativity. I</span><span style="line-height: 19px;"> could go on and on, but everyone else already has. </span></span><span style="font-family: "courier new" , "courier" , monospace; line-height: 19px;">Keep an opened mind. Even about God.</span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span></span> <span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span></span> <span style="font-family: "courier new" , "courier" , monospace;"><span style="font-size: large; line-height: 19px;"><b>Love</b></span></span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span> <span style="line-height: 19px;">So, what about love? It's so pure and simple, it MUST be absolute. Unfortunately, love too is a highly overloaded word, concept, and even overloaded feeling. Love goes mystic as an experience, but all you have to do is ingest some MDMA to produce its subjective conviction. This clearly demonstrates our experience of love is at the very least, part of a chemical feedback loop in human behavior. And the conviction you feel on your wedding day? Give it seven years. Again, love's track record is no better than God's. Actually, it's quite a bit worse.</span></span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span></span> <span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span></span> <span style="font-family: "courier new" , "courier" , monospace; font-size: large; line-height: 19px;"><b>Mathematics</b></span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span> <span style="line-height: 19px;">Ahh... mathematics - it's perfection itself. Hardly. Math is only a game we play in our struggle to understand the world. Mathematics is a creation of the human mind, and simply a tool for science. If you look closely, the "truth" of mathematics flows from its usefulness in the observations we've made, or else is some form of identity - wholly disconnected from reality, which again makes it simply an abstraction of thought, which is more of an illusion than an absolute.</span></span><div><span style="font-family: courier new, courier, monospace;"><br /></span></div><div><span style="font-family: courier new, courier, monospace;"><b><a href="https://aeon.co/videos/how-a-verbal-paradox-shattered-the-notion-of-total-certainty-in-mathematics?utm_source=Aeon+Newsletter&utm_campaign=fb87a832a6-EMAIL_CAMPAIGN_2022_05_20_01_03&utm_medium=email&utm_term=0_411a82e59d-fb87a832a6-70629661">How a verbal paradox shattered the notion of total certainty in mathematics</a></b><br /></span>
<span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span></span> <b style="font-family: "Courier New", Courier, monospace;">“As far as the laws of mathematics refer to reality, they are not certain; and as far as they are certain, they do not refer to reality.”</b><span style="font-family: "courier new" , "courier" , monospace;"> – Albert Einstein</span><span style="background-color: white; color: #333333; font-family: "courier new" , "courier" , monospace; line-height: 15.4545px;"><b><br /></b></span><span style="font-family: "courier new" , "courier" , monospace;"></span><span style="background-color: white; color: #333333; font-family: "courier new" , "courier" , monospace; line-height: 15.4545px;"><b><br /></b></span> <span style="font-family: "courier new" , "courier" , monospace; font-size: large; line-height: 19px;"><b><br /></b></span><br />
<span style="font-family: "courier new" , "courier" , monospace; font-size: large; line-height: 19px;"><b>Gravity</b></span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span><span style="line-height: 19px;">Speaking of Einstein, gravity never fails us!</span></span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span><span style="line-height: 19px;">Don't be silly. Of course it does. Along with all the other "laws" of science. All you have to do is go into orbit. OK, that's not fair. The subjective experience of gravity just takes on a different form in orbit. But seriously. Ever heard of the 30 years that shook physics? </span></span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span></span> <span style="font-family: "courier new" , "courier" , monospace;"> <span style="line-height: 19px;">In the late nineteenth century, it was thought that all science had been discovered, there were just a few refinements to be made. There was even talk of shutting down the patent office. And then along came Einstein. That's when all hell</span><span style="line-height: 19px;"> </span><span style="line-height: 19px;">broke loose</span><span style="line-height: 19px;"> (not related to God). </span></span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><br /></span><span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;">Suffice it to say, we have more organized doubt in science now, than in any time in history. And that's healthy. And that's the point. Any student entering the field of science today who does not have an open mind is a fool. If he "believes" in math, if he "believes" in science, he is likely to be worse than useless in this endeavor</span><span style="line-height: 19px;">. He may actually distract us from approaching the truth with his conviction.</span></span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span> <span style="line-height: 19px;">Like mathematics, scientific absolutes are an illusion of the human mind. Instead of discovering absolutes, science is how we claw our way forward in thinking. Just don't hold too tightly to your conclusions, while you reach for the next hand-hold. You'll find it easier to grasp.</span></span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span></span>
<br />
<span style="font-family: "courier new" , "courier" , monospace;"><span style="font-size: large; line-height: 19px;"><b>Useful Generalization</b></span></span><br />
<span style="font-family: "courier new" , "courier" , monospace; line-height: 19px;"><br /></span> <span style="font-family: "courier new" , "courier" , monospace; line-height: 19px;">So the next time you hear someone exclaim, "Absolutely!", let your mind wander to all the exceptions. Then realize the person making the statement needs a lesson in critical thinking. Help them out. Explain that their conclusion may only be a useful generalization.</span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span class="Apple-style-span" style="line-height: 19px;"><br /></span> <span class="Apple-style-span" style="line-height: 19px;">If you have any other "absolutes", please comment below. I'll do what I can to help you test them.</span></span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span class="Apple-style-span" style="line-height: 19px;"><br /></span> <span class="Apple-style-span" style="line-height: 19px;">Who knows, maybe you'll find one. </span></span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span class="Apple-style-span" style="line-height: 19px;"><br /></span> <span class="Apple-style-span" style="line-height: 19px;">But I doubt it.</span></span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><br /></span>
<span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;">If you do find something in nature that is perfectly consistent with all of its history, well, it's probably just waiting for its exception.</span></span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span></span>
<span class="Apple-style-span" style="font-family: "courier new" , "courier" , monospace; line-height: 19px;">Perhaps the best statement on the topic: </span><br />
<br />
<span style="font-family: "courier new" , "courier" , monospace;">"I would rather have a mind opened by wonder than one closed by belief." - Gerry Spence</span></div><div><span style="font-family: courier new, courier, monospace;"><br /></span></div><div><span style="font-family: courier new, courier, monospace;">"Doubt is not a pleasant condition, but certainty is absurd." - Voltaire</span></div><div>
<span style="font-family: "courier new" , "courier" , monospace;"><br /></span> <span style="font-family: "courier new" , "courier" , monospace;">Another similar conclusion:</span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><br /></span> <span style="font-family: "courier new" , "courier" , monospace;">"I'm not absolutely sure of anything" - Richard Feynman in 1981 from this interview:</span><br />
<br />
<b><a href="http://www.openculture.com/2014/03/richard-feynman-on-religion-science.html">Richard Feynman on Religion, Science, the Search for Truth; Our Willingness to Live with Doubt</a></b><br />
<u><br /></u><span style="font-family: "courier new" , "courier" , monospace;"><b><a href="http://reason.com/archives/2012/12/24/half-the-facts-you-know-are-probably-wro" style="background-color: white;">Half the "Facts" You Know Are Probably Wrong</a></b></span><br />
<br />
<span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><a href="https://medium.com/starts-with-a-bang/433601c3580e"><b>"Settled" Science?</b></a></span></span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><span style="line-height: 19px;"><br /></span></span> <b><a href="http://fatfist.hubpages.com/hub/There-are-NO-Absolutes-There-is-NO-Absolute-Truth">There are NO Absolutes. There is NO Absolute Truth!</a></b><br />
<br />
<b>03-07-06 <a href="http://www.slate.com/articles/health_and_science/cover_story/2016/03/ego_depletion_an_influential_theory_in_psychology_may_have_just_been_debunked.html?utm_medium=email&utm_source=digg">Everything Is Crumbling</a></b><br />
<br />
<b><a href="https://www.youtube.com/watch?v=0Rnq1NpHdmw">John Oliver: Scientific Studies</a></b><br />
<br />
12-07-17 <b><a href="https://medium.com/starts-with-a-bang/scientific-proof-is-a-myth-1cbfeff5562a">Scientific Proof Is aA Myth</a></b><br />
<br />
05-02-18 <a href="https://aeon.co/ideas/the-danger-of-absolute-thinking-is-absolutely-clear?utm_medium=feed&utm_source=feedburner&utm_campaign=Feed%3A+AeonMagazineEssays+%28Aeon+Magazine+Essays%29">The Danger of Absolute Thinking</a></div>Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.com2tag:blogger.com,1999:blog-23742979.post-8412071463611739342018-11-12T13:56:00.001-08:002019-01-13T06:05:02.650-08:00The Nature of Knowledge<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi0mTYuXueZjeQkWW3IlLzikIZTnU7IkSbwqq1_39MsowgFfe0i2m-f8BN5bHC-qQjmnGWLR6EiVo2a4IMkNYMfYE0OZd7UR69B0To1R9tarNRGwc9cruaKt716dCYnwx25LVUr9A/s1600/1280px-The_Thinker%252C_Rodin.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1600" data-original-width="1200" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi0mTYuXueZjeQkWW3IlLzikIZTnU7IkSbwqq1_39MsowgFfe0i2m-f8BN5bHC-qQjmnGWLR6EiVo2a4IMkNYMfYE0OZd7UR69B0To1R9tarNRGwc9cruaKt716dCYnwx25LVUr9A/s640/1280px-The_Thinker%252C_Rodin.jpg" width="480" /></a></div>
<br />
I realize the ambitious objective indicated by the title, but there's another perspective - the essence of knowledge, which is more limited in scope and can be more easily contained. And explained. I'm making this post to help me sort out some ideas I'm working on in my research in neuroscience.<br />
<br />
What does it mean to know something? Is it to have that thing well characterized? To understand what might happen to it in most circumstances? Or all circumstances? To have access to the truth about a thing? I believe this last part involving truth goes beyond the scope of knowledge, and in doing so causes a great deal of confusion and grief. It's also the part of Plato's definition, (justified true belief) that was invalidated by <span style="background-color: white; color: #222222; font-family: "roboto"; font-size: 12pt; white-space: pre-wrap;">Edmund Gettier in 1963.</span><br />
<br />
If you've read my blog post, "<a href="http://suddendisruption.blogspot.com/search/label/Absolutely%21">Absolutely</a>", you'll remember my description of approaching truth asymptotically, but never achieving it. Knowledge is that approach, ever waiting to be refined and edited.<br />
<br />
Think about a few things that you "know" to be true. Are they really? Politics is a fertile field for knowledge. Half of any group will "know" things the other half dismisses. It's the same with religion. Conviction is no less "certain" from multiple conflicting perspectives. See what I mean? Half of what we "know" does not even approach the truth. And the other half is only a useful generalization.<br />
<br />
The point is, what we "know" at any given instant is simply the best understanding available to us at the time, based on our own individual experience and perception, and in spite of our "conviction". Indeed, it may often be a long way from some objective and independent "truth". This is why it's best to always keep an opened mind.<br />
<br />
Knowledge is a work in progress. To know something is only to approach its truth, and sometimes to fall well short. Yet we act on our knowledge because it's the best we have to work with at the moment.<br />
<br />
Always be prepared to learn something new in your never-ending quest for knowledge.Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.com0tag:blogger.com,1999:blog-23742979.post-26460572013878609082018-10-11T08:01:00.002-07:002018-10-11T08:01:35.579-07:00Men in AmericaThis challenging but also interesting video is actually a social media test.<br />
<br />
<b><a href="https://www.youtube.com/watch?v=LrhHkQhglig&feature=share">Men in America</a></b><br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe width="320" height="266" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/LrhHkQhglig/0.jpg" src="https://www.youtube.com/embed/LrhHkQhglig?feature=player_embedded" frameborder="0" allowfullscreen></iframe></div>
<br />
<br />
Did you pass?Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.com0tag:blogger.com,1999:blog-23742979.post-39261478786079555582018-03-12T08:41:00.002-07:002022-09-27T14:26:54.264-07:00The Master and His Emissary - A Review of the Divided BrainOriginally posted on 3-15-12:<br />
<br />
Like many, I've been skeptical of left / right brain theory for decades. The data seemed fluffy, with too many exceptions. Then I encountered this video of a TED presentation:<br />
<br />
<div>
<a href="https://www.youtube.com/watch?v=dFs9WO2B8uI">The Divided Brain</a></div>
<br />
<div class="separator" style="clear: both;">
<a href="https://www.youtube.com/watch?v=dFs9WO2B8uI"> <img border="0" height="250" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhjHlKu-2TLpt03u_FDOQ5b0JSII5KMccTkVCXXs3G8PMT09BHp3dWovedfFEYAaIyBBLSi4GBCM6a1Ts4KBYAhLdOhXrZwXBj5pVygZe0jL3_v72_yGDvbf1K5LIcmQZn5M3xIyw/s400/DividedBrain.png" width="400" /></a></div>
<br />
<br />
Finished? If you're like me, the value and concentration of the content were overwhelming. But watch it again. This time keep your hand on the pause button. Pause if his words get ahead of the pictures. Pause again if the pictures get ahead of the words.<br />
<br />
Yes, there's a lot of detail here, both in hard data and concept, but that's not the main reason for this second viewing. This video nicely demonstrates the theme of its own content. It's been biologically established that as you watch, the presentation is going into BOTH sides of your skull at the same time. Your eyes and ears are collecting two similar copies, but your divided brain is creating two DIFFERENT experiences, one dominated by image and animation, the other by text and verbal logic. Your brain is running in parallel, and the only time you notice is when one half's recognition gets ahead of the other. Your control of the pause button shifts smoothly from side to side. Your left or right brain inhibits the other when needed. This shifting control allows both experiences to be captured.<br />
<br />
The moving images provide right-brain meaning through visual metaphor. Dr. McGilchrist's precise left-brain verbalization and text nail down and allow you to "grasp" these ideas. Though I'm over-simplifying, this is an example of the very thing being presented - that it takes BOTH sides of the brain working as asymmetrical and largely unequal specialists in delivering the result that is ultimately the human mind.<br />
<br />
The other reason this video is important is that the book is extremely detailed and thoughtful. "The Divided Brain" takes some digesting. But it's worth it. And it's good to have a "both-brain" outline to fill in his very deliberate, precise and comprehensive written presentation. This video covers literally 500+ pages which are well summarized in only a few minutes. Much of it is profound. I watched it at least a dozen times. Then I read the pages:<br />
<br />
<b><a href="http://www.amazon.com/gp/product/0300188374/ref=as_li_tl?ie=UTF8&camp=1789&creative=9325&creativeASIN=0300188374&linkCode=as2&tag=suddennet-20&linkId=Z2EB2C4F5XIGKGYG">The Master and His Emissary: The Divided Brain and the Making of the Western World, by Dr. Iain McGilchrist</a><img alt="" border="0" height="1" src="https://ir-na.amazon-adsystem.com/e/ir?t=suddennet-20&l=as2&o=1&a=0300188374" style="border: none; margin: 0px;" width="1" /> </b><br />
<b> <b><span style="font-size: large;"><br /></span></b></b><br />
<b style="font-weight: bold;"><span style="font-size: large;"><br /></span></b> <b style="font-weight: bold;"><span style="font-size: large;">The Divided Book</span></b><br />
<br />
This work is both brilliant and maddening. What it presents is a major advancement in our ability to understand human behavior and our behavior's contradictions. It contains both hard data and insightful observations, but also wild conjecture. From a wealth of brain studies over the last few decades, the author has crystallized a new model of left and right brain functionality, which is by far the book's greatest contribution to the topic. Then he describes the left brain's evolving impact on western culture.<br />
<br />
The result is both logically and intuitively convincing. Each of us is literally of two minds, creating a single subjective experience. We are driven by two largely complementary engines of evaluation, each seamlessly yielding control of the mind and body from moment to moment depending on which side is most likely to be effective in dealing with the current challenge. Asymmetry of functional realms is the key to minimizing obvious dynamic conflict. The realization of this fact has wide-ranging implications for all of human behavior, but especially personal relationships, politics, religion, and economics. I shall address these consequences in separate blog posts. But let's get back to the book.<br />
<br />
The delivery of these ideas is fascinating in spite of his formal style, which is necessary for such a demanding topic. Dr. McGilchrist does an amazing job of precisely navigating very deep waters, but like any project this ambitious, he reaches a bit too far now and then, which is perhaps his most important lesson. This work spans credibility from solid conclusion too obvious speculation. But even in speculation, there is very little that is patently wrong. These ideas are not some new-age theory, though this book may literally become the bible for left / right brain enthusiasts. No, I haven't lost my bias for the rational, but I have gained a new appreciation for the intuitive, and it's genesis.<br />
<br />
Like the brain, the book is presented in two parts. The first part is well-founded scientific documentation of the physical and behavioral differences between the left and right brain. Taken alone, it's of tremendous value in understanding the brain's obvious bicameral nature. The second part is dominated by the selective ascription of art and history to one side of the brain or the other, and its impact on our modern culture.<br />
<br />
Though dense with useful data, each chapter raises important issues, which are then addressed by the next. Step by careful step, he builds a coherent model of the mind based on cross-supported observations of brain physiology and actual human behavior. For the most part, it rings true.<br />
<br />
The later section on paradox is brilliant and sets the stage for what he presents in the second part: the "Achille's heel" of the rational mind, which is that our more logical left brain denies anything not within its framework of simulation. This means that the left misses a lot, virtually anything that can't be "proven". He describes this shortcoming as a hall of mirrors. A sharp focus and "single-minded" objective is the left's strength, but also its weakness. The left brain is blind to, or otherwise demotes obvious leaps of intuition, such as casually stepping over a paradox. He uses Zeno to nicely make this point.<br />
<br />
The second part of the book is all about the impact the left-brain has had on the realm of the right. He uses examples of music, art, and philosophy from the last few thousand years. Much of this is based on McGilchrist's impressive knowledge of our culture, and some intuition. This part is as subjective as the first part is objective.<br />
<br />
The first half of the book is a pleasant intellectual challenge. It's also a ride in the park compared to the second half, which is like hitting a bog on a dirt bike, well at least it was for this rational mind. I had to gear-down even to drag my way through, probing for the hard ground of logic, and finding little. He seems to ascribe behaviors to one side of the brain or the other almost willy-nilly. And he freely admits, "These thoughts are inevitably contingent, to some extent fragmentary and rudimental." Though I tried to understand his literary leaps, my mind kept reverting to the rational. This probably says more about me than the writing, but I wonder how many other readers gave up at this point and never finished the book.<br />
<br />
To be fair I chased down a few of these wild ascriptions using his comprehensive bibliography. Each provided a reason to believe, if not actual proof. You'll have to decide for yourself. Though useful data are more rare in the second half, if you grind through, the pearls are there, and worth the effort. Plus there's something more useful than mere data. If you tend to the rational, the first half will be the most meaningful. If you're more intuitive, I suspect the second half will be the most insightful. This intuitive approach in the second half also creates doubt in the rational mind. For me, conjecture about so much subjective art, and evidence such as which way a subject of a painting was facing, left my mind reeling and yearning for the science of the first part.<br />
<br />
Another way of saying this is that the first part contains lots of observations, science, and hard data. It makes sense. It makes you THINK. The second part makes you wonder about the validity of the author's many right-brain speculations. And it gives you the FEELING he just MIGHT be right - but he can't prove it.<br />
<br />
<br />
<span style="font-size: large;"><b>Conclusion</b></span><br />
<br />
Dr. McGilchrist presents a rational left-brained model for what our right-brain has secretly known all along, but could not say. Our right side feels the truth of the presentation but doesn't have a voice. Our left side can put it into words, but won't accept a new model of the mind without reasonable proof. This book provides both proof AND conviction. It makes you think. And it makes you feel.<br />
<br />
This is where the parallel with the video comes in. The book is also a Rorschach's test of left and right brain conclusions. The medium IS the irony. Though the book is mostly words meant for the left brain, these words are inspired by right-brain imagination. Though tedious at times, even the more wild ideas are hard to indict.<br />
<br />
I've probably re-read fewer than ten books in my entire life. I'm a slow reader, but once I read a book, I know it. This book is an exception. The moment I finished (and it took months), I flipped to the front and immediately started reading again. Like the video, reading the book whet my appetite for more. Perhaps it'll do the same for you. If you have any interest in the brain or human behavior, "The Master and His Emissary" is a must-read.<br />
<br />
04-03-14 <a href="https://www.youtube.com/watch?v=uEB68f8kvnY"><b>Age of Wonder - a philosophy lecture by Dr. McGilchrist</b></a><br />
<br />
05-09-17 <b><a href="https://www.youtube.com/watch?v=U2mSl7ee8DI">An excellent interview of Dr. McGilchrist</a></b><br />
<br />
05-17-2018 <a href="https://www.youtube.com/watch?v=5Q2XzLvuJWc"><b>Iain McGilchrist - The Divided Brain and the Unmaking of Our World</b></a><br />
<br />
06-12-18 <b><a href="https://jennymackness.wordpress.com/2018/03/11/iain-mcgilchrist-and-the-divided-brain/">Jenny Connected Review</a></b><br />
<br />
09-27-22 <b><a href="https://tubitv.com/movies/674575?utm_source=justwatch-feed&tracking=justwatch-feed" target="_blank">The Divided Brain - The Film</a></b>Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.com1tag:blogger.com,1999:blog-23742979.post-55872710109927074052017-10-09T11:15:00.000-07:002020-01-15T13:11:57.982-08:00The Most Dangerous Drugs<br />
01-15-20 <span style="background-color: white; color: #1d2129; font-family: "helvetica" , "arial" , sans-serif; font-size: 14px; white-space: pre-wrap;">"showed a greater drop in binge drinking than their peers" is the silver lining in this 850,000 sample study. It will have a far greater long-term positive impact than any negative impact of increased marijuana use as total life-time alcohol harm is 3.5 times greater than marijuana harm:</span><br />
<br />
<a href="https://neurosciencenews.com/legal-marijuana-binge-drinking-15466/">College students use more marijuana in states where it’s legal, but they binge drink less</a><br />
<br />
10/09/17 <b><a href="http://www.iflscience.com/health-and-medicine/only-a-handful-of-people-in-history-have-ever-overdosed-on-lsd-this-is-what-happened-to-them/">Only A Handful Of People In History Have Ever Overdosed On LSD. This Is What Happened To Them</a></b><br />
<br />
06-05-17 <b><a href="https://www.youtube.com/watch?v=kBCZvwhWGj4">Your Brain On Acid</a></b><br />
<br />
<b>06-10-15 <a href="http://burners.me/2015/06/10/global-drug-survey-2015-results/">Nice Collection Of Drug Meta Data</a></b><br />
<br />
04-14-15 - Views are changing: <a href="http://www.people-press.org/2015/04/14/in-debate-over-legalizing-marijuana-disagreement-over-drugs-dangers/"><b>Pew Research on Marijuana</b></a><br />
<br />
02-23-15 - <b><a href="http://www.washingtonpost.com/blogs/wonkblog/wp/2015/02/23/marijuana-may-be-even-safer-than-previously-thought-researchers-say/">Marijuana may be even safer than previously thought, researchers say </a></b><br />
<br />
02-09-15 - <b><a href="http://reason.com/blog/2015/02/09/landmark-study-finds-marijuana-is-not-li?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+reason%2FHitandRun+%28Reason+Online+-+Hit+%26+Run+Blog%29">Landmark Study Finds Marijuana Is Not Linked to Car Crashes </a></b><br />
<br />
From 3-23-2007:<br />
<br />
By British drug class:<br />
<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjCz8TIZDLiR-ESnSpryeHwG7z9Bzs9Nz_Dyc8OhkKIu3OXTu_CTqkcKxjyC-GteRyXcvi83RhJaI_olhHBgR7YMdtx0_rmz2G1FbhZJWLomdHAjq_EKy2sXpiwQy2kwKSYi1qH/s1600-h/Drug+Rankings.gif" onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5045138401609206674" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjCz8TIZDLiR-ESnSpryeHwG7z9Bzs9Nz_Dyc8OhkKIu3OXTu_CTqkcKxjyC-GteRyXcvi83RhJaI_olhHBgR7YMdtx0_rmz2G1FbhZJWLomdHAjq_EKy2sXpiwQy2kwKSYi1qH/s400/Drug+Rankings.gif" style="cursor: pointer;" /></a><br />
<br />
<br />
By actual aspect of harm:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhQMsgiDIPLPeCaJJcpl0mrMpi5c4qldfziF5dBqhIJpX99kak-l-9p71UkQ8JhbW3b0HEE_LR0-2NuC9Nt5bBAxUD7QjVtUQRMM4XMqUAueXmteTPzFatayndcFsYU3okjVWxKyA/s1600/Drug+Risk.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="532" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhQMsgiDIPLPeCaJJcpl0mrMpi5c4qldfziF5dBqhIJpX99kak-l-9p71UkQ8JhbW3b0HEE_LR0-2NuC9Nt5bBAxUD7QjVtUQRMM4XMqUAueXmteTPzFatayndcFsYU3okjVWxKyA/s640/Drug+Risk.jpg" width="640" /></a></div>
<br />
<br />
Having spent over two years on the Washoe County Grand Jury, I've been amazed at how disproportionate the law is compared to the actual physical and social damage resulting from the use of each type of drug.<br />
<br />
<a href="http://news.bbc.co.uk/1/hi/health/6474053.stm"><span style="color: #3333ff; font-weight: bold;">Apparently others agrees - details here.</span></a><br />
<br />
Paper - <b><a href="http://www.sg.unimaas.nl/_OLD/oudelezingen/dddsd.pdf">Drug harms in the UK: a multicriteria decision analysis</a></b><br />
<br />
Even this chart has a muted range when applied to an individual. In the cases I've seen, the curve's far steeper on both ends with meth at the top. Do solvents REALLY do less social damage than cannabis?!?! Certainly not if you're the one huffing.<br />
<br />
I can only assume they were measuring TOTAL social impact as opposed to individual social damage, since the incidence of solvent abuse is far more rare than cannabis. Of the 12 million monthly users in America, ever hear of anyone over-dosing on marijuana?<br />
<br />
In any case, it's good these drugs are finally being painted with separate brushes. From all the studies and history, our laws are obviously upside-down - especially when it comes to alcohol and cannabis.<br />
<br />
Let's get real with the science and the law, before more kids decide they have to try ALL of these things just to learn the truth. Remember that chant from the 60s?<br />
<br />
Let's start telling it like it IS!<br />
<br />
<a href="http://www.blogger.com/A%20useful%20interview%20with%20Dr%20Nutt:">A useful interview with Dr Nutt 2011:</a><br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen='allowfullscreen' webkitallowfullscreen='webkitallowfullscreen' mozallowfullscreen='mozallowfullscreen' width='320' height='266' src='https://www.youtube.com/embed/pWQaJZkDaB8?feature=player_embedded' frameborder='0'></iframe></div>
<br />
<br />
<br />
<a href="http://learn.genetics.utah.edu/content/addiction/mouse/">Mouse Party Teaches Drug Effects</a><br />
<br />
<b><a href="http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0063972">Psychedelics Actually Benefit Mental Health</a></b><br />
<br />
Interesting links:<br />
<br />
<b><a href="http://reset.me/">Reset Me</a></b><br />
<br />
<b><a href="http://motherboard.vice.com/read/the-speed-of-hypocrisy-how-america-got-hooked-on-legal-meth?utm_source=digg&utm_medium=email">The Speed of Hypocrisy: How America Got Hooked on Legal Meth</a></b><br />
<br />
<br />Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.com0tag:blogger.com,1999:blog-23742979.post-31256319072286554622017-08-10T10:21:00.001-07:002017-08-10T10:21:43.794-07:00Burning Man 1995Excellent early video production:<br />
<br />
<a href="https://vimeo.com/58508995">Burning Man 1995 Video</a>Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.com0tag:blogger.com,1999:blog-23742979.post-968545900448935792017-05-31T10:53:00.001-07:002024-03-15T07:27:20.284-07:00The Significance of the Samsung Note<div>
First posted 3-15-12:<br />
<br /></div>
<div>
</div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjq1wWpJJNEFhDeHWMrn9VrXakkvZPu31hSz_jQBXZ0NuTyhUlbSLY0pGEjR9Z4aFYjdxTwcrZdZ0QOUD3w8O9eBDGlMQ3L3i6f69AltljcPlFrN4Z_UgVmjbxwCb6vCs20EfqNJQ/s1600/RodPhones.JPG" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="480" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjq1wWpJJNEFhDeHWMrn9VrXakkvZPu31hSz_jQBXZ0NuTyhUlbSLY0pGEjR9Z4aFYjdxTwcrZdZ0QOUD3w8O9eBDGlMQ3L3i6f69AltljcPlFrN4Z_UgVmjbxwCb6vCs20EfqNJQ/s640/RodPhones.JPG" width="640" /></a></div>
<br />
Like much in the history of human affairs, technical advancement does not generally happen in a smooth progression. It moves in fits and starts, and smart-phone technology has been on a tear for the last few years.<br />
<div>
<br />
Palm was the first true smart-phone with a library of independent apps, but it was the iPhone that first found broad acceptance of the general public. Apple seems to have a way with tech fashion, even if they aren't always the first to market. Or the best.</div>
<div>
<br /></div>
<div>
The next major fit of development was the Android family. Motorola Droid offered the first significant competition to the iPhone. HTC improved performance and over this last year, Samsung has come to lead Android technology with its large displays, yet light weight.<br />
<br />
We now have the Samsung Galaxy Note as its latest example, but is it too cool or simply too big? I'll start with a comparison of my Droid which is what I know best. The Samsung Note has:<br />
<br />
100% more screen area.<br />
50% taller<br />
67% wider<br />
250% more pixels<br />
255% faster clock<br />
80% more battery<br />
60% more pixels in its camera<br />
Plus a front camera<br />
4G surfing and movies<br />
4 times the RAM<br />
16 times the ROM<br />
Effective pen interface<br />
<br />
So what's not to like? Well, it is 8 grams heavier but that's too small to notice. The Samsung Note also has no hard keyboard, but surprisingly, the screen is so large, I'm faster (and more accurate) on its soft keyboard than the Droid hard keyboard. The Samsung Note is better in every way than the standard Droid and even better in most ways than the latest iPhone. End of story? Not quite.<br />
<br />
Surprisingly, the Note's best feature (the screen) is also the critic's biggest complaint, which is what this post is really about. The Note is being panned as a "phablet" because of its large screen. The logic is, it's too big to hold up to your face, and yet too small to compete as a tablet. Here's an example review:<br />
<br />
<br />
By: Jonathan S. Geller - Feb 13th, 2012 at 03:45PM<br />
<br />
"The Galaxy Note essentially has everything you’d want in a smartphone: a great dual-core processor, a solid camera, a beautiful display and good build quality, and it runs on ATT’s new 4G LTE network that delivers incredibly fast download speeds. Plus the battery seems actually decent so far, which is a triumph for modern smart-phones.<br />
<br />
Throw all of that right out the window.</div>
<div>
<br />
The phone is too big. You will look stupid talking on it, people will laugh at you, and you’ll be unhappy if you buy it. I really can’t get around this, unfortunately, because Samsung pushed things way too far this time."<br />
<br />
<br />
And it wasn't just Jonathan. Here's what Zach at BGR had to say:</div>
<div>
<div>
<br /></div>
<div>
Samsung Galaxy Note review: The smartphone that ‘Samsunged’ Samsung<br />
By: Zach Epstein | Feb 22nd, 2012 at 12:01PM<br />
<a href="http://www.bgr.com/2012/02/22/samsung-galaxy-note-review-the-smartphone-that-samsunged-samsung"></a><br />
"Holding this beast to your face while on a phone call in public will result in awkward stares. Not “maybe” or “might,” but “will.” It just looks silly."<br />
<br />
<br />
<div>
One more - PC World's review:<br />
<div>
<br />
"For most, the Note will be too big for a phone, but too small for a tablet. Rather, it’s an awkward in-between device, and will only appeal to a niche consumer base. "<br />
<br />
<br />
<br />
I'm here to tell you, PC World and all the rest are dead WRONG. The Note will NOT be limited to a niche. It has hit the sweet spot in size and will become the new standard in smart-phone technology. Here's why.</div>
<br />
As some of you may know, I've been a geek since before the word was widely used. I've been interested in computers since the smallest ones filled up a room, which was long before they became personal. It was much later that the first thing that could be considered personal technology was introduced, and it was a calculator.</div>
</div>
</div>
<div>
<br /></div>
<div>
If you think the lines are long for gadgets now, you should have been around in 1972 when HP introduced the original HP35 calculator. It sold for $395 which was over $2000 in today dollars, but you couldn't buy it at any price (no eBay back then). After placing only two magazine ads, the original HP35 calculator was back-ordered for more than six months.</div>
<div>
<br /></div>
<div>
This backlog was because the HP35 was SUCH a major advancement in technology, it is hard to imagine even in today's new gadget world. The closest competition to the HP35 calculator sat on a desk, weighed 25 pounds and cost more than $10,000 (or $50,000 in today dollars).<br />
<br />
<b>In contrast, the HP35 was designed to fit into William Hewlett's shirt pocket, which is the key to the issue at hand.</b></div>
<div>
<br /></div>
<div>
<div>
Even though back-ordered from their own distribution, I discovered from a friend at HP that I could buy their calculator at HP headquarters. This outlet was for employees, but he said they weren't checking IDs. I immediately flew my plane to Palo Alto, walked up to the front counter and bought two (an extra one for my cousin).<br />
<br /></div>
<div>
It's been that way my whole life. I watch a given technology then buy the latest and greatest when it's introduced; not because it's a fashion, but because it's significantly better in some technical way. I bought the very first Palm Pilot the day it was released. I generally hold off upgrading until there is a significant advancement. At their introduction I bought the first color Palm phone (also from Samsung), then the Palm Treo and Palm Centro in turn as they were significant advancements.</div>
</div>
<div>
<br />
Just over two years ago I ended a long-term relationship with Palm and bought the original Droid on the day of its introduction. I considered the iPhone but the first version wouldn't even copy, cut and paste text, which I can't live without. Android has been amazing though there are still things the old Palm did that the Droid can not yet touch. But that's another blog post.<br />
<br />
So why am I leaving the Droid behind so quickly? The usual reasons - significant advancements in technology which are listed above, but most importantly because of the size of the screen. All of that visual real estate is wonderful. For years now I've known the original HP-35 hit a sweet spot in physical size and weight. It was as big as possible without being too big to fit in a shirt pocket.<br />
<br />
As it turns out the Samsung Note is almost the same size and weight as that original HP-35. I've been carrying the Note in my shirt pocket the last few weeks and it feels just like the HP35 I carried from years ago. So according to the reviewers, the only problem is how silly we look if we hold it up to our head, which is my second point - a true geek is like the <b><a href="http://www.youtube.com/watch?v=4r7wHMg5Yjg">Honey Badger</a></b> - he doesn't give a shit.<br />
<br />
<div>
And that's how I know I'm an authentic geek: I don't understand why it looks weird to hold a Samsung Note up to your head. Why does it matter? It's what it DOES that counts. I for one believe it's the ultimate geek-cred to side with function over fashion. And who's says Bill Hewlett wouldn't have looked cool talking on his new calculator if only there had been a cell tower around.</div>
<br /></div>
<div>
Who wants to bet the next iPhone is not bigger?<br />
<br />
And that in three years the Samsung Note will be the standard size for a phone?<br />
<br />
And then it will be cool.<br />
<br />
Email your wager.<br />
<br />
03-28-12 <b><a href="http://www.engadget.com/2012/03/27/samsung-ships-five-million-galaxy-notes-in-just-five-months/">Samsung ships five million Galaxy Notes in just five months</a></b><br />
<br />
04-05-12 <b><span style="color: #0b5394;"><a href="http://www.ottawacitizen.com/technology/Samsung+Galaxy+Note+freak/6415654/story.html">Samsung's Galaxy Note is a freak hit</a></span></b><br />
<b><br />
</b><br />
<b><span style="color: #0b5394;"><a href="http://www.gottabemobile.com/2012/06/01/samsung-galaxy-note-sales-continue-to-climb/">06-01-12 Too early to say I told you so?</a></span></b><br />
<b><br />
</b><br />
06--20-13 <b>Time to note a problem with the Note</b> - both the power and volume buttons are in the worst possible locations. I had noticed this with other phones but hope for some reason it would be different with the Samsung, but alas... no. The problem is, these buttons are in exactly the place where you are most likely to hold the phone, which means they are constantly and inadvertently activated. It is a classic physical overloading fail. And Power should be slightly recessed, so it doesn't bump on, whatever it's location. Droid did this well.<br />
<br />
Interesting survey about size:</div>
<div>
<br /></div>
<div>
<b><a href="http://news.cnet.com/8301-13579_3-57601830-37/apples-worst-kept-secret-bigger-iphones-in-2014/?google_editors_picks=true">Apple's worst-kept secret: Bigger iPhones in 2014</a></b></div>
<div>
<br /></div>
<div>
10-14-13 Upgraded to the Samsung Note 3 as it's brighter, lighter, faster with longer battery life and better camera. Also, the power button is no longer directly across from the volume buttons so both are inadvertently hit less often. I'm not crazy about the hard Home button which also powers up when not asked for, but I'll reserve judgment until I get a few more miles. Overall the device is a delight because of display, performance, and battery. More later.<br />
<br />
02-12-14 Four months use and this is the best mobile device I've ever owned, mostly because of display quality, speed, and battery life. And the apps keep getting better.<br />
<br />
09-24-14 <a href="http://www.digitaltrends.com/mobile/phablet-reviews-iphone-6-plus-and-before/">Ultimate Vindication</a><br />
<br />
11-01-14 Upgraded to Samsung Note 4 with faster charging and better battery life. This is the best portable computer I've ever owned.<br />
<br />
08-22-16 I bought the Samsung Note 7 today. Once again, this is a brighter and higher resolution screen. Wait a minute. That was an understatement. It's amazing how each generation of these Samsung displays is brighter than the last. Holding these two phones next to each other with displays on maximum, the difference is like night and day. But how bright of a display do we really need? Brighter. Here's why. When you get into your forties and your close vision begins to go, you notice you can either hold the content further away, or you can increase the light level. Contrast is as important as (or more so) resolution or screen size. It allows you to see more detail on a given screen. It increases visual bandwidth. The new Note has the best display I've ever seen. Again.<br />
<br />
Battery tech is also much improved (apparently, to the point of failure 09-22-16). And with wireless charging a standard feature, it charges faster, easier and lasts longer. Another plus for the Note 7.<br />
<br />
Physically, the device is also narrower and just a bit lighter, both improvements.<br />
<br />
On the downside, I don't like the curved edges. Already I've had a condition where the "1" key on Hacker's Keyboard would not activate because the touch sensitive layer does not wrap around the corner where the majority of the key was. But then if it did, you'd be activating functions when only holding the device, which would be a major problem. Also, when dragging, the effect of dragging stops as you approach the edge and your finger prematurely leaves the surface as the surface drops away. The result is that you can't drag something up to the edge. You are stopped a character short of the margin. The rounded edge is annoying and seems to have little practical value.<br />
<br />
What exactly is the point of a curved edge? You do get to see a message arrive when the phone is face down, but does this relatively rare case make up for the UI failures? Hardly. Is this a case of fashion over function? Is Samsung looking for a physical branding device? If so, they should first make sure it does not impair the operation of the phone. In any case, the curved screen does not work for me. Give me a flat screen any day (with a little bezel to hold on to so I'm not activating features inadvertently).<br />
<br />
Now for the surface. The glass back is slick looking but also quite literally slick. It's like holding a bar of soap. Do they want these things dropped so they can sell you another one? I added egrip phone strips which helps a lot, but why should I have to geek up such a premium device? Everything except the screen should be a high friction surface. The glass back is another design fail.<br />
<br />
On the plus side, the camera operation is much improved. As for quality, I'll let you know when I get time to work with some of the images.<br />
<br />
Adding the Samsung Gear VR is a blast, but much lower resolution than I expected. Perhaps that is the cost of fast and smooth response. Or do they spread those pixels over such a wide field of view that our fovea only gets a relatively low res?<br />
<br />
Notwithstanding the battery issues, the Note 7 is two steps forward, one back, but that's still a net improvement.<br />
<br />
(09-22-16 - As everyone knows by now, the Note 7 was a major failure. Guess I'll have to wait for the Note 8. Or something else.)<br />
<br />
05-31-17 - What was radical in 2012 is now mainstream:<br />
<br />
<b><a href="https://techcrunch.com/2017/05/31/phables-are-the-phuture/">Smartphone screens find their size sweet spot</a></b><br />
<br /></div><div>02-10-21 Final vindication: when I first wrote this blog post in 2012, a 5.3" screen was considered freakishly large. Now anything that small is considered too small to use:</div><div><br /></div><div><b><a href="https://arstechnica.com/gadgets/2021/02/the-iphone-12-mini-hasnt-sold-well-according-to-multiple-estimates/" target="_blank">Sorry, small-phone lovers:</a></b></div><div><br /></div><div>Next, I predict we'll get flexible 10" phones that wrap around your arm most of the time, but spring out flat when being used. You read it here first - 02-10-21</div><div><br /></div><div><br /></div>
Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.com4tag:blogger.com,1999:blog-23742979.post-66408757628204859212017-02-28T10:56:00.003-08:002022-05-21T13:53:53.006-07:00Introducing Mud Bluff, Nevada<div class="separator" style="clear: both; text-align: left;">
Many of you know I've been involved with the Burning Man Project for years as a Regional Contact. During that time I've helped to organize PermaBurn, Stone Soup and Buring Girl as well as many other Burner events and projects. I've also hosted various events and theme camps at my home in south Reno. What you may not know is that I've been seeking a larger and more remote location to host a permanent regional campout as well as continue this type of participation. </div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
Over the last three years, I've reviewed hundreds of property listings and hiked the more interesting ground. On December 4th I went out to the Lahontan dam for yet another site visit.</div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhErhvdc-cxZNTKAVuZn0DTu0tIBoHXu7eZ7xo37FrxDkwZawoCyEb2cthdJP03hR7v7JqDD5pB8ssva86jerBHHfjRW-QW3y46UkK1f3dIHjcFLFqhQW-qQTQ8Wj4upzkIdNBWBQ/s1600/Mud+Bluff.JPG" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhErhvdc-cxZNTKAVuZn0DTu0tIBoHXu7eZ7xo37FrxDkwZawoCyEb2cthdJP03hR7v7JqDD5pB8ssva86jerBHHfjRW-QW3y46UkK1f3dIHjcFLFqhQW-qQTQ8Wj4upzkIdNBWBQ/s640/Mud+Bluff.JPG" width="640" /></a></div>
<br />
This was my first impression as I drove onto the property, inspiring the title for this blog post. By the end of January 2017, I had purchased both this and the adjoining parcel.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjav772sWu7_xBp_f0Qt5MGYpwiKaZ_znIn0n0pwzQcl__zPU_JQm5AysB-z6Id2q_lX6b-e6vGCiK4S3fDWZbenpGiv0mT72Y39_w26GYH96tIzcSnuoYw2KW94vTWREtJQwJcdg/s1600/Mud+Bluff+East+of+Lahonton+Damn.JPG" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="444" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjav772sWu7_xBp_f0Qt5MGYpwiKaZ_znIn0n0pwzQcl__zPU_JQm5AysB-z6Id2q_lX6b-e6vGCiK4S3fDWZbenpGiv0mT72Y39_w26GYH96tIzcSnuoYw2KW94vTWREtJQwJcdg/s640/Mud+Bluff+East+of+Lahonton+Damn.JPG" width="640" /></a></div>
<br />
Lahonton dam and reservoir to the west of the first 77 acre purchase in December of 2016.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEggl1NiyAJ3DU-hQVJuN2D3VTAloRUWesZjgGZ5j2udMtQzC4pPOhMY-lVShknrsNVtIVrPMmFfLhY3-xWor63CQfw-vcrJ-rimaNRHDdLPYMocBp_z_hTj4QzlyCThuV1ZJPv2Og/s1600/Mud+Bluff+1+%2526+2.JPG" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="490" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEggl1NiyAJ3DU-hQVJuN2D3VTAloRUWesZjgGZ5j2udMtQzC4pPOhMY-lVShknrsNVtIVrPMmFfLhY3-xWor63CQfw-vcrJ-rimaNRHDdLPYMocBp_z_hTj4QzlyCThuV1ZJPv2Og/s640/Mud+Bluff+1+%2526+2.JPG" width="640" /></a></div>
<br />
An additional 39 acres with power and a well were purchased in January 2017, for a total of 116 acres.<br />
<br />
As you can see, the Carson river crosses the property for about a thousand feet with about four acres actually on the north side of the river. The 26 acres of river bottom contain an oxbow from an earlier flow of the river and lots old cottonwood trees. Another 50 acres is sloping, alluvial bluff, with about 40 acres of flat desert on top, providing panoramic views in all directions.<br />
<div>
<br /></div>
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiCYlm3FVXvuDNq_LB7XOZBRPwAWDiwmaolCfcXN5VvX_G1hEPmfqPJ_76evPgqSrIVzfNQIIJMFJO-inBAqhOfQt5XEa0-QIlNZkvQVhnUky5mmK17hREhkey1k9xRzlzjOoQzPQ/s1600/20161204_090232.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiCYlm3FVXvuDNq_LB7XOZBRPwAWDiwmaolCfcXN5VvX_G1hEPmfqPJ_76evPgqSrIVzfNQIIJMFJO-inBAqhOfQt5XEa0-QIlNZkvQVhnUky5mmK17hREhkey1k9xRzlzjOoQzPQ/s640/20161204_090232.jpg" width="640" /></a></div>
<br />
The river facing west showing the low flow in December.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhOIR-OGNkfiUFHZiW-z7JIByKHyCQyuT_DpMXzZ3n2jgTOjfc7f0xEjxLTsDq5twJnGe9jNIwLei1XmcqSXg6A_MnVtmSI4NI5CreVemmZL_NXRp8kH2C0ylaikg5xF3LxetgyCw/s1600/20161204_090748.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhOIR-OGNkfiUFHZiW-z7JIByKHyCQyuT_DpMXzZ3n2jgTOjfc7f0xEjxLTsDq5twJnGe9jNIwLei1XmcqSXg6A_MnVtmSI4NI5CreVemmZL_NXRp8kH2C0ylaikg5xF3LxetgyCw/s640/20161204_090748.jpg" width="640" /></a></div>
<br />
Beavers are under that log and made that small "dam" to the right. This is framed by the north 4 acres of the property across the river.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEher7eFQTWse6ZIOOX9ITU_xG6omTxDTs1eRcr91KwWcouKr41ax98_qRoBrTuMRoRN4jZRyBg55thNjLKZ-DWIuoRmJOari3en_mz5H_RYWq1eqlLQSjapYtiKjEAV6ZRWutI6pw/s1600/20161204_090923.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEher7eFQTWse6ZIOOX9ITU_xG6omTxDTs1eRcr91KwWcouKr41ax98_qRoBrTuMRoRN4jZRyBg55thNjLKZ-DWIuoRmJOari3en_mz5H_RYWq1eqlLQSjapYtiKjEAV6ZRWutI6pw/s640/20161204_090923.jpg" width="640" /></a></div>
<br />
Lots of reeds and birds.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh9gfb3ss2_70wViv_ShJYGEK18z0NAnoL9G7fjdNEKIop24ZdOMfujeMHfhPjWHkMVmxs8Tx6dTrXoByiKabeA890BDi3GWymP40TK4-iCYEyre8h7HHO0XSR-cSYioT97kzy5aw/s1600/20161218_143958.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh9gfb3ss2_70wViv_ShJYGEK18z0NAnoL9G7fjdNEKIop24ZdOMfujeMHfhPjWHkMVmxs8Tx6dTrXoByiKabeA890BDi3GWymP40TK4-iCYEyre8h7HHO0XSR-cSYioT97kzy5aw/s640/20161218_143958.jpg" width="640" /></a></div>
<br />
The bluff on the second parcel of land.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgL4C46g1bjDqmizM4Zw48CnYhZZIPo5J2zZnoSjJ7Z07NwwDYzn8rMoXxefhYfaiA3otP7sz5lvmB51jc-d0TLdBRyFqO-n7_EKv7ND0Kni0oqd3hLcn7BBl5gg4RlS_7llFO6AA/s1600/20170101_134022.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgL4C46g1bjDqmizM4Zw48CnYhZZIPo5J2zZnoSjJ7Z07NwwDYzn8rMoXxefhYfaiA3otP7sz5lvmB51jc-d0TLdBRyFqO-n7_EKv7ND0Kni0oqd3hLcn7BBl5gg4RlS_7llFO6AA/s640/20170101_134022.jpg" width="640" /></a></div>
<br />
Dead cottonwood. Some are more than a hundred years old.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<br />
The 26 acres of bottom land from the top of the bluff<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjEEYrfMisTYMxFocK08DAIYN8I2z8uT-op53oVGipZWoRJoMEn6WLeOtIeKWhxnTfHhovgpx0rmKBphcsnRvbWja1N3N7csGdn7Rfn5zirHYNze0M42she37BBxNFDvTiVHjoCCA/s1600/View+from+the+Bluff.jpg" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjEEYrfMisTYMxFocK08DAIYN8I2z8uT-op53oVGipZWoRJoMEn6WLeOtIeKWhxnTfHhovgpx0rmKBphcsnRvbWja1N3N7csGdn7Rfn5zirHYNze0M42she37BBxNFDvTiVHjoCCA/s640/View+from+the+Bluff.jpg" width="640" /></a><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhIk9Sb3jNB9Wwp27XLNJ03TPDPEshfALkb7I_zmJgtQjf8UBtFR8lFguALGM8bNWIlL3J-MAej19JgVKzH2UzvnQGCm2Me1lSI_zHdySs6Yjo7thn0x1WqFIQCv93nKMLt0prMbQ/s1600/View+West.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhIk9Sb3jNB9Wwp27XLNJ03TPDPEshfALkb7I_zmJgtQjf8UBtFR8lFguALGM8bNWIlL3J-MAej19JgVKzH2UzvnQGCm2Me1lSI_zHdySs6Yjo7thn0x1WqFIQCv93nKMLt0prMbQ/s640/View+West.jpg" width="640" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
View to the west<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjkEZfvg-OPADehKlCp9chbsWMVvVmzfeaDzx63LSDxQndRMJj01j1dX1J7yqcPG_meJo_L5d9TE577Vo9cE3LgN5Iucga2lXRrexMaDIPDTgKfjMxCRtFrz_4t8obbbMXv3LJIlA/s1600/View+North.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjkEZfvg-OPADehKlCp9chbsWMVvVmzfeaDzx63LSDxQndRMJj01j1dX1J7yqcPG_meJo_L5d9TE577Vo9cE3LgN5Iucga2lXRrexMaDIPDTgKfjMxCRtFrz_4t8obbbMXv3LJIlA/s640/View+North.jpg" width="640" /></a></div>
<br />
View to the North<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhD4lZ7n6puiRoUxszsxBVttRpKZyofR2PV0EiyfzeH42d-JbEd3KlfKFmWi9ZwByfLtlH-822Z6zMZNAeMOKdIOUopAkbt6yjD_04G8at3oj_keojehRCS3_xvZrZOG8h6ghWGpg/s1600/View+East.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhD4lZ7n6puiRoUxszsxBVttRpKZyofR2PV0EiyfzeH42d-JbEd3KlfKFmWi9ZwByfLtlH-822Z6zMZNAeMOKdIOUopAkbt6yjD_04G8at3oj_keojehRCS3_xvZrZOG8h6ghWGpg/s640/View+East.jpg" width="640" /></a></div>
<br />
View to the East<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjqOodHTPgeuY8I_a7RklDC7qd-hpSZMeOUWogJXJ0A8Gxzel7X61SJG7N_wZDCm1WWK0RWQ_JswhmhofOUy9woScL8viX3nSaLtxFNRCCgDCnDtH7oXHeg8f1kgwY1qdvGLc5_rw/s1600/Well+test+yielded+16+GPM.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjqOodHTPgeuY8I_a7RklDC7qd-hpSZMeOUWogJXJ0A8Gxzel7X61SJG7N_wZDCm1WWK0RWQ_JswhmhofOUy9woScL8viX3nSaLtxFNRCCgDCnDtH7oXHeg8f1kgwY1qdvGLc5_rw/s640/Well+test+yielded+16+GPM.jpg" width="640" /></a></div>
<br />
The 50 acres of sloping ground is in the middle. Yes, it's quite bleak in winter, but that creates a contrast with the river-bottom cottonwoods in May. Stand by for updated pictures.<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjOg29g8y9poviAnE_iNfxmBT8CyV_CI4T0-aTr3dR6mZz9PoPF6szOIk2hLErdlopp4gtYW9Q51JnZdjtlJFkjUFzZeSJbmETxVRtq4oRDuQXiVz2BctCmdiL1FF1-4WhzRVlebw/s1600/Well+test+yields+16+GPM.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjOg29g8y9poviAnE_iNfxmBT8CyV_CI4T0-aTr3dR6mZz9PoPF6szOIk2hLErdlopp4gtYW9Q51JnZdjtlJFkjUFzZeSJbmETxVRtq4oRDuQXiVz2BctCmdiL1FF1-4WhzRVlebw/s640/Well+test+yields+16+GPM.jpg" width="640" /></a></div>
<br />
That's the well-test being done - good water, and lots of it. Also, that pole brings power to the center of the property with transformers both below, and on top of the bluff.<br />
<br />
What am I going to do with this property?<br />
<br />
That's my newest challenge, but it will probably include a place to live, and space to continue hosting Burner events and art development. Email me if you'd like to be involved with this project.<br />
<br />
Or visit our dedicated website or Facebook page:<br />
<br />
Mudbluff.org<br />
<br />
<a href="https://www.facebook.com/groups/303746580066581/?ref=bookmarks">https://www.facebook.com/groups/303746580066581/?ref=bookmarks</a><br />
<br />
<b><a href="https://blog.dangerranger.org/2020/05/18/a-river-runs-thru-it/" target="_blank">Here's a review of our progress from 2020:</a></b><br />
<br />
<br />Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.com2Lahontan Dam, Fallon, NV 89406, USA39.4632482 -119.0669710999999913.941213700000002 -160.3755651 64.9852827 -77.75837709999999tag:blogger.com,1999:blog-23742979.post-15357828887207440172016-11-29T08:48:00.000-08:002016-11-29T08:48:07.483-08:00Rotate Your Second DisplayOriginally posted 04-24-13:<div>
<br /><div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhIrwCdi1uU_l-m8qlqnruD4enKUFCd3ii6QJ_oRr9MWopDFP3vRhyLA8jI_rgL4Y-WNh4cZxiacgy6lDLObXn1qJ_bH_L8_7V0tYQzUT316KlzaU9Pw8f41XNBsaIX9pEjpZHmMw/s1600/Vertical-2.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="480" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhIrwCdi1uU_l-m8qlqnruD4enKUFCd3ii6QJ_oRr9MWopDFP3vRhyLA8jI_rgL4Y-WNh4cZxiacgy6lDLObXn1qJ_bH_L8_7V0tYQzUT316KlzaU9Pw8f41XNBsaIX9pEjpZHmMw/s640/Vertical-2.jpg" width="640" /></a></div>
<div style="background-color: white; border: 0px; color: #666666; font-family: HelveticaNeue, "Helvetica Neue", Helvetica, Arial, sans-serif; font-size: 13px; margin-bottom: 20px; outline: none 0px; padding: 0px; vertical-align: baseline;">
<br /></div>
Have you noticed how web content and computer displays are pointed in different directions?<br /><br />If you open almost any webpage on a laptop or desktop computer you’ll see blank white bands along the sides. Even worse, you won’t see much vertical content. This is because most documents and web content scroll vertically, yet most displays are presented horizontally. And the sites that ARE formatted horizontally, are so wide they are difficult to read. Why is this? <div>
<br /></div>
<div>
It’s because of the movies.<br /><br />Since movie and TV’s 1080P format with their 16 by 9 aspect ratio is now the sweet spot in volume display manufacturing, we have this strange situation where computer content does match the display presentation.<br /><br />This is why you see people rotating their phones and tablets so often. Unfortunately, with laptop or desktop computers it’s a bit more clumsy to rotate the display. But the very thing that caused the problem (volume price point), provides a reasonable solution – buy a second monitor and turn it sideways.<br /><br />This is actually easier than you might think. There are lots of optional vertical monitor stands, and most computers (including laptops) already have a second video port. So get a second display, turn it sideways, configure Windows screen settings and poof! The problem is solved.<br /><br />Give it a try. It’s like having a whole new second computer.</div>
<div>
<br /></div>
<div>
And it's one that actually fits your work.</div>
</div>
</div>
Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.com0tag:blogger.com,1999:blog-23742979.post-74888293096687232442016-09-26T11:25:00.000-07:002016-09-26T11:25:01.470-07:00GeoOrbital Quick Change Electric BikeHow about an electric bike solution that can be installed in 60 seconds?<br />
<br />
Weird music, but cool video and product:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe width="320" height="266" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/s22QKOM5Fao/0.jpg" src="https://www.youtube.com/embed/s22QKOM5Fao?feature=player_embedded" frameborder="0" allowfullscreen></iframe></div>
GeoOrbital Video<br />
<br />
<span style="color: #0000ee;"><b><u>GeoOrbital Quick Change Electric Bike</u></b></span><br />
<span style="color: #0000ee;"><b><u><br /></u></b></span>Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.com0tag:blogger.com,1999:blog-23742979.post-75673920351247842832016-09-14T16:46:00.000-07:002016-09-15T07:54:20.533-07:00Electra Meccanica Solo - A New Standard in Personal Transportation?<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiKO4Mzeq915XXeM6awypK75rB5ztTyJvS79gYZvX8xAZ3AanxSCulpWKS5HnjOzt6UcJ9IUjJ3xRCcZhqJhueJ1Kkw4bNmsg2VHsjv5RJMnO5ZKwdMhcLlfKrsq6E8GT2PiakT_A/s1600/Solo.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="323" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiKO4Mzeq915XXeM6awypK75rB5ztTyJvS79gYZvX8xAZ3AanxSCulpWKS5HnjOzt6UcJ9IUjJ3xRCcZhqJhueJ1Kkw4bNmsg2VHsjv5RJMnO5ZKwdMhcLlfKrsq6E8GT2PiakT_A/s640/Solo.jpg" width="640" /></a></div>
<br />
With Toyota, Tesla and the rest, electric cars have done well in the last few years. But what about a low-cost practical solution to the typical short range commute? Has Electra Meccanica found that sweet spot in personal transportation at $15,000 and under 1000 lbs?<br />
<br />
I guess we'll be finding out in the next few months:<br />
<br />
<b><a href="https://electrameccanica.com/">Electra Meccanica</a></b><br />
<br />
<b><a href="http://www.topspeed.com/cars/others/2017-electra-meccanica-solo-ar174409.html">TopSpeed's Review</a></b>Sudden Disruptionhttp://www.blogger.com/profile/05159891861229551613noreply@blogger.com0