... seeking simple answers to complex problems, and in the process, disrupting the status quo in technology, art and neuroscience.

Friday, July 17, 2020

The Gnostic Neuron - A Simple Model of a Complex Brain

(Originally posted July 17, 2020)

"If you can't explain it simply, you don't understand it well enough." - Albert Einstein

I understand how the brain works. Well enough. I’m serious. More specifically, I have stumbled across the nature of knowledge, and now understand the first principle of the neuron:

Neurons literally create knowledge at the instant that they fire.

What does this even mean? How can biology create something as abstract as knowledge? This assertion begs a clarification of knowledge, which I’ll provide in due course, but here’s a quick overview.

Knowledge is not the same as information. Information is objective. Knowledge is subjective. It’s relative to that specific neuron and only exists for a moment. Most importantly, knowledge is not stored as a “state” in the neuron. Knowledge is a sensitivity to a specific condition in the world. Knowledge is often not logical, but logic can be critical to its creation and ultimately, its stateless simulation. The nature of knowledge ranges from primal proto-knowledge to something approaching the truth. By degrees.

The point is, most neurons create knowledge, and most knowledge is created by neurons. It’s only the quality of this knowledge that varies, and varies widely. Once we begin to focus on what each neuron knows, and how this knowledge dynamically changes, we can begin to build a simple model of a complex brain.

This idea of neuron generated knowledge may at first seem comical, radical, bizarre, or worse - meaningless. That was my first impression too. It caused me to laugh out loud. My second impression was, could it be this simple? I couldn’t look away.

Over time this assertion has become more obvious, at least for me. Now it’s hard to see neurons as anything other than creators of knowledge. This concept changes how I understand the world. I now see knowledge everywhere. Like green letters dropping down the screens from the movie, “The Matrix”. I see bits of knowledge coming together to form emergent insight and ultimately meaningful information. Am I delusional? Perhaps. But I see this knowledge cueing scripts of movement in the actions of those around me. I see neurons coming to know things, and altering the world everywhere I look. What I’m about to present is not some obscure technical or philosophical proof. It's broad-based speculation and has wide application in our everyday interactions with the world.

With this first principle of the neuron, the brain begins to make a lot more sense. Once I understood that neurons created knowledge, figuring out how this happened became a lot easier and revealed an architecture of the brain which is reflected in our language and culture. Words are literally the expression of this knowledge when neurons cue a script of muscle movement to create the sounds or writing associated with that knowledge. This idea demands the redefinition of philosophy, language, and even sheds light on the hard problem of the brain. But it helps to understand a simple version first. Later we can speculate about consciousness.

Yes, I realize how audacious this claim is, probably better than most. I’ve been casually working on this problem for decades, but more intensely over the last few years. I’ve now collected hundreds of pages of descriptions, notes, and references. Most of what I’m going to present is not original work. Even trying to identify and give credit to each idea is beyond my resources right now. For that, I apologize. As you might imagine, the detail is overwhelming without some kind of overview, so this summary will provide a few more assertions (which were informed by this model), then build from a single neuron up to the functioning of a simple brain. I will start with the most broad generalizations about philosophy, evolution and the brain before focusing on the neuron. If using this Gnostic model to inform general assertions seems like circular reasoning, it’s not. It’s merely circular presentation. Control of all animal life ultimately starts with the neuron. And once I get to the details of the model, so will I.

To keep things general, I will dispense with electrical descriptions, genetics, imaging, and most of the technical fields, at least for now. So what’s left? Philosophy, connection, chemistry, and language. Oh, and a bit of theory about evolution. But first I need to challenge some common assumptions and plant a seed of doubt about the limits of science.

The Missing Model of the Brain

I’ll start with this important question:

How can the most profound and studied object in the world be so poorly understood?

I’m of course talking about the brain. And “profound” is an understatement. Without our brain, nothing else matters. Without your brain, there is no you. Our brain creates our reality, and mediates our interaction with the world. We are our brains. This view of the brain is not new. In the 4th century BC, Hippocrates expressed it surprisingly well:

“Men ought to know that from nothing else but the brain comes joy, delights, laughter, and sports, and sorrows, griefs, despondency, and lamentations. And by this, in an especial manner, we acquire wisdom and knowledge, and see and hear and know what are foul and what are fair, what are bad and what are good, what are sweet, and what are unsavory. ... And by the same organ we become mad and delirious, and fears and terrors assail us. ... All these things we endure from the brain. ...In these ways I am of the opinion that the brain exercises the greatest power in the man.” - On the Sacred Disease

In spite of all that has been learned since, this ancient and intuitive summary remains one of the best and most concise descriptions of how we experience our brains. Not only does “the brain exercise the greatest power in the man”, but in everything every man has ever done. Pick a topic. As you think about it, your brain informs your understanding. If you act on your thoughts, it’s your brain that has ultimate control. You can not think about, nor do anything that does not involve your brain. You can not be without your brain.

And that’s just your brain. And that’s just right now. While subjectively critical, most of our individual brains will have little impact on the world at large. But collectively, all the brains that have ever existed have literally controlled everything that has ever been done. Our brain is the key to all relationships, politics, economics, science, art, and philosophy. Yes, profound without doubt.

As for studied, has any other object gotten more attention, especially in the last few decades? Does any other intellectual challenge present as much data? And has any other effort yielded fewer conclusions? We’ve had a “Decade of the brain”, a “New Century of the Brain” and have even treated the brain like a “moon shot” during Obama’s “Brain Initiative”. Yet, none of this rhetoric mattered. We still don’t have a useful model. Is the brain too complicated for the mind to comprehend? Unlikely, but let’s take a closer look.

The complexity of the brain is astounding. You’ve probably heard the quantifications. Each of our brains have trillions of connections between billions of neurons, to monitor millions of sensors, all to control thousands of movements using hundreds of muscles for one primary purpose - survival. The number of possible combinations and behaviors are greater than all the atoms in the universe. And that’s just one brain. Each seems to be a little different. And each is constantly changing.

Neuroanatomy has taken the brain apart and reduced it to components. Much of the brain has been mapped by function, at least in a macro sense. But when we look closer, these “areas” and other brain “components” have few clear boundaries. Most are fuzzy at the edges where millions of fibers deliver signals from one part to another.

With heroic effort involving injury and death, various functions have been attributed to various lumps, gyri, and sulci, but this localization is only by degrees. If we try to get specific about what exactly happens where, exception becomes the rule, and rules becomes the exception. Brain function appears to be both localized and distributed at the same time. It seems a paradox.

Also, most of the brain is clearly divided left and right. These two halves are only connected in the center, bottom and back, and then only by degrees. Even the cerebellum has definite bilateral symmetry, if not functional division in its structure and operation. So is the brain divided? Or unified? The answer is yes. Without question.

It’s not just the brain that’s complex, it’s also the neuron. In the nano context, we’ve collected an astounding amount of data involving types of neurons, neurotransmitters, glial structures, genetics, and of course, nano, micro and macro chemistry, each with their own scope. We understand how the neuron creates a signal but not what that signal means. We have a clear understanding of how all of this happens, but not exactly why. At least not in the nano-context. Yet.

As we zoom back out conceptually, various groups of neurons “project” their axons from one area to another. Some detailed connections have been mapped, but between the nano level and the macro level, most of the micro connectome remains in shadow. Should neuronal function be associated with the location of their cell bodies and dendrites, or the majority of their axon terminations? Specifically, what connects to where? And why?

The Biology of Behavior

Even more challenging than neurophysiology and chemistry is characterizing function. The brain is where sensory input gets converted into muscle movement. We define this as behavior. This behavioral database includes all animal life, but even limiting it to human history, it’s still overwhelming in scale and content. Why did Socrates drink the poison? How did Henry II envision our legal code? What drove Mao Tse-Tung to cause the death of tens of millions of his own people?

Generalize from a trillion behaviors then apply them to yourself. Why do you do each thing you do each day? Your behavior is far from random, but its true course can be difficult to devine, and at the same time, easy to rationalize. Behavior ranges from obvious to enigmatic, with no clear boundaries from one motive to the next, much like the physiology of the brain itself. And that’s an important clue. Function follows form. And vice versa.

As individuals, we each have an introspective experience. It’s our own private view of the brain from the inside. Much of the time our behavior seems reasonable and organized. But is it? How many times per day are you surprised by your own behavior? Think carefully. True self-awareness is less common that you might realize. Where might these surprising thoughts and actions come from? How much of our thinking is conscious? How much is hidden in layers of mystery even from our subjective experience?

Scaling outward, how do your behaviors contrast and conform to those around you? And those more distant? Plus, each brain is changing dynamically from moment to moment. Repeating psychological experiments on the same subject often yields different results. Consistency is elusive as recent brain imaging studies have shown. The brain is plastic by degrees, and in critical phases. So is the resulting behavior.

Multiply these behaviors by the billions of creatures and all the people that have ever lived. Now correlate it with what we know about neuroanatomy, chemistry, genetics, neuroscience and all the other academic fields we’ve enlisted in this effort. Generalizing from such a broad and changing base of information is like trying to nail an ocean of Jello to an infinite moving wall. What goes where? And why?

And yet, the brain is not random. As Hippocrates noted, all behaviors flow from within the skull. So far we have nothing to disprove his observation. We’re left with billions of neurons doing mysterious things to yield trillions of complex behaviors. In short, the brain is a tangle, a modern Gordian Knot, and perhaps just as difficult to unravel. Or in the case of the brain, understand.

If you’re not familiar with the Gordian Knot, it’s a parable about a very large and complex ball of rope with one loop attached to an ox cart. It was said that anyone who could unravel this knot and uncouple the cart would become the King of Gordia (which was near modern day Greece). This royal test was much like the Sword in the Stone, except the challenge was this tangle. When Alexander the Great encountered this ball of rope he simply drew his sword and cut the loop. Then he took the kingdom. By force.

Though both the Sword in the Stone and Gordian’s Knot crowned a King, you might be tempted to presume Alex cheated. If you require the solution to conform to the spirit of the problem as presented, you’d be correct. But it could also be argued that Alex was just thinking outside the box. Or that might makes right. The story contains several possible lessons depending upon your values, sensibilities, and perspective. And that’s the point. It’s only one example of our mind entertaining multiple ways of looking at a problem. And its solutions. That’s another important hint, but for now the concept of the Gordian Knot is a useful way to encapsulate the mystery of the brain. Let’s get back to our missing model.

Intuitive Modeling

“A theory can be proved by experiment; but no path leads from experiment to the birth of a theory.” - Albert Einstein

Open any book about neuroscience. Usually within the first few pages will be some disclaimer about the lack of a useful brain model.

Jeff Hawkins of the Redwood Center for Theoretical Neuroscience put it concisely in his 2004 TED talk about his book, “On Intelligence”:

"We have tons of data and very little theory." To drive home this deficit, Jeff then offered a quote from decades before by Nobel laureate Francis Crick, "We don't even have a framework."

Not even a framework? Well, this is embarrassing. Why all the intellectual abdication? Some generalizations are certainly more probable than others, even if extraordinarily broad. Or completely wrong. Even error tends to invoke a useful counterpoint. Where are our sweeping generalizations about the brain? We need a fresh perspective. We need a new approach. We need a radical idea to break this logical logjam of data. For this challenge, some wild speculation would be better than none at all.

Ironically, modeling is one of the things the brain does best. We model the world constantly. We can’t help it. This modeling ranges from casual and even subconscious to formal, detailed and explicit. The most useful models may even become detailed simulations.

We model the actions of other people to predict their behavior, often without realizing it. This is known as theory of mind. Other models are conscious but still casual. Their complexity ranges from sparse to rich depending upon how much attention we pay to each topic.

For instance, you may know more about psychology than I, or the reasons for the subtle coloring of a rose, but I likely know more about how a computer works. It’s what each of us attend to that allows us to populate our respective models of the world.

Using associative maps, allegory, and metaphor, we model the tools we use, the work we do, and the places we live, all to great advantage. But it’s still a different experience and a different advantage for each of us. You think about things I dismiss. And vice versa. We each have our casual models of the world, and our model of the brain is part of that world, casual or not.

Since you’re reading this, you likely have at least a fuzzy default model of the brain. It may involve the concept of hard-wired “circuits”, programming new habits, or just processing your thoughts. Each of these is a tech metaphor. Or perhaps your model might involve more explicit concepts of electronics, genetics or imaging. Again, each is a field of technology. For decades, my casual model of the brain focused on chemistry, ionics, and logic, yet my model was never viscerally satisfying. The closer I looked, the clearer it became that the brain had more in contrast with technology, than in common. I came to realize these default metaphors and fields of study were distorting my thinking.

Blinded by Science?

Today, billions of dollars are being spent to understand this slippery object. As noted, the 1990s were declared the “Decade of the Brain.” The decade produced yet another tsunami of data, but again, few conclusions. We’re now well into the new millennium and we still don’t have a useful model of the brain. Below is a more recent quote from Ed Boyden who leads brain investigation at the MIT Media Lab:

How the Brain Is Computing the Mind

Despite the title, Ed explains very little about how the brain works, though he does acknowledge the challenge, and provides this same important clue - why would Ed presume the brain might “compute” the mind? And it’s not just Ed. Various forms of computer thinking remains our default metaphor for the brain in spite of its poor fit.

The contrasts between the brain and computer have been well known for decades. Nobel laureate Gerald Edelman effectively challenged the computer metaphor in several of his books. Yet this tech approach continues to guide most of the effort, and consume most of the resources.

If we think of the brain as a computer, it follows that neurons somehow represent state machines, conforming to information theory. This is not the case. If the brain were some kind of computer, we would expect it to be fast, digital, synchronous, serial, hard-wired and determinant in its operation.

The brain is the very opposite in each of these major aspects. It’s relatively slow, surprisingly analog, mostly asynchronous, profoundly parallel, and quite plastic. Instead of consistent answers, the brain often yields an indeterminate result in a very uncomputer like fashion. But it’s not just the computer metaphor that causes problems. It’s science itself. Let’s get back to Ed Boyden:

“The reason is that the brain is such a mess and it’s so complicated, we don’t know for sure which technologies and which strategies and which ideas are going to be the very best.“

The very best? How about any at all?

Again, Ed leads with “technologies.” Why would we expect the brain to be understood in terms of technology? The brain certainly didn’t come out of a factory. The brain evolved. And yet technology has been the prime modeling strategy for most of recorded history.

The brain has in turn been compared with aqueducts, telegraph, clock, telephone, steam power, computer, and finally, the internet. Each was the most advanced technology of its time. Some are now trying to understand the brain in terms of superconducting quantum calculation. Though complex, I doubt the brain’s operation is quite that exotic. Or technical. And it gets worse.

It’s almost as if science itself has become our latest “new” technology. The first test of science is consistency. Though not random, the brain’s operation is often not consistent. This is a major challenge for science, and perhaps one reason for our missing model. Science requires that experiments produce repeatable results. The brain violates this with impunity, switching from one answer to another as it dynamically tunes itself to its environment. Hypocrisy is common in human behavior. When we overlay technical metaphors, things get worse. Soon these metaphors are steeped in rationalization and confusion, when the true test of any model or metaphor is simplicity and utility. We need a way to break up this technical logjam.

Simulating Flight

The technical approach to understanding the brain is like deconstructing a Boeing to understand how a bird flies, and just as useful. The Boeing applies brute force in a crude fashion, but also goes faster. The bird’s organic approach to flight is far more subtle and elegant. Which is better? Neither. Each has advantages depending upon load, speed and maneuverability.

This tech/organic contrast is not limited to the skies. Something similar has happened on land and on the seas. The wheel forever changed how we travel. It allows for greater speed, load and distance at the cost of flexibility. But not always. A bicycle is a hybrid in this tension of tech versus organic. It allows human muscle to achieve greater speed compared to running, and also greater efficiency than any other application of the wheel.

At sea, a similar dichotomy exists with a similar hybrid solution. Power boats will get you there faster using brute force, and swimming is elegant but quite limiting. Sailing applies the best of both worlds when speed and capacity are critical requirements.

Learning to fly provides another useful comparison when trying to find a useful model of the brain. Only just over a century ago the consensus was that man would never fly. By December of 1903, Samuel Langley had just spent the entire Smithonian budget and $50,000 from the Department of the Army attempting manned flight using the brute force of a steam engine and fixed flight surfaces. We might describe this as the technical approach to flight. His plane crashed, and the pilot died. After decades of effort and after spending a literal fortune from the government, Langley gave up.

The New York Times punctuated this failure by publishing an op-ed stating that man would never fly. A week later this assertion was proved wrong. With more modest funding, the Wright brothers applied the hybrid bicycle approach to flight with its human balanced control and lighter gasoline engine and finally took to the air.

The point is, whether you wish to travel on land, sea or sky, solutions range from organic to technical. Organic is more subtle and efficient. The technical approach applies more power and speed, and succeeds in a different way.

This challenge of a brain model is similar. Electronic computers simulate the world using complex systems operating at the speed of light, but that’s just one way to simulate the world. There is another. The more organic approach, which the brain uses, is much more subtle and elegant. And in many ways, far more effective. Especially when survival is involved. What is the nature of this more organic simulation? What is the brain really doing?

The Gnostic Neuron

Here’s one final example of our missing brain model. It’s the opening line from the issue dedicated to the brain from Scientific American in 2014, The New Century of the Brain:

“Despite a century of sustained research, brain scientists remain ignorant of the workings of the three-pound organ that is the seat of all conscious human activity.”

Pessimistically, the article then immediately cites an interesting discovery as just another mysterious loop in our Gordian Knot, another stressed stick in our log jam:

“... the discovery of a “Jennifer Aniston neuron” was something like a message from aliens, a sign of intelligent life in the universe but without any indication about the meaning of the transmission. We are still completely ignorant of how the pulsing electrical activity of that neuron influences our ability to recognize Aniston’s face and then relate it to a clip from the television show Friends. For the brain to recognize the star, it probably has to activate a large ensemble of neurons, all communicating using a neural code that we have yet to decipher.”

If you haven’t heard about the “Jennifer Aniston neuron”, here’s a quick summary of this remarkable work from UCLA in 2005. It began when a patient was being prepared for brain surgery to treat epilepsy. As part of that process, selected neurons were monitored while the subject was shown photos of various places, people, and things. In this case a neuron was discovered that fired when the patient was shown a picture of Jennifer Aniston. Even more remarkably, that same neuron fired no matter how Jennifer Aniston was presented. Whether it was her spoken name, her written name, her photograph or other likeness. All of them worked as long as the cue seems to capture some essence of Jennifer.

This discovery nicely demonstrates a “gnostic” neuron, or “grandmother cell”. These are also known as “concept” cells, which itself was a concept started as a joke at a neuroscience conference in 1967. Yet this was no joke. This was real, and has yet to be effectively challenged. The results were independently verified when “Luke Skywalker,” “Bill Clinton,” and “Halle Berry” neurons were found in other epilepsy patients. There are many other examples, and they all demonstrate invariant knowledge by firing when said subjects were detected in ANY form.

The idea of a gnostic neuron is philosophically profound, literally, the expression of knowledge taking the form in this case of Jennifer Aniston. This neuron recognized that one specific person out of thousands of people the subject experienced in her life. That’s an impressive trick. How did this neuron come by this knowledge? And what significance does it have in unraveling our Gordian Knot? In breaking through this logjam of data?

I’ve included the above pessimistic assessment of the Jennifer Aniston discovery because I reached the very opposite conclusion. For me, this gnostic neuron was not a message from aliens. It was a critical hint. The moment I read about the Jennifer Aniston neuron I literally stopped in mid bite. I was eating lunch. The moment remains vivid.

Knowledge is the key to philosophy, or at least its object of love. Knowing the nature of things is also the practice of Zen. What is the nature of the brain, or its most obvious component, the neuron? What is the first principle of the neuron? What is the Zen of brain architecture?

As a computer architect, I’ve had a lifelong professional interest in what computers have in common, and in contrast, with the brain. Knowledge is similar to information, but not the same thing. Like many other technologists, not only had I been mischaracterizing the neuron, I’d also been mischaracterizing knowledge. I’d spent decades analysing neurons as logic devices, trying to understand what kind of systems these neurons might form, or how to “code” memory as suggested above. But like so many other technologists, I had the wrong perspective.

In that instant, for me, the problem changed. Instead of dismissing this result as a message from aliens, I began to wonder what all the other neurons “knew”, and how they came to know it.

In that moment, the neuron ceased being a slippery state machine and became associated with acquiring knowledge. I began researching what it might mean to “know” something, and how a neuron might perform this amazing trick.

(more shortly)

Tuesday, July 07, 2020

COVID-19 - the Zendemic Wrapped in Toilet Paper

Here's a radical idea. I want it in the public record as of March 14th, 2020.

What if COVID-19 is actually not as deadly as it seems?

What if we're undercounting the actual cases, and over attributing the deaths caused by it?

OK, let me put that a little differently. What if COVID-19 on its own only rarely causes death? I'm deadly serious. What if COVID-19 is actually a relatively mild biological challenge to a normal healthy human, similar to its cousin the common cold. OK, maybe a cold that kills some people. But would the numbers of diagnosed cases of COVID-19, and the deaths "caused" by it, look any different than they do now?

As obvious evidence, why such inconsistent numbers from country to country? The disease is the common element. The variance must reflect standards for data capture or perhaps the demographics or lifestyle of the patients. Let's address data first.

When and why does a case become a case? While we're questioning, where are the useful comparative data? Out of a thousand people without confounding issues, how many will die per age group? And why do we have no random sample control groups to track overall transmission rates and deaths instead of guessing about the denominator? This denominator problem is best understood by the difference in the death rate between China (9.9%) and Korea (0.7%). Korea did more testing and so have a more useful denominator. Another way of looking at it is that China only tested those who already had a serious case. They didn't bother to test mild cases. This lack of testing is happening in the U.S. as well. At least so far. I realize there's a priority for tests being used to track individual contact and transmission, but a baseline of periodically sampled control groups would be of great value in learning how the disease is evolving in a given population.

Now for lifestyle and demographics. What is the general health of the population? Compare Italy (7.9%) and Germany (0.3%) percent. Are Germans that much healthier than the Italians? Or is this also confounded by the denominator issue in Italy? We'll know in the long term.

Also, why does China now have so few new cases and deaths? Were they THAT good at stopping transmission? It's hard to believe China effectively isolated a hundred thousand from the other 1.4 billion. And did it without exception. If this disease is so contagious, China should be keeping its early lead in both cases and death. They obviously aren't. Or else they aren't reporting it.

Which brings us to this problem with the skewing of the death demographic (65 plus years old, immuno-challenged, etc.). This demographic is extraordinary for a deadly disease. But perhaps not for an ordinary cold. It's clear that most of the deaths are those over 65 years old, but what percent of 65 and older that contract COVID-19 die? Also, there is that lack of dying children. Why?

Normally, about 150,000 people die around the world each day. That's about 20 people per day per million. The most convincing data will be when deaths exceed 20 people per day, per million. So far, COVID-19 has only added another 633 people per day in the United States, a rate of only abut two per day per million. Or is it even this high? How many of those 633 would have died from other causes within 24 hours? It seems that COVID-19 might be taking the blame for a normal death rate in a typical winter. Or at the very least, taking the blame for far more death than it deserves.

With nearly eight billion people, at any given moment there are thousands of people in the world on the edge of death. Sad but true. A simple cold or flu can push some of them over that edge. What then is the cause of that death? Their pre-existing condition? Or the most recently diagnosed cold or flu? What if this COVID-19 event is largely an attribution artifact? What if they simply die a bit earlier of additional COVID-19 biological stress. Which disease or chronic condition should get the credit?

If we didn't know that the COVID-19 virus existed, would these deaths be blamed on other causes? Would they even be seen as abnormal? Is COVID-19 simply an artifact of an improved technical ability to measure a new disease? And to publish the results in the media instantly?

Then there is the toilet paper thing. If you haven't realized it yet, there is no "real" shortage of toilet paper, just people hoarding it. It's a self-fulfilling prophecy. This happened once in the 1970s. I remember it well. The same thing happened with gasoline at about the same time. Here's why it matters. 

A run on TP is similar to a run on medical services. If you hear that there is a new disease, you might be just a bit more likely to go to the hospital and get tested. When the result is positive - boom - they isolate you and fill up a bed. Soon someone else comes in with a positive test and our hypersensitive medical system responds. Even a small shift in demand can overwhelm this medical system. Soon the hospital's full and there's an "epidemic". Hyper-analysis of this epidemic will find a correlation with whatever version of cold or flu that happens to have emerged during the season. In this case, that disease might be COVID-19. And the media runs with it. Panic ensues.

Is COVID-19 the first actual media disease not unlike this run on toilet paper?

If so, this Zendemic will resolve quickly, no more than a few weeks. Otherwise, deaths will exceed the typical 150,000 per day for months on end. So far it hasn't, but we will know soon.

Habeas corpus.

03-18-20 The picture is becoming more clear.

03-25-20 What is coronavirus – and what is the mortality rate?

The above article finally addresses some of the questions I presented above. Well, sort of. For instance, I noted and questioned, "It's clear that most of the deaths are those over 65 years old, but what percent of 65 and older that contract COVID-19 die?"

Though I didn't use 80 years old to define my question, that age nicely frames the issue and makes my point. I might have said 90 percent of those that die are over 80 years old, but what percent of 80 and older that contract COVID-19 die?

Their answer - 10%.

So if COVID-19 could be exposed to all 80-year-olds (which is impossible), how many would die? Google says three million. Normally about 300,000 will die each year (linear rate). That is a useful baseline, and also the estimate The Guardian makes for COVID-19. Which was my original point. Of course, the final count could be greater, but not by orders of magnitude, and likely well under 50 percent greater.

So the question becomes, how much do we economically impact eight billion people for any excess death over 300,000?

Actually, I think this has been a good test run for a bug ten or a thousand times worse that may occur next year. Or the year after that. But not yet. COVID-19 is not the black plague. Not even close.

03-27-20 I posted the following to a friend's Facebook feed:

Justin is correct. The numbers of deaths in America so far attributed to COVID-19 are so low they get lost in the noise of the typical death rate caused by respiratory failure which is around 500 per day in America, or 1.5 deaths per day per million. But that's just the view from the top and ignores the denominator problem - how many died per day per what size population? Even though this COVID-19 has been declared a pandemic, it remains epicentric, meaning most of the deaths occur in hot spots like Wuhan, Milan and New York. What is the size of each of those exposed populations? We don't have good numbers yet, but we can use China as an example. As shown there, the ultimate impact will be far less than the media currently suggests. So far, the sky is not actually falling, and is unlikely to do so.

03-29-20 another Facebook comment:

Bruce, over 6000 people in America die every DAY for one reason or another. That's 180,000 in the month or so that we've been keeping count of COVID-19. Now, many of those deaths are from accidents, etc. but a large number are from chronic conditions, many of which, are conflated with Corvid-19 because that is the current proximate cause of death. In only a few of your 2043 cases is COVID-19 the clear and direct cause of death. In 2009 hundreds of thousands died from swine flu. Or did they? Like COVID-19, many of those deaths had respiratory and other comorbid factors as well. Yes, it's sad, but the reality is, various diseases ripple through our population each winter bringing early death to hundreds of thousands that might have lived a few more days, weeks or months. Only a small minority would live for years longer. I'm not suggesting that Covid19 isn't deadly and we of course should try to avoid its spread, or at least slow it down. I think this exercise is good practice for when we get a really bad bug like Ebola, but let's try to keep these numbers (and causes of death) in context. So far this is no worse than a bad flu, and if China, South Korea, and Germany are useful examples, it will end about the same way within a few weeks. "If you can keep your head when all about you are losing theirs...yours is the Earth and everything that's in it..." Rudyard Kipling.

The test described below may well be the turning point in this biological mystery. Sure, the Abbott ID NOW test is quick and simple. It will be used a lot, but more importantly and for the first time, there will be the ability to do large random sample testing over various large populations. This should solve the "denominator" problem. With that information at hand, analysis and local triage and isolation become manageable. The rest is just implementation. Check it out:

Why Abbott's 5-minute COVID test could be a game-changer

04-03-20  Corona WorldOMeter

Look carefully at the curves for each country (or even the world as a whole). These curves are not geometric (becoming ever steeper). Instead, they are flattening. These are pretty typical two-dimensional propagation curves. They are like a forest fire that only burns the weaker trees. This bug is harvesting those with significant comorbid factors in their health.

Yes, some are dying weeks or months before they might have, but most would have died sometime this year. COVID-19 will ultimately kill about the same number as the flu does each year, and in many cases, the very same people. Their death will just be attributed to a different disease. This event is more about a panicked media than a biological challenge. Callous? Of course. But with increasing reports of bankruptcies, domestic abuse, murder, and suicide, there is serious doubt about this disease being worse than the cure. Still, it's a useful dress rehearsal for a much worse bug in the future, and much good will ultimately come from this event.

04-06-20 Fever Map Indicates Dramatic Drop in Temperature

Kinsa Source Health Map

After working with this map for a while I've jumped to the conclusion that this may be the most useful data so far about this whole COVID-19 issue. OK, temperature spikes do not equal COVID-19, but when these spikes correlate to jurisdictions with spiking COVID-19 cases and deaths, probability shifts dramatically in the favor that these temperature spikes ARE caused by COVID-19. If this is true then we should see not just a flattening of the curve, but a dramatic drop in new cases within days, or at most a very few weeks.

How COVID-19 affects humans

04-07-20 It appears in the graph below that about 200 deaths per day in the U.S. may have been misattributed to COVID-19 instead of all other causes of pneumonia. Of course, this is only one of the many comorbidity factors widely associated with this pandemic. If the other factors are added in, it might account for most of the current 1400 deaths per day, except for the geographical distribution of the dead. They are not evenly distributed across the population. They are epicentric in nature, especially in NY and NJ. Yes, there is misattribution but it likely only accounts for a fraction of the cases. The rest must be from the direct biological impact of COVID-19. It IS a real disease. We just don't yet have it well characterized.

04-09-20 Misatribution?

04-13-20 Shit may be the breakthrough we need to solve the denominator problem. Not familiar with the issue? The Worldmeter currently says we have 1,872,825 cases of Coronavirus which has caused 116,037 deaths worldwide. That's a death rate of over six percent, which is patently absurd. If this disease is really killing six percent of those who contract it, it is three times worse than the Spanish Flu, and that's simply not the case. There must be FAR more cases than have been documented. That would change the denominator in the death rate. This work with sewer sludge may ironically clarify our understanding. Next, we need to take on the misattribution issue, and the real scale of the Corona threat will come into focus:

New research examines wastewater to detect community spread of Covid-19

OK, I want to be clear. Corona IS a deadly disease, but only by degrees, and with extremely disproportionate targets. Here is a subgroup I just read about. It is rest home in New Jersey with about 700 rooms which means they have a staff of about 70 per industry average. At this home, 70 residents and two nurses died with a positive Corona test. That sample is consistent with the Diamond Princess - 700 tested positive and 10 dies, many of them were older passengers. In both cases, 10% of the elderly and 2% of the younger (but perhaps not completely healthy) died. This data is a place to start.

04-19-20 A New Statistic Reveals Why America’s COVID-19 Numbers Are Flat

"According to the Tracking Project’s figures, nearly one in five people who get tested for the coronavirus in the United States is found to have it. In other words, the country has what is called a “test-positivity rate” of nearly 20 percent."
This is the first decent "denominator" data I've seen so far. If this 20% number is correct, then Corona with currently 39,000 deaths in the United States has a death rate of 0.06 percent, which is less deadly than the typical flu. Then you have the issue of the under counting because of lack of tests, and the misattribution issue which would effectively over count the dead. All of this new data seems to be homing in on my original assertion that Corona is not nearly as deadly as the media has presented.

04-21-20 Both Santa Clara and Los Angeles counties now of studies showing that from 20 to 80 TIMES more people have positive Corona antibody tests which is consistent with the "Tracking Project" above. Again, this would mean Corona's CFR is comparable to a typical flu. Where are the numbers from the rest of the country? And why is this topic not being addressed in the daily briefing?

Hundreds Of Thousands In LA Infected With Coronavirus: Study

04-23-20 Governor Cuomo just announced that 21% of New York city has a positive antibody test for Corona, yet does not acknowledge what this means for the CFR. Our government has been grossly negligent in managing this "pandemic" and its metrics.

04-26-20 Santa Clara, Los Angles, New York, and Miami are all reporting positive antibody tests many TIMES in excess of reported diagnosed cases. Actually more that an order of magnitude greater than reality. It's time to reassess the nature or Corona.

Miami Joins the Crowd

I won't bother with the bug's official name, COVID-19 anymore. This disease has had so much world wide impact that it will forever be known as the Corona panic. Or perhaps ultimately known as a the media disease instead of a biological one. Like the disease itself, the published perception has been FAR worse than the reality - somewhere between one and two orders of magnitude. Even if this media impact was mostly not deadly (suicide and murder stats will likely show an increase), the financial costs will be enormous, perhaps incalculable.

As for the disease itself, if these antibody test are ultimately validated, and as I originally suggested at the beginning of this post, Corona will not be remembered as a deadly disease, at least not in the same terms as Ebola or HIV, and certainly not on the scale originally feared. It will take years to sort out the misattribution to even discover Corona's true death rate.

Also, the cost of the "cure" will far exceed the social impact of the relatively modest number of dead. In terms of death rate, Corona will likely fall somewhere between an average flu and perhaps, the swine flu, but in it's ability to spread, it will be closer to the common cold which is far greater than the flu.

I will let others with far more knowledge present the details, but it's safe to say that our media and government response to Corona can not be rationalized nor supported when Corona is ultimately compared to the Flu and our historical response to that and other diseases. Still, this panic response has been quite informing and an interesting exercise, even though VERY expensive.

05-04-20 Here's another useful approach to understanding the true impact of Corona:

Excess Deaths Associated with COVID-19


Here's a very quick summary. Corona is both a deadly disease and a common cold, each by degrees, and dependent upon conditions once two issues are defined - the denominator problem and misattritubtion of the causes of death, both over and under stated for various reasons.

So far it appears that somewhere between two and twenty percent of America has antibodies for this disease. A good guess might be about 30 million Americans have already contracted and survived this disease. And that's its most important metric. It means this disease is not very deadly, perhaps not much worse than a bad case of the flu. Characterizing it's mild cases should be relatively easy. Understanding how it kills could be much more difficult as most of those deaths are mired in comorbities and teasing apart cause from correlation will be difficult. With 30% of the deaths occurring in rest homes, Corona will soon be largely managed as another disease of the elderly, while most of the world gets back to work.

The important question is, how much deeper than 10% will this disease penetrate the U.S. population? And how many more will die before this immune base begins to impact the transmission rate?

05-26-20 Misatribution remains a mess:

Beating Up the Numbers:

One example of misatribution:

06-01-20 It will take a while until we learn the truth of Corona but there are a few conclusions that can be drawn now:

Likely beginning in late 2019 Americans began transmitting COVID-19 without even knowing it.

By June 1st, 2020, between two to twenty percent of those living in large U.S. cities have contracted and recovered from Corona without ever knowing it. Somewhat less than one percent had symptoms acute enough to be tested. Approximately three hundredths of this one percent died with a positive COVID-19 test. Some of these deaths were certainly caused by this deadly disease. Many others were not. There has been gross misattribution of the proximate cause of death in both directions. I believe that ultimately, COVID-19 will be seen to have been less lethal than the average flu. Only our response has been exceptional, and perhaps a good simulation for the real thing.

Corona is now mostly a political issue.

06-20-20 Daily COVID-19 Deaths in the U.S. Have Fallen Dramatically Since April

06-22-20 Stanford prof: Median infection fatality rate of coronavirus for those under 70 is just 0.04%

06-24-20 Transmission of disease - Erin Bromage

Erin does not really deal with misatribution which would have a dramatic effect on mortality rates, but there is much good basic information here:

06-25-20 Where Are We Now? - Erin Bromage

07-07-20 As of today, COVID-19 has killed 538,933 worldwide of which 130,312 are in the United States. This is an interesting ratio in that one would expect outcomes to be above average in America because of better health resources. So why does 5% of the world's population have 24% of the death? Could it be misattribution of cause of death?

Let's back out that 130,312 questionable deaths as bad data and apply the remaining deaths to the population of the rest of the world. This exercise gives us 408,621 deaths for 7,331,462,517 or about 56 deaths per million. Now it's true that misattribution could be understated in the rest of the world as well as overstated in America, but odds are not 4.8 times. Also, infection rates will vary widely, but will tend to average over such a large base. All things equal, the world number is more likely to be more accurate. And if we apply this death rate back to America, it would be 18,480 deaths to date, far less than the stated 130,312 and almost certainly more accurate.

Finally, if we take our more probable infection rate of 10% instead of the 1% case infection rate, our denominator yields an infection fatality rate of 0.06%, about like a bad flu year. This is almost certainly a better assessment than we have gotten from the World Health Organization or the CDC.

What is all the fuss about?

08-31-20 CDC Excess Death Analysis

10-10-20 JP Video on the NEW CDC Infection Fatality Rates per CDC

Flu = 0.1% 
COVID ages 0-19 = 0.00003%
COVID ages 20-49 = 0.0002%
COVID ages 50-69 = 0.005%
COVID ages 70 + = 0.054%
COVID ages 0-19 = 0.00003%

Now where are these extra deaths coming from?

It seems like the source of these deaths is extremely disproportionate with those in assisted living and non-white which would indicate harvesting (pulling deaths forward) and/or racist or unhealthy lifestyles, or simple misattribution. COVID is not a normal virus or we are measuring something else entirely. The race factor could be a function of job loss, suicide, and domestic abuse with the groups hit hardest by the economic downturn because of income which could be the race factor:

Over 100,000 new cases of COVID in the United States in one day, with some states having "positivity rates" of 30 percent. This means that the disease is rippling out through the general population, but strangely enough, extremely few cases are ending in death or even hospitalization as a ratio of the number of cases. Indeed, the number of cases seems to now be completely decoupled from the number of deaths. In other words, cases are going up exponentially, but the number of deaths remains flat at about 1000 per day. This would indicate that the two are not connected at all, and maybe never were. To be generous, death is VERY loosely correlated with contracting COVID. 

This draws attention to those thousand who are dying each day. Are they really dying from COVID? Or is a thousand per day simply the limit of misattribution for a population of this size? In the long run, we will know.

It's past time to admit that COVID is not nearly as deadly as originally feared, not even within an order of magnitude. Maybe not even within two orders of magnitude of published worst-case scenarios. A large part of America has obviously now already had this disease, more than half of which were never even aware of it. Only a very small fraction of those with COVID required hospitalization, and only a much smaller fraction of those died. This fraction was so small that even determining the true cause of death for these relative few was difficult, and often inaccurate. It's time to post analyze all this data and put this disease in its place - a relatively minor threat to the world in 2020.

11-18-20 FINALLY! The general press is picking up on the denominator issue I first posted in the blog in March. Eight months isn't bad. Oh well.

11-20-20 Misattribution? 

It's becoming clear that the key to understanding COVID is teasing apart which cases would normally be minor for COVID but coincidental or in some way enabled by biological changes brought on by COVID that dramatically impact the risk of existing comorbidities in a way that makes COVID a "harvesting" disease. In any case, COVID is not the threat to humanity presented by the media. It may be a weird disease with some even stranger preference for killing the weakest in our culture. Or perhaps COVID is a disease that rarely kills and what we have is just a bad case of mass hysteria and all of those deaths an artifact of misattribution,  over, or mis-treatment brought on by fear. Yes, this is a radical thought, but COVID is a radically different kind of disease and so this idea needs to be seriously considered:

Death of man who fell off ladder ‘ruled as natural, caused by COVID-19′

11-27-20 Why is science being deleted instead of being challenged based on underlying data?

Sunday, March 29, 2020

Defy Aging - Keep Moving and Stay Hungry

First posted on my sixtieth (or is that fortieth?) birthday, 01-29-12. Updated periodically:

60 is the new 40. 50 is the new 30. I've even seen a proclamation that 80 is the new 30!

Such declarations about this new "age" can be seen everywhere. Are they simply age denial? Do baby-boomers refuse to grow old? And is this denial just a way to lie about our age? Will lying to ourselves help us live longer? Maybe.

There's been lots of interesting research about the placebo effect in all its forms. What's interesting is, placebos seem to work even when the patient KNOWS it's a placebo. And how is lying about your age different from giving yourselves a placebo?

Unfortunately, the placebo effect will not solve everything. There are still some hard facts in this new age of aging. Since the Kellogg brothers made health a popular topic at the beginning of the last century, thousands of treatments have been tried in an attempt to stay younger. Most have been proven to be worthless, but a few obviously make a difference:

"Today, the average age for someone moving into a nursing home is 81. In the 1950s, it was 65."

"People are living 34 years longer than their great-grandfathers."

"The number of people in the world over 100 years old is now approaching half a million."

The internet is full of such dramatic results, so how does one gain the benefit?  A few simple things make most of the difference.

Avoiding tobacco and limiting solar exposure is good for the skin.  The guy to the left was a life-long truck driver.  He's obviously not British. Tobacco has a similar effect but to all of your skin.

Appearance aside, the most important factors in staying young are still diet and exercise, so keep moving and stay hungry.

Keep Moving

Whether you are overweight, have chronic pain, arthritis, dementia, depression, diabetes, anxiety or fatigue there is one piece of advice that will improve your quality and length of life - "Keep Moving". What's surprising is how this advice not only affects the physical but also your mental health.

ANY physical activity that keeps you moving for at least 30 minutes a day, EVERY day will make a huge difference. That "every day" is the hard part. Success starts with finding something you enjoy. It can be yoga, swimming or walking.  Start slowly and work your way up. Even if it takes a year to do 2 miles a day, after that you've gained 80% of the benefit of exercising in general. The second, fifth and seventeenth years are much easier.  The best exercise is the one that you DO, so it's probably the one you enjoy most.  Find your favorite way to move.

Here is the best summary I've found on the topic, graphically presented.  If you do nothing else about your health this year, at least spend nine minutes watching this video.  It may add years to your life:

23 and 1/2 Hours : What is the Single Best Thing You Can Do For Your Health?

“You don’t deteriorate from age, you age from deterioration.” - Joe Weider 

Stay Hungry

The meaning is obvious. The trick is to not stay too hungry. Just like the exercise part, if you take it to the point of pain you're more likely to return to your old lifestyle. On the other hand, if you eat only what you need, you'll not only stay lean and healthy, you'll enjoy life more.

Have you noticed how much better food taste when you're hungry? Well, at least the first few hundred calories. This is an important hint. When the meal becomes less compelling, stop eating. I know it's easier to say than do for a number of reasons. But if you eat just 100 calories less than you burn each day, you'll lose 12 pounds a year - that's hard science, and it's major progress. Still, it's easier said than done.

The trick is finding how many calories you really NEED. It's probably a lot less than you think. That's because we're used to eating about twice as much as we require. Food is everywhere you turn. There's now even a snack bar at our local DMV. People seem to eat every hour or two. And they eat more than they did a hundred years ago. There's just too much food in our cage. Fortunately, our body is smart enough to send most of those extra calories right down the toilet. But not all of them. Over time, even a few extra calories a day will add to your waistline.

You can use an internet calculator to find how many calories you need per day. You can tell if the number's right by how hungry you are at the end of the day:

Calorie Calculator

Once you know this number, slowly decrease intake until you find that edge between hunger and health. This should become your average consumption target. Avoid grazing. Eat at appointed times, and only planned amounts. Take some of that food out of your virtual cage. And as you decrease volume, increase variety. That's the key to good nutrition.

Another trick is micro-fasting. If you know you'll be having a large dinner, skip lunch. Sure, you'll be hungrier than usual and probably eat a bit more at dinner, but you're already a few hundred calories under your target. Just don't stuff yourself. Keep your AVERAGE consumption just below your need. Take your time losing those extra pounds.

"Staying hungry" will also improve the quality of experience for your other appetites. From sex to alcohol, to Netflix, less can be more if you hone your appetite with a bit of moderation. Find the "sweet spot" and stay hungry in all respects.

Live Longer

If it's that simple, why are only a few truly healthy? It's obvious not everyone is gaining these extra years. Not surprisingly, access to excess noted above and electric grocery carts are the reasons. The majority of people today are actually shortening their lives with calories and the couch. Many are now dying younger than they would have a hundred years ago because of this default lifestyle. And more will follow them into the grave shortly.  Just look around.

Our society has become bifurcated where most people (of all ages) default into less activity and consume more calories. A minority eat less and lead more active lives. What's truly amazing is that this minority is still able to skew the average lifespan upward, while the bulk of America is killing themselves early. That's why a healthy lifestyle may extend one's life even more than the averages indicate. If you live well, your chronological age may not matter as much as you think.

Misrepresenting your age may be a lie, but it's a lie worth living.

"Count your age by friends, not years. Count your life by smiles, not tears." - John Lennon

Even more data:

Consistent with the social bifurcation of watching diet and exercise:

05-08-17 Life expectancy gap between rich and poor US regions is 'more than 20 years'

04-17-13 Here is a demonstrative meta-study of the effects of 50 calorie reduction per day for an entire country! Now if we could just learn to do that as individuals:

The Cuban diet: eat less, exercise more - and preventable deaths are halved

Another example in progress:

02-22-18 Venezuelans report big weight losses in 2017 as hunger hits

06-10-13 Cause or effect?  Fast walkers stay ahead of the game

01-15-15 More data:

Daily walk adds years to your life: Just 20 minutes a day is enough

04-21-15 Or is an hour a day the sweet spot?

The Right Dose of Exercise for a Longer Life

01-20-16 Here's an interesting idea that fits in with the work I've been doing on neuroscience and behavior:

The Hunger Mood

04-09-17 Interesting meta-collection: Peaking

06-08-17 Longevity Illustrator

07-17-18 Think Yourself Young

11-08-18 Dutchman, 69, brings lawsuit to lower his age 20 years

01-17-19 This Is How To Have A Long Awesome Life: 7 Secrets From Research

04-22-19 Effective Microfasting

10-25-19 Five Myths About Again

01/13/20 Lee Schonberg Index

02-02-20 When Does Someone Become "Old"?

03-29-20  The number of steps per day, not speed, is linked to mortality rate

05-05-20 A fairly obvious filtering of the healthiest, but still deaths were associated with how MANY steps taken and not how fast they were taken:

More Steps Per Day Linked to Lower Mortality Risk

Taking more steps per day is associated with lower all-cause mortality risk, according to an observational study in JAMA.
Roughly 4800 adults aged 40 and up participating in the National Health and Nutrition Examination Survey (NHANES) wore accelerometers on their hips during waking hours for 7 days. During a mean 10 years' follow-up, 24% died. The unadjusted all-cause mortality rates were:

77 per 1,000 person-years for those who took less than 4000 steps per day;
21 per 1,000 for 4,000–7,999 steps;
7 per 1,000 for 8,000–11,999 steps; and
5 per 1,000 for 12,000 steps and above.
In adjusted analyses, people who took 8,000 steps per day had lower all-cause, cardiovascular, and cancer mortality than those who took 4,000 steps. Faster walking speed was not associated with lower mortality after adjusting for total daily steps.

09-21-20 Older People Have Become Younger

Sunday, February 09, 2020

The Emergence of Man

First posted 08-03-12, updated every now and then.

As a boy, I was impressed with "2001 A Space Odyssey".  The idea that a spark from space vaulted man beyond the other primates seemed plausible. And it got me thinking.

How different ARE we from the other primates?  And when did this difference occur?

These questions lead me to, "The Naked Ape", by Desmond Morris. He defined a few differences, but far more similarities.  So I kept looking.

Over the years I've kept track of the various discoveries looking for the significant differences between us and our cousins.  It's time to document them.

For background, let's step back about a billion years:

Billion-year-old fossils may be early fungus
Scientists Find 'Oldest Human Ancestor'

And a half a billion years later:

Tiny Chinese Archicebus fossil is the oldest primate yet found

Walking and Running

When I was a child, walking erect was the gold standard of humanity.  And it's true, we're better on two feet chasing down game than all others, but only by degrees.

Rare 10 million-year-old fossil unearths new view of human evolution

Little Foot - 3.67 Million Years Old

Bonobos showed that walking erect is no big deal:

Walking Upright

But that's not running.  About three million years ago a significant change occurred.  Humans became marathon runners and developed some new hunting scripts largely stolen from wolves, which begs the question, who ultimately domesticated whom?

Family Tree of Dogs and Wolves Is Found to Split Earlier Than Thought


Food has also been an area of study to define differentiation, but which species jumped what line, and when?

Human ancestors changed diet 3.5 million years ago


Another thing that set us apart was thought to be tool use, but this test also fell as chimps and other species have now demonstrated.

04-14-15 World's oldest stone tools discovered in Kenya 

Tool-making and Meat-eating Began 3.5 Million Years Ago?

Tool-Making 3 Million Years Ago?

Stone-tool makers reached North Africa and Arabia surprisingly early - 2.4 million years ago 

Tool-making 1.4 Million Years Ago?

Bone Tool-making - 1.4 Million Years Ago

12-06-19 San Diego Tool Use 130,000 years ago

Ancient Stone Tools Hint at the Real Paleo Diet 126,000 to 781,000 Years Ago
The spear which was developed about 500,000 years ago is also a clear example of tool use.  So far we've seen no other species accomplish this trick, though a weapon may become a possible learned behavior for some other primates as other tools have been.  Is it just a matter of time?

Australian researchers say they’ve found the world’s oldest hatchet

Monkey or Man?

11-04-16 49,000 Year Old Human Settlement in Australia

01-31-18 Sharp stones found in India signal surprisingly early toolmaking advances


Clear evidence of trade between distant (and separate) tribes would definitely set man apart from other primates:

Evolve or die: Why our human ancestors learned to be social more than 320,000 years ago


In response, the transfer of culture became the new human benchmark, but the ability to transfer new knowledge from one generation to another has also been demonstrated by chimps...


Then there was self-awareness, which was disproved in spite of Darwin's original mirror observations:

Mirror Test

Fish Passes the Mirror Test

And a nice test of abstract thinking is meta-cognition:

Chimps: Ability to 'Think About Thinking' Not Limited to Humans

And have episodic memory:

Chimpanzees and orangutans remember distant past events

How about 500,000 year-old art?  But which species made it? :

Art on the half-shell

176,000 year old ritual?

Bruniquel Cave

Blombos Cave has 73,000 year old art?


Next to evolve was language.  The chimp Washoe laid that one to rest in the 1960s.  And then there's Koko the gorilla who recognizes 1000 signs vocabulary and 2000 spoken words.  She has an IQ of about 80. Also bonobos have gesture language plus now respond to spoken language with keyboard feedback. It may be simple, but it's language. And even more languages are being discovered:

Prairie dogs' language decoded by scientists


The most literally obvious and vivid tool of man has been fire.  The control of fire allowed our gut to decrease in length by about a yard as we began to cook our food and digestion improved.  Human resistance to air pollution also emerged over the last million years, an indication that we lived with fire during that time.

Control of fire wasn't just tool use, it was the most exquisite form of tool use.  The trick was getting close enough to use the flame but not get burned, and then of course, not letting the fire go out.  How many thousands of our ancestors played with fire before we learn to pass on these two tricks?  And was this the brain and thumbs at work?  Fire was the turning point. The most concise summary of why I found in a New Yorker article, "The Case Against Civilization" (which I disagree with overall):

"The earliest, oldest strata of the caves (in Africa) contain whole skeletons of carnivores and many chewed-up bone fragments of the things they were eating, including us. Then comes the layer from when we discovered fire, and ownership of the caves switches: the human skeletons are whole, and the carnivores are bone fragments. Fire is the difference between eating lunch and being lunch."

We know of no other primate who developed the independent use of fire, (though some Bonobos have now been trained to do so with a lighter, and even use water to put it out).  Man's sustained use of fire is estimated to have begun sometime between 1.5 million and 400,000 years ago:

Who Mastered Fire?

Were Early Humans Cooking Their Food a Million Years Ago?

Still, isn't the difference between us and other primates simply a matter of DEGREE in thinking and manipulating our environment?  Scripts and tools are certainly learned and used effectively by other species.  But our fore-brains allowed for abstraction, delayed gratification and far more complex simulations as demonstrated by the wide range of different human behaviors.  So is our main difference from other primates the complexity of behaviors created by individualism and hyper-specialization?

Or maybe not:

Neanderthals Light it Up

Canned Food?

Out of Africa

Whatever makes us different was probably well established by 60,000 (or 100,000?) years ago, as that's when humans became successful enough to spread from Africa to the rest of the world in our anatomically modern form.  Was it a combination of language, hunting methods, tools, spears, and fire?  Or was it some kind of proto-agriculture for which we've yet to find evidence?

Blombos Cave contained scratches on ocher objects from 75,000 to 100,000 years ago.

Left 100,000 Years Ago?

Or Stayed 60,000 Years Ago?

01-25-18 The modern human brain may only be 40,000 years old

06-16-20 Earliest known bow-and-arrow hunting outside Africa 48,000 years ago

Music, Art and Property?

Border Cave takes some level of symbolic culture and the ownership of property back to 44,000 years.  The Venus of Fels Cave in Germany is clearly art from 35,000 years ago.

Border Cave

Could "owning things" be that line between us and chimps?  This is one of the ideas put forth in Sex at Dawn.  Maybe Christopher Ryan is on to something.  Will this mystery lead us back to ourselves?  In any case, ten to fifty thousand years ago was an exciting time for man.

40,000 Year Old Cave Painting

Archaeologists Unearth 35,000 Year Old Musical Instrument

World's Oldest Portrait - Symbolic Abstraction 26,000 Years Ago

Not all hunter-gatherers moved around.  How could they have carried all these pots?

What 15,000 Years Of Cooking Fish Tells Us About Humanity

02-09-20 15,000 Year Old Catalan Altamira cave carvings


The key to real civilization seems to be the domestication of plants and animals - agriculture.  It's often described in terms of specialization and our ability to withhold gratification until the resource matures (wheat, cows or eggs into chickens).

This may be the key to domestication 14,000 years ago:

We Didn't Domesticate Dogs.  They Domesticated Us.

How hunting with wolves helped humans outsmart the Neanderthals.

14,000-Year-Old Bread

Another line blurred:

Baboons Kidnap and Raise Feral Dogs as Pets

Even the line of first settlements are moving backward and becoming blurred.  In school I was taught civilization started about 5,000 years ago.  Then it was 7,000 years.  Then 10,000.  And now:


Except for digging holes, and a few other minor exceptions, no other species builds shelter:

Oldest house in Britain discovered to be 11,500 years old
Stone Building in Russia

12,000 Year-Old Gobekli Tepe

Gobekli Tepe Update 04-04-15

(Wiki says Gobekli Tepe is only dated to 9559 projecting to 11,000 years old) That's still some impressive stone work which must have taken a few thousand years to develop.  20,000 years seems like a more safe number for now.  We just need to find more sites and map progress, but we're definitely blurring back into our ancestors.  When exactly did we become "human"?

As a side note, dogs have been with us for about 14,000 years according to bone evidence.

And here is an even broader overview taking evolution into our culture - a lot of good ideas here:
This next post strays a bit far from the origins of man, but contains so many useful observation about humanity:

State of the Species - Charles C. Mann

Maybe the missing mechanism is EPIgenitics working with genetics. It's an example of how evolution can go well beyond sexual preference:

Scientists claim that homosexuality is not genetic — but it arises in the womb

Here is a fun idea about how the n-grams of our cultural evolution is reflected in our language:

Evolution of the most common English words and phrases over the centuries  12-12-12

World's Oldest Wooden Water Wells Discovered From About 5000 Years Ago  12-24-12

Is a long childhood the key difference?  Maybe:

Why Are We the Last Apes Standing?

Believe it or not, this was published long after I published this post (which like primates is still evolving). Mark Changizi seems to agree that we differ only by degree ("quantitatively so, not qualitatively"). Interesting post.  I need to get his books on my list:

Bursting the Bubble of Human Intelligence  04-09-13

It seems this puzzle is filling in literally day by day.  Stay tuned for more updates.

It appears we must guard against cultural imperialism in our acquisition of knowledge. And does human behavior vary to try all possible combinations in the same way a species replicates to fill the physical range of it's environment?

Why Americans Are the Weirdest People in the World  02-25-13

9000 Year Old Neolithic Settlement in Israel - 07-17-19

The Wheel

02-19-16 The wheel is certainly a definitive test of humanity.  Well, at least so far:

Oldest Wheel - 5200 Years Old

12-27-18 Seven Million Years of Human Evolution

09-09-20 When did we become fully human?