Toward A Bayesian Theory Of Willpower
I.
What is willpower?
Five years ago, I reviewed Baumeister and Tierney’s book on the subject. They tentatively concluded it’s a way of rationing brain glucose. But their key results have failed to replicate, and people who know more about glucose physiology say it makes no theoretical sense.
Robert Kurzban, one of the most on-point critics of the glucose theory, gives his own model of willpower: it’s a way of minimizing opportunity costs. But how come my brain is convinced that playing Civilization for ten hours has no opportunity cost, but spending five seconds putting away dishes has such immense opportunity costs that it will probably leave me permanently destitute? I can’t find any correlation between the subjective phenomenon of willpower or effort-needingness and real opportunity costs at all.
A tradition originating in psychotherapy, and ably represented eg here by Kaj Sotala, interprets willpower as conflict between mental agents. One “subagent” might want to sit down and study for a test. But maybe one subagent represents the pressure your parents are putting on you to do well in school so you can become a doctor and have a stable career, and another subagent represents your own desire to drop out and become a musician, and even though the “do well in school” subagent is on top now, the “become a musician” subagent is strong enough to sabotage you by making you feel mysteriously unable to study. This usually ends with something about how enough therapy can help you reconcile these subagents and have lots of willpower again. But this works a lot better in therapy books than it does in real life. Also, what childhood trauma made my subagents so averse to doing dishes?
I’ve come to disagree with all of these perspectives. I think willpower is best thought of as a Bayesian process, ie an attempt to add up different kinds of evidence.
II.
My model has several different competing mental processes trying to determine your actions. One is a prior on motionlessness; if you have no reason at all to do anything, stay where you are. A second is a pure reinforcement learner - “do whatever has brought you the most reward in the past”. And the third is your high-level conscious calculations about what the right thing to do is.
These all submit “evidence” to your basal ganglia, the brain structure that chooses actions. Using the same evidence-processing structures that you would use to resolve ambiguous sense-data into a perception, or resolve conflicting evidence into a belief, it resolves its conflicting evidence about the highest-value thing to do, comes up with some hypothesized highest-value next task, and does it.
I’ve previously quoted Stephan Guyenet on the motivational system of lampreys (a simple fish used as a model organism). Guyenet describes various brain regions making “bids” to the basal ganglia, using dopamine as the “currency” - whichever brain region makes the highest bid gets to determine the lamprey’s next action. “If there’s a predator nearby”, he writes “the flee-predator region will put in a very strong bid to the striatum”.
The economic metaphor here is cute, but the predictive coding community uses a different one: they describe it as representing the “confidence” or “level of evidence” for a specific calculation. So an alternate way to think about lampreys is that the flee-predator region is saying “I have VERY VERY strong evidence that fleeing a predator would be the best thing to do right now.” Other regions submit their own evidence for their preferred tasks, and the basal ganglia weighs the evidence using Bayes and flees the predator.
This ties the decision-making process into the rest of the brain. At the deepest level, the brain isn’t really an auction or an economy. But it is an inference engine, a machine for weighing evidence and coming to conclusions. Your perceptual systems are like this - they weigh different kinds of evidence to determine what you’re seeing or hearing. Your cognitive systems are like this, they weigh different kinds of evidence to discover what beliefs are true or false. Dopamine affects all these systems in predictable ways. My theory of willpower asserts that it affects decision-making in the same way - it’s representing the amount of evidence for a hypothesis.
III.
In fact, we can look at some of the effects of dopaminergic drugs to flesh this picture out further.
Stimulants increase dopamine in the frontal cortex. This makes you more confident in your beliefs (eg cocaine users who are sure they outrun that cop car) and sometimes perceptions (eg how some stimulant abusers will hallucinate voices). But it also improves willpower (eg Adderall helping people study). I think all of these are functions of increasing the (apparent) level of evidence attached to “beliefs”. Since the frontal cortex disproportionately contains the high-level conscious processes telling you to (eg) do your homework, the drug artificially makes these processes sound “more convincing” relative to the low-level reinforcement-learning processes in the limbic system. This makes them better able to overcome the desire to do reinforcing things like video games, and also better able to overcome the prior on motionlessness (which makes you want to lie in bed doing nothing). So you do your homework.
Antipsychotics decrease dopamine. At low doses of antipsychotics, patients might feel like they have a little less willpower. At high doses, so high we don’t use them anymore, patients might sit motionless in a chair, not getting up to eat or drink or use the bathroom, not even shifting to avoid pressure sores. Now not only can the frontal cortex conscious processes not gather up enough evidence overcome the prior on motionlessness even the limbic system instinctual processes (like “you should eat food” and “you should avoid pain”) can’t do it. You just stay motionless forever (or until your doctor lowers your dose of antipsychotics).
In contrast, people on stimulants fidget, pace, and say things like “I have to go outside and walk this off now”. They have so much dopamine in their systems that any passing urge is enough to overcome the prior on motionlessness and provoke movement. If you really screw up someone’s dopamine system by severe methamphetamine use or obscure side effects of swinging around antipsychotic doses, you can give people involuntary jerks, tics, and movement disorders - now even random neural noise is enough to overcome the prior.
(a quick experiment: wiggle your index finger for one second. Now wave your whole arm in the air for one second. Now jump up and down for one second. Now roll around on the floor for one second. If you’re like me, you probably did the index finger one, maybe did the arm one, but the thought of getting up and jumping - let alone rolling on the floor - sounded like too much work, so you didn’t. These didn’t actually require different amounts of useful resources from you, like time or money or opportunity cost. But the last two required moving more and bigger muscles, so you were more reluctant to do them. This is what I mean when I say there’s a prior on muscular immobility)
IV.
I think this theory matches my internal experience when I’m struggling to exert willpower. My intellectual/logical brain processes have some evidence for doing something (“knowing how the education system works, it’s important to do homework so I can get into a good college and get the job I want”). My reinforcement-learner/instinctual brain processes have some opposing argument (“doing your homework has never felt reinforcing in the past, but playing computer games has felt really reinforcing!”). These two processes fight it out. If one of them gets stronger (for example, my teacher says I have to do the homework tomorrow or fail the class) it will have more “evidence” for its view and win out.
It also explains an otherwise odd feature of willpower: sufficient evidence doesn’t necessarily make you do something, but overwhelming evidence sometimes does. For example, many alcoholics know that they need to quit alcohol, but find they can’t. They only succeed after they “hit bottom”, ie things go so bad that the evidence against using alcohol gets “beyond a reasonable doubt”. Alcoholism involves some imbalance in brain regions such that the reinforcing effect of alcohol is abnormally strong. The reinforcement system is always more convinced in favor of alcohol than the intellectual system is convinced against it - until the intellectual evidence becomes disproportionately strong even more than the degree to which the reinforcement system is disproportionately strong.
Why don’t the basal ganglia automatically privilege the intellectual/logical processes, giving you infinite willpower? You could give an evolutionary explanation - in the past, animals were much less smart, and their instincts were much better suited to their environment, so the intellectual/logical processes were less accurate, relative to the reinforcement/instinctual processes, than they are today. Whenever that system last evolved, it was right to weight them however much it weighted them.
But maybe that’s giving us too much credit. Even today, logical/intellectual processes can be pretty dumb. Millions of people throughout history have failed to reproduce because they became monks for false religions; if they had just listened to their reinforcement/instinctual processes instead of their intellectual/logical ones, they could have avoided that problem. The moral law says we should spend our money saving starving children instead of buying delicious food and status goods for ourselves; our reinforcement/instinctual processes let us tell the moral law to f#@k off, keeping us well-fed, high-status, and evolutionarily fit. Any convincing sophist can launch an attack through the intellectual/logical processes; when they do, the reinforcement/instinctual processes are there to save us; Heinrich argues that the secret of our success is avoiding getting too bogged down by logical thought. Too bad if you have homework to do, though.
Does this theory tell us how to get more willpower? Not immediately, no. I think naive attempts to “provide more evidence” that a certain course of action is good will fail; the brain is harder to fool than people expect. I also think the space of productivity hacks has been so thoroughly explored that it would be surprising if a theoretical approach immediately outperformed the existing atheoretical one.
I think the most immediate gain to having a coherent theory of willpower is to be able to more effectively rebut positions that assume willpower doesn’t exist, like Bryan Caplan’s theory of mental illness. If I’m right, lack of willpower should be thought of as an imbalance between two brain regions that decreases the rate at which intellectual evidence produces action. This isn’t a trivial problem to fix!
The lines here are perfectly straight - feel free to check with a ruler. Can you force yourself to perceive them that way? If not, it sounds like you can’t always make your intellectual/logical system overrule your instincts, which might make you more sympathetic to people with low willpower.