The Art of the Overbet

Previously: The Kelly Criterion

Last time I said, never go full Kelly.

In practice, I strongly agree with this. Either one should go far over full Kelly because the core Kelly assumptions have broken down and you want to throw the rules out the window, or you are trying to responsibly grow a bankroll and full Kelly betting on what you believe your edge to be would be massively overly aggressive.

This time we go over the second category with practical examples. What are situations in which one should engage in what appears to be massive over betting?

Four main scenarios: Not winning is catastrophic, losing is acceptable, and losing is impossible, losing is inevitable.

You Win or You Die

When you play the game of thrones, you win or you die. If playing it ‘safe’ with your resources means you don’t win, then it means that you die. Either don’t play, or don’t hold back.

In Rounders (spoiler alert, movie is recommended), the star finds himself owing $15,000 to the Russian mob, and his best efforts could only get him $10,000. Without better options and unwilling to run for his life, he walks into the poker club of the man he owes the money to, and says “I owe you that money tomorrow, right? I’ve got $10,000. I’m looking for a game.”

Famously, early in the company’s history, the founder of FedEx once found himself without the funds to make payroll. He knew that if he missed payroll, that would be the end of FedEx. So he flew to Vegas with what funds he had, bet them all on black, won, made payroll, and now we have a FedEx.

Magic players often select ‘safe’ decks instead of ‘risky’ decks. This is exactly backwards. If you flame out of a tournament and lose all rounds, you get nothing. If you win half of them, you still get nothing. If you win three quarters, you usually still get almost nothing relative to winning. Extra variance is great.

If you believe that the fate of everyone depends on crossing a threshold or hitting an impossibly precise target, no matter how bad the odds, you deploy everything you have and take your best shot.

There’s a deadline. We don’t have time for Kelly.

Win or Go Home

Losing not being so bad, or continuing to play not being so good, is effectively very similar to losing being catastrophic but not safely avoidable. In both cases, surviving is not a worthwhile goal.

A classic mistake is the gambler who forgets that they are using their bankroll to pay the rent. On forums I would often read stories of hard working sports gamblers with mid-five figure bankrolls. They make sure to make only ‘responsible’ wagers on sporting events, risking only 1-3% of their funds each time. When they have a good month, they could eat and make rent, but that ate up most of the profits.

What they continuously failed to realize was that this was not a sustainable situation. Being ‘responsible’ only ensured that, even if they were good enough at picking winners, they would never succeed due to their fixed costs. They were being ‘responsible’ with their sizing relative to their bankroll, but completely irresponsible when sizing relative to fixed costs of their time.

When I first started gambling on sports, I put aside a fixed bankroll I committed not to replenishing, but then acted what any Kelly-style formula would call completely irresponsible with that bankroll until after my first few double ups. As the operation became clearly worth my time, I scaled back and did more ‘responsible’ sizing, knowing that I wouldn’t be eaten alive by fixed costs.

Later, when exploring if it was worthwhile to return to wagering, I did a similar thing, struggled, and eventually lost everything I’d set aside. This was very good. It let me move on. To this day, we can never know for sure whether I had an edge during that second attempt – I suspect my luck was quite perverse – but I do know it wasn’t the best use of my time to find out.

Fail fast is a well-known principle for start-ups. I broke this rule once, with MetaMed, and it was an expensive mistake. Even if you have segregated your potential losses in dollar space, you need to contain them in time space, and emotional space, and social space.

There’s no deadline. But we don’t have time for Kelly.

Bet You Can’t Lose

You can’t lose everything if you can’t risk everything.

Most people’s resources are mostly not money, or even things at all, most of the time. When someone ‘loses everything’ we talk of them losing their friends, their family, their reputation, their health, their ability to work. And for good reason.

There is a scene in Defending Your Life where we flash back to the main character paying over $2,000, a third of all the money he has, to avoid a middle seat on an international flight. In context, this is to his credit, because it was ‘brave’ to risk spending so much. In a sense it was brave, in a more important general sense it was stupid, but in the most important sense he was spending a very small fraction of his resources, so the relevant questions were: Was this transaction worth it? No. Did this put the character at risk of having a liquidity crisis where not having any cash could be expensive? A little.

The best reason to do ‘safe’ things with money, and to maintain liquid savings, is to avoid paying the cost of needing liquidity and not having it, or paying the costs of avoiding scenarios where that liquidity might become necessary. This includes the ability to take advantage of opportunity. Avoiding extra costs of having to borrow money, which is expensive in time and in emotional well-being and social capital, not only in money, is by most people largely underestimated. Slack is vital.

There is also a benefit to having no cash. For some people, and in some families and cultures, one who has cash is expected to spend it, or inevitably will quickly spend it. Sometimes this is on one’s self, sometimes on others, but the idea that one can have money and conserve it by spending only responsibly isn’t a thing. The only socially acceptable way to not spend money is to not have money. Thus, there is a high effective ‘tax’ on savings.

In such situations, not only is risk not so bad, it can be actively great. If your risky bet pays off, you can have enough money for big purchases or even to escape your poverty trap. If it fails, you would have wasted the money anyway, and now you can conserve slash mooch off others. Thus, your goal is to find a way to translate cash, which one will inevitably lose, into inalienable property that can be protected, or skills and connections and relationships, or at least experiences one can remember.

This isn’t just for poor people. Start-up culture works this way, too, and the only way to get things at reasonable prices, including labor, or to be able to raise money, is to clearly have spent everything you have and have no spare resources. This is a large portion of why start-ups are so stressful – you are not allowed to have any slack of any kind. I’m likely about to start a new company, and this is the thing that I dread most about the idea.

You Bet Your Life

This is all in contrast to the grand project we have been assigned, of ‘saving for retirement.’

Older people saving for retirement in modern developed countries, or at least some of those saving for retirement, face a special situation. Unable to earn additional funds, and without families or friends they can count on, their survival or at least quality of life depends entirely upon their ability to save up money earlier in life to spend later.

If they run out of money, they’ll still get some amount of government support, and perhaps some amount of familial support, but any remaining years are going to suck. If they die with money in the bank, that money can be passed on to others, but this is mostly a small benefit relative to not running out of cash to spend.

Compound this with large variance in remaining life span, and highly variant needs for health care and assistance during those remaining years, and you have an entire population pressured to furiously save every dollar they can. Any other use of one’s resources is looked at as irresponsible.

This is profoundly weird and profoundly messed up. 

It is handled, by most people and by those we trust to advise us, mindbogglingly badly.

That’s true even in the cases where the problem definition mostly applies to one’s situation, if one is willing to assume the world will continue mostly as it is, and one is unable to invest in other types of resources and expects to have little assistance from them.

It also leads us, as a society, to treat our savings as the savings, the retirement savings, and to apply the principles of that problem to all saving problems, and all risk management problems. Young people take ‘risks’ and buy stocks, older people play it ‘safe’ and buy bonds.

If the world were a much more certain place, where we knew how long we would live, in what state of health, and how the economy and world would do, and everything else we might need money for, and how much money we needed to engage in various activities and consume various goods, we could kind of have a target number of dollars. There was an ad a few years back that was literally this. Each person had a giant seven-figure red number they would carry around from place to place, with an oddly exact number of dollars they ‘needed’ in order to be ‘able to’ retire. Implied was that less than this was failure and would be terrible, more than that would be success and no further funds required. Now you can rest.

Instead, at best we have a probabilistic distribution of how much utility one would be able to get from various amounts of capital, in various scenarios involving one’s life and how the outside world is working. Even in this simplified problem, how does one then ‘play it safe’? At best one can reduce variance from some ‘normal’ shocks like a decline in the stock market, while still being exposed to others, and likely not protect much at all from bigger risks. No matter what, the bulk of what you have will probably go to waste or be passed on to others, or else you are risking disaster. At a minimum, you’re ‘betting’ on what to do with and where to spend the rest of your life, and there you are very much all-in. Odd for someone who is about to die to even try to ‘play it safe.’









Posted in Uncategorized | 2 Comments

The Kelly Criterion

Epistemic Status: Reference Post / Introduction

The Kelly Criterion is a formula to determine how big one should wager on a given proposition when given the opportunity.

It is elegant, important and highly useful. When considering sizing wagers or investments, if you don’t understand Kelly, you don’t know how to think about the problem.

In almost every situation, reasonable attempts to use it will be somewhat wrong, but superior to ignoring the criterion.

What Is The Kelly Criterion?

The Kelly Criterion is defined as (from Wikipedia):

For simple bets with two outcomes, one involving losing the entire amount bet, and the other involving winning the bet amount multiplied by the payoff odds, the Kelly bet is:

{\displaystyle f^{*}={\frac {bp-q}{b}}={\frac {p(b+1)-1}{b}},}


  • f * is the fraction of the current bankroll to wager, i.e. how much to bet;
  • b is the net odds received on the wager (“b to 1″); that is, you could win $b (on top of getting back your $1 wagered) for a $1 bet
  • p is the probability of winning;
  • q is the probability of losing, which is 1 − p.

As an example, if a gamble has a 60% chance of winning (p = 0.60, q = 0.40), and the gambler receives 1-to-1 odds on a winning bet (b = 1), then the gambler should bet 20% of the bankroll at each opportunity (f* = 0.20), in order to maximize the long-run growth rate of the bankroll.

(A bankroll is the amount of money available for a gambling operation or series of wagers, and represents what you are trying to grow and preserve in such examples.)

For quick calculation, you can use this rule: bet such that you are trying to win a percentage of your bankroll equal to your percent edge. In the above case, you win 60% of the time and lose 40% on a 1:1 bet, so you on average make 20%, so try to win 20% of your bankroll by betting 20% of your bankroll.

Also worth remembering is if you bet twice the Kelly amount, on average the geometric size of your bankroll will not grow at all, and anything larger than that will on average cause it to shrink.

If you are trying to grow a bankroll that cannot be replenished, Kelly wagers are an upper bound on what you can ever reasonably wager, and 25%-50% of that amount is the sane range. You should be highly suspicious if you are considering wagering anything above half that amount.

(Almost) never go full Kelly.

Kelly betting, or betting full Kelly, is correct if all of the following are true:

  1. You care only about the long-term geometric growth of your bankroll.
  2. Losing your entire bankroll would indeed be infinitely bad.
  3. You do not have to worry about fixed costs.
  4. When opportunities to wager arise, you never have a size minimum or maximum.
  5. There will be an unlimited number of future opportunities to bet with an edge.
  6. You have no way to meaningfully interact with your bankroll other than wagers.
  7. You can handle the swings.
  8. You have full knowledge of your edge.

At least seven of these eight things are almost never true.

In most situations:

  1. Marginal utility is decreasing, but in practice falls off far less than geometrically.
  2. Losing your entire bankroll would end the game, but that’s life. You’d live.
  3. Fixed costs, including time, make tiny bankrolls only worthwhile for the data and experience.
  4. There is always a maximum, even if you’ll probably never hit it. Long before that, costs go up and people start adjusting the odds based on your behavior. If you’re a small fish, smaller ponds open up that are easier to win in.
  5. There are only so many opportunities. Eventually we are all dead.
  6. At some cost you can usually earn money and move money into the bankroll.
  7. You can’t handle the swings.
  8. You don’t know your edge.

There are two reasons to preserve one’s bankroll. A bankroll provides opportunity to get data and experience. One can use the bankroll to make money.

Executing real trades is necessary to get worthwhile data and experience. Tiny quantities work. A small bankroll with this goal must be preserved and variance minimized. Kelly is far too aggressive.

If your goal is profit, $0.01 isn’t much better than $0.00. You’ll need to double your stake seven times to even have a dollar. That will take a long time with ‘responsible’ wagering. The best thing you can do is bet it all long before things get that bad. If you lose, you can walk away. Stop wasting time.

Often you should do both simultaneously. Take a small amount and grow it. Success justifies putting the new larger amount at risk, failure justifies moving on. One can say that this can’t possibly be optimal, but it is simple, and psychologically beneficial, and a limit that is easy to justify to oneself and others. This is often more important.

The last reason, #8, is the most important reason to limit your size. If you often have less edge than you think, but still have some edge, reliably betting too much will often turn you from a winner into a loser. Whereas if you have more edge than you think, and you end up betting too little, that’s all right. You’re gonna be rich anyway.

For compactness, I’ll stop here for now.




Posted in Uncategorized | 4 Comments

Additional arguments for NIMBY

The following post was written in July of 2017 in response to a post by Julia Galef, but I decided not to post/finish it. I figured I’d piled on San Francisco enough. With the recent post on the subject by Scott Alexander, I figured I would dust it off, give it an editing pass to reflect new points, flesh out the unwritten parts, and let it fly free. Enjoy!

Epistemic Status: I never borrowed your pot, I gave it back to you in perfect condition, and when I borrowed it there was already a hole in it. Also, citation needed.

Originally a response to: Should We Build More Housing in San Francisco?

Image result for yes

Yes. Also, yes. And in conclusion, yes.

Wait. We’re not done.

In her post, Julia presents what she sees as the three major disagreements between advocates of building more housing (YIMBYs) and critics of building more housing (NIBMYs). She views these as: Would adding new housing noticeably effect prices? Does new housing help poor tenants? Are NIMBY objections (that incumbents deserve extra consideration, and existing character is worth preserving) legitimate?

My mission, should I choose to accept it (and I do) is to come up with as many reasons as possible why building new housing in San Francisco would be a bad idea, because upon reading Julia’s list, I realized that my steelman arguments against housing construction mostly weren’t covered, and I wouldn’t group them in this way at all. Many of them seem to never be covered at all.

I will also cover the standard reasons, to see where they fit into my ordering of potential arguments, and because some of them have a point.

Added 10/11/2018: Scott’s reasons, in his recent article, are that San Francisco is uniquely bad at building houses, more housing might not lower rents, YIMBYs act like human garbage and show total disrespect for people’s lives and preferences and call anyone opposed to them names up to and including racist, that people moving to San Francisco is bad for them and that people moving to San Francisco is bad for their communities. I don’t address the completely true argument that YIMBYs are often obnoxious, and don’t consider ‘we won’t be able to do X’ a very good argument for ‘lets ban doing X’ but do address the others. Scott’s final preference is for coordination to move valued communities to better locations, a solution I endorse, and even tried to start multiple times without success.

Reason #1:

People shouldn’t move to San Francisco (regardless of what it does to housing costs and what it does to the rest of the city).

I’ll try to keep this section as short as possible. No promises.

Continue reading

Posted in Uncategorized | 5 Comments

Eternal: The Exit Interview

Previously: Eternal, and Hearthstone Economy versus Magic EconomyThe Eternal Grind, My Referral Link to download Eternal if you want to play.

Epistemic Status: Played a lot of games. Wrote a lot of words. Those not interested in Magic, Eternal or game design may want to skip this one.

Eternal remains modern Magic, with some simplifications, on a phone, with a Hearthstone interface economy.

That remains super high praise. Everyone involved should be very happy with what they have accomplished, and I’m excited to see them move into tournament play.

Despite that, I did still bounce off the game in the end. I enjoyed it during my morning and evening commutes for several months, quite possibly played it another month or two when I wasn’t enjoying it all that much, then realized I had grown tired slash bored of it and stopped playing, mostly all at once.

This is a summary of my experiences, followed by a more detailed analysis of what went both right and wrong. Where Eternal changes vocabulary (e.g. it renames the colors) I will mostly use Magic’s words for things and Magic’s mana symbols, as I expect this to confuse exactly zero people.

Continue reading

Posted in Uncategorized | Leave a comment

Apply for Emergent Ventures

Original Post (Marginal Revolution): Emergent Adventures: A New Project to Help Foment Enlightenment

Today, two new philanthropic projects announced funding.

Jeff Bezos

In the first project, Jeff Bezos gave two billion dollars to fund services for poor families and the homeless. This will be his fourth most valuable charitable project, behind The Washington Post, Blue Origin and of course the world’s greatest charity He is truly a great man.

In exchange for giving two billion dollars to help those in need, he was roundly criticized throughout the internet for not giving away more of his money faster. Every article I’ve seen emphasizes how little of his wealth this is, and how much less generous this is than Warren Buffet or Bill Gates. The top hit when I googled was a piece entitled “Jeff Bezos donates money to solve problem he helped create,” because he’s a capitalist and exploits the workers, don’t you know? Never mind that he’s the biggest source of new consumer surplus we have.

This isn’t quite as big an abomination as the Peter Singer claim that someone who gives away almost all of their vast fortune is basically the worst person ever for not giving away the rest of it. Or even that they’re the worst person in the world for giving away that money in a slightly less than optimal fashion. Won’t someone think of the utils?

But it’s not that far behind.

That doesn’t mean we don’t wish he’d do more, or that we don’t want to help him optimize to have the best impact. It’s important to help everyone play their best game, and I could definitely think of better uses for the money. The hot takes using that angle will doubtless follow shortly.

But, seriously: If you see someone donating two billion dollars to help the less fortunate and you can’t on net say anything nice about it the least you can do is to shut up.

Tyler Cowen’s Emergent Ventures

Now on to the more exciting project that I suspect will have more lasting impact, despite having funding three orders of magnitude smaller.

Emergent Ventures is brilliant. I heartily endorse this service or product.

One of Tyler’s greatest strengths is his ability and willingness to consume and integrate vast amounts of information. He already reads not only a number of books that boggles my mind, but also everything that arrives in his inbox, which is doubtless a criminally underused way to introduce important things into the public discourse. This is how he has a well thought out interesting answer on almost any question you can ask him, as I’ve observed at two in-person audience Q&A sessions.

Now he’s offering the world a golden opportunity. I beseech you to take full advantage of it!

By applying you get to put a 1500 word proposal in front of a brilliant polymath who will at a minimum seriously think about your proposal, and likely offer you feedback. Maybe a dialogue will start. He’ll also incorporate your ideas into his brain, and then perhaps on to his readership.

That’s if you’re not funded. If you are funded, not only will you get Free Money, you’ll get his support helping you succeed. That’s huge.

Emergent Ventures recognizes at least five important facts about the world.

Fact one: Serious attempts at high upside projects (aka ‘moonshots’) where people attempt to do a big thing have ludicrously good rates of return. It is totally fine to not only have most of them fail, but to have most of them have been bad ideas so long as some good ones are mixed in.

Fact two: Having to fit such proposals into boxes that can be approved by bureaucratic systems that in Tyler’s words would ‘keep the crazy off my desk’ are orders of magnitude more damaging to this than most people realize. Such systems have to be game theoretically defended against attempts to extract money. This forces there to be concrete systems guarding the money, so people get paid for ‘hours worked’ and such – lots is wasted to documentation. These systems force applicants to check off tons of boxes, and to pass multiple layers of convincing people, including convincing them that others will become convinced, forcing all projects to become trapped in signaling games and in looking normal and reputable and credible. Even when you could push through something special, there’s little way to know that without trying. So there’s a huge push to do normal-looking things and to end up with tons of time, effort and money wasted.

Fact three: The judgment of a smart individual human can do a better job, and can use ‘watch out for patterns’ and general paying attention to prevent becoming too exploited. There’s no reason to focus on things being non-profits, or not being attempts to make money, or having formal measurable goals at all, if you trust your judgment and are willing to accept mistakes.

Fact four: If the cost of application and the cost of documentation, before and after the grant, is dramatically lowered, then a lot of stuff that wasn’t viable and wouldn’t have even have been considered, is now on the table.

Fact five: People coming to you with short (max 1500 words) descriptions of their best ideas for what is worth doing in the world creates an awesome feed of cool new ideas and proposals and people, leading to food for thought, to new collaborations and connections, and to unknown unknown upsides. Yes, there will also be a lot of ‘money, please’ in there, but in my experience much less than you would naively expect.

You can apply here. I encourage you to do so.







Posted in Uncategorized | 1 Comment

On Robin Hanson’s Board Game

Previously: You Play to Win the GamePrediction Markets: When Do They Work?Subsidizing Prediction Markets

An Analysis Of (Robin Hanson at Overcoming Bias): My Market Board Game

Robin Hanson’s board game proposal has a lot of interesting things going on. Some of them are related to calibration, updating and the price discovery inherent in prediction markets. Others are far more related to the fact that this is a game. You Play to Win the Game.

Rules Summary

Dollars are represented by poker chips.

Media that contains an unknown outcome, such as that of a murder mystery, is selected, and suspects are picked. Players are given $200 each. At any time, players can exchange $100 for a contract in all possible suspects (one of which will pay $100, the rest of which will pay nothing).

A market is created for each suspect, with steps at 5, 10, 15, 20, 25, 30, 40, 50, 60 and 80 percent. At any time, each step in the market either contains dollars equal to its probability, or it has a contract good for $100 if that suspect is guilty. At any time, any player can exchange one for the other – if there’s a contract, they can buy it for the listed probability. If there’s chips there, you can exchange a contract for the chips. Whoever physically makes the exchange first wins the trade.

At the end of the game, the winning contract pays out, and the player with the most dollars wins the game.

Stages of Play

We can divide playing Robin’s game into four distinct stages.

In stage one, Setup, the source material we’ll be betting on is selected, and the suspects are generated.

In stage two, the Early Game, players react to incremental information and try to improve their equity, while keeping an eye out for control of various suspects.

In stage three, the Late Game, players commit to which suspects they can win with and lock them up, selling off anything that can’t help them win.

In stage four, Resolution, players again scramble to dump now-worthless contracts for whatever they can get and to buy up the last of the winning contracts. Then they see who won.


Not all mysteries will be good source material. Nor do you obviously want a ‘certified good’ source. That’s because knowing the source material creates a good game, is a huge update. 

A proper multiple-suspects who-done-it that keeps everyone in suspense by design keeps the scales well-balanced, ensuring that early resolutions are fake outs. That can still make a good game, but an even more interesting game carries at least some risk that suspects will be definitively eliminated early, or even the case solved quickly. Comedy routines sometimes refer to the issue where they arrest someone on Law & Order too early in the episode, so you know they didn’t do it!

When watching sports, a similar dilemma arises. If you watch ‘classic games’ or otherwise ensure the games will be good, then the first half or more of the game is not exciting. Doing well early means the other team will catch up. So you want to choose games likely to be good, but not filter out bad games too aggressively, and learn to enjoy the occasional demolition.

The setup is also a giveaway, if it was selected by someone with knowledge of the material. At a minimum, it tells us that the list is reasonably complete. We can certainly introduce false suspects that should rightfully trade near zero from the start, to mix things up, and likely should do so.

One solution would be to have an unknown list of contracts at the start, and introduce the names as you go along. This would also potentially help with preventing a rush of trades at the very start.

In this model, you can always exchange $100 for a contract on each existing suspect, and a contract for ‘Someone You Least Suspect!’ Then, when a new suspect is introduced, everyone with a ‘Someone You Least Suspect!’ contract gets a contract in the new suspect for free for each such contract they hold. There are several choices for how one might introduce new suspects. They might unlock at fixed times, or players could be allowed to introduce them by buying a contract.

The complexity cost of hiding the suspects, or letting them be determined by the players, seems too high for the default version. It protects the fun of the movie and has some nice properties, but for the base game you almost certainly want to lay out the suspects at the start. This gives a lot away, but that’s also part of the game.

For the first few games played, it probably makes sense to choose mysteries ‘known to be good’ such as a classic Agatha Christie.

The game would presumably come with a website that allowed you to input a movie, show or other media, and output a list of suspects. It would also want to advise players on whether their selection was a good choice, or suggest good choices based on selected criteria. Both will need to be balanced to avoid giving too much away, as noted above; I’ll talk more about the general version of this problem another time.

If you are in charge of setup, I would encourage including at least one suspect that obviously did not do it, in a way that is easy to recognize early. This prevents players from assuming that all suspects will remain in play the whole time, and rewards those paying attention early. Keep people on their toes.

The Early Game

The market maker is intentionally dumb, although in default mode they are smart enough to know who the suspects are. All suspects start out equal.

There are a bunch of good heuristics, many of which should be intuitive to many viewers of mysteries, that create strong trading opportunities right away. To state the most basic, the earlier a suspect first appears on the screen, the more likely they are to have done the deed. So the moment one of the suspects appears – ‘That’s Bob!’ – everyone should rush to buy Bob, and perhaps sell everyone else if trading costs make that a good idea. How far up to buy him, or sell others, is an open question.

That will be the first of many times when there will be an ‘obvious’ update. There will also be non-obvious updates. Staying physically close to the board, chips and/or contracts ready to go, is key to make sure you get the trade first. This implies that making a race depend on the physical exchange of items might be a problem. Letting it be verbal (e.g. whoever first says ‘I buy Bob’) prevents that issue, but risks ambiguity.

What characterizes the early game, as opposed to the late game, is that the focus is on ‘make good trades’ rather than on winning. There’s no reason to worry too much about who owns how many of each contract, unless someone is invested heavily in one particular suspect. We can think of that as a player choosing to enter the endgame early.


Robin notes an interesting phenomenon, that players got caught up in the day trading and neglected to watch the mystery. Where should the smart player direct the bulk of their attention?

That depends upon your model of murder mysteries.

One model says that murder mysteries are ‘fair’. Clues are introduced. If you pay attention to those clues, you can figure out who did it before the detective does. When the detective solves the mystery, you can verify that the solution is correct once you hear their logic. If you can solve the mystery first, you can sell every worthless contract and buy all the worthwhile contracts. Ideally, that should be good enough to win the game, especially if you execute properly, selling and buying in balance without giving away that you believe you’ve solved the mystery.

Another related model says that murder mysteries follow the rules of murder mysteries, and that this often is good enough to narrow down or identify the killer. That way-too-famous-for-his-role actor is obviously the killer. Another would-be suspect was introduced at the wrong time, so she’s out. A third could easily have done it, but that wouldn’t work with the thematic elements on display.

A third model says that the detective, or others in the movie, have a certain credibility. Thus, when Sherlock Holmes says that Bob is innocent, that is that. Bob is innocent. You don’t need to know why. Evidence otherwise might not mean much, but there’s someone you can trust.

Functionally, these three are identical once you know what factors you’r reacting to. They say that (some kinds of) evidence count as evidence, and resulting updates are real. The more you believe this, the more you should pay attention to the movie. This includes trying early on to figure out what type of movie this is. Be Genre Savvy! Until you know what rules apply, don’t worry too much about day trading, unless people are going nuts.

A fourth model says that the mystery was chosen for a reason, and written to keep up suspense, so nothing you learn matters much beyond establishing who the suspects are. The game already did that for you. Unless the game followed my advice and included an obviously fake suspect or two, to punish players who only look at trading.

If you believe this model, and don’t think there is news and there aren’t fake subjects (or that they will be sufficiently obvious you’ll know anyway, if only by how others talk and act) then you won’t put as much value on watching the movie. If trading has good action, trading might be a better bet.

A fifth model that can overlap with the previous models says that others will watch the movie and process that information, so there’s no need to watch yourself if there are enough other players. You might think that there is then a momentum effect, where players are unwilling to trade aggressively enough on new information. Or you might think that players overreact to new information, especially if you’re a forth-model (nothing matters, eat at Arby’s) kind of theorist.

If you feel others can be relied on to react to news, you might trade on news even if you don’t think it matters, because others will trade after you, and you can then cash out at a quick profit. Just like in the real markets.

Or you might concentrate on arbitrage. Robin observed that players would focus on buying good suspects rather than selling poor suspects, and this often resulted in probabilities that summed to more than 100%. This offers the chance for risk-free profits, plus the chance to place whatever bet you like best while cashing in.

In my mind, the question boils down to where the game will be won and lost. Is there enough profit in day trading to beat the person who placed the largest bet on the guilty party? What happens in the endgame?

The End Game

A player enters the endgame when they attempt to ensure that they win if a particular suspect is guilty.

This is not as difficult as it looks, and could be quite difficult to fight against. Suppose I want to bet on Alice being the culprit. I could sell all other suspects and buy her contracts. As a toy example, lets say there are four suspects, and lets say I decide to butcher my executions. I sell the others for $20 and $15 each, and buy Alice for $25, $30 and $40.

If the game ends and Alice is guilty, I made $105 selling worthless contracts, and made $195 buying Alice contracts, for a net profit of $295. If she’s innocent, I collect nothing, so I paid $105 for Alice contracts and made $105 selling other contracts, so I’m just out my initial $200 and die broke. That’s really, really terrible odds if I chose Alice at random!

But if it’s a 10 person game and I do that, even if I chose at random, 25% of the time Alice is guilty. Can someone else make more than $295 to beat me?

If after I finish, others return all the prices to normal, then someone else could profit from my initial haste, then execute the same trades I did at better prices. If that happens, I’m shut out.

That works if you jump the gun, and enter the endgame too early. That’s true even if Alice is the most likely suspect.

In particular, others now need to make a choice. Lets say I went all in on Alice. There are three basic approaches on how to respond:

  1. Abandon Alice. If Alice is guilty, you’ve lost. So it’s safe to assume Alice is innocent, and sell any Alice contracts at their new higher prices, especially once you’re broke and no longer can buy any more. If this encourages someone else to take option 2 and also move in on Alice, even better, that’s one less person who can beat you if Alice is innocent. The majority of players should do this.
  2. Attack Alice. If others are abandoning Alice in droves, her price might collapse even beyond $25 as people rush to sell. You can then pick contracts up cheap, sell other contracts at better prices, and have a strictly better position.
  3. Arbitrage. Try to make as much money off the situation as possible, without committing to a direction. If people are being ‘too strategic’ and too eager to get where they want to go, rather than focusing on getting the best price, then by making good trades (sell Alice when I buy too quickly, buy when others sell too quickly) and forcing others to get worse prices, I can end up with more value, then decide later what to do.

If you only engage in arbitrage, and others commit to suspects, you’ll be in a lot of trouble unless you’ve already made a ton, because you won’t have anyone to trade against. Your only option becomes to trade with the market, which limits how much you can get on the suspect you finally decide to go with, even if the mystery is solved while the market is open, and you’re the only one left with cash.

The good and bad news is that’s unlikely to happen, as others will also ‘stick around’ with flexible portfolios. That means that you won’t be able to make that much when the mystery gets solved, but it does mean you can divide the spoils. If six players commit to suspects while four make good trades, not only are two of the six already shut out, it’s likely the remaining four can coordinate (or just do sensible trades) to win if two or three of the suspects are found guilty, and sometimes should be able to nail all four.

When you have four (or six) suspects and ten players, there are not enough suspects for everyone to own one, and there certainly aren’t enough for anyone to own two. That means that even if a suspect looks likely to be guilty, if you know you can’t win that scenario, you’ll be dumping, and that means at least seven of ten people are dumping any given suspect if they understand the game.

The logical response to this is to stay far enough ahead on your suspect that you clearly win if they’re guilty, and if you get a good early opportunity to dump other contracts you should definitely do that. Good trades are generally good, and those trades just got even better, especially if everyone focuses on buying rather than selling. What you don’t want to do is overpay, or run out of cash (and/or run out of things you can sell).

Thus, I might buy the Alice $20, $25 and Alice $30 contracts, and start selling contracts on suspects I think are trading rich. What I’m worried about is competition – I don’t want other players buying Alice contracts, so if they do, I’ll make sure they don’t get size off by buying at least at their price, and I’ll make sure to stay ahead of them on size. I’ll also think about whether the remaining players are sophisticated enough to sell what they have, even at lousy prices; if they are, I’ll be careful to hold a bunch of cash in reserve. If there are ten players, I can expect there to exist 16-25 Alice contracts, and I want to be sure not to run out of money.

Rank Ordering

This suggests each player has a few different goals.

You want to accumulate contracts in suspects you ‘like’ (which mostly means the ones you think are good bets), so you can get ‘control’ of one or more of them. Control means that if they did it, you win.

You want to get rid of contracts in the suspects you don’t like. The trick here is that sometimes the price will go super high (relative to the probability they did it) as multiple players compete to gain ‘control’ of the suspect. Other times, the price will collapse because there is only one bidder for control of that suspect. If one player gets a bunch of contracts, and is in good overall shape, then no one else will compete.

That in turn might drive the price so low – $10 or even $5 – that the value of their portfolio shrinks a lot, tempting another player to enter, but doing so would drive the price up right away, so it often doesn’t make sense to compete. If Bob is buying up Alice contracts and Carol now buys one at $10, who is going to sell one now? Much better to wait to see if the price goes higher, which in turn puts Bob back in control. The flip side of that is, if Carol can buy a $10 and a $15 contract, and force Bob to then pay $20, Carol can sell back to Bob at a profit. It’s a risky bluff if others are actively selling, but it can definitely pay off.

The key in these fights is who has more overall portfolio value, plus the transaction costs of moving into more contacts. If Carol can make $100 trading back and forth in other contracts, Bob is going to have a tough time keeping control, and mostly has to hope that Carol chooses to go after a different suspect. By being in as good shape as possible, Bob both is more likely to win the fight, and (if others realize this) more likely to avoid the fight.

With a lot of players engaged in active day trading, and aren’t strategically focused, transaction costs could be low. If they’re sufficiently low, then it could be a long time before it is hard to buy and sell what you want at a reasonable price, postponing the end game until quite late. The more other players are strategically focused, and strategy determines price, the harder it is to trade, the more existing positioning matters and the less you can try to day trade for profit other than anticipating a fight over a suspect, or a dump of them.

Rich Player, Poor Player

Suppose you’re a poor player. You made some trades, and they didn’t work out. Perhaps you held on to a suspect or two too long, and others dumped them, either strategically or for value. Perhaps you had a hunch, got overexcited, and others disagreed, and now you’re looking foolish. Now you only have (let’s say) $120 in equity, down from $200.

You Play to Win the Game. How do you do that? There are more players than suspects, several of whom have double your stake. So you’ll need to find a good gambit.

A basic gambit would be to buy up all the contracts you can of a suspect everyone has dismissed. Even if there are very good reasons they seem impossible and are trading at $5, you can still get enough of them to win if it works out, and you might have no competition since players in better shape have juicier targets. Slim chance beats none.

But if even that’s out of the question, you’ll have to rebuild your position. You will need to speculate via day trading. Any other play locks in a loss. You find yourself in a hole, and have no choice but to keep digging. Small arbitrage won’t work. Your best bet is likely to watch the screen and listen, and try to react faster than everyone else in the hopes that the latest development is huge or seen as huge, then turn around and sell your new position to others to make a quick buck. Then hope there’s still enough twists to do this again.

If the endgame has arrived, and rich players are sitting on or fighting for all the suspects, you’ve lost. Your best bet is to consolidate into cash, and hope some suspect crashes down to $5 for you.

Now suppose you’re a rich player. You have $300 in equity. How do you maximize your chance of winning?

The basic play is to corner the market on the most likely suspect, or whoever you think is most likely. If you make a strong move in, you should be able to scare off competition, and even if you don’t do so, you can use that as an opportunity to make more profit if they drive the price up. At some point, others will have to dump, and you can afford to give them a good price if you have to. It’s hard to win a fight when outgunned. The key is not to engage too much too soon, as this risks letting a third player take advantage of an asset dump later. So you’ll want to hold some cash for that, if possible. Remember that you’ll need something like 12-14 contracts to feel safe from a dump, depending on how much equity you’ve built, if you’re out of cash. That shuts out other players.

The advanced play, if you’re sufficiently far ahead, is to try to win on multiple suspects. That’s super hard. Even if you had $400 in equity, if you divide it in half, there are still multiple other players over $200. It seems unlikely you can get control of multiple worthwhile suspects. There’s no point in trying for multiple bargain basement suspects at the expense of one good one, even if it works. So is there any hope here?

I think there is, in the scenario where there is a clear prime suspect.

In this scenario, the prime suspect was bid high early on. Given Robin’s notes about player behavior and tendency to push prices too high, and the battle for control of the suspect, prices might get very high very quickly. There also may be players who will refuse to sell their contracts in the prime suspect, because they don’t realize that they’re shut out of winning in that case. Either they’re maximizing expected value rather than chance of winning, or they don’t realize the problem, or both.

This could open up an opportunity where the ‘net profit’ on the prime suspect isn’t that high for any player. Suppose they start at $25, and everyone starts with their two contracts. They then trade at $30, $40, $50 and $60 in a row, not all to the same player. So there’s minimal chances to buy contracts that make you that much money. If you buy an $80 contract your maximum profit is $20, which is easy to beat by day trading.

So what you can do is go for the block. Hopefully you helped drive the price up early, which is part of how you got your equity. Then contracts only really traded at $60 and $80. So even if the suspect is guilty, someone who moved in on this without day trading first is not going to end up with many contracts. You start with $200, so lets say they end up with 3 contracts and a little cash.

It’s not crazy for you to sell the suspect at the top, do some successful day trading, and then have over $300 in cash. You could win without any contracts that pay out, if you know you’re the most successful day trader and no one can have that many contracts.

That’s a better position than having 5 or 6 contracts in the prime suspect, since you still have cash if they’re innocent. The trick is then having that be enough to win on another suspect as well, or splitting your efforts by holding onto contracts elsewhere. Tricky. But perhaps not impossible, especially if people are dumping contracts at $5.

At a minimum, what you can do is be in a strong position to respond to new developments, and be able to choose which other suspect to back later in the game if you now think they’re more likely, while still winning if the situation doesn’t change. That’s very strong.

A final note is that it is legal, in the game, to trade with another player without going through the market. This could be used to buy out a players’ position in a suspect, shifting control of that suspect, and avoid the issue where once a player starts dumping a position, the price will collapse, as well as the ability of other players to ‘intercept’ the transfer and ruin the buyers’ attempt to accumulate a new position and take control. Thus, players should learn that if they have a bunch of contracts and want out, they should check for a bulk buyer, and if they want in they should consider doing the same. The risk of course is that you tip your hand, which makes doing it on the buy side less attractive.

Flexible Structures

It’s also worth noting that you can extend the idea easily to other prediction markets, and to an online version.

You could trade on the outcome of a sporting event, or an election, or any other real-world prediction market, using the same rules. You could play a board game, and also play the contract game on the outcome of your board game. That gives players something to do between turns and extra things to think about, and gives extra players or eliminated players something to do.

You could trade over a series of outcomes or events (for example, all the football games played today, or both the winner of the game and the combined number of points scored, or even obscure stuff too like number of punts or what not) in order to reward more trading ‘for value’ and place less emphasis on being right. Or just keep track of funds between games, watch multiple shows, and reward the overall winner.

That raises the question of what we can learn about prediction markets from the game.

Market Accuracy and Implications

Early in the game, market prices should roughly reflect fair probabilities of being guilty. Anyone who jumps the gun for strategic positioning will lose out to a more patient player. That won’t stop players from being overeager, and bidding suspects up too high, but as Robin noted that opens the door for others to do arbitrage and sell the contracts back down to reasonable prices.

Later in the game, prices will grow increasingly inaccurate as players jockey for position, and let strategic considerations override equity considerations.

This is a phenomenon we see in many strategic games. Early in the game, players mostly are concerned about getting good value, as most assets are vaguely fungible and will at some point be useful or necessary. As the game progresses, players develop specialized strategies, and start to do whatever they need to do to get what they need, often at terrible exchange rates, and dump or ignore things they don’t need, also at terrible exchange rates.

If we wanted to improve accuracy, we’d need to make the game less strategic and more tactical, by rewarding players who maximize expected profits. There’s a dumb market that is handing out Free Money when news occurs. We’d like players to battle for a share of that pie, rather than competing for control of suspects. If the game was played over many rounds, the early rounds would mostly focus on expected value and doing good trades. If the game was played for real money, and settled in actual dollars, then we’d definitely have a lot more accurate pricing!

If a market has traders purely motivated by expected value and profit, then its pricing will be as good as the pricing ability of the traders.

If a market has a few ‘motivated’ traders, or noise traders, that are doing something for reasons other than expected value, that is good. You need a source of free money to make the market work. Thus, the existence of the bank, as a source of free money, is great, because it motivates the game. You can imagine a version of the game where players can only trade when they agree to it. There would still be trades, since the prize for winning should overcome frictions and adverse selection, but volume of trading would plummet.

If a market has a few people who have poor fair values, that works like motivated traders.

If a market has too many traders who have poor fair values, or in context they have fair values that are not based on the expected payout, then relationships break down. There’s now profit for those who bet against them, but that doesn’t mean there’s enough money in the wings to balance things out. At some point that money is exhausted, and no one paying attention has spare funds. Prices stop being accurate, to varying degrees.

In particular, this illustrates that if those managing the money have payouts that are non-linear functions of their profits, then very weird things will happen. If I get fired for missing my benchmark, and so do my colleagues, but we don’t get extra fired for missing them by a lot, then this will lead us to discount tail risks. In the game, this takes the form of dumping suspects you can’t control – if Alice did it, you’ve already lost, and third prize is you’re fired. There are many other similar scenarios. If we want accurate prices, we need traders to have linear utility functions, or reasonable approximation thereof.


This game sounds like a lot of fun, and seems to have lots of opportunities for deep tactical and strategic play, for bluffing, and to do things that players should find fun. I really hope that it gets made. You can take one or more of many roles – arbitrager, mystery solver, genre savvy logician, momentum, value, tactical or strategic, or just hang out and watch the fun and/or the mystery, and if you later get a hunch, you can go for it.

I hope to talk to a few friends of mine who have small game companies, in the hopes one of them can help. Kickstarter ho?

If anyone out there is interested in the game and making it happen, please do talk to Robin Hanson about it. I’m sure he’d be happy to help make it a reality. And if you’re looking to play, I encourage you to give it a shot, and report back.
















Posted in Uncategorized | Leave a comment

You Play to Win the Game

Previously (Putanumonit): Player of Games

Original Words of Wisdom:

Quite right, sir. Quite right.

By far the most important house rule I have for playing games is exactly that: You Play to Win the Game.

That doesn’t mean you always have to take exactly the path that maximizes your probability of winning. Style points can be a thing. Experimentation can be a thing. But in the end, you play to win the game. If you don’t think it matters, do as Herm Edwards implores us: Retire.

It’s easy to forget, sometimes, what ‘the game’ actually is, in context.

The most common and important mistake is to maximize expected points or point differential, at the cost of win probability. Alpha Go brought us many innovations, but perhaps its most impressive is its willingness to sacrifice territory it doesn’t need to minimize the chances that something will go wrong. Thus it often wins by the narrowest of point margins, but in ways that are very secure.

The larger-context version of this error is to maximize winning or points in the round rather than chance of winning the event.

In any context where points are added up over the course of an event, the game that matters is the entire event. You do play to win each round, to win each point, but strategically. You’re there to hoist the trophy.

Thus, when we face a game theory experiment like Jacob faced in Player of Games, we have to understand that we’ll face a variety of opponents with a variety of goals and methods. We’ll play a prisoner’s dilemma with them, or an iterated prisoner’s dilemma, or a guess-the-average game.

To win, one must outscore every other player. Our goal is to win the game.

Unless or until it isn’t. Jacob explicitly wasn’t trying to win at least one of the games by scoring the most points, instead choosing to win the greater game of life itself, or at least a larger subgame. This became especially clear once winning was beyond his reach. At that point, the game becomes something odd – you’re scoring points that don’t matter. It’s not much of a contest, and it doesn’t teach you much about game theory or decision theory.

It teaches you other things about human nature, instead.

A key insight is what happens when a prize is offered for the most successful player of one-shot prisoner’s dilemmas, or a series of iterated prisoner’s dilemmas.

If you cooperate, you cannot win. Period. Someone else will defect while their opponents cooperate. Maybe they’ll collude with their significant other. Maybe they’ll lie convincingly. Maybe they’ll bribe with out-of-game currency. Maybe they’ll just get lucky and face several variations on ‘cooperate bot’. Regardless of how legitimate you think those tactics are, with enough opponents, one of them will happen.

That means the only way to win is to defect and convince opponents to cooperate. Playing any other way means playing a different game.

When scoring points, make sure the points matter.

These issues will also be key to the next post as well, where we will analyze a trading board game proposed by Robin Hanson.


Posted in Uncategorized | 19 Comments