# Superforecasting Fundamentals Three important questions you have to ask yourself: - Modeling and analysis: Will things really be different? - How much should I react to this new information? - How confident should I be in my forecast? ## Will things really be different? Your own experiences create an inside view and a bias towards the more familiar. Start with appropriate references to create a base rate from which to start your thinking. The outside view is used to compare one case to a set of generally similar cases and the inside view is used to understand how the specifics of this case might make it different. Be mindful of your ability to fill in blanks with case-specific or easy to imagine information. ## How much should we react to new information? We need to strike a balance between updating our beliefs in light of new information but not reacting to noise. There are two biases that pull us towards conservatism or stubbornness: - belief perseverance: we tend to continue believing what we believe, even when we know contradictory evidence exists. - confirmation bias: uncritically accepting information that agrees with our belief and criticizing information that is contrary to our beliefs. There are three actions you can take to tackle conservatism: - Look for contradictory information. This can be painful to do because we don't want to find out we are wrong. You can start by imagining alternative scenarios and thinking about what information would support those scenarios. - Revisit your analysis regularly and update your forecasts. This can help you notice how you are influenced by biases. - Don't react impulsively to new information (especially not with dismissive thoughts). Instead re-evaluate your analysis in light of this new information. There are also instances when we overreact to new information. We tend to give information that is most available to our mind more weight. This is the availability bias. This does not mean it is more important than old information. Only through lots of practice and performance review can we become better forecasters. ## How confident should I be in my forecast? Most people in most circumstances are overconfident in the estimates of probabilities. They had some great graphs on estimated probability versus actual frequency across fields. The exceptions were lawyers who were underconfident when they felt they would lose with a high likelihood. Meteorologists were surprisingly well calibrated. Good forecasters must have: - humility: good calibration of what you do and don't know - decisiveness: forecasts that provide resolution. Make high probability forecasts for events that do happen, and low probability for events that don't. They provide useful forecasts rather than accurate maybe's. ## Related - [[The Behavioral Investor]] - [[Framework for analysis and decision-making]] - [[Confidence Calibration]]