banner
lca

lca

真正的不自由,是在自己的心中设下牢笼。

"Thinking, Fast and Slow" Reading Notes

image

Author: Daniel Kahneman
Recommendation: ⭐⭐⭐⭐⭐

Excerpts#

Preface#

  • Our subjective judgments are biased: we are particularly prone to believe research findings that are drawn without sufficient evidence, and the collection of observational samples in studies is often inadequate.

  • People use similarity as a simple heuristic (simply put, a rule of thumb) to make difficult judgments. This reliance on heuristic methods inevitably leads to biased predictions (systematic errors). Dependence on rules of thumb will inevitably lead to biases in people's judgments.

  • People estimate the importance of things based on how easily they can retrieve information from memory, which is often related to the extent of media coverage. Topics that are frequently mentioned become vivid in the mind, while others are gradually forgotten.

  • The media's choice of what to report aligns with the information present in people's minds, so the phenomenon of authoritarian regimes pressuring independent media is not coincidental.

  • When faced with difficult problems, we often answer relatively simple questions while ignoring the fact that we have replaced the original question.

Chapter 1 An Angry Face and a Multiplication Problem#

  • A step-by-step calculation process is slow thinking.

  • System 1 operates unconsciously and quickly, requiring little mental effort, without feeling, and is completely under autonomous control. System 2 shifts attention to mentally demanding activities, such as complex calculations.

  • When people focus too much on something, they block out other things, even those they are usually very interested in.

Chapter 2 Movie Protagonists and Supporting Characters#

  • If a person's brain is in a sprinting state, it may effectively block out (secondary information). Generally, most people require self-control to maintain coherent thinking or to think actively from time to time.

Chapter 3 Inertia Thinking and the Contradiction of Delayed Gratification#

  • Those who have experienced flow describe it as "a state in which the brain's attention is effortlessly focused, allowing one to forget the concept of time, forget oneself, and forget one's problems." Their descriptions of the pleasure derived from this state are very appealing, which Mihaly calls "optimal experience."

  • Riding a motorcycle at 150 miles per hour and competing in a chess tournament both require effort; however, in a state of flow, focusing on engaging tasks does not require self-control. Therefore, we should devote all resources to the task at hand.

  • One of the main functions of System 2 is to supervise and control thought activities and various behaviors guided by System 1, allowing some thoughts to be directly reflected in actions or to suppress or alter other thoughts.

Chapter 4 The Magical Power of Association#

  • Things evoke thoughts in your brain that trigger many other thoughts, and these associative behaviors rapidly expand in your mind.

  • You think you know yourself well, but you are mistaken.

  • In an authoritarian country, the ubiquitous portraits of leaders not only convey the feeling that "Big Brother is watching you," but also gradually erode your ability to think independently and act autonomously.

Chapter 5 Your Intuition May Just Be an Illusion#

  • System 1 creates a sense of familiarity, while System 2 relies on this familiarity generated by System 1 to make correctness judgments.

  • Repetition can induce a relaxed state and a comforting sense of familiarity.

Chapter 7 The Letter "B" and the Number "13"#

  • Conscious skepticism requires holding multiple incompatible explanations in mind simultaneously, which takes effort, and this is not System 1's strong suit. Variability and skepticism fall within System 2's responsibilities.

  • Gilbert suggests that before understanding a statement, one will first attempt to believe it: if the statement is correct, you must first understand what its viewpoint means. Only then can you decide whether to "doubt" it.

  • When System 2 is involved, we are almost inclined to believe everything. Because System 1 is easily fooled and prone to bias, and although System 2 holds the power of skepticism and distrust, it is sometimes busy and, when not busy, quite lazy, often neglecting its duties.

  • Evidence shows that when people are tired or depleted, they are more easily influenced by hollow yet persuasive information, such as advertisements.

  • Liking (or disliking) a person leads to liking (or disliking) everything about that person—including aspects you have not yet observed—this tendency is called the halo effect.

  • Our observation order of a person's character traits is random. However, the order is indeed important because the halo effect emphasizes first impressions, while subsequent information is largely diminished.

  • The method adopted to avoid the halo effect in grading follows a general principle: eliminate erroneous associations!

  • Jumping to conclusions in the absence of evidence is very helpful for our understanding of intuitive thinking.

  • You often find that knowing very little can actually encompass all known things into a coherent thought pattern.

  • Framing effect: Different expressions of the same information often evoke different emotions in people. Saying "the survival rate within a month after surgery is 90%" is more reassuring than saying "the death rate within a month after surgery is 10%." Similarly, saying a cold dish "is 90% fat-free" is more appealing than saying "contains 10% fat." Clearly, the deep meanings of each set of sentences are the same; they just differ in expression, but people usually read different meanings and believe what they see is the truth.

  • She knows nothing about this person's management skills. The reason she has a good impression of him is that she once heard him give a brilliant presentation.

Chapter 8 How Do We Make Judgments?#

  • People always combine factors of strength and credibility to assess a person's ability.

  • Divergent thinking allows us to make intuitive judgments.

  • A clear example of divergent thinking is: when asked if he thinks the company is financially strong, he thinks of the product he loves from that company.

Chapter 9 Goal Problems and Heuristic Problems Are Inseparable#

  • When the brain is in a normal state, you almost have intuitions and thoughts about everything that appears before you.

  • Heuristic problems are the simpler questions you answer by bypassing the original problem.

  • When people are asked to judge probabilities as required, they are actually judging other things and believe they have completed the task of judging probabilities. When faced with a difficult "goal problem," if some related and easily answered "heuristic questions" immediately come to mind, System 1 typically adopts this "substitution" approach, using the answers to the substitute questions.

  • Three-dimensional heuristics: Distant objects appear taller.

  • The impression of the size of three-dimensional images influences our judgment of the size of two-dimensional images. The illusion arises from the heuristics of three-dimensional images.

  • The bias of heuristics is that objects that appear farther away seem larger.

  • Your associative memory quickly and automatically uses available information to construct the most appropriate story.

  • Emotional heuristic: Because of liking, there is agreement.

  • The supremacy of conclusions does not mean that your thinking has completely stopped, nor does it mean you can completely ignore information and reasonable explanations to draw your own conclusions.

  • System 2 also has the functions of actively searching memory, complex calculations, comparisons, planning, and decision-making. System 2 seems to always be in the highest decision-making position and has the ability to resist the suggestions of System 1; it can slow things down and begin logical analysis.

  • Exaggerating emotional consistency (halo effect).

  • Focusing attention on current evidence while ignoring non-existent evidence (seeing is believing).

Chapter 10 The Law of Large Numbers and the Law of Small Numbers#

  • System 1 is very good at a thinking pattern—automatically and effortlessly identifying causal relationships between things, even if sometimes this relationship does not exist at all.

  • For a rational person, unbiased and moderate predictions should not raise issues.

  • The risk of error in small samples can be as high as 50%.

  • When information is insufficient, extreme predictions and the willingness to predict rare events stem from System 1.

  • Confidence is determined by the coherence of the most reasonable story you extract from available information.

  • Our intuitive predictions are indeed encouraging, but this prediction may be too far from reality; let's take another look at the information at hand and bring the prediction back to an average state.

  • The general bias of trusting more than doubting.

  • System 1 is not good at questioning.

  • System 2 can question because it can simultaneously hold multiple incompatible possibilities.

  • The law of small numbers is a manifestation of general bias, namely trusting more than doubting.

  • We often feel very familiar with and knowledgeable about a person, but in fact, we know very little about them.

Chapter 19 The Illusion of "Knowing"#

  • We constantly try to understand the world, and in this process, we inevitably generate "narrative fallacies." Those statements that attract people's attention are often very straightforward and concrete rather than abstract; they believe that talent, stupidity, and intention outweigh luck, focusing on a few significant events that have occurred rather than countless events that did not happen.

  • We humans often concoct strained explanations for past regrets and believe them to be true, thus deceiving ourselves.

  • Causal explanations for random events are inevitably wrong.

  • The halo effect has a final stage, which is to place an invincible halo on the protagonist of the story.

  • The case of Google also contains a lot of skills, but the role of luck in the company's actual operations far exceeds the level described in the story. The more luck is involved, the less can be learned from it.

  • We firmly believe that the world is meaningful, and this confidence is built on a solid foundation: we maximize the neglect of our ignorance.

  • In this world, patterns (such as the order of six baby girls) do not just occur by chance; they are also the result of mechanical causal connections or human will.

  • The social cost of hindsight.

  • Simply put, if you follow your intuition, you often make mistakes by viewing random events as patterned events.

  • I knew the "effect" long ago.

  • The phenomenon of "hindsight."

  • If an event indeed occurs, people will exaggerate the likelihood of their previous predictions; if a possible event does not occur, subjects will incorrectly recall that they always thought the likelihood of it happening was low.

  • Changing personal thoughts based on what has happened creates profound cognitive illusions.

  • The hindsight bias has a detrimental effect on decision-makers' evaluative behavior, leading observers to evaluate the quality of a judgment not based on the rationality of the judgment process but rather on the quality of the outcome.

  • We all need a reassurance, wanting to know that our actions will have appropriate results.

  • The anchoring effect is ubiquitous in life.

  • Philip Rosenzweig wrote a book—The Halo Effect.

  • The anchoring effect.

  • Before evaluating a specific value of an unknown quantity, people always consider this quantity in advance, at which point the anchoring effect occurs.

  • Success and failure stories often exaggerate the impact of leadership style and management measures on company performance, so these stories are basically useless.

  • Because luck plays a significant role, we cannot infer leadership quality and management effectiveness from predictions of success.

  • We should not carry outcome bias. Although outcome bias can sometimes be useful, this decision is foolish.

  • Suggestion is a form of anchoring effect.

  • System 1 understands sentences by trying to believe in the truth of their content; its selective activation of corresponding ideas produces a series of systematic errors that make us more easily deceived and more firmly believe in our ideas.

  • Numbers of different sizes can evoke different conceptual systems in memory, and these biased concepts become the basis for estimating annual average temperatures, leading to biased estimates.

  • If you know little about California trees but are asked whether redwoods exceed 1200 feet, you might think this number is not far from the real number. Since this question was posed by someone who knows the actual height of the tree, this anchor value may be a valuable hint. However, an important finding in anchoring effect research is that anchor values are clearly arbitrary and may be as effective as anchor values that might have informational value.

  • The anchoring effect is triggered by this association. Whether the story is true or credible does not matter at all. The powerful influence of random anchoring is an extreme example of the anchoring effect, as random anchoring clearly provides no information.

  • Plans are designed for optimal scenarios. When we anticipate actual outcomes, we should avoid the anchoring effect of plans. Considering various ways in which plans might fail is also a way to execute plans.

Chapter 12 Scientifically Utilizing the Availability Heuristic#

  • Being aware of your biases is beneficial for team relationships; constantly being vigilant about biases is a tiring task.

  • The same biases apply to common observations; many cooperative team members feel that what they do exceeds.

  • In any case, everyone should keep this in mind. What you do may occasionally exceed your responsibilities, but you should know that when you might feel this way, every member of your team may feel the same.

  • Availability bias can affect our views of ourselves or others.

  • Self-assessment is measured by how easily events come to mind. The experience of easily recalling something is more important than the number of things recalled.

  • Students who list more improvement methods also rate the course higher.

  • If I encounter unimaginable difficulty recalling examples that reflect my decisiveness, it indicates that I am not a decisive person at all.

  • This CEO has had multiple successes in a row, so failure does not easily come to her mind. Availability bias makes her overly confident.

Chapter 13 Anxiety and the Design of Risk Policies#

  • Availability effect.

  • Over time, memories of disasters become blurred, and the degree of worry and precaution diminishes.

  • Availability bias.

  • Diabetes and asthma, strokes and accidents.

  • In each group of causes, subjects must identify the more common cause and estimate the ratio of the two possibilities, then compare their judgments with the health statistics at that time.

  • The world in our minds is not an accurate reflection of the real world; our estimates of the frequency of events are also influenced by our exposure to this information and frequency, as well as the intensity of personal emotions.

  • In many areas of life, the views formed and choices made by people directly express their basic emotional tendencies and trade-offs, and these behaviors are made entirely unconsciously.

  • Emotional heuristic.

  • When people favor a certain technology, they believe that technology has advantages and lower risks; if they dislike a technology, they only think of its drawbacks and a few advantages.

  • How to prevent low-probability risk events from evolving into public crises?

  • The public's understanding of risk is deeper than that of experts.

  • When the views and hopes of experts contradict those of other citizens, people should not fully accept the experts' opinions. He said that when experts and the public disagree on their respective priorities, "both sides must respect each other's views and wisdom."

  • Every policy issue includes assumptions about human nature, especially the choices people might make and the consequences of their choices for themselves and society.

  • Utility stacking is a series of self-sustaining events that may begin with media coverage of relatively minor events, then lead to public panic and large-scale government action.

  • Media coverage of a certain risk can capture part of the public's attention, which can then turn into outrage and anxiety. This emotional response itself is a form of promotion that drives the media to follow up with reports, leading to greater anxiety and a wider impact.

  • The 1989 Eila incident.

  • The Eila incident was clearly an overreaction to a minor issue.

  • The Eila incident illustrates that our brains have a fundamental limit in solving small risks: we either completely ignore the risk or overemphasize it, with no middle ground.

  • Policymakers should not ignore the widespread fear, even if these emotions are baseless.

  • Policymakers must strive to protect the public from the influence of fear rather than just protecting them from real dangers.

Chapter 14 Guess What Tom's Profession Is?#

  • When you doubt the reliability of information, you can do one thing: when making probability judgments, think towards the base rates. Don't expect following this principle to be easy—it requires a lot of effort to achieve self-monitoring and self-control.

  • Bayes' theorem.

  • The paradox of less is more.

  • Focusing on weaknesses is also common in political debates.

  • Stereotyping: Stereotyping refers to the tendency of people to extend their views of a group to every member of that group (if a group has certain problems, all its members will inevitably have those problems).

  • It is worth paying these costs to build a better society; however, if one only focuses on being happy and having the correct political stance while denying the existence of costs, this attitude cannot withstand scientific scrutiny. Relying on emotional heuristics in political disagreements is common; some positions we agree with incur no costs, while some positions we oppose yield no benefits. We should have the capability to do better.

  • We are not as helpful as we think.

  • Subjects know that one of their members has a seizure and needs help, but they feel that several people may have rushed out to provide help, so they can safely stay in their cubicle.

  • This experiment shows that when someone knows others have also heard the same call for help, they feel their responsibility diminishes.

  • Changing a person's view of human nature is difficult, and changing a person's view of their own dark side is even more difficult.

  • Causal explanations have a greater impact on our thoughts than statistical results. But even persuasive causal relationship statistics do not change our long-held or deeply rooted beliefs formed through personal experience.

Chapter 17 All Performances Will Revert to the Mean#

  • Life often gives us feedback that contradicts common sense.

  • Success = Talent + Luck; Great success = More talent + More luck.

  • The illusion of effectiveness.

  • Confidence is a feeling that reflects the consistency of a piece of information and the cognitive relaxation exhibited when processing that information.

  • The illusion of skill in stock investment.

  • The most active traders often achieve the worst results, while the least active investors earn the highest returns.

  • Individual investors often sell "winning stocks" to maintain their profits; "winning stocks" are those that appreciate after purchase, while whether they rise or fall depends on "losing stocks."

  • Persistent achievements.

  • Many researchers share a common view that almost all stock traders, regardless of their understanding of stocks (few understand stocks), are playing a game of chance. The subjective experiences of traders are merely their seemingly wise guesses made under very uncertain conditions. However, in efficient markets, wise guesses are not much better than random guesses.

  • If your success mainly relies on luck, how much of your achievements can you attribute to yourself?

  • Subjective confidence and professional culture provide fertile ground for cognitive illusions.

  • Experts make mistakes not because of the content of their thinking but because of their thinking style.

  • Hedgehogs "know one big thing," having their own theories about the world, explaining certain special events within a clear framework, often lacking patience for those who do not see things their way, and being critical of themselves.

  • In contrast, foxes are more complex thinkers. They do not believe that one big thing can drive the course of history (for example, they cannot accept the idea that Ronald Reagan ended the Cold War solely through personal strength). Instead, these foxes recognize that many different factors and forces interact to produce this outcome, including pure luck, and this outcome often leads to larger and more unpredictable results.

  • She is like a hedgehog, with a theory that explains everything, creating an illusion that she understands the world.

Chapter 21 Intuitive Judgments vs. Formulaic Calculations: Which is Better?#

  • Why are expert predictions less accurate than simple calculations? Mill speculates that one reason is that these experts try to be clever, always wanting to think outside the box, and when predicting, they consider complex combinations of different characteristics.

  • Because we do not have a clear understanding of what is in our thoughts, we will never know how we will make different judgments when there are slight changes in our surrounding environment.

  • In a memorable example, Dawes pointed out that the stability of marriage can be predicted by a formula: the frequency of sex minus the frequency of arguments.

  • Do not simply trust intuitive judgments—whether your own or others'—but do not completely disregard them either.

Chapter 22 When It Is Possible#

  • I only respond in rare cases when I believe criticism is seriously misleading.

  • Learning professional skills, such as high-level international chess, professional basketball, and firefighting skills, is complex and slow because professional skills in a certain field involve not just a single skill but many small techniques.

  • When someone tells you that you should believe their judgment, do not believe them, nor should you believe yourself.

  • Professional skills are not a single skill but consist of many skills. The same professional may be an expert in her field but a novice in another.

  • Even when judging the wrong question, one may still have a high degree of confidence in making that judgment.

Chapter 23 Efforts to Adopt#

  • We tend to favor internal opinions over external ones.

  • Executives can easily propose overly optimistic plans to seize resources, so organizations face the challenge of controlling this tendency among executives.

Chapter 24 Optimism is a Double-Edged Sword#

  • A person cannot adopt information they have not thought of, perhaps because they have never known that information.

  • Individuals and companies reward those who provide risky and misleading information rather than those who tell the truth.

  • Successful scientists I have encountered tend to exaggerate the importance of their ongoing research. I also believe that those who do not love to exaggerate their importance will falter when repeatedly faced with setbacks and failures, which is the fate of most researchers.

  • Pre-mortem: A partial method to overcome optimism bias.

  • When an organization is about to make an important decision but has not yet formally issued a resolution, Klein suggests convening a brief meeting with those knowledgeable about the decision. Before the meeting, there is a short speech: "Imagine that we have implemented the current plan a year from now, but the result was a disaster. Please take 5-10 minutes to briefly write down the reasons for this disaster."

  • When a team focuses on decision-making, especially when the leader announces his intentions, doubts about the feasibility of planned steps gradually diminish, and eventually, such doubts may be seen as disloyalty to the team and the leader.

Chapter 25 Decisions Regarding Risk and Wealth#

  • Prospect theory.

  • It is the poor who buy insurance and the rich who sell insurance.

  • This risky gamble becomes the only choice for entrepreneurs and commanders when they are at a loss.

  • Theory-induced blindness, which means once you accept a theory and use it as a thinking tool, it becomes difficult to notice its errors.

  • You also know that your attitude towards gains and losses does not stem from your self-assessment of the wealth you possess. You want to gain $100 but do not want to lose $100, not because this money changes your wealth status. You simply like to gain and dislike to lose—almost certainly, your aversion to loss is much greater than your liking for gain.

  • The core content of prospect theory includes three cognitive features.

  • The third: loss aversion.

  • Many choices we face in life are mixed blessings: there are risks of loss and possibilities of gain, and we must decide whether to accept or reject this risk.

  • For most people, the fear of losing $100 is stronger than the desire to gain $150.

  • In gambles where both gains and losses may occur, loss aversion leads to choices that strongly avoid risk.

  • In the description of prospect theory, poverty means that a person's living standard is below their reference point. Some goods are unaffordable for the poor, so they are always "in loss." They feel that the small amount of money they receive is a reduction of loss rather than a gain. This money can help a person get a little closer to the reference point, but the poor always linger at the steepest point of the value function.

  • For the poor, spending money means loss.

  • Negative emotions, irresponsible parents, and poor feedback have a greater impact than good situations, and people process bad news more thoroughly than good news. We care more about avoiding negative self-definitions than pursuing positive self-definitions; bad impressions and bad patterns are easier to form and harder to disappear than good situations.

  • Long-term healthy marital relationships depend not only on seeking happiness but also on avoiding negative situations.

  • We all know that perhaps one thing can destroy years of cultivated friendships.

  • The boundary between good and bad is a reference point that changes over time and depends on the situation at that time.

  • We find a basic principle of fairness: one cannot use market forces to impose losses on others.

  • Contrary to the expected principle, the degree to which people value outcomes differs from the degree to which they value the likelihood of those outcomes. The possibility effect emphasizes unlikely outcomes, while almost certain outcomes are valued less compared to certain outcomes. The expected principle determines value through possibilities, which is a psychological pitfall.

  • When making decisions with long-term impacts, do not be overly meticulous, but also do not act entirely randomly. If you consider a little, you might later say, "I could have made a better choice," and this hindsight will make you feel more regretful.

Chapter 35 The Inconsistency of Experience Utility and Decision Utility#

  • The experiencing self answers the question "Does it hurt now?" while the remembering self answers the question "How was it overall?" We can only preserve life experiences through memory; therefore, when we think about life, the only perspective we can adopt comes from the remembering self.

Chapter 36 Life is Like a Play#

  • How much can you remember about your last trip?

  • The person taking photos does not think that the scenery at that moment is only for their brief enjoyment; they regard the scenery as a memory to be cherished in the future. Photos are very useful for the remembering self, even though we rarely watch these photos for long or multiple times; some photos we may not have looked at again, but taking photos is not necessarily the best way for travelers' experiencing selves to appreciate the scenery.

  • People in love may feel happy even in traffic jams, while those in mourning may continue to feel sad even when watching a comedy. However, under normal circumstances, we only feel joy or sadness based on what is happening at that moment, provided we pay attention to it. For example, to derive pleasure from eating, you must notice that you are eating.

  • The potential conflict between the remembering self and the experiencing self is more complex than I initially imagined. In early ice hand experiments, the combination of process neglect and the peak-end rule led people to make seemingly absurd choices. Why do people choose to endure unnecessary suffering? This choice is made by the remembering self, which prefers to recall experiences that leave better memories, even if these choices cause them more pain.

  • The only standard for judging whether a person is rational is not whether their beliefs or preferences are reasonable, but whether they are consistent. A rational person can have likes and dislikes, but their preferences must be consistent over time. Rationality refers to logical consistency, that is, whether it is reasonable or not.

  • When we observe those whose behaviors seem strange, we should consider a possibility—they may have reasonable reasons for doing so. Only when the reasons become unreasonable will psychological explanations be triggered.

  • Our views of ourselves are essentially views of System 2. System 2 makes judgments and choices, but it recognizes and rationalizes the views and feelings formed by System 1. You may not realize that you hold an optimistic attitude towards a project simply because its leader reminds you of your beloved sister. Or, you may dislike someone who looks like your dentist. If you seek an explanation, you will search your memory for some decent reasons, and you will surely find some.

  • The acquisition of skills requires a fixed environment, opportunities for practice, and quick and clear feedback on one's thoughts and actions.

  • Recognizing the cognitive domain you are in, slowing down, and requiring System 2 to reinforce it. When you encounter the Müller-Lyer illusion again, what will you do? When you see line segments with arrows pointing in different directions, you will realize that you cannot trust your intuition about length.

Loading...
Ownership of this post data is guaranteed by blockchain and smart contracts to the creator alone.