A heuristic (/hjʊˈrɪstɪk/; from Ancient Greek εὑρίσκω (heurískō) 'to find, discover'), or heuristic technique, is any approach to problem solving or self-discovery that employs a practical method that is not guaranteed to be optimal, perfect, or rational, but is nevertheless sufficient for reaching an immediate, short-term goal or approximation. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision.[1][2]

Examples that employ heuristics include using trial and error, a rule of thumb or an educated guess.

Heuristics are the strategies derived from previous experiences with similar problems. These strategies depend on using readily accessible, though loosely applicable, information to control problem solving in human beings, machines and abstract issues.[3][4] When an individual applies a heuristic in practice, it generally performs as expected. However it can alternatively create systematic errors.[5]

The most fundamental heuristic is trial and error, which can be used in everything from matching nuts and bolts to finding the values of variables in algebra problems. In mathematics, some common heuristics involve the use of visual representations, additional assumptions, forward/backward reasoning and simplification. Here are a few commonly used heuristics from George Pólya's 1945 book, How to Solve It:[6]

  • If you are having difficulty understanding a problem, try drawing a picture.
  • If you can't find a solution, try assuming that you have a solution and seeing what you can derive from that ("working backward").
  • If the problem is abstract, try examining a concrete example.
  • Try solving a more general problem first (the "inventor's paradox": the more ambitious plan may have more chances of success).

In psychology, heuristics are simple, efficient rules, either learned or inculcated by evolutionary processes. These psychological heuristics have been proposed to explain how people make decisions, come to judgements, and solve problems. These rules typically come into play when people face complex problems or incomplete information. Researchers employ various methods to test whether people use these rules. The rules have been shown to work well under most circumstances, but in certain cases can lead to systematic errors or cognitive biases.[7]

History

The study of heuristics in human decision-making was developed in the 1970s and the 1980s, by the psychologists Amos Tversky and Daniel Kahneman,[8] although the concept had been originally introduced by the Nobel laureate Herbert A. Simon. Simon's original primary object of research was problem solving that showed that we operate within what he calls bounded rationality. He coined the term satisficing, which denotes a situation in which people seek solutions, or accept choices or judgements, that are "good enough" for their purposes although they could be optimised.[9]

Rudolf Groner analysed the history of heuristics from its roots in ancient Greece up to contemporary work in cognitive psychology and artificial intelligence,[10] proposing a cognitive style "heuristic versus algorithmic thinking", which can be assessed by means of a validated questionnaire.[11]

Adaptive toolbox

Gerd Gigerenzer and his research group argued that models of heuristics need to be formal to allow for predictions of behavior that can be tested.[12] They study the fast and frugal heuristics in the "adaptive toolbox" of individuals or institutions, and the ecological rationality of these heuristics; that is, the conditions under which a given heuristic is likely to be successful.[13] The descriptive study of the "adaptive toolbox" is done by observation and experiment, while the prescriptive study of ecological rationality requires mathematical analysis and computer simulation. Heuristics – such as the recognition heuristic, the take-the-best heuristic and fast-and-frugal trees – have been shown to be effective in predictions, particularly in situations of uncertainty. It is often said that heuristics trade accuracy for effort but this is only the case in situations of risk. Risk refers to situations where all possible actions, their outcomes and probabilities are known. In the absence of this information, that is under uncertainty, heuristics can achieve higher accuracy with lower effort.[14] This finding, known as a less-is-more effect, would not have been found without formal models. The valuable insight of this program is that heuristics are effective not despite their simplicity – but because of it. Furthermore, Gigerenzer and Wolfgang Gaissmaier found that both individuals and organisations rely on heuristics in an adaptive way.[15]

Cognitive-experiential self-theory

Heuristics, through greater refinement and research, have begun to be applied to other theories, or be explained by them. For example, the cognitive-experiential self-theory (CEST) is also an adaptive view of heuristic processing. CEST breaks down two systems that process information. At some times, roughly speaking, individuals consider issues rationally, systematically, logically, deliberately, effortfully, and verbally. On other occasions, individuals consider issues intuitively, effortlessly, globally, and emotionally.[16] From this perspective, heuristics are part of a larger experiential processing system that is often adaptive, but vulnerable to error in situations that require logical analysis.[17]

Attribute substitution

In 2002, Daniel Kahneman and Shane Frederick proposed that cognitive heuristics work by a process called attribute substitution, which happens without conscious awareness.[18] According to this theory, when somebody makes a judgement (of a "target attribute") that is computationally complex, a more easily calculated "heuristic attribute" is substituted. In effect, a cognitively difficult problem is dealt with by answering a rather simpler problem, without being aware of this happening.[18] This theory explains cases where judgements fail to show regression toward the mean.[19] Heuristics can be considered to reduce the complexity of clinical judgments in health care.[20]

Psychology

Heuristics (from Ancient Greek εὑρίσκω, heurískō, "I find, discover") is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals,[21][22][23] organizations,[24] and even machines[25] use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution.[26][27][28][29] While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate.[30] Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete.[31] In that sense they can differ from answers given by logic and probability.

The economist and cognitive psychologist Herbert A. Simon introduced the concept of heuristics in the 1950s, suggesting there were limitations to rational decision making. In the 1970s, psychologists Amos Tversky and Daniel Kahneman added to the field with their research on cognitive bias. It was their work that introduced specific heuristic models, a field which has only expanded since. While some argue that pure laziness is behind the heuristics process, others argue that it can be more accurate than decisions based on every known factor and consequence, the less-is-more effect.

Philosophy

A heuristic device is used when an entity X exists to enable understanding of, or knowledge concerning, some other entity Y.

A good example is a model that, as it is never identical with what it models, is a heuristic device to enable understanding of what it models. Stories, metaphors, etc., can also be termed heuristic in this sense. A classic example is the notion of utopia as described in Plato's best-known work, The Republic. This means that the "ideal city" as depicted in The Republic is not given as something to be pursued, or to present an orientation-point for development. Rather, it shows how things would have to be connected, and how one thing would lead to another (often with highly problematic results), if one opted for certain principles and carried them through rigorously.

Heuristic is also often used as a noun to describe a rule of thumb, procedure, or method.[32] Philosophers of science have emphasised the importance of heuristics in creative thought and the construction of scientific theories.[33] Seminal works include Karl Popper's The Logic of Scientific Discovery and others by Imre Lakatos,[34] Lindley Darden, and William C. Wimsatt.

Law

In legal theory, especially in the theory of law and economics, heuristics are used in the law when case-by-case analysis would be impractical, insofar as "practicality" is defined by the interests of a governing body.[35]

The present securities regulation regime largely assumes that all investors act as perfectly rational persons. In truth, actual investors face cognitive limitations from biases, heuristics, and framing effects. For instance, in all states in the United States the legal drinking age for unsupervised persons is 21 years, because it is argued that people need to be mature enough to make decisions involving the risks of alcohol consumption. However, assuming people mature at different rates, the specific age of 21 would be too late for some and too early for others. In this case, the somewhat arbitrary delineation is used because it is impossible or impractical to tell whether an individual is sufficiently mature for society to trust them with that kind of responsibility. Some proposed changes, however, have included the completion of an alcohol education course rather than the attainment of 21 years of age as the criterion for legal alcohol possession. This would put youth alcohol policy more on a case-by-case basis and less on a heuristic one, since the completion of such a course would presumably be voluntary and not uniform across the population.

The same reasoning applies to patent law. Patents are justified on the grounds that inventors must be protected so they have incentive to invent. It is therefore argued that it is in society's best interest that inventors receive a temporary government-granted monopoly on their idea, so that they can recoup investment costs and make economic profit for a limited period. In the United States, the length of this temporary monopoly is 20 years from the date the patent application was filed, though the monopoly does not actually begin until the application has matured into a patent. However, like the drinking age problem above, the specific length of time would need to be different for every product to be efficient. A 20-year term is used because it is difficult to tell what the number should be for any individual patent. More recently, some, including University of North Dakota law professor Eric E. Johnson, have argued that patents in different kinds of industries – such as software patents – should be protected for different lengths of time.[36]

Stereotyping

Stereotyping is a type of heuristic that people use to form opinions or make judgements about things they have never seen or experienced.[37] They work as a mental shortcut to assess everything from the social status of a person (based on their actions),[2] to classifying a plant as a tree based on it being tall, having a trunk, and that it has leaves (even though the person making the evaluation might never have seen that particular type of tree before).

Stereotypes, as first described by journalist Walter Lippmann in his book Public Opinion (1922), are the pictures we have in our heads that are built around experiences as well as what we are told about the world.[38][39]

Artificial intelligence

A heuristic can be used in artificial intelligence systems while searching a solution space. The heuristic is derived by using some function that is put into the system by the designer, or by adjusting the weight of branches based on how likely each branch is to lead to a goal node.

Behavioural economics

Heuristics refers to the cognitive shortcuts that individuals use to simplify decision-making processes in economic situations. Behavioral economics is a field that integrates insights from psychology and economics to better understand how people make decisions.

Anchoring and adjustment is one of the most extensively researched heuristics in behavioural economics. Anchoring is the tendency of people to make future judgements or conclusions based too heavily on the original information supplied to them. This initial knowledge functions as an anchor, and it can influence future judgements even if the anchor is entirely unrelated to the decisions at hand. Adjustment, on the other hand, is the process through which individuals make gradual changes to their initial judgements or conclusions.

Anchoring and adjustment has been observed in a wide range of decision-making contexts, including financial decision-making, consumer behavior, and negotiation. Researchers have identified a number of strategies that can be used to mitigate the effects of anchoring and adjustment, including providing multiple anchors, encouraging individuals to generate alternative anchors, and providing cognitive prompts to encourage more deliberative decision-making.

Other heuristics studied in behavioral economics include the representativeness heuristic, which refers to the tendency of individuals to categorize objects or events based on how similar they are to typical examples,[40] and the availability heuristic, which refers to the tendency of individuals to judge the likelihood of an event based on how easily it comes to mind.[41]

Types

Availability heuristic

According to Tversky and Kahneman (1973), the availability heuristic can be described as the tendency to consider events that they can remember with greater facilitation as more likely to occur than events that are more difficult to recall.[42] An example of this would be asking someone whether they believe they are more likely to get bitten by a shark attack or die in a drowning incident. Someone may quickly answer with the incorrect belief that they are more likely to die from a shark attack as the event is more easily remembered, and is often covered more heavily than drowning deaths in the news. The correct answer is that people are more likely to die of drowning (1 in 1,134) than die after being bitten by a shark (1 in 4,332,817).[43]

Representative heuristic

The representativeness heuristic refers to the cognitive bias where people rely on their preconceived mental image/prototype of a particular category or concept rather than actual probabilities and statistical data for making judgments. This behavior often leads to stereotyping/generalization with limited information causing errors as well as distorted views about reality.[44]

For instance, when trying to guess someone's occupation based on their appearance, a representative heuristic might be used by assuming that an individual in a suit must be either a lawyer or businessperson while assuming that someone in uniform fits the police officer or soldier category. This shortcut could sometimes be useful but may also result in stereotypes and overgeneralizations.

See also

References

  1. Myers, David G. (2010). Social psychology (Tenth ed.). New York, NY: McGraw-Hill. p. 94. ISBN 978-0-07337-066-8. OCLC 667213323.
  2. 1 2 "Heuristics—Explanation and examples". Conceptually. Archived from the original on 21 December 2021. Retrieved 23 October 2019.
  3. Pearl, Judea (1983). Heuristics: Intelligent Search Strategies for Computer Problem Solving. New York, NY: Addison-Wesley. p. vii. ISBN 978-0-201-05594-8.
  4. Emiliano, Ippoliti (2015). Heuristic Reasoning: Studies in Applied Philosophy, Epistemology and Rational Ethics. Switzerland: Springer International Publishing. pp. 1–2. ISBN 978-3-319-09159-4. Archived from the original on 2019-07-11. Retrieved 2015-11-24.
  5. Sunstein, Cass (2005). "Moral Heuristics". The Behavioral and Brain Sciences. 28 (4): 531–542. doi:10.1017/S0140525X05000099. PMID 16209802. S2CID 231738548.
  6. Pólya, George (1945) How to Solve It: A New Aspect of Mathematical Method, Princeton, NJ: Princeton University Press. ISBN 0-691-02356-5 ISBN 0-691-08097-6
  7. Gigerenzer, Gerd (1991). "How to Make Cognitive Illusions Disappear: Beyond "Heuristics and Biases"" (PDF). European Review of Social Psychology. 2: 83–115. CiteSeerX 10.1.1.336.9826. doi:10.1080/14792779143000033. Archived (PDF) from the original on 5 September 2012. Retrieved 14 October 2012.
  8. Kahneman, Daniel; Slovic, Paul; Tversky, Amos, eds. (30 April 1982). Judgment Under Uncertainty. Cambridge, UK: Cambridge University Press. doi:10.1017/cbo9780511809477. ISBN 978-0-52128-414-1.
  9. Heuristics and heuristic evaluation. Archived from the original on 5 July 2015. Retrieved 1 September 2013. {{cite book}}: |website= ignored (help)
  10. Groner, Rudolf; Groner, Marina; Bischof, Walter F. (1983). Methods of Heuristics. Hillsdale, NJ: Lawrence Erlbaum.
  11. Groner, Rudolf; Groner, Marina (1991). "Heuristische versus algorithmische Orientierung als Dimension des individuellen kognitiven Stils" [Heuristic versus algorithmic orientation as a dimension of the individual cognitive style]. In K. Grawe; N. Semmer; R. Hänni (eds.). Über die richtige Art, Psychologie zu betreiben [About the right way to do psychology] (in German). Göttingen: Hogrefe. ISBN 978-3-80170-415-5.
  12. Gigerenzer, Gerd; Todd, Peter M.; and the ABC Research Group (1999). Simple Heuristics That Make Us Smart. Oxford, UK: Oxford University Press. ISBN 978-0-19512-156-8.
  13. Gigerenzer, Gerd; Selten, Reinhard, eds. (2002). Bounded Rationality: The Adaptive Toolbox. Cambridge, MA: MIT Press. ISBN 978-0-26257-164-7.
  14. Gigerenzer, Gerd; Hertwig, Ralph; Pachur, Thorsten (15 April 2011). Heuristics: The Foundations of Adaptive Behavior. Oxford University Press. doi:10.1093/acprof:oso/9780199744282.001.0001. hdl:11858/00-001M-0000-0024-F172-8. ISBN 978-0-19989-472-7.
  15. Gigerenzer, Gerd; Gaissmaier, Wolfgang (January 2011). "Heuristic Decision Making". Annual Review of Psychology. 62: 451–482. doi:10.1146/annurev-psych-120709-145346. hdl:11858/00-001M-0000-0024-F16D-5. PMID 21126183. SSRN 1722019.
  16. De Neys, Wim (18 October 2008). "Cognitive experiential self theory". Perspectives on Psychological Science. 7 (1): 28–38. doi:10.1177/1745691611429354. PMID 26168420. S2CID 32261626. Archived from the original on 31 July 2013.
  17. Epstein, S.; Pacini, R.; Denes-Raj, V.; Heier, H. (1996). "Individual differences in intuitive-experiential and analytical-rational thinking styles". Journal of Personality and Social Psychology. 71 (2): 390–405. doi:10.1037/0022-3514.71.2.390. PMID 8765488.
  18. 1 2 Kahneman, Daniel; Frederick, Shane (2002). "Representativeness Revisited: Attribute Substitution in Intuitive Judgment". In Thomas Gilovich; Dale Griffin; Daniel Kahneman (eds.). Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge, UK: Cambridge University Press. pp. 49–81. ISBN 978-0-52179-679-8. OCLC 47364085.
  19. Kahneman, Daniel (December 2003). "Maps of Bounded Rationality: Psychology for Behavioral Economics" (PDF). American Economic Review. 93 (5): 1449–1475. CiteSeerX 10.1.1.194.6554. doi:10.1257/000282803322655392. ISSN 0002-8282. Archived from the original (PDF) on 19 February 2018.
  20. Cioffi, Jane (1997). "Heuristics, servants to intuition, in clinical decision making". Journal of Advanced Nursing. 26 (1): 203–208. doi:10.1046/j.1365-2648.1997.1997026203.x. PMID 9231296.
  21. Marsh, Barnaby (2002-01-01). "Do Animals Use Heuristics?". Journal of Bioeconomics. 4 (1): 49–56. doi:10.1023/A:1020655022163. ISSN 1573-6989. S2CID 142852213.
  22. Gigerenzer, Gerd; Brighton, Henry (2009). "Homo Heuristicus: Why Biased Minds Make Better Inferences". Topics in Cognitive Science. 1 (1): 107–143. doi:10.1111/j.1756-8765.2008.01006.x. hdl:11858/00-001M-0000-0024-F678-0. ISSN 1756-8765. PMID 25164802.
  23. Hutchinson, John M. C.; Gigerenzer, Gerd (2005-05-31). "Simple heuristics and rules of thumb: Where psychologists and behavioural biologists might meet". Behavioural Processes. Proceedings of the meeting of the Society for the Quantitative Analyses of Behavior (SQAB 2004). 69 (2): 97–124. doi:10.1016/j.beproc.2005.02.019. ISSN 0376-6357. PMID 15845293. S2CID 785187.
  24. Gigerenzer, Gerd; Gaissmaier, Wolfgang (2011). "Heuristic Decision Making". Annual Review of Psychology. 62 (1): 451–482. doi:10.1146/annurev-psych-120709-145346. hdl:11858/00-001M-0000-0024-F16D-5. PMID 21126183.
  25. Braun, T.D.; Siegal, H.J.; Beck, N.; Boloni, L.L.; Maheswaran, M.; Reuther, A.I.; Robertson, J.P.; Theys, M.D.; Bin Yao; Hensgen, D.; Freund, R.F. (1999). "A comparison study of static mapping heuristics for a class of meta-tasks on heterogeneous computing systems". Proceedings. Eighth Heterogeneous Computing Workshop (HCW'99). IEEE Comput. Soc. pp. 15–29. doi:10.1109/hcw.1999.765093. hdl:10945/35227. ISBN 0-7695-0107-9. S2CID 2860157.
  26. Alan, Lewis (2018). The Cambridge Handbook of Psychology and Economic Behavior. Cambridge University Press. p. 43. ISBN 978-0-521-85665-2.
  27. Lori, Harris (2007). CliffsAP Psychology. John Wiley & Sons. p. 65. ISBN 978-0-470-19718-9.
  28. Nevid, Jeffery (2008). Psychology: Concepts and Applications. Cengage Learning. p. 251. ISBN 978-0-547-14814-4.
  29. Gigerenzer, Gerd; Brighton, Henry (2009). "Homo heuristicus: why biased minds make better inferences". Topics in Cognitive Science. 1 (1): 107–143. doi:10.1111/j.1756-8765.2008.01006.x. hdl:11858/00-001M-0000-0024-F678-0. ISSN 1756-8765. PMID 25164802.
  30. Goldstein, E. Bruce (2018-07-23). Cognitive psychology : connecting mind, research, and everyday experience. Cengage Learning. ISBN 978-1-337-40827-1. OCLC 1055681278.
  31. Scholz, R. W. (1983-11-01). Decision Making under Uncertainty: Cognitive Decision Research, Social Interaction, Development and Epistemology. Elsevier. ISBN 978-0-08-086670-3.
  32. Jaszczolt, K. M. (2006). "Defaults in Semantics and Pragmatics". Stanford Encyclopedia of Philosophy. ISSN 1095-5054. Archived from the original on 2021-06-08. Retrieved 2021-06-08.
  33. Frigg, Roman; Hartmann, Stephan (2006). "Models in Science". Stanford Encyclopedia of Philosophy. ISSN 1095-5054. Archived from the original on 2021-06-03. Retrieved 2021-06-08.
  34. Kiss, Olga (2006). "Heuristic, Methodology or Logic of Discovery? Lakatos on Patterns of Thinking". Perspectives on Science. 14 (3): 302–317. doi:10.1162/posc.2006.14.3.302. S2CID 57559578.
  35. Gigerenzer, Gerd; Engel, Christoph, eds. (2007). Heuristics and the Law. Cambridge, MA: MIT Press. ISBN 978-0-262-07275-5.
  36. Johnson, Eric E. (2006). "Calibrating Patent Lifetimes" (PDF). Santa Clara Computer & High Technology Law Journal. 22: 269–314. Archived from the original (PDF) on 2011-10-05.
  37. Bodenhausen, Galen V.; et al. (1999). "On the Dialectics of Discrimination: Dual Processes in Social Stereotyping". In Chaiken, Shelly; Trope, Yaacov (eds.). Dual-process Theories in Social Psychology. New York, NY: Guilford Press. pp. 271–292. ISBN 978-1-57230-421-5.
  38. Kleg, Milton (1993). Hate Prejudice and Racism. Albany, NY: State University of New York Press. p. 135. ISBN 978-0-79141-536-8. Archived from the original on 2023-10-28. Retrieved 2015-03-24.
  39. Gökçen, Sinan (20 November 2007). "Pictures in Our Heads". European Roma Rights Centre. Archived from the original on 14 July 2015. Retrieved 24 March 2015.
  40. Bhatia, Sudeep (2015). "Conceptualizing and studying linguistic representations across multiple levels of analysis: The case of L2 processing research" (PDF). Cognitive Science. 39: 122–148. Archived (PDF) from the original on 2023-06-14. Retrieved 2023-04-20.
  41. Dale, Sarah (2015). "Heuristics and biases: The science of decision-making". Business Information Review. 32 (2): 93–99. doi:10.1177/0266382115592536.
  42. Tversky, Amos; Kahneman, Daniel (1973-09-01). "Availability: A heuristic for judging frequency and probability". Cognitive Psychology. 5 (2): 207–232. doi:10.1016/0010-0285(73)90033-9. ISSN 0010-0285. Archived from the original on 2023-10-28. Retrieved 2023-08-24.
  43. "Shark Attack Statistics - Frequency & Fatality Worldwide". 2023-02-02. Archived from the original on 2023-05-09. Retrieved 2023-05-09.
  44. Kahneman, Daniel; Tversky, Amos (July 1973). "On the psychology of prediction". Psychological Review. 80 (4): 237–251. doi:10.1037/h0034747. ISSN 1939-1471. Archived from the original on 2023-10-28. Retrieved 2023-05-09.

Further reading

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.