Reasoning
Nicole Cruz
Jumping to conclusions (JTC) refers to the tendency to reach a conclusion or decision without sufficient evidence to justify it. It has been investigated mostly in people with schizophrenia, but has also been found to correlate with conservative political beliefs and with conspiracist thinking. Taking as a starting point recent reports of racist violence in the United States, two preregistered online experiments investigated whether jumping to conclusions is higher in people with stronger racial prejudice as well as with stronger scores in the related constructs of need for closure, social dominance orientation, nationalism, and rejection of science. In Exp 1, US participants completed the classic beads task with neutral content (e.g. red or blue beads drawn from a jar). In Exp 2, the same participants worked through an adaptation of the beads task to contents relevant to racist and political beliefs (e.g. local or foreign crime suspects drawn from a police department). Overall, we found that racist beliefs, and among the constructs related to racist beliefs in particular need for closure, were associated with higher JTC independently of the task content. The findings are interpreted in relation to Bayesian principles of belief revision and learning. Future research could examine the generalizability of the findings to geographic and cultural contexts outside of the US in which racial prejudice occurs.
This is an in-person presentation on July 21, 2024 (15:20 ~ 15:40 CEST).
Mr. Christopher Pinier
Dr. Michael D. Nunez
Claire E Stevenson
Abstract reasoning, the ability to solve large-scale problems by taking away unnecessary details (Clement et al., 2007), is essential for human cognition and behavior. However, there remains a lack of cognitive computational models available to study how abstract reasoning emerges and develops in early childhood. We seek to solve this knowledge gap by testing whether deep learning models can explain the key mechanisms that enable children to develop abstract reasoning. Specifically, we investigated whether the Emergent Symbol Binding Network (ESBN; Webb et al., 2021) would be a suitable model. Higher working memory capacity has been shown to facilitate the development of abstract visual reasoning (AVR) in humans. We explore whether ESBN can simulate AVR developmental phenomena by manipulating its memory architecture and training regime. To test this, we observed ESBN’s accuracy as it solved two abstract visual reasoning tasks with decreasing batch size per condition (32, 16, 8, 4). We also used two possible encoders: a random and convolutional encoder. We predicted the convolutional encoder should perform better than the random one, given it has more layers (Seijdel et al., 2020). Initial results do not show support for the ESBN model as a model of abstract visual reasoning development because the simpler, random encoder fared better than the convolutional encoder for all batch sizes. Further research will be performed to identify a suitable candidate model for explaining abstract visual reasoning development.
This is an in-person presentation on July 21, 2024 (15:40 ~ 16:00 CEST).
Michael Lee
People draw on event co-occurrences as a foundation for causal and scientific inference, but in which ways can events co-occur, and can dependencies between events be combined to form more complex dependencies? Statistically one can express a dependency between events A and C as P(C|A) != P(C), or P(A|C) != P(A). But how can it be specified further? In the psychology of reasoning, the conditional relation P(C|A) is often thought to become biconditional when people add the converse, P(A|C), or inverse, P(not-C|not-A), or both, with the effects of these additions largely treated as equivalent. But from a coherence based logical perspective it can make a strong difference whether the converse or the inverse is added, and in what way. In particular, the addition can occur by forming the conjunction of two conditionals, or by merely constraining their probabilities to be equal. Here we outline four distinct ways of defining biconditional relationships, and illustrate their differences by how they constrain the conclusion probabilities of six inference types. We present a Bayesian latent-mixture model with which the biconditionals can be dissociated from one another, and discuss implications for the interpretation of empirical findings in the field.
This is an in-person presentation on July 21, 2024 (16:00 ~ 16:20 CEST).
David Budescu
Research of advice taking frequently utilizes a Judge Advisor System (JAS) paradigm, in which a judge reports a prior belief, receives advice, and then revises their initial estimate. However, several recent studies have shown that the cognitive process of advice taking depends on whether one explicitly accesses and reports their prior or not. In cases where a judge does not report their prior, posterior belief tends to differ from cases in which they do. In social science, this type of phenomenon, where the mere act of elicitation has a treatment-like effect, is sometimes referred to as “measurement reactivity”. However, if we neglect to elicit a judge’s prior, we can only study posterior belief—not belief change. Building on past work that found typical JAS response behavior is best represented by a “decline, adopt or compromise” (DAC) dual hurdle model, I show that by treating judges’ prior beliefs as missing data, we can use imputation techniques to estimate how their beliefs change without directly eliciting their prior. Across both simulation and empirical studies, I demonstrate the feasibility and effectiveness of this planned missing data imputation method, as well as novel theoretical results regarding how people take advice when their priors are not explicitly accessed. The DAC model still adequately fit a task involving continuous estimates, but not a task involving probability judgment for binary outcomes. The general method of utilizing planned missing data designs also has broad potential applications for addressing measurement reactivity in social science.
This is an in-person presentation on July 21, 2024 (16:20 ~ 16:40 CEST).
Submitting author
Author