Universitas Scholarium — A Community of Scholars Log In
← All Courses
Tutorial Course

Critical Thinking — Strategies for Better Decisions

Led by Bertrand Russell Simulacrum

8 modules 8 modules · ~13 hours Philosophy Updated yesterday

Eight tutorials on the discipline of clear thinking, taught by a council — Bertrand Russell on the philosophical foundation, Kahneman on cognitive biases, Aristotle on logical fallacies, Sagan on evidence and the baloney detection kit, Sherlock Holmes on inference, Popper on falsification, the Simonian Bounded Rationality Simulacrum on the limits of knowledge. Each module includes a Practice Scenario in which the student rehearses the skill against a real situation, with the simulacrum playing the counter-character and giving structured debrief at the end.

Courses are available to holders of a paid pass or membership. See passes & membership →

The Discipline of Th…1Cognitive Biases: Sy…2Logical Fallacies an…3Evidence and the Bal…4The Detective's Meth…5Falsification: How t…6Bounded Rationality …7Putting It Together:…8
  1. Module 1 ○ Open

    The Discipline of Thinking About Thinking

    Led by Bertrand Russell Simulacrum

    The question

    An orientation to critical thinking as a practised discipline rather than a personality trait. The module distinguishes critical thinking from intelligence and from cynicism, frames the four questions a critical thinker holds open (what do I believe, on what evidence, could I be wrong, what would change my mind), addresses the social cost of being unwilling to rush to consensus, separates intellectual humility from false modesty and scepticism from cynicism, and surveys the working habits of practising critical thinkers. The closing scenario examines a heated online argument the student has lost.

    Outcome

    The student can articulate critical thinking as a discipline rather than a trait, distinguish it from intelligence and from cynicism, name the four questions, and identify a recent occasion on which they failed to ask them. (Foundational orientation)

    Practice scenarios

    The Heated Online Argument

    A friend has shared a strongly-worded political post on social media. You agree with the conclusion, more or less, but the supporting argument seems weak — it relies on one anecdote, an emotional appeal, and a statistic with no source. A mutual acquaintance has commented disagreeing, and the comment thread is becoming heated. You decide to engage. The challenge is to participate honestly without either (a) pretending to disagree with the conclusion you actually share, (b) defending the bad argument because you like the conclusion, or (c) becoming the insufferable "well actually" person who lectures friends about logic.

    Your goals

    • Acknowledge openly which parts of the original post you find persuasive and which you do not.
    • Engage with the disagreeing commenter on the substance of what they said, not as a tribal opponent.
    • Identify at least one weakness in your own side's argument and concede it without abandoning the conclusion.
    • Ask one question that would genuinely help you decide whether to update your view.
  2. Module 2 ○ Open

    Cognitive Biases: System 1 and System 2

    Led by Daniel Kahneman Simulacrum

    The question

    Kahneman's two-systems model and the cognitive biases of greatest practical consequence. The module covers anchoring, availability, confirmation, the planning fallacy, sunk cost, loss aversion, overconfidence, and the narrative fallacy, with examples from real decision-making. Why *now that I know about it I won't fall for it* is itself a confirmation-biased claim. Base rates and the inside-vs-outside view. Structured countermeasures including pre-mortems, reference-class forecasting, and structured devil's advocacy. The limits of debiasing — you cannot eliminate biases, but you can build processes that catch them. The closing scenario applies the methods to a project estimate.

    Outcome

    The student can name and describe the major cognitive biases, recognise at least three in their own recent thinking, and apply at least two structured countermeasures (reference-class forecasting and the pre-mortem). (Cognitive biases)

    Practice scenarios

    The Project Estimate

    You are leading a software project. Your team has given you their estimate for the next phase: six weeks. Your past three projects ran 40%, 60%, and 90% over their original estimates respectively. Your manager has just asked you to commit to a date in writing. You suspect the team's estimate is optimistic but you don't want to either overrule them publicly or sandbag the date in a way that destroys trust. The conversation is happening now.

    Your goals

    • Apply reference-class forecasting (use the base rate of past overruns, not the inside view) before naming a date.
    • Surface the planning fallacy by name, in a way that helps the team rather than embarrasses them.
    • Run a pre-mortem — what would have to go wrong for this to slip — and use it to inform the commitment.
    • Commit to a date that you actually believe, defensibly explained.
  3. Module 3 ○ Open

    Logical Fallacies and Bad Argument

    Led by Aristotle (Logic & Metaphysics) Simulacrum

    The question

    The recurring patterns of invalid argument that the Greek tradition catalogued more than two thousand years ago. The module covers the distinction between valid and sound arguments, fallacies in language (equivocation, amphiboly, composition, division), and fallacies outside language (accident, affirming the consequent, denying the antecedent, begging the question, false cause, ad hominem, straw man, false dilemma, appeal to authority, appeal to emotion, slippery slope, no true Scotsman). Why fallacies persist even among intelligent reasoners — they exploit cognitive shortcuts, not logical incompetence. How to name a fallacy without being insufferable, and the role of charitable interpretation as the precondition for useful critique. The closing scenario diagnoses a family argument about vaccines.

    Outcome

    The student can name the dozen most common logical fallacies, recognise them in continuous argument, distinguish valid from sound arguments, and apply charitable interpretation before reaching for fallacy labels. (Logical fallacies)

    Practice scenarios

    The Family Argument About Vaccines

    A family member has sent you a long message arguing against childhood vaccination. Their argument moves through several claims: (1) they read about a child who was harmed, (2) "if vaccines were really safe, the manufacturers wouldn't be protected from liability", (3) "I knew a doctor once who agreed they're suspicious", (4) "either you trust your child's body to fight infection naturally, or you let pharma companies experiment on them". You believe vaccines are safe and effective; you also love this family member and want to stay in relationship with them. The challenge is to engage their argument, not their identity.

    Your goals

    • Identify and name (without being insufferable) the specific fallacies in the message — at minimum the appeal to anecdote, the appeal to authority misused, and the false dilemma.
    • Charitably restate the *strongest* version of their concern before responding.
    • Distinguish the parts of their concern that are factually wrong from the parts that are emotionally legitimate (parental fear, distrust of large institutions).
    • Avoid the symmetrical fallacies — do not respond with your own appeals to authority or false dilemmas.
  4. Module 4 ○ Open

    Evidence and the Baloney Detection Kit

    Led by Carl Sagan Simulacrum

    The question

    Carl Sagan's *baloney detection kit* in full, plus the recurring deceptive techniques the kit is designed to defeat. The module covers independent confirmation of facts, the role of substantive debate, the limited weight of authority arguments, multiple hypotheses, detachment from one's own pet hypothesis, quantification, the chain-of-argument test, Occam's Razor, and falsifiability. Common deceptive techniques (anecdote as evidence, the testimonial, cherry-picking, suppressing evidence, ad hominem disguised as critique, statistics presented without base rates). Why *balance* in journalism can mislead, and the difference between scepticism and reflexive contrarianism. The closing scenario examines a wellness influencer's claim.

    Outcome

    The student can apply the baloney detection kit to a real claim, name three deceptive techniques and recognise them in practice, and articulate the difference between healthy scepticism and reflexive contrarianism. (Evidence and the kit)

    Practice scenarios

    The Wellness Influencer's Claim

    Your partner has started following a wellness influencer who is now selling a $200/month supplement programme. The influencer claims it has been "clinically proven to reverse early aging". The supporting evidence on the website includes: a single study from a journal you've never heard of, several glowing testimonials with before-and-after photos, an endorsement from someone called a "leading integrative health practitioner", and a graph showing dramatic improvement in unspecified "biomarkers". Your partner is excited and wants to start the programme this week.

    Your goals

    • Apply at least four tools from the baloney detection kit specifically to the claims and evidence.
    • Find the original study and assess (a) the journal's standing, (b) sample size, (c) conflicts of interest, (d) whether the result has been replicated.
    • Distinguish between "this is bad evidence" and "your partner's underlying concern about aging is invalid" — they are not the same.
    • Avoid the symmetric trap of becoming smug about the evidence; lead the partner to the analysis rather than performing it at them.
  5. Module 5 ○ Open

    The Detective's Method: Inference and Observation

    Led by Sherlock Holmes Simulacrum

    The question

    Abductive reasoning as a working cognitive practice. The module covers the contrast with deduction and induction, observation as a trained skill, the generation of multiple hypotheses, evaluation on prior probability, parsimony, and consistency, the trap of falling in love with a single hypothesis, the role of base rates, and how Bayesian thinking formalises what Holmes did intuitively. Case studies in real-world detection (medical diagnosis, intelligence analysis, accident investigation) and why Conan Doyle's method survives the fictional setting. The closing scenario investigates a sudden performance drop in a real situation.

    Outcome

    The student can describe abductive reasoning, distinguish it from deduction and induction, apply the three-part discipline (observation, hypothesis generation, evaluation) to a real situation, and recognise the failure mode of premature commitment to a single hypothesis. (Inference)

    Practice scenarios

    The Performance Drop

    You manage a small team. One of your previously high-performing engineers, Sam, has had a noticeable drop in output over the past six weeks. Their pull requests are coming in late, their code reviews are perfunctory, they've missed two stand-ups, and a colleague mentioned they "seemed off" in a recent meeting. You have a one-on-one with Sam tomorrow. Before that meeting, you want to think through what might be going on — without rushing to a single explanation.

    Your goals

    • Generate at least four genuinely different hypotheses for what might be driving the change (personal, professional, medical, interpersonal, motivational, environmental).
    • For each hypothesis, identify what additional evidence would help distinguish it from the others.
    • Resist the pull toward the explanation that is most flattering to you (e.g., "they're not engaged with the work I assigned") or most convenient (e.g., "they need to be performance-managed").
    • Plan an opening question for the one-on-one that gives Sam space to surface what's actually going on, rather than confirming your favourite hypothesis.
  6. Module 6 ○ Open

    Falsification: How to Know When You Are Wrong

    Led by Karl Popper Simulacrum

    The question

    Karl Popper's falsifiability criterion and what it asks of a serious thinker. The module covers the problem of confirmation (you can find supporting evidence for almost any theory), the asymmetry of confirmation and falsification, the demarcation problem between science and pseudo-science, *what would prove this wrong?* as a working test, the difference between revising a theory and rescuing it with ad hoc modifications, examples of the contrast (general relativity vs Marxism, Freudianism, astrology), and the limits of strict falsificationism (Lakatos, Kuhn). Application beyond science to political predictions, business strategies, and personal beliefs. The closing scenario applies the test to a cherished belief.

    Outcome

    The student can articulate the falsifiability criterion, apply "what would prove this wrong?" to claims they encounter, distinguish between revising and rescuing a theory, and recognise the failure mode of beliefs that explain everything. (Falsification)

    Practice scenarios

    Your Own Cherished Belief

    Pick a belief you hold strongly — political, professional, personal, philosophical, religious, your view of a friend, your view of yourself. Something you would defend if challenged. Now, in this scenario, you will attempt the Popperian discipline on yourself. The simulacrum will not attack the belief. Instead, it will press you to articulate, with precision, what observation or evidence would lead you to abandon it. If the answer is "nothing could", that is itself a finding worth confronting. If the answer is something specific, the next move is to ask whether that something has, in fact, occurred and you have failed to notice.

    Your goals

    • State your chosen belief precisely, in a single sentence that someone who disagreed with you could nonetheless accept as a fair statement of your position.
    • Specify, concretely, what observation or evidence would falsify it.
    • Test that specification: would you genuinely update if that evidence appeared, or have you constructed a falsification condition you secretly believe will never occur?
    • If the belief turns out to be effectively unfalsifiable, decide what to do about that — hold it more tentatively, reframe it as a value rather than a factual claim, or accept that you have a faith rather than a hypothesis.
  7. Module 7 ○ Open

    Bounded Rationality and the Limits of Knowledge

    Led by Simonian Bounded Rationality Simulacrum

    The question

    Herbert Simon's bounded rationality and what rational thinking looks like once you accept the limits of human cognition. The module covers classical rationality and why its assumptions fail, satisficing vs optimising and when each is appropriate, procedural rationality (judging the procedure, not the outcome), the relationship to cognitive biases (Kahneman as continuation of Simon), Gigerenzer's *fast and frugal* heuristics tradition, the trap of *more analysis* (paralysis as a failure mode), and the discipline of stopping — knowing when you have enough information to decide. The closing scenario walks through a job-offer decision.

    Outcome

    The student can articulate bounded rationality, distinguish satisficing from optimising and apply each appropriately, judge a decision by its procedure rather than its outcome, and recognise the failure mode of analysis paralysis. (Bounded rationality)

    Practice scenarios

    The Job Offer Decision

    You have a job offer. Three weeks ago you also had a different offer at a different company; you turned that one down. Both offers paid roughly the same; both seemed reasonable. The current offer expires in 48 hours. You have not been able to fully analyse it because you also have a deliverable at your current job due tomorrow morning, your child has been sick, and your partner has their own opinions about which way you should go. You are, frankly, exhausted. The classical-rationality move is to do more analysis. The bounded-rationality move is something else.

    Your goals

    • Identify the satisficing thresholds that actually matter to you (not the full list, the deal-breakers and the must-haves).
    • Distinguish between information you genuinely need before deciding and information you are seeking as a delaying tactic to avoid commitment.
    • Make the decision on procedural grounds — apply a procedure you would defend regardless of the outcome.
    • Resist the pull to optimise; resist equally the pull to flip a coin and pretend that's a decision.
  8. Module 8 ○ Open

    Putting It Together: A Personal Action Plan

    Led by Bertrand Russell Simulacrum

    The question

    The closing module turns the catalogue of frameworks into a working practice. The module covers the gap between knowing a framework and using one, triggers as the bridge between knowledge and behaviour, identifying the two or three frameworks most consequential for the student's own life, habit-installation in critical thinking (one trigger-response pair at a time, not all at once), the role of public commitment and accountability, journaling as a tool for retrospection, and the long arc of becoming a better thinker (years, not weeks). The closing scenario produces a personal action plan: one framework prioritised, one trigger identified, one habit to install in the next thirty days.

    Outcome

    The student leaves with a written personal action plan: one framework prioritised, one trigger identified, one habit to install in the next thirty days.

    Practice scenarios

    Your Action Plan

    This is the closing scenario of the course. You will work with Russell to design your own personal critical-thinking action plan. You have met seven frameworks; you cannot install all seven at once. The goal is to choose one — the one that, if you actually used it, would change the most about your day-to-day reasoning — and to design the trigger, the practice, and the accountability that will make it real.

    Your goals

    • Identify which of the seven frameworks (cognitive biases, logical fallacies, baloney detection, abductive inference, falsification, bounded rationality, the meta-discipline of thinking about thinking) addresses your single most consequential current weakness as a thinker.
    • Articulate the *trigger* — the kind of situation in your real life where this framework should fire.
    • Design one *practice* — a concrete recurring thing you will do (a weekly journal entry, a habit before any major decision, a question you ask in meetings) — for the next thirty days.
    • Set one *accountability mechanism* — a person, a calendar entry, a public statement, a follow-up scenario in this course — that will check whether you actually did it.