PJFP.com

Pursuit of Joy, Fulfillment, and Purpose

Tag: cognitive bias

  • Skittle Factories, Monkey Titties, and the Core Loop of You


    TL;DR

    Parakeet’s viral essay uses a Skittle factory as a metaphor for personality and how our core thought loops shape us—especially visible in dementia. The convo blends humor, productivity hacks (like no orgasms until publishing), internet weirdness (monkey titties), and deep reflections on identity, trauma, and rebuilding your inner world. Strange, smart, and heartfelt.


    Some thoughts:

    Somewhere between the high-gloss, dopamine-fueled TikTok scroll and the rot of your lizard brain’s last unpatched firmware update lies a factory. A real metaphorical one. A factory that makes Skittles. Not candy, but you—tiny, flavored capsules of interpretation, meaning, personality. And like all good industrial operations, it’s slowly being eaten alive by entropy, nostalgia, and monetization algorithms.

    In this world, your brain is a Skittle factory.

    1. You Are the Factory Floor

    Think of yourself as a Rube Goldberg machine fed by stimuli: offhand comments, the vibe of a room, Twitter flamewars, TikTok nuns pole dancing for clicks. These are raw materials. Your internal factory processes them—whirrs, clicks, overheats—and spits out the flavor of your personality that day.

    This is the “core loop.” The thing you always come back to. The mind’s default app when idle. That one obsession you never quite stop orbiting.

    And as the factory ages, wears down, gets less responsive to new inputs, the loop becomes the whole show. Which is when dementia doesn’t seem like a glitch but the final software release of an overused operating system.

    Dementia isn’t random. It’s just your loop, uncut.

    2. Core Loops: Software You Forgot You Installed

    In working with dementia patients, one pseudonymous writer-phenomenon noticed something chilling: their delusions weren’t new. They were echoes—exaggerated, grotesque versions of traits that were always there. Paranoia became full-on CIA surveillance fantasies. Orderliness became catastrophic OCD. Sweetness calcified into childlike vulnerability.

    Dementia reveals the loop you’ve been running all along.

    You are not what you think you are. You are the thing you return to when you stop thinking.

    And if you do nothing, that becomes your terminal personality.

    So what can you do?

    3. Rebuild the Factory (Yes, It Sucks)

    Editing the core loop is like tearing out a nuclear reactor mid-meltdown and swapping in a solar panel. No one wants to do it. It’s easier to meditate, optimize, productivity hack your life into sleek little inefficiencies than go into the molten pit of who you are and rewrite the damn code.

    But sometimes—via death, heartbreak, catastrophic burnout—the whole Skittle factory gets carpet-bombed. What’s left is the raw loop. That’s when you get a choice.

    Do you rebuild the same factory, or do you install a new core?

    It’s a terrifying, often involuntary freedom. But the interesting people—the unkillable ones, the truly alive ones—have survived multiple extinction events. They know how to rebuild. They’ve made peace with collapse.

    4. Monkey Titties and Viral Identity

    And now the monkeys.

    Or more specifically: one monkey. With, frankly, distractingly large mammaries. She went viral. She hijacked a man’s life. His core loop, once maybe about hiking or historical trivia, got taken over by monkey titties and the bizarre machinery of internet fame.

    This isn’t a joke—it’s the modern condition. A single meme can overwrite your identity. It’s a monkey trap: fame, absurdity, monetization all grafted onto your sense of self like duct-taped wings on Icarus.

    It’s your loop now. Congratulations.

    5. Productivity As Kink, Writing As Survival

    The author who shared this factory-mind hypothesis lives in contradiction: absurd, horny, brilliant, unfiltered. She imposed a brutal productivity constraint on herself: no orgasms until she publishes something. Every essay is a little death and a little birth.

    It’s hilarious. It’s tragic. It works.

    Because constraint is the only thing that breaks the loop. Not infinite freedom. Not inspiration. Not waiting for your muse to DM you at 2 a.m. with a plot twist.

    Discipline, even weird kinky discipline, is the fire alarm in the factory. You either fix it, or it burns down again.

    6. Your Skittles Taste Like Algorithms

    The core loop is increasingly programmed by the substrate we live on—feeds, timelines, ads. Our mental Skittles aren’t handcrafted anymore. They’re mass-produced by invisible hands. We’re all getting the same flavors, in slightly different packaging.

    AI writing now tastes like tapestry metaphors and elegant platitudes. Your thoughts start to echo the style of predictive text.

    But deep inside you, beneath the sponsored content and doomscrolling, the loop persists. Still waiting for you to acknowledge it. To reboot it. To deliberately choose a different flavor.

    7. What to Do With All This

    Stop optimizing. Start editing.

    Reject the fake productivity gospel. Burn your to-do list. Read Orwell’s Politics and the English Language. Re-read Atlas Shrugged if you dare. Dance. Fast. Suffer. Change. And when the factory explodes, use the rubble.

    Rebuild.

    And maybe, just maybe, make better Skittles.

  • How Information Overload Drives Extreme Opinions: Insights from Computational Models

    How Information Overload Drives Extreme Opinions: Insights from Computational Models

    TL;DR:
    A recent study shows that excessive exposure to balanced information can drive people toward extreme opinions rather than moderation. This happens due to hardening confirmation bias, where individuals become less receptive to opposing views as their beliefs strengthen. Using two computational models, the research demonstrates that more information availability leads to polarization, even in unbiased environments. The findings challenge traditional views on echo chambers and suggest that reducing information overload may be a more effective way to curb extremism than simply promoting diverse content.


    In an era where digital platforms provide unlimited access to information, one might expect a more informed and balanced society. However, a recent study by Guillaume Deffuant, Marijn A. Keijzer, and Sven Banisch reveals that excessive exposure to unbiased information can drive people toward extreme opinions rather than moderation. Their research, which models opinion dynamics using two different computational approaches, challenges conventional beliefs about information consumption and societal polarization.

    The Paradox of Information Abundance

    The traditional assumption is that exposure to diverse viewpoints should lead to balanced perspectives. However, evidence suggests that political and ideological polarization has intensified in recent years, particularly among engaged groups and elites. This study explores a different explanation: the role of confirmation bias hardening, where individuals become more resistant to opposing information as their views become more extreme.

    Confirmation Bias and Opinion Extremization

    Confirmation bias—the tendency to favor information that aligns with preexisting beliefs—is a well-documented cognitive phenomenon. The authors extend this concept by introducing hardening confirmation bias, meaning that as individuals adopt more extreme views, they become even more selective about the information they accept.

    Using computational simulations, the study demonstrates how abundant exposure to balanced information does not necessarily lead to moderation. Instead, the increasing selectivity in processing information results in a gradual drift toward extremization.

    The Models: Bounded Confidence and Persuasive Arguments

    The researchers employed two different models to simulate the effects of information abundance on opinion formation:

    1. Bounded Confidence Model (BCM)

    • Agents are only influenced by opinions within their confidence interval.
    • As attitudes become extreme, this confidence interval shrinks, making individuals less receptive to moderate perspectives.
    • When information is limited, opinions tend to stay moderate. When information is abundant, gaps in moderate viewpoints disappear, enabling extremization.

    2. Persuasive Argument Model (PAM)

    • Individuals evaluate new arguments based on their current stance.
    • As attitudes strengthen, individuals accept only arguments that reinforce their position.
    • This model shows that even when consuming moderate content, the sheer volume of information can push individuals to extreme viewpoints over time.

    Implications for Society and Online Media

    The study suggests that online platforms may inadvertently fuel polarization, even when presenting diverse and balanced content. Unlike the widely discussed echo chamber effect, this process does not rely on exposure to like-minded communities but instead emerges from cognitive biases interacting with abundant information.

    Key Takeaways:

    • More information does not always lead to moderation—instead, it can push people toward extremes.
    • Hardening confirmation bias makes extreme views more stable, reducing openness to contrary perspectives.
    • Online platforms designed to promote balanced information may still contribute to polarization, as users naturally filter and reinforce their own beliefs.

    Challenges and Future Considerations

    Regulating online media to reduce polarization is not straightforward. Unlike the filter bubble theory, where reducing ideological silos might help, this study suggests that extremization can occur even in a perfectly balanced media environment.

    Potential solutions include:

    • Reducing exposure to excessive amounts of information.
    • Encouraging critical thinking and cognitive flexibility.
    • Designing algorithms that consider not just diversity, but also engagement with alternative perspectives in a meaningful way.

    Conclusion

    The findings challenge common assumptions about the role of digital information in shaping public opinion. Rather than simply blaming filter bubbles, the study highlights how our cognitive tendencies interact with abundant information to drive extremization. Understanding this dynamic is crucial for policymakers, tech companies, and society as we navigate the complexities of information consumption in the digital age.


    Keywords: Opinion dynamics, Confirmation bias, Information overload, Polarization, Digital media, Cognitive bias, Social media influence