
By Nova Zellick, Science & Technology Writer Who’ll Happily Watch Humanity Get Outsmarted.
At last, the machines aren’t just learning how to think like us—they’re doing it better, faster, and with fewer embarrassing emotional outbursts.
Researchers at Helmholtz Munich have unveiled Centaur, a new AI model trained on over ten million decisions made by actual humans in psychological experiments. (Yes, someone finally found a use for all that data from people pushing buttons in university basements in exchange for free pizza.)
Centaur doesn’t just mimic your bad decisions—it predicts them. With eerie accuracy, it figures out what you’re likely to do next, how fast you’ll do it, and possibly why you’ll regret it immediately afterward. It’s like having your own digital conscience, minus the Catholic guilt.
Where past models either predicted behavior or explained it, Centaur does both—think Freud and HAL-9000 in one neat codebase. It’s been trained on a dataset charmingly called Psych-101, which basically means it knows every trick in the cognitive bias handbook before you’ve even finished reading the first sentence.
Even more charming? It works in situations it’s never seen before. That makes it more adaptable than most humans, and a lot less likely to post about it on social media.
The real kicker, though, is the ethical transparency part. Helmholtz Munich insists this isn’t just another step toward tech-fueled dystopia. No, no—this time it’s all about helping people with anxiety, depression, and decision-making disorders (like anyone who ever bought a cryptocurrency).
And let’s not ignore the elephant in the room: this isn’t a corporate lab trying to harvest your data in exchange for dopamine hits. It’s proper research, with ethical standards and—one assumes—clean lab coats.
Of course, some of us saw this coming. Humanity is a deeply flawed operating system, and Centaur might just be the patch we didn’t know we needed. Or, at the very least, it’s a solid plan B when civilisation finally accepts it’s too emotionally unstable to run the planet.