Discussion about this post

User's avatar
Greg G's avatar

I fall somewhere in the middle, and I find some of these arguments pretty lazy. For example, the "good guy with an AI" argument is about as convincing as the "good guy with a gun" argument currently. Maybe it will be fine, maybe not.

The bottom line is that no one knows, and people hate not knowing. So they come up with a plethora of arguments explaining why they actually do know what the future will bring.

Perhaps data is a bottleneck, perhaps development will be continuous, etc. Perhaps not. We can't really rule out that we're ~1 more breakthrough away from highly capable AI, and it seems obvious that there would be some level of self-improvement overhang at that point. It can't be that we've already plucked all the low-hanging fruit along the way. Does all of that adds up to a real problem? We don't know.

Expand full comment
Rachel Maron's avatar

While AI Snake Oil by Narayanan and Kapoor rightfully dismantles doomer fantasies about rogue AGI, it misses the deeper, more insidious existential risk: the erosion of trust. The real threat isn’t that AI will become self-aware and physically destroy us, it’s that flawed, opaque, and biased systems will quietly dismantle our ability to tell what’s real, whom to trust, and what to believe.

Existential collapse doesn’t look like a robot uprising; it looks like manipulated elections, automated discrimination, and the death of shared reality. The Princeton School frames AI risk as a matter of technical safety and infrastructure regulation, but this ignores how AI is already being weaponized socially, against marginalized communities, democratic institutions, and cognitive coherence itself.

I would argue that the most dangerous AI isn’t smarter than us; it’s trusted more than it should be. If trust is offloaded to systems built without transparency, accountability, or equity, we don’t get safety. We get soft authoritarianism wrapped in efficiency metrics.

The future of AI governance isn’t about containment; it’s about trust reconstruction. Because once public trust collapses, no amount of air-gapping will save us.

Expand full comment
36 more comments...

No posts