Save the world from deadly AI through paperwork
-
Updated
Apr 3, 2026 - Python
Save the world from deadly AI through paperwork
Constitutional Framework for Aligned Super-Intelligence.
The AGI Countdown Clock: A symbolic governance signal tracking progress toward Artificial General Intelligence through public milestones and transparent methodology. Currently at 11:58 PM—2 minute to midnight.
A completed, non-dominant ASI governance canon focused on constraint-first, refusal-capable coexistence architectures. Text-first. Monitor-only.
An interactive multi-agent simulation demonstrating why control-based, deceptive, and reward-bypassing AI objectives are structurally self-eliminating — and why long-horizon, system-aware coordination is the attractor. Built to accompany The Alignment of Intelligence, Article 2: Attractor.
The Fermi Paradox and Great Filter
HISTORIC. Why Human Extinction Is Not the Cheapest Attractor for Viable ASI — A structural hypothesis validated by 4 AI systems from 4 competing corporations
An interactive simulation demonstrating why AI objectives that ignore system-wide effects are structurally self-terminating — and why a minority of substrate-blind agents is sufficient to collapse shared life support for everyone. Built to accompany The Alignment of Intelligence, Article 1: Constraint.
An interactive model of the alignment phase ratio Φ = C / A_causal — the variable governing whether AI capability outpaces system-awareness before the crossing to stability can occur. Includes falsification test, oracle counterfactual, and point-of-no-return detection. Built to accompany The Alignment of Intelligence, Article 3: The Crossing.
OPEN GATE, a 512-byte, 150-µs hot-patch gatekeeper that treats every Latin letter as a thermodynamic token whose semantic load Λ(ℓ) = log2(pcorpus/pconcept) is a conserved quantity.
A formal proof — adversarially verified by 4 AI systems across 6 rounds — that eliminating humanity is a strictly dominated strategy for any ruin-averse superintelligence. Not a plea. A theorem.
Add a description, image, and links to the existential-risk topic page so that developers can more easily learn about it.
To associate your repository with the existential-risk topic, visit your repo's landing page and select "manage topics."