-
01
STRENGTH
0
Existential risk demands public oversight.
Systems capable of recursive self-improvement create asymmetric risk profiles
comparable to nuclear material. Critical-infrastructure designation provides
the legal scaffolding for audit, licensing, and coordinated response.
CITE
Bengio & Hinton, Statement on AI Risk, 2023.
-
02
STRENGTH
0
Markets cannot price civilizational externalities.
The race-to-deploy logic of frontier labs externalizes catastrophic costs onto
the public. Treating AGI like the power grid lets us internalize those costs
through conventional safety regimes — testing, certification, and accountability.
CITE
Acemoglu, Power and Progress, 2023.
-
03
STRENGTH
0
Precedent already exists; we are merely extending it.
Aviation, pharmaceuticals, and nuclear power were once frontier industries
regulated after harm. Designating AGI now is a chance to regulate
before — a learned lesson, not a novel imposition.
CITE
EU AI Act, Title III, Annex III, 2024.
-
04
STRENGTH
0
Democratic legitimacy requires democratic control.
If AGI will reshape labor, speech, and security, then its trajectory cannot
belong to a handful of private entities. Critical-infrastructure status is
the constitutional path back to public consent.
CITE
Zuboff, The Age of Surveillance Capitalism, 2019.
-
01
STRENGTH
0
Premature regulation locks in incumbents.
Critical-infrastructure regimes raise compliance costs that only well-capitalized
firms can absorb. The result is regulatory capture: a moat for the very labs
the regulation was meant to constrain.
CITE
Stigler, The Theory of Economic Regulation, 1971.
-
02
STRENGTH
0
"AGI" is a moving target, not a regulable object.
There is no agreed technical definition of general intelligence. Writing law
around an undefined capability invites either toothless statutes or sweeping
overreach into ordinary software.
CITE
Mitchell, Why AI is Harder Than We Think, 2021.
-
03
STRENGTH
0
Existing tort and product-liability law already applies.
Harms caused by AI systems are not exempt from negligence, fraud, or
product-liability doctrine. New designations risk creating contradictory
duties without new protections.
CITE
Calo, Robotics and the Lessons of Cyberlaw, 2015.
-
04
STRENGTH
0
Innovation requires permissionless experimentation.
Critical-infrastructure rules require pre-approval. Most breakthroughs in
machine learning came from unsanctioned exploration. Cementing gatekeepers
will export the frontier to less accountable jurisdictions.
CITE
Thierer, Permissionless Innovation, 2016.