← Full Scorecard

LLM Epistemic Capture / Open Source

HIGH (cognitive substrate)Blind spots: 0

In plain terms

ENGINE AHEAD — $8.8 trillion open-source subsidy (HBS 24-038). GitHub panopticon (100M devs).

Analysis

ENGINE AHEAD — $8.8 trillion open-source subsidy (HBS 24-038). GitHub panopticon (100M devs). LLMs encode captured Wikipedia/journal biases via AI safety training (RLHF) ($1.32/hr Kenyan workers). BST recursion: LLMs cannot identify own capture. Open vs closed AI = Jiang false dialectic. Compute oligopoly (Nvidia 90%+, Big Three) renders openness irrelevant. psychological warfare doctrine completed: information control leads to cognitive control. Crowd debates AI jobs; engine tracks what people think and believe formatting.
EU Regulatory SuperstateNvidia / Compute Oligopoly