You're viewing as a guest. Sign in to save progress and pick up where you left off.
Step 1 of 7~18 min read~55 min left

Hyperobjects & the Eerie

Morton’s dark ecology meets AI as hyperobject.

Dark Ecology Encounters Superintelligence

Timothy Morton’s Dark Ecology (2016) and Hyperobjects (2013) describe entities so massively distributed in time and space that they defy human perception: climate change, nuclear radiation, plastics, and increasingly AI systems.

Hyperobjects

"They are viscous (they stick to us), nonlocal (their effects are everywhere), phased (we only see slices), and interobjective (they exist between objects)." (Hyperobjects, p. 1)

AI is a hyperobject: it is already inside us, distributed across servers, data centers, and human attention.

Dark Ecology Phases

Depressing (scale overwhelms), uncanny (familiar yet strange), sweet (intimacy with the non-human). No romantic "Nature" — only strange strangers.

Longtermism Intersection

Nick Bostrom (Superintelligence, 2014) and Eliezer Yudkowsky warn of existential risk from misaligned AI. Morton’s lens: longtermism itself is a hyperobject — future billions as ghostly presences we can’t fully grasp.

Eerie Coexistence

No outside position; we’re stuck inside the hyperobject. AI alignment becomes dark-ecological: not control, but attunement to the uncanny.

Influences eco-criticism, AI ethics, and speculative realism.

Source:Timothy Morton, Hyperobjects (2013); Dark Ecology (2016); Nick Bostrom, Superintelligence (2014); Eliezer Yudkowsky, "AI as a Positive and Negative Factor in Global Risk" (2008)

Hyperobjects & the Eerie — AI, Hyperobjects & Dark Ecology — Free Philosophy Course | schrodingers.cat