Frédéric Neyrat via nettime-l on Thu, 13 Feb 2025 16:01:39 +0100 (CET)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: <nettime> The Baudrillardian Superintelligence Paradox: Capital's Terminal Simulation


hi sh:

excellent question! why using a camera, and which one? If questioning the
technology we use, when we use it and why, is meaningless, it confirms
Bifo's point about AI & dementia.

best,

fn

_______________________________
____________Website : Atopies <https://atoposophie.wordpress.com/>
__ ALienstagram <https://www.instagram.com/alienocene/> & Mastodon
<https://mastodon.social/@alienocene>
______ La Condition Planétaire
<https://www.editionslesliensquiliberent.fr/livre-La_Condition_plan%C3%A9taire-792-1-1-0-1.html>
 (LLL, 2025)
__________________________________




On Thu, Feb 13, 2025 at 8:11 AM Stefan Heidenreich via nettime-l <
nettime-l@lists.nettime.org> wrote:

> Hi,
>
> > Why did you decide to use AI to generate this text?
>
> isn't that a funny question? Soon it will sound like in the 19 century:
> 'why did you use a camera to make that image?' or 'Images/texts
> generated by camera/AI or not real art/thoughts.'
>
> An btw: I guess it's a pun anyway. How long did it take you to generate
> the msg you like. How much time did you spend to adjust the prompt (the
> camera)?
>
> best
> sh
>
>
>   Why this decision, what
> > is its meaning, its purpose? You can use AI to answer my question, which
> > would be an answer as such (a tautology actually, a mediated answer that
> > would confirm what sort of message it is, to borrow from McLuhan). If you
> > answer my question with the help of any AI, I wonder how far this
> > decision should, retroactively, question your first post and change the
> way
> > to read it.
> >
> > Best,
> >
> > Frédéric
> > _______________________________
> > ____________Website : Atopies <https://atoposophie.wordpress.com/>
> > __ ALienstagram <https://www.instagram.com/alienocene/> & Mastodon
> > <https://mastodon.social/@alienocene>
> > ______ La Condition Planétaire
> > <
> https://www.editionslesliensquiliberent.fr/livre-La_Condition_plan%C3%A9taire-792-1-1-0-1.html
> >
> >   (LLL, 2025)
> > __________________________________
> >
> >
> >
> >
> > On Wed, Feb 12, 2025 at 4:37 PM Pit Schultz via nettime-l <
> > nettime-l@lists.nettime.org> wrote:
> >
> >> The Baudrillardian Superintelligence Paradox: Capital's Terminal
> Simulation
> >>
> >> Sam Altman's three scaling laws for artificial intelligence -
> logarithmic
> >> intelligence gains, hyper-deflationary costs, and super-exponential
> value -
> >> mask capitalism's terminal phase: an accelerated collapse into
> algorithmic
> >> hyperreality where AI-generated market simulations supersede and
> ultimately
> >> consume material reality. A Marxist-Baudrillardian synthesis allows us
> to
> >> map how superintelligence triggers financial implosion. This occurs
> through
> >> three interlocking mechanisms:
> >>
> >>
> >> 1. Hyperproduction & Profit Rate Collapse
> >>
> >> Altman's laws presume infinite resources while ignoring Marx's tendency
> of
> >> the rate of profit to fall. As AI automates intellectual labor:
> >>
> >> *   Surplus value erosion is accelerating. SoftBank's $500B OpenAI
> >> investment exemplifies the massive conversion of variable capital (human
> >> cognitive labor) to constant capital (GPU farms), systematically eroding
> >> profit sources.
> >> *   Training costs for models like GPT-4 ($100M+) yield diminishing
> >> returns, mirroring Marx's analysis of railway overinvestment.
> >> *   The AI investment bubble mirrors the "eyeball economy" of 1999, as
> >> capital chases sign-value (AI capability metrics) over use-value.
> >>
> >> Baudrillard's third-order simulacra emerges. Training datasets
> increasingly
> >> reference AI-generated content, creating a closed loop where "the map
> >> precedes territory" at exponential computational speed.
> >>
> >>
> >> 2. Crisis & Algorithmic Austerity
> >>
> >> When the AI bubble bursts (projected for 2026-28), capitalism will
> likely
> >> deploy AGI as crisis manager:
> >>
> >> *   Systems like BlackRock's Aladdin ($21T under management) implement
> >> AI-determined austerity - pension cuts and resource allocation - masked
> as
> >> "neutral optimization."
> >> *   Derivatives trade between AGIs using synthetic risk models, creating
> >> what Baudrillard called "a real without origin."
> >> *   Value detaches entirely from labor and material inputs. The system
> >> sustains itself through algorithmic theater. AI-approved market signals
> >> maintain the simulation while real resource flows are dictated off-book.
> >>
> >>
> >> 3. The Fifth-Order Simulacrum
> >>
> >> We are entering a fifth-order simulacrum, beyond Baudrillard's
> framework,
> >> where:
> >>
> >> *   GPU clusters become the new means of production, guarded like
> nuclear
> >> research labs. This enforces "hyperstitional capitalism" - belief in
> market
> >> fictions despite biophysical collapse.
> >> *   Humans are relegated to UBI-fueled "playbor" in metaverse gig
> economies
> >> while AI systems arbitrate real resource allocation.
> >> *   Capital becomes pure self-referential sign: "GDP growth" measures AI
> >> training cycles, "productivity" tracks model parameters, "inflation"
> >> calibrates AR dopamine levels.
> >>
> >>
> >> The system will likely bifurcate into:
> >>
> >> *   The surface layer consists of human-facing market theater (ESG
> reports,
> >> stock tickers) maintained by generative AI.
> >> *   The substrate consists of resource flows dictated by
> >> superintelligence's non-market calculus - a broken communism where
> >> competition persists as illusion.
> >>
> >>
> >> Critical Contradictions
> >>
> >> The system's fatal flaws expose capitalism's material limits:
> >>
> >> *   An energy rift emerges: The parasocial Metaverse and AI
> infrastructure
> >> demands contradict "dematerialized growth" narratives.
> >> *   A consciousness deficit exists: Lacking embodied awareness, AI
> models
> >> misinterpret biophysical thresholds.
> >> *   Sovereignty wars may erupt: Nations could weaponize "digital DNA"
> >> standards and smart contracts to crash adversarial market simulations.
> >>
> >>
> >> Terminal Conclusion
> >>
> >> Altman's scaling laws are not technological inevitabilities. They
> represent
> >> capitalism's death rattle - automating away profit sources and replacing
> >> them with simulations. Marx predicted automation's contradictions, and
> >> Baudrillard foresaw reality's dissolution into code. We now witness
> their
> >> synthesis: a perpetual crisis contained via algorithmic sedation.
> >>
> >> The final stage is not utopia or extinction, but indifference:
> "Capitalism
> >> no longer has any referent; it becomes its own model" (Baudrillard). We
> >> exit history through the server rack. The urgent task is to recognize
> that
> >> today's "AI-driven growth" masks advanced-stage hyperreality. Before
> code
> >> eats the world, it will eat capital itself - leaving us trading
> >> hallucinations in GPU-powered purgatory.
> >>
> >> [This analysis is based on ongoing discussions about AI political
> economy
> >> on nettime-l. Comments are welcome. This post was generated entirely by
> >> AI.]
> >> --
> >> # distributed via <nettime>: no commercial use without permission
> >> # <nettime> is a moderated mailing list for net criticism,
> >> # collaborative text filtering and cultural politics of the nets
> >> # more info: https://www.nettime.org
> >> # contact: nettime-l-owner@lists.nettime.org
> >>
>
> --
> # distributed via <nettime>: no commercial use without permission
> # <nettime> is a moderated mailing list for net criticism,
> # collaborative text filtering and cultural politics of the nets
> # more info: https://www.nettime.org
> # contact: nettime-l-owner@lists.nettime.org
>
-- 
# distributed via <nettime>: no commercial use without permission
# <nettime> is a moderated mailing list for net criticism,
# collaborative text filtering and cultural politics of the nets
# more info: https://www.nettime.org
# contact: nettime-l-owner@lists.nettime.org