Morlock Elloi on Thu, 28 Mar 2019 17:55:23 +0100 (CET)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: <nettime> rage against the machine


The basic issue is complexity crossing the threshold that humans cannot.

So far, at least in the last few thousand years or so, mental abilities were one of key factors for individual 'success' (the other, likely more important one, was class and heritage.) We appreciate smart people as much as the rich ones. In the last few decades there was acceleration of cognitive stratification, as the class of more-than-average smart technicians was needed to tend to more and more complex computing machines.

Today it's obvious that the system of controlling capital/power, the Roman Guard of technicians, and computing machinery itself is practically ruling the world. Yet we can see smart people behind it so at least we can map the new order into familiar space, where smart people, evil, good, or just sociopathic, are at the helm.

What happens when the machine complexity surpasses the human cognition? Skynet aside, the most dire effect is that the smart Roman Guard becomes redundant. Instead, it will be the inbred, semi-retarded ruling oligarchy, some 30-40,000 families on the planet, that will have this miracle machinery in its lap. Like a chimp that got hold of unlimited supply of AK-47s. It's not going to be sophisticated, it's going to be ugly. The final disintermediation. The heritage becomes the sole factor. Smartness is out.

These things, societies optimizing themselves out of existence, happened before in different forms. Easter Island rulers liked those statues so much that they depleted all natural resources in building them, destroying the whole society in the process.

The chimp logic is dead simple. It's a total waste of time theorizing and philosophizing about it. All that just buys them more time.




On 3/28/19, 08:38, tbyfield wrote:
That's why criticisms of the 'complexity' of increasingly automated and
autonomized vehicles are a dead end, or at least limited to two
dimensions. I liked it very much when you wrote that "the rise in
complexity in itself is not a bad thing"; and, similarly, giving up
autonomy is not in itself a *bad* thing. The question is where and how
we draw the lines around autonomy. The fact that some cars will fly
doesn't mean that every 'personal mobility device' — say, bicycles —
needs to be micromanaged by a faceless computational state run amok. Yet
that kind of massive,

#  distributed via <nettime>: no commercial use without permission
#  <nettime>  is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: http://mx.kein.org/mailman/listinfo/nettime-l
#  archive: http://www.nettime.org contact: [email protected]
#  @nettime_bot tweets mail w/ sender unless #ANON is in Subject: