James Wallbank on Wed, 3 Apr 2019 11:53:24 +0200 (CEST)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: <nettime> Managing complexity?


Felix, this is the sort of post that social media conditions me to want to click "Like" but also to feel that it's an inadequate response.

I'd only add (or perhaps, draw out):

* "Managing" is the wrong way to think about maximising human welfare (or, indeed, achieving any defined objective) when interacting with complex systems.

* Perhaps "Surfing" is a better concept - dynamically balancing on roiling, turbulent, unknowable medium to plot a course at least approximately intentional. Some of the time.

* Digital networking ("the internet") is a connection machine. It takes elements and human activities and connects them, profligately, in ways forseen and unforseen, visible and invisible. (Who'd have thought that a geeky urge to purchase contraband anonymously would become intimately connected with melting icecaps? Thanks, Bitcoin!)

* It's this constant and accelerating process of cross-connection that makes current and future society a complex system, tending towards ever more complexity, and ever more unknowability. Forever. (Right up until THE EVENT, of course.)

All the best,

James
=====

On 01/04/2019 11:24, Felix Stalder wrote:

On 30.03.19 21:19, Brian Holmes wrote:
However, the surging sense of intellectual mastery brought by the
phrase, "managing complexity," declines precipitously when you try to
define either "management" or "complexity."
Complexity is relatively easy to define. As Jospeh Rabie already did,
the number of actors and the number of ways in which they can interact
with, and adapt to, one another defines the complexity of a "system".

This, of course, leads to the question how to determine the size of the
system. The first generation of cybernetics gave another answer to that
question than the second, as Ted pointed out.

Prem's suggestion that we are dealing with polycentric systems is
certainly right and makes it both easier and harder to define the number
of actors that make them up. Easier in the sense that it puts the focus
on densities and rates of interaction (higher at the center, lower at
the periphery) rather than on precise, yet elusive boundaries. Harder in
the sense that it stresses that each system contains numerous such
centers, shifting the problem from drawing boundaries to deciding the
inclusion/exclusion of centers.

Be that as it may. Let's assume that the number of actors and the ways
of interacting have increased over the last, say, last 70 years. More
important than the simple number of actors (which is hard to ascertain
anyway) is that the ways in which they are interacting has increased,
leading to an exponential, rather than linear rise on complexity.

In my view, there are a number of reasons for this.

* The chains of interactions have grown longer. Many (social and
ecological) systems used to be relatively local phenomena have become
global ones (as a consequence of the expansion of capitalism as
globalization).

* The intensity of interaction has been increasing (as a consequence of
the intensification of capitalism), taking many systems away from
"steady states" closer to the edge of "phase-transitions" (to use the
terminology from complexity theory). In this process, these systems
become more and more non-linear, increasing the need to understand their
internal dynamics (e.g who are the actors and how are they interacting)
while at the same time, making them less predictable.

* The social institutions that have traditionally limited the ways of
interaction by providing and enforcing rules and norms have weakened,
further increasing the leeway for agency (which, of course, not all bad).

Not knowing where to draw boundaries, or which centers are relevant to
the understanding of the system, is a part of the problem of not being
able to "manage" the many actors and their increasing ranges of
interaction and the predictable effects of their interactions. By
"managing" I initially simply meant the ability to track the actors that
make up the system and the ability to intervene in the system to move it
towards desired states. This is a somewhat technocratic view, I admit.

Joseph Weizenbaum argued in the 1970sthat the computer was introduced as
an answer to the social pressures which large corporations and
government agencies faced. Rather than accept social change, the new
computing infrastructure was putting central management on a new
footing. It could now keep track of many more elements and react much
faster to changes in the environment by reorganizing quickly the
relation of the elements to one another. This was, basically, the shift
from Fordism to Post-Fordism and by definition an increase in complexity
that came, as it always comes, at the price of an higher rate of
abstraction as a way of limiting that increase of complexity (a lower
number of variables per element are taken into account).

For similar reasons, I think, the shift towards markets and quantitative
signals (prices, ranking, indices etc) was so successful. It allowed to
manage the increase in social complexity by abstracting it away.

I think both systems (computers and markets) as ways of managing
complexity are reaching an upper limit, mainly because an ever
increasing number of actors are no longer conforming to their
abstractions (by exhibiting dimensions that we deemed irrelevant in the
process of abstraction, or by not behaving according to the models etc.).

These are not problems of implementation for technical limits to be
overcome by progress, but fundamental limitation of the these two
systems of abstraction/management.

Not everything can be expressed as a price. Even economists are now
arguing again about the difference between value and price. For
neo-liberals, is the same: the value of a thing is whatever somebody is
willing to pay for it, and therefor it cannot be too high or too low.


On 31.03.19 15:50, Prem Chandavarkar wrote:

AI systems do not sit well with consciousness, for AI makes its
decisions on the basis of statistical correlations derived from
computing power, and not on the basis of consciousness. AI systems run
into problems difficult to foresee or comprehend once the decision
process gets detached from sentient consciousness, especially when the
AI system encounters non-linear contexts.
Exactly! And this is a good indicator as to way the systemic crisis is now.


Which leads back to the "management" question. Management, with its
bureaucratic/cybernetic control approach is probably the wrong way to
think this anyway.

Because this is already getting way too long, I simply paste more of
Prem's excellent bullets here, not the least as a reminder to myself to
think more in this direction.

On 31.03.19 15:50, Prem Chandavarkar wrote:

‘Management’ and ‘complexity’ do not fit well in a polycentric system,
for management is an activity where one intervenes in order to control
output, and in a polycentric system, it is almost impossible to
ascertain with precision the impact of any intervention.

      
To live with complex systems we must allow them to be
self-organising. This is the argument used in the argument for free
markets, falling back on Adam Smith’s metaphor of the ‘invisible
hand’. 

      
However, self-organising systems are emergent - they can
exhibit fundamental properties that did not exist at all in an
earlier state of the system.  As humans, we cannot be blind to what
properties may emerge, unless we say we have no ethical concerns at
all if the system throws up properties such as unfair and degrading
exploitation of others or ecological imbalances.

All the best. Felix












#  distributed via <nettime>: no commercial use without permission
#  <nettime>  is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: http://mx.kein.org/mailman/listinfo/nettime-l
#  archive: http://www.nettime.org contact: [email protected]
#  @nettime_bot tweets mail w/ sender unless #ANON is in Subject:
#  distributed via <nettime>: no commercial use without permission
#  <nettime>  is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: http://mx.kein.org/mailman/listinfo/nettime-l
#  archive: http://www.nettime.org contact: [email protected]
#  @nettime_bot tweets mail w/ sender unless #ANON is in Subject: