Ana Viseu on Mon, 10 Apr 2000 08:01:21 +0200 (CEST)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

<nettime> A report of "Computers, Freedom and Privacy 2000"


A report of "Computers, Freedom and Privacy 2000"

"Challenging the assumptions" was the theme of this year's 'Computers,
Freedom and Privacy' conference (April 4-7). The tenth edition of CFP was
held in Toronto and, as usual, it brought together professionals from a
broad range of fields: computer scientists, lawyers, business,
journalists, academics, NGOs and students. 

Reversing the chronological order of things I start my review with the
last session entitled "Ten Years of CFP: Looking back, looking forward",
because it condensed and made visible two themes that underlied the
feeling of this conference. The first has to do with a shift from 'whether
the Net will be regulated' --a concern which prevailed in the first
editions of CFP-- to a concern with 'who will regulate it' --that dominated
this year's conference. Adding to this point, Simon Davies (Privacy
International) spoke of a struggle between 'us' (computer scientists,
privacy advocates, etc) and 'them' (business and government). The second,
has to do with the delicate balance between concepts such as freedom and
privacy. How can we make them work together, and if they don't which
should prevail? 

These issues were dealt with in a variety of contexts that ranged from the
'domain name system', to children's rights, intellectual property,
surveillance and technological determinism, amongst others [1]. 

Although the 'domain name system' and its regulation was a hot topic,
after debating it for almost two mornings the only conclusion that one can
arrive to is that it is a dead topic. On one hand, nobody seems to applaud
the ICANN initiative for it resembles too much a political instrument; on
the other, nobody can provide feasible alternatives [2]. Jerry Berman
(Center for Democracy and Technology) summarized this position well when
saying that ICANN should be concerned only with issues of management of
the domain names. The rest, he said, should be in the hands of the
government and the different organizations that lobby there. It goes
without saying that the above mentioned government is the American one,
and that the 'lobby groups' are also American. 

Another hot topic, and the one that created the most heated discussion, was
that of 'net filters' and children's rights. With two defendants and two
opponents the panel on "Views of the Bertelsmann Foundation's self-regulation
of internet content proposal" was the best place to see the inextricabilities
of the concepts of freedom and privacy. The proposal's supporters argued for
the need to 'protect our children', for the self-regulating aspect of the
proposal, and for the innocuous character of labeling devices. The opponents
replied with the real danger of institutional use of these filtering systems
as mechanisms of control (what happens when a website falsely self-rates
itself?�), and with the global homogeneity of the filtering systems
themselves because, in fact, there is no filtering system that, for example,
has a category for media monopolies� Christopher Hunter (Annenberg School of
Communication) brought up the danger of pushing idiosyncratic speech to the
'no-man's land' of the web and the subsequ ent homogenization of content. As
it is, he said, 80% of the traffic goes to 5% of the sites. 

An issue that the panelists did not question, but that is of the greatest
importance, is that of giving unlimited power to parents to decide for
their children. In countries such as the U.S. and Canada, where a 'zero
tolerance' policy is already in place in schools it is urgent to consider
if implementing filtering systems at home will not lead to the creation of
children that are unable to deal with any situation that falls beyond the
lines delineated by others and that lack a capacity for self-critical
thinking. Besides, why assume that the Net is more powerful than any other
media in perverting our children and that, therefore, there is a need for
strict regulation?

On the theme of surveillance Duncan Campbell gave an excellent report on
ECHELON. Campbell started off with a bit of history and argued that
despite widespread belief ECHELON was not born out of the cold war. In
fact, he said, the USSR never had a system like this or the ability to
create it. The fact of the matter is that ECHELON is a product of our own
Western society, it is designed to monitor global satellite communication
(140 centers around the world) and it does so automatically. That is, 80%
of what is intercepted is sent directly to the U.S. It's enemies are not
single individual users that write 'dangerous' keywords in their email
messages, rather, its enemies are hackers, NGOs, single lobby groups, et
cetera. Campbell argued that currents movements in favour of stronger
security laws--such as the banning of anonymous web hosting in France--are
used to increase surveillance. 

Questions related to intellectual property (IP) and the adjacent legal
systems were very prominent in this conference. Apart from the usual legal
discussions there were two ways of approaching this issue that I believe are
helpful to understand the broader social aspects of the enforcement of IP
laws. The first was brought up by Jessica Litman, a Professor at the Wayne
State University. Litman highlighted the dangers of applying the traditional
IP model to the new digital context. Discussing specifically the issue of
piracy, Litman stated that the current IP model establishes a direct
correlation between strong copyright models and the amount of works produced,
that is, it implies that the more 'protection' the more 'production'. Using
this kind of metaphor its proponents have managed to convince people that
anything that has the same effect as piracy is indeed piracy, and that if the
results of any practice are the same as piracy then it is also piracy. Litman
argued that in order to change this situation we need to start using new
metaphors that reflect a new reality. In order to do this we have to come up
with a new vocabulary to replace the current one. Thus, rather than using
words such as piracy or cybersquatting--which are heavily loaded words--we
should use terms that are neutral in the eyes (and hears) of the majority of
people. 

The second point, which I think is important to mention, was brought up by
Randall Davis (MIT). Davis affirmed that new technologies change our
relation to information. To exemplify these changes Davis mentioned what
happens to libraries when their contract to an online journal finishes.
The library no longer possesses the previous issues, these were only there
while there was a bond between both institutions. Thus, Davis argues,
information becomes more an experience than an artifact. 

It is then possible to take this argument a step beyond the immateriality
of information, and note that the experience of information is based on a
relationship: Information no longer resides in you or in me but in our
connection, and this, I believe is crucial to the understanding of the
so-called 'new digital society'. 

The last point that I want to mention is that of the discussion regarding the
non-neutrality of technology. Although many speakers addressed this issue, I
will focus exclusively on Steve Talbott. Talbott, the publisher of the
NetFuture newsletter, argued for the need to look beyond the immediate
technological use, that is, to start by thinking about our (human) needs and
concerns and then think of the technology. If you don't understand how the
things are connected, he argued, then the cause of problems are solutions.
Talbott argued that throughout our struggle for progress we seem to have lost
track of our initial goals and purposes, and technological advancement
became, in itself, a goal or even the goal. For example, we first wire up all
the schools in the nation and only then think about how to use this
technology. Or, we introduce notions of efficiency in realms--such as
workplace--that traditionally had much less numerical and statistical traits.
Our freedom, says Talbott, resides in the capacity to think in larger terms,
to leave behind the immediate and think about the future while keeping in
mind our humanity.

Much more could be said about this conference, but as I finish I just want
to mention one last problematic issue: diversity. This issue is double
sided for, on one hand, this conference has the great merit of being
diverse both in the range of issues dealt with, but also in the spectrum
of fields. The presence of specialists both from the private and
governmental areas, the presence of theorists and pragmaticians, of
lawyers and journalists, et cetera is definitely a characteristic that
makes many other conferences envious. But, by the same coin, this
conference lacks diversity in attendees and realities. Most of the
sessions dealt exclusively in a very North-American (if not American)
reality which does not apply to most of the world. As an attendee from
Spain put it, "in Spain we deal with much more basic and profound problems
than the ones dealt with here". Also, the attendees were almost
exclusively white and largely male. 

But, personally, what bothered me the most was the widespread tendency to
say 'consumers' or 'little guys' when referring to people. In a conference
whose aim is to deal with privacy and freedom issues, and try to make
these concepts part of the public awareness it strikes me that confining
it to the realm of 'consumers' is not the solution. Rather, we should see
these concepts as part of that which makes us human, as a right that
everyone should and must have. 

[1] Wired News published yesterday a summary of many of the panels of the
conference <http://www.wired.com/news/politics/0,1283,35519,00.html>. See
also USAToday <http://www.usatoday.com/life/cyber/tech/cth671.htm>

[2] A good, if impractical, solution was advanced by Simson Garfinkel, who
proposed that we take all the meaning out of the domain name making it
similar to a telephone number. 
 
#  distributed via <nettime>: no commercial use without permission
#  <nettime> is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: [email protected] and "info nettime-l" in the msg body
#  archive: http://www.nettime.org contact: [email protected]