Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!oitnews.harvard.edu!purdue!lerc.nasa.gov!magnus.acs.ohio-state.edu!math.ohio-state.edu!howland.reston.ans.net!news.sprintlink.net!simtel!harbinger.cc.monash.edu.au!yarrina.connect.com.au!labtam!labtam!chris
From: chris@labtam.labtam.oz.au (Chris Taylor)
Subject: Re: Terminator
Message-ID: <chris.803611017@labtam>
Organization: Labtam Australia Pty. Ltd., Melbourne, Australia
References: <stick-0306951605060001@user59.lightside.com> <DA2K7E.3q4@pts.mot.com>
Date: Tue, 20 Jun 1995 01:16:57 GMT
Lines: 36

>In article <stick-0306951605060001@user59.lightside.com>, stick@lightside.com (Joe) writes:
>> I heard that by killing off half of a fairly sizable 'net, you can
>> actually induce "creativity" and "self-awareness"... Great! All we need
>> now is a real world Terminator 2 situation. Personally I think that
>> research in NN technology should be stopped by the govt. or at least
>> severe restrictions should be placed on it.
>> 
>> Viva real brains

Actually it's REAL brains that you should be scared about - right now!

Even if a Terminator style automaton arrives one day it will only
emulate the same lack-of-conscience and savagery that already has been
conditioned into many human brains that exist today.

Sure you can give the automaton big weapons, but you can give
a human big weapons too. You can make the automaton extra dangerous by
conditioning it to 'fight to the death', but human soldiers have also
been conditioned that way in various instances.

Big automatons are little more scary that humans.

If you want to really get scared about something, worry about
LITTLE automatons.
The potential of nanotechnoloy to invent little virus-like robots and the
potential of genetic-technology to invent better killer viruses.
Or just worry about the potential of germ-warfare using viruses that exist now. 

You are never going to stop the potential for 'nasty research'
by placing stamp-down controls on these sort of things. 

You can only discourage 'nasty research' (thus assume it will go underground)
and you need to encourage a flood of 'nice research' so that hopefully we have
the technical capabilility to handle the nasty products of these sort of
technologies.

