Inke Arns on Sun, 27 Dec 1998 16:37:47 +0100 |
[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]
Syndicate: (Fwd) Philip Zhai: VR Timetable |
From: "Philip Zhai" <[email protected]> Subject: VR Timetable Date: Sat, 26 Dec 1998 19:43:35 EST A Hypothetical Timetable of Virtual Reality in the Future (alert: this is not a prediction, but just a special presentation of the concept of VR) STAGE I. FROM SYNTHESIS OR RE-PRODUCTION OF SENSATIONS TO IMMERSIVE EXPERIENCE IN CYBERSPACE 2000: 3D visual-display screens and 3D earphones are installed inside the helmet which is wirelessly connected to a super computer. 2002: Data gloves that you wear pick up signals of your hand movement and tracking sensors pick up signals of your arm movement, and these signals are sent to the same super computer, such that on your 3D screens you can see realistic images of your own arms, hands, and fingers; primal cyberspace is formed. 2005: Haptic devices and data gloves are combined into one such that when you see (the image of) your hand touching an object the glove gives you stimuli in time so you also feel the touch; when you strike the object, you can hear the sound as if it were from that object. 2015: Haptic/data gloves are expanded to cover the whole body so you have a complete haptic/data bodysuit. So your whole body appears in the 3D environment among other synthetic objects which you can touch, hear, break, play with, etc. 2020: Motion-tracking facilities are connected to your limbs such that your body-movements such as walking, jumping, will not necessarily let you go anywhere in the actual world, but the resultant signals are sent to the computer so you see yourself walking, jumping, etc., in cyberspace, that is, in the virtual world. 2030: The image of your whole animated body interacts with all other images in cyberspace, and your five senses are fed with corresponding stimuli. As a result, the fully coordinated sense perception is presented as coming from the 3D images in cyberspace. That is, the color, the sound, the smell, etc. are all perceived as coming from one or another 3D object you can see. 2040: Fundamental laws of physics are programmed into the infrastructure of virtual world such that we see virtual events manifesting regularity similar to those in the actual world. Look ahead you see an ocean and look behind you see mountains; look ahead again you see the same ocean but now a passenger ship is coming from afar ... 2045: The computational power has dramatically increased such that problems such as time lapse and rendering rates no longer prevent us from carrying out seamless interaction with objects in cyberspace. Thus, our virtual experience is totally immersive; and cyberspace is totally isolated from physical space. 2050: Haptic/tactile recorders are available so that using bodysuit and other devices we can record, edit, and playback our sense of touch. 2060: Internet is transformed into a cyberspace infrastructure, so that all Internet surfers can interact with each other by shaking hands, swapping gifts, etc. Lovers can embrace one another in cyberspace, regardless of the physical distance in the actual world. 2070: You can choose your own image in cyberspace, and adjust the strength of each of the five senses as you like. 2090: Interaction in cyberspace through Internet has become a major means of long distance communication. STAGE II: FROM SENSORY COMMUNICATION TO FUNCTIONAL TELEOPERATION 2100: Telecommunication is combined with robotic technology to carry out teleoperation such that virtual reality goes beyond communication. Our interaction with synthetic objects in cyberspace will actually initiate and maintain physical processes in the actual world and perform tasks as we wish. When we interact with objects in the foundational part of the virtual world, the computer sends signals to robots in a remote place such that the robots in the actual world will interact with physical objects in the natural environment to perform industrial, agricultural, and other basic tasks for human subsistence and prosperity. Humans immersed in cyberspace experience telepresence as if they are interacting with physical objects personally since the remote robots send back stimuli for the five senses as they encounter physical objects. 2110: Teleoperation expands to allow for interpersonal cooperation. Different robots teleoperated by different persons immersed in cyberspace will act together to carry out complicated projects. 2130: Robotic human surrogates spread all over the earth, and any person immersed in cyberspace can teleoperate any one of these robots; thus you don't have to physically travel to any other places while you can be present in many remote places and interact with physical objects or human subjects in those places. 2150: Robots differently sized (try also to think about applications of nanotechnology) and differently powered can be chosen by the person immersed in cyberspace so that, coordinated with visual image amplification or reduction (think about visual images from a satellite or from a microscope), he/she can teleoperate very large and very small objects at ease. He/she can pick up (in virtual world) an airplane (in the actual world) with two fingers, or walk through a blood vessel. 2180: Most of human activities are progressing in the virtual world. We maintain our basic economy by teleoperation in the foundational part of VR, and carry out artistic creations in the expansive part of VR. STAGE III. FROM TECHNOLOGICAL INNOVATION TO NEW GENESIS 2200: Robotic human surrogates are shipped to the outer space and humans visit, explore, and colonize other heavenly bodies by telepresence, teleoperation, and possibly communicate or cooperate with other intelligent beings thereby. 2250: Human reproductive process is also carried out by cybersex and teleoperation, and children grow up in the virtual world, thus telepresence and teleoperation become their default way of life. 2350: Our descendants living in the virtual world regard years before 2000 as pre-historical, and they learn how we --their ancestors-- used to live without telepresence and teleoperation only by reading history books. 2400: Our later descendants begin to create their second-level virtual reality. 1998: But how do we know that we have not already been living in virtual reality and doing telepresence and teleoperation right now, in 1998, before 1998, or since the very beginning? What's the nature of what we call "material objects," or "physical space," or "geographical distance"? Also, what is the nature of the human mind and consciousness, and what ethical implications does the "new genesis" have for humanity? In my second book, "Get Real: A Philosophical Adventure in Virtual Reality," I demonstrate why and how there are no ontological differences between virtual reality and actual reality, and between cyberspace and physical space. I argue that 1) whatever reasons we have for justifying the materiality of the actual world are equally valid or invalid for justifying the materiality of the virtual world; 2) whatever reasons we have for calling the perceived objects in the virtual world illusory are equally applicable or inapplicable for calling those in the actual world illusory; 3) whatever functions we need to perform in the actual world for our survival and prosperity we can also perform in the virtual world. Author Philip Zhai's WebPage: http://www.geocities.com/athens/3328 What are available now (buying guide): http://www.cs.jhu.edu/~feldberg/vr/vrbg.html ______________________________________________________ Get Your Private, Free Email at http://www.hotmail.com