Show Navigation

Search Results

Refine Search
Match all words
Match any word
Prints
Personal Use
Royalty-Free
Rights-Managed
(leave unchecked to
search all images)
{ 64 images found }

Loading ()...

  • Myron Kruger jumps in front of a VideoPlace screen. Kruger designed this system to allow people to interface directly with computers. The operator stands in front of this large, backlit screen. A video camera is used to form an image of the silhouette - the computer then interprets different poses or actions as different commands. The results are displayed on an equally- large video screen, the image of the operator being manipulated in response to the commands. Kruger was the first to use the term 'artificial reality' for this concept. Model released. (1990)
    USA_SCI_VR_19_xs.jpg
  • When the Three Mile Island reactor in Pennsylvania (no steam rising from the abandoned cooling towers on the left) failed catastrophically in 1979, the intense radioactivity in the plant prevented its owners from surveying and repairing the damage. Four years later, with conditions still unknown, Carnegie Mellon engineer William L. "Red" Whittaker designed several remote-controlled robots that were able to venture into the radioactive plant. From the book Robo sapiens: Evolution of a New Species, page 140.
    USA_rs_477_qxxs.jpg
  • Borrowing from Star Wars, engineers at NASA's Ames Research Center, just south of San Francisco, CA, are developing a personal assistant robot that can hover over an astronaut's shoulder in space, or work at the direction of an astronaut in situations too dangerous for a human. Floating weightlessly, the machine could have many uses: patrolling corridors for gas leaks, reminding astronauts about the tasks on their to-do lists, or serving as a communication link when people are busy using both hands. From the book Robo sapiens: Evolution of a New Species, page 124.
    USA_rs_411_qxxs.jpg
  • Sewer inspection robot. Kurt I, a sewer inspection robot prototype. Here, the robot is moving through a simulated sewer at a German government-owned research and development centre. Unlike its predecessors, the Kurt I, and its successor, Kurt II, are cable-less, autonomous robots, which have their own power supply and piloting system. Kurt uses two low-powered lasers (upper centre) to beam a grid (red, lower centre) into its path. When the gridlines curve, indicating a bend or intersection in the pipe, the robot matches the curves against a digital map in its computer. It will then pilot itself to its destination. Photographed in Bonn, Germany.
    Ger_rs_40_xs.jpg
  • Ian Horswill and Genghis at the M.I.T. Insect Robot Lab in Cambridge, Massachusetts.
    Usa_sci_ir_32_nxs.jpg
  • Kurt I, a 32-cm-long robot, crawls through a simulated sewer network on the grounds of the Gesellschaft für Mathematik und Datenverabeitung-Forschungs-zentrum Informationstechnik GmbH (GMD), a government-owned R&D center outside Bonn, Germany. Every ten years, Germany's 400,000 kilometers of sewers must be inspected, at a cost of $9 per meter. Today, vehicles tethered to long data cables explore remote parts of the system. Because the cables restrict the vehicle's mobility and range, GMD engineers have built Kurt I, which crawls through sewers itself. To pilot itself, the robot?or, rather, its successor model, Kurt II?will use two low-power lasers to beam a checkerboardlike grid into its path. When the gridlines curve, indicating a bend or intersection in the pipe ahead, Kurt II will match the curves against a digital map in its "brain" and pilot itself to its destination. From the book Robo sapiens: Evolution of a New Species, page 194
    GER_rs_6_qxxs.jpg
  • Dr. Paul MacCready, inventor and chairman of AeroVironment Inc. holds a see-through model of Black Widow: an MAV (Micro Air Vehicle) recently developed by AeroVironment for DARPA (Defense Advanced Research Projects Agency). MacCready and his team of designers and engineers were able to accomplish the government's objective for an MAV. The Black Widow would likely serve surveillance purposes for the military, but there are other applications as well such as air quality testing and police assistance. Robo sapiens Project.
    Usa_rs_584_xs.jpg
  • USA_091029_018_x.jpg
  • Virtual reality. Cyberspace racquetball game: real strokes made by Christopher Allis, the player are returned by the Cyberspace computer through the virtual, computer- generated environment displayed on the monitor. Admission to this virtual squash court is provided by 3-D video goggles, a magnetic sensor & optical fiber sensors woven into a black rubber glove. The headset sensor transmits data to the computer on the player's position in space, whilst the data glove connects real hand movements to the virtual racquet court. Photo taken at AutoDesk Inc., Sausalito, California. Model Released (1990)
    USA_SCI_VR_27_xs.jpg
  • Virtual reality: Michael McGreevy, PhD. in front of a pair of video images of the Valles Marineris of the planet Mars, computer-generated from data provided by the Viking spacecraft at NASA's Ames Research Centre, California. Sophisticated computers & sensors provide the user with a telepresence in the virtual world, through small video screens mounted in goggles on a headset, whilst a spherical joystick controls movement through the virtual landscape. One future Martian application of this system might be in gathering geological samples by remote control using a rover robot. A sensor in the geologist's headset could direct the robot at specific sample targets. Model Released (1990)
    USA_SCI_VR_35_xs.jpg
  • Virtual reality: fitting adjustments being made to a data suit (blue, center) by Lou Ellen Jones, Asif Emon and Bea Holster at VPL research, Redwood City, California. VPL specializes in virtual or artificial reality systems, the production of computer-generated graphical environments that users may enter. Visual contact with such artificial worlds is provided by a headset equipped with 3-D goggles. A spatial sensor on the headset (to fix the user's position in space) and numerous optical fiber sensors woven into the data suit, relay data back to the computer. The forerunner to the data suit is the data glove, which restricted the user's virtual interaction to hand gestures. Model Released (1990)
    USA_SCI_VR_34_xs.jpg
  • Virtual reality: data suit design. John Bumgarner at VPL Research Inc., Redwood City, California, discussing technical points relating to the design of the blue data suit being worn by Lou Ellen Jones on left. VPL produces virtual reality systems - computer generated graphical environments that a user may enter & interact with. Visual contact is provided by a headset equipped with 3-D goggles. A spatial sensor on the headset (to fix the user's position in space) and numerous optical fiber sensors woven into the data suit relay data back to the computer. The forerunner to the data suit is the data glove, which restricted the user's virtual interaction to hand gestures. Model Released (1990)
    USA_SCI_VR_33_xs.jpg
  • Virtual reality: Lewis Hitchner manipulates a pair of video images of the Valles Marineris of the planet Mars, computer-generated from data provided by the Viking spacecraft at NASA's Ames Research Centre, California. Sophisticated computers & sensors provide the user with a telepresence in the virtual world, through small video screens mounted in goggles on a headset, whilst a spherical joystick controls movement through the virtual landscape. One future Martian application of this system might be in gathering geological samples by remote control using a rover robot. A sensor in the geologist's headset could direct the robot at specific sample targets. Model Released (1990)
    USA_SCI_VR_17_xs.jpg
  • Medicine: VA (Veteran's Affairs) Hospital in Long Beach, California - Dr. K.G. Lehmann, surgeon, preparing to perform a cardiac catheterization (diagnostic heart catheterization). The catheter, about the same thickness as a fine fishing line, is passed into a vein in the patient's arm. The catheter is then fed through the blood vessels to the heart. The surgeon keeps track of the catheter's position using an x-ray video camera. A tiny pressure measuring device, micro manometer, is at the end of the catheter, and is used to take blood pressure readings at both sides of a heart valve. This micro sensor device was made using the same technology as is used in the manufacture of silicon 'chips', allowing minute sensors to be built for such invasive diagnostic techniques. MODEL RELEASED (1990).
    USA_SCI_MED_08_xs.jpg
  • Medicine: VA (Veteran's Affairs) Hospital in Long Beach, California - Dr. K.G. Lehmann, surgeon, preparing to perform a cardiac catheterization. The catheter, about the same thickness as a fine fishing line, is passed into a vein in the patient's arm. The catheter is then fed through the blood vessels to the heart. The surgeon keeps track of the catheter's position using an x-ray video camera. A tiny pressure measuring device, a micro manometer, is at the end of the catheter, and is used to take blood pressure readings at both sides of a heart valve. This micro sensor device was made using the same technology as is used in the manufacture of silicon 'chips', allowing minute sensors to be built for such invasive diagnostic techniques. MODEL RELEASED (1990)
    USA_SCI_MED_07_xs.jpg
  • Medicine: VA (Veteran's Affairs) Hospital in Long Beach, California - Dr. K.G. Lehmann, surgeon, preparing to perform a cardiac catheterization (diagnostic heart catheterization). The catheter, about the same thickness as a fine fishing line, is passed into a vein in the patient's arm. The catheter is then fed through the blood vessels to the heart. The surgeon keeps track of the catheter's position using an x-ray video camera. A tiny pressure measuring device, a micro manometer, is at the end of the catheter, and is used to take blood pressure readings at both sides of a heart valve. This micro sensor device was made using the same technology as is used in the manufacture of silicon 'chips', allowing minute sensors to be built for such invasive diagnostic techniques. MODEL RELEASED (1990)
    USA_SCI_MED_06_xs.jpg
  • Medicine: VA (Veteran's Affairs) Hospital in Long Beach, California - Dr. K.G. Lehmann, surgeon, preparing to perform a cardiac catheterization. The catheter, about the same thickness as a fine fishing line, is passed into a vein in the patient's arm. The catheter is then fed through the blood vessels to the heart. The surgeon keeps track of the catheter's position using an x-ray video camera. A tiny pressure measuring device, a micro manometer, is at the end of the catheter, and is used to take blood pressure readings at both sides of a heart valve. This micro sensor device was made using the same technology as is used in the manufacture of silicon 'chips', allowing minute sensors to be built for such invasive diagnostic techniques. MODEL RELEASED (1990)
    USA_SCI_MED_05_xs.jpg
  • Virtual reality in undersea exploration: bench testing of an undersea tele-robotic robot arm, being developed for the U.S. Navy by the Centre for Engineering Design at the University of Utah, Salt Lake City. The functions of this robot are the performance of complex underwater tasks by remote manipulation from the surface. Underwater video cameras & other imaging systems relay information to a computer that produces a 3-D virtual image of the seabed. The operator is linked to this world through a headset equipped with 3-D goggles, & spatial sensor, and data gloves or other clothing that relay precision movements back through the computer to tools on the robot's limbs. (1990)
    USA_SCI_VR_40_xs.jpg
  • Virtual reality in undersea exploration: bench testing of an undersea tele-robotic robot arm, being developed for the U.S. Navy by the Center for Engineering Design at the University of Utah, Salt Lake City. The functions of this robot are the performance of complex underwater tasks by remote manipulation from the surface. Underwater video cameras & other imaging systems relay information to a computer that produces a 3-D virtual image of the seabed. The operator is linked to this world through a headset equipped with 3-D goggles, & spatial sensor, and data gloves or other clothing that relay precision movements back through the computer to tools on the robot's limbs. (1990)
    USA_SCI_VR_39_xs.jpg
  • Virtual reality: Jim Chong wears a prototype (1st generation) headset. Virtual environments are generated by computer systems to allow users to interact with in similar ways as they might with a real environment. The computer environments are displayed to their users using sophisticated graphics projected through small video monitors mounted on the headset. In addition, some headsets have a sensor which instructs the computer of the wearer's spatial aspect, that is, in 3-D. This particular model features displays with half-silvered mirrors that allow the user to see the computer image & look ahead. Model Released (1990)
    USA_SCI_VR_30_xs.jpg
  • Researcher Tim Leuth and surgeon Martin Klein with a medical robot called a "SurgiScope" at the Virchow Campus Clinic, Humboldt University, Berlin, Germany. The SurgiScope is an image guided surgery support device comprised of a robotic tool holder, advanced image handling software and a position sensor. The robotic system can be used for surgical planning and interoperative guidance.
    Ger_rs_229_xs.jpg
  • Virtual reality: Warren Robinett wears a prototype (1st generation) headset. Virtual environments are generated by computer systems to allow users to interact with in similar ways as they might with a real environment. The computer environments are displayed to their users using sophisticated graphics projected through small video monitors mounted on the headset. In addition, some headsets have a sensor which instructs the computer of the wearer's spatial aspect, that is, in 3-D. This particular model features displays with half-silvered mirrors that allow the user to see the computer image & look ahead. Model Released (1990)
    USA_SCI_VR_14_xs.jpg
  • Micro Technology: Micromechanics: Light micrograph of the detector 'teeth' of a micro-resonator. This is a tiny mechanical resonating structure, made by the same silicon deposition process used in the manufacture of microcircuits. The 'teeth' seen here detect the motion of the resonator, the central buff-colored object. The dark vertical lines running above and below the resonator are the strands of silicon connecting the sensor to the resonant masses. The strands are only two microns thick, but at this scale silicon has a greater mechanical strength than steel. Micro-resonators have a variety of uses in detecting very small amplitude motions. [1989]
    USA_SCI_MICRO_14_xs.jpg
  • Nuclear Winter test fire: brown smoke rises from smoldering brush fires, deliberately started to study the potential climatic effects of a nuclear war. The nuclear winter theory predicts that smoke from fires burning after a nuclear war would block sunlight, causing a rapid drop in temperature that would trigger serious ecological disturbance. The test burn took place in December 1986 on 500 acres of brush in Lodi Canyon, Los Angeles. Dripping napalm from a helicopter ignited the fire. Ground-based temperature sensors were used to study soil erosion. Various airborne experiments included smoke sampling & high-altitude infrared imaging from a converted U-2 spy plane.
    USA_SCI_NUKE_21_xs.jpg
  • Virtual reality. Jamaea Commodore wears a virtual reality headset and data glove appears immersed in a computer-generated world. Virtual reality headsets contain two screens in front of the eyes, both displaying a computer- generated environment such as a room or landscape. The screens show subtly different perspectives to create a 3-D effect. The headset responds to movements of the head, changing the view so that the user can look around. Sensors on the data glove track the hand, allowing the user to manipulate objects in the artificial world with a virtual hand that appears in front of them. Model Released (1990)
    USA_SCI_VR_28_xs.jpg
  • Scientist Richard Turco and Carl Sagan were on the scientific team that devised the concept of nuclear winter. Turco is seen here at the Nuclear Winter test fire: where a canyon outside Los Angeles was deliberately set on fire to study the potential climatic effects of a nuclear war. The nuclear winter theory predicts that smoke from fires burning after a nuclear war would block sunlight, causing a rapid drop in temperature that would trigger serious ecological disturbance. The test burn took place in December 1986 on 500 acres of brush in Lodi Canyon, Los Angeles. Dripping napalm from a helicopter ignited the fire. Ground-based temperature sensors were used to study soil erosion. Various airborne experiments included smoke sampling & high-altitude infrared imaging from a converted U-2 spy plane.
    USA_SCI_NUKE_25_xs.jpg
  • Nuclear Winter test fire: brown smoke rises from smoldering brush fires, deliberately started to study the potential climatic effects of a nuclear war. The nuclear winter theory predicts that smoke from fires burning after a nuclear war would block sunlight, causing a rapid drop in temperature that would trigger serious ecological disturbance. The test burn took place in December 1986 on 500 acres of brush in Lodi Canyon, Los Angeles. Dripping napalm from a helicopter ignited the fire. Ground-based temperature sensors were used to study soil erosion. Various airborne experiments included smoke sampling & high-altitude infrared imaging from a converted U-2 spy plane.
    USA_SCI_NUKE_22_xs.jpg
  • Virtual reality: Jaron Lanier, head of VPL Research of Redwood City, California, photographed surrounded by demonstration images of the virtual, non-real worlds that VPL have created. Fiber- optic sensors in the black rubber glove Lanier is wearing transmit a user's movements into the computer-generated virtual environment. A user's view of such a world is projected by the computer into 2 eye phones mounted on a headset. Model Released (1990)
    USA_SCI_VR_25_xs.jpg
  • Virtual reality: Jaron Lanier, head of VPL Research of Redwood City, California. Fiber- optic sensors in the black rubber glove Lanier is wearing transmit a user's movements into the computer-generated virtual environment. A user's view of such a world is projected by the computer into 2 eye phones mounted on a headset. Model Released (1990)
    USA_SCI_VR_23_xs.jpg
  • Virtual reality: Rich Holloway wears prototype headset which employs half-silvered mirrors to enable the user to view a projected image of a virtual environment (and thus exist in virtual reality) and also see in front of his nose. A virtual environment is one created by a computer. A person entering such an environment does so with the aid of such a headset, which displays virtual imagery. Tactile interaction with the environment may be made using a data glove, a Spandex garment wired with sensors, which relays movement of the hand & fingers to the virtual environment. Model Released (1990)
    USA_SCI_VR_13_xs.jpg
  • Virtual reality in air traffic control (ATC) systems. Bill Wiseman from the University of Washington Human Interface Technology Laboratory, Seattle, demonstrating how ATC might operate in the future. Optical fiber sensors in his black data glove & the pink-rimmed micro-laser scanner glasses connect the operator with a virtual, computer-generated, 3-D image of the airspace he is controlling. Through raising his gloved hand to touch an icon (projected image) of an approaching jet, he is placed in instant voice communication with the pilot. This photograph was taken with the cooperation of SEA/TAC international airport, Seattle. MODEL RELEASED. (1990)
    USA_SCI_VR_11_xs.jpg
  • Virtual reality in air traffic control (ATC) systems. Bill Wiseman from the University of Washington Human Interface Technology Laboratory, Seattle, demonstrating how ATC might operate in the future. Optical fiber sensors in his black data glove & the pink-rimmed micro-laser scanner glasses connect the operator with a virtual, computer-generated, 3-D image of the airspace he is controlling. Through raising his gloved hand to touch an icon (projected image) of an approaching jet, he is placed in instant voice communication with the pilot. This photograph was taken with the cooperation of SEA/TAC international airport, Seattle. MODEL RELEASED. (1990)
    USA_SCI_VR_09_xs.jpg
  • Virtual reality. Harry Marples, Computer Scientist, programming a system that will allow visitors a 3-D guided tour of a new building before it is even built. Plans for a proposed design are fed into a computer, which is capable of displaying them in sophisticated 3-D graphics. Thus the real building is presented by the computer as a virtual one. Visitors wearing special headsets fitted with video goggles and spatial sensors can move from room to room within the virtual space as if they were in the real world. Optical fibers woven into rubber data gloves provide a tactile dimension. Photo taken at the Computer Science Dept., University of North Carolina. Model Released (1990)
    USA_SCI_VR_07_xs.jpg
  • Virtual reality. Harry Marples, Computer Scientist, programming a system that will allow visitors a 3-D guided tour of a new building before it is even built. Plans for a proposed design are fed into a computer, which is capable of displaying them in sophisticated 3-D graphics. Thus the real building is presented by the computer as a virtual one. Visitors wearing special headsets fitted with video goggles and spatial sensors can move from room to room within the virtual space as if they were in the real world. Optical fibers woven into rubber data gloves provide a tactile dimension. Photo taken at the Computer Science Dept., University of North Carolina. Model Released Model Released (1990)
    USA_SCI_VR_05_xs.jpg
  • Micro Technology: Micromechanics: A processed silicon wafer containing hundreds of micro mechanic pressure sensors. Tweezers are being used to remove faulty sensors, labeled by an automatic test device with a black dot of ink.
    USA_SCI_MICRO_11_xs.jpg
  • A rancher in Halfway, Oregon, Bob Goodman lost his arm below his elbow in a freak accident. Researchers at the University of Utah attached a myoelectric arm, which he controls by flexing the muscles in his arm that are still intact. Sensors on the inside of the prosthetic arm socket pick up the faint electrical signals from the muscles and amplify them to control the robot arm. In this way, Goodman can cook his dinner and do his chores, just as he did before the accident. From the book Robo sapiens: Evolution of a New Species, page 179 top.
    USA_rs_392_qxxs.jpg
  • Bob Goodman, a rancher in Halfway, Oregon, lost his arm in a freak accident. Researchers at the University of Utah gave him a myoelectric arm, which he controls by flexing the muscles in his arm that are still intact. Sensors on the inside of the prosthetic arm socket pick up the faint electrical signals from the muscles and amplify them to control the robot arm. In this way, Goodman can do most things as he did before his accident. Seen here cutting his meat while having lunch with his girlfriend at a café in Halfway, Oregon.
    USA_SCI_MEARM_393_xs.jpg
  • Bill Haeck of Rock Springs, Wyoming is an avid hunter who relies on his artificial myoelectric arm to continue his hobby after losing his arm in an accident.  Researchers at the University of Utah gave him a myoelectric arm, which he controls by flexing the muscles in his arm that are still intact. Sensors on the inside of the prosthetic arm socket pick up the faint electrical signals from the muscles and amplify them to control the robot arm. In this way, Haeck can do most things as he did before his accident but he often forgets to charge the battery. Seen here target shooting behind his house.
    USA_SCI_MEARM_08_xs.jpg
  • Bob Goodman, a rancher in Halfway, Oregon, lost his arm in a freak accident. Researchers at the University of Utah gave him a myoelectric arm, which he controls by flexing the muscles in his arm that are still intact. Sensors on the inside of the prosthetic arm socket pick up the faint electrical signals from the muscles and amplify them to control the robot arm. In this way, Goodman can do most things as he did before his accident. Here he is using a pitchfork to throw hay over the fence to his horses.
    USA_SCI_MEARM_03_xs.jpg
  • Bob Goodman, a rancher in Halfway, Oregon, lost his arm in a freak accident. Researchers at the University of Utah gave him a myoelectric arm, which he controls by flexing the muscles in his arm that are still intact. Sensors on the inside of the prosthetic arm socket pick up the faint electrical signals from the muscles and amplify them to control the robot arm. In this way, Goodman can do most things as he did before his accident. Here he is putting his arm on right after he wakes up and gets dressed in his bedroom.
    USA_SCI_MEARM_01_xs.jpg
  • Relaxing in his office at the Mechanical Engineering Lab in Tsukuba, Japan, Takanori Shibata pats a derivative product from his research: a robot cat named Tama. Shibata is a roboticist who studied with MIT robot guru Rodney Brooks before heading his own lab. Omron, a Japanese engineering company, applied Shibata's discoveries to produce Tama, a mechanical pet with sensors beneath its fur that react to sound and touch.  Omron says it has no plans as of yet to commercialize its robot cats. From the book Robo sapiens: Evolution of a New Species, page 227.
    Japan_JAP_rs_33_qxxs.jpg
  • Hanging from a network of cables, Brachiator III quickly swings from "branch" to "branch" like the long-armed ape it was modeled on. (Brachiator refers to "brachiation," moving by swinging from one hold to another.) The robot, which was built in the laboratory of Toshio Fukuda at Nagoya University (Japan), has no sensors on its body. Instead, it tracks its own movements with video cameras located about four meters away. Brightly colored balls attached to the machine help the cameras discern its position. Brachiator's computer, which is adjacent to the camera, takes in the video images of the machine's progress and uses this data to send instructions to the machine's arms and legs. From the book Robo sapiens: Evolution of a New Species, page 87.
    Japan_JAP_rs_272_qxxs.jpg
  • Nuclear Winter test fire: brush fires deliberately started to study the potential climatic effects of a nuclear war. The nuclear winter theory predicts that smoke from fires burning after a nuclear war would block sunlight, causing a rapid drop in temperature that would trigger serious ecological disturbance. The test burn took place in December 1986 on 500 acres of brush in Lodi Canyon, Los Angeles. Dripping napalm from a helicopter ignited the fire. Ground-based temperature sensors were used to study soil erosion. Various airborne experiments included smoke sampling & high-altitude infrared imaging from a converted U-2 spy plane.
    USA_SCI_NUKE_24_xs.jpg
  • Nuclear Winter test fire: fire crews rest while monitoring the brown smoke rising from smoldering brush fires, deliberately started to study the potential climatic effects of a nuclear war. The nuclear winter theory predicts that smoke from fires burning after a nuclear war would block sunlight, causing a rapid drop in temperature that would trigger serious ecological disturbance. The test burn took place in December 1986 on 500 acres of brush in Lodi Canyon, Los Angeles. Dripping napalm from a helicopter ignited the fire. Ground-based temperature sensors were used to study soil erosion. Various airborne experiments included smoke sampling & high-altitude infrared imaging from a converted U-2 spy plane.
    USA_SCI_NUKE_23_xs.jpg
  • Virtual or artificial reality. Alvar Green, CEO of Autodesk in 1990, Playing Cyberspace, a sophisticated videogame designed by AutoDesk Inc., USA. The computer monitor displays an image of one of Cyberspace's virtual (non-real) environments - a room - into which the player enters by wearing a headset & data glove. Two video images of the environment fit are projected into the eyes, whilst physical interaction is achieved through spatial sensors in the headset & optical fibers woven into the black rubber data glove, which send data to the computer on the player's position & movements in space. Alvar Green Model Released (1990)
    USA_SCI_VR_26_xs.jpg
  • Virtual reality: Jaron Lanier, head of VPL Research of Redwood City, California, photographed surrounded by demonstration images of the virtual, non-real worlds that VPL have created. Fiber- optic sensors in the black rubber glove Lanier is wearing transmit a user's movements into the computer-generated virtual environment. A user's view of such a world is projected by the computer into 2 eye phones mounted on a headset (seen unworn at left, on top of the computer monitor). Model Released (1990)
    USA_SCI_VR_24_xs.jpg
  • Virtual reality: Jaron Lanier, head of VPL Research of Redwood City, California. Fiber- optic sensors in the black rubber glove Lanier is wearing transmit a user's movements into the computer-generated virtual environment. A user's view of such a world is projected by the computer into 2 eye phones mounted on a headset. Model Released (1990)
    USA_SCI_VR_22_xs.jpg
  • Virtual reality: Jaron Lanier, head of VPL Research of Redwood City, California, photographed surrounded by demonstration images of the virtual, non-real worlds that VPL have created. Fiber- optic sensors in the black rubber glove Lanier is wearing tranmsit a user's movements into the computer-generated virtual environment. A user's view of such a world is projected by the computer into 2 eyephones mounted on a headset (seen unworn at left, on top of the computer monitor). Model Released (1990)
    USA_SCI_VR_21_xs.jpg
  • Virtual reality in air traffic control (ATC) systems. Bill Wiseman from the University of Washington Human Interface Technology Laboratory, Seattle, demonstrating how ATC might operate in the future. Optical fiber sensors in his black data glove & the pink-rimmed micro-laser scanner glasses connect the operator with a virtual, computer-generated, 3-D image of the airspace he is controlling. Through raising his gloved hand to touch an icon (projected image) of an approaching jet, he is placed in instant voice communication with the pilot. This photograph was taken with the cooperation of SEA/TAC international airport, Seattle. MODEL RELEASED. (1990)
    USA_SCI_VR_10_xs.jpg
  • Applications of virtual reality systems in medical education. Here, Scott Delp and Scott Fisher are using a system developed at NASA's Ames Research Centre in Menlo Park, California, to study the anatomy of the human leg. They both wear a headset equipped with 3-D video displays to view the computer-generated graphical images - one is shown between the two doctors. Physical exploration of the leg anatomy is afforded by using the data glove, a black rubber glove with woven optical fiber sensors, which relays data on their physical hand movements back to the computer. Model Released (1990)
    USA_SCI_VR_06_xs.jpg
  • Micro Technology: Micromechanics: Scanning electron micrograph (SEM) of a mite (Acarimetaseiulus occidentalis) on the surface of a silicon micro-resonator 'chip'. The micro- resonator, or 'semaphore structure', is a product of micromechanics. Micro-resonators are use to make tiny vibration sensors for engineering use. The comb-like detector ends of the micro- resonators are seen here, a thin strand of silicon running from the left detector toward top left is attached to a large resonant mass. The absence of a resonant mass fixed to the right detector indicates a fault in manufacture. To give an idea of scale, the silicon strand is 2 microns thick and 2 microns wide. Reid Brennan's semaphore structure with mite. [1990]
    USA_SCI_MICRO_15_xs.jpg
  • A rancher in Halfway, Oregon, Bob Goodman lost his arm below his elbow in a freak accident. Researchers at the University of Utah attached a myoelectric arm, which he controls by flexing the muscles in his arm that are still intact. Sensors on the inside of the prosthetic arm socket pick up the faint electrical signals from the muscles and amplify them to control the robot arm. In this way, Goodman can cook his dinner and do his chores, just as he did before the accident. From the book Robo sapiens: Evolution of a New Species, page 179 bottom.
    USA_rs_394_qxxs.jpg
  • Sweet Lips the robot guide takes visitors through the Hall of North American Wildlife, near the Dinosaur Hall in the Carnegie Museum of Natural History in Pittsburgh, PA. Carnegie Mellon University robotics professor Illah R. Nourbakhsh's creation draws children like a pied piper by speaking and playing informational videos on its screen. It navigates autonomously, using a locator system that detects colored squares mounted high on the wall. A color camera and scores of sonar, infrared, and touch sensors prevent Sweet Lips from crashing into museum displays or museum visitors. From the book Robo sapiens: Evolution of a New Species, page 220-221.
    USA_rs_104_qxxs.jpg
  • Bill Haeck of Rock Springs, Wyoming is an avid hunter who relies on his artificial myoelectric arm to continue his hobby after losing his arm in an accident.  Researchers at the University of Utah gave him a myoelectric arm, which he controls by flexing the muscles in his arm that are still intact. Sensors on the inside of the prosthetic arm socket pick up the faint electrical signals from the muscles and amplify them to control the robot arm. In this way, Haeck can do most things as he did before his accident but he often forgets to charge the battery. Seen here target shooting behind his house.
    USA_SCI_MEARM_09_xs.jpg
  • Bob Goodman, a rancher in Halfway, Oregon, lost his arm in a freak accident. Researchers at the University of Utah gave him a myoelectric arm, which he controls by flexing the muscles in his arm that are still intact. Sensors on the inside of the prosthetic arm socket pick up the faint electrical signals from the muscles and amplify them to control the robot arm. In this way, Goodman can do most things as he did before his accident. Here he is arm-wrestling with a neighbor in a local bar called the Sportsman's Club: showing off the strength of his electric arm motor. (Actually the arm has no lateral force, only frontal, but the hand does have more gripping power than a normal hand.)
    USA_SCI_MEARM_07_xs.jpg
  • Bob Goodman, a rancher in Halfway, Oregon, lost his arm in a freak accident. Researchers at the University of Utah gave him a myoelectric arm, which he controls by flexing the muscles in his arm that are still intact. Sensors on the inside of the prosthetic arm socket pick up the faint electrical signals from the muscles and amplify them to control the robot arm. In this way, Goodman can do most things as he did before his accident.
    USA_SCI_MEARM_05_xs.jpg
  • Bob Goodman, a rancher in Halfway, Oregon, lost his arm in a freak accident. Researchers at the University of Utah gave him a myoelectric arm, which he controls by flexing the muscles in his arm that are still intact. Sensors on the inside of the prosthetic arm socket pick up the faint electrical signals from the muscles and amplify them to control the robot arm. In this way, Goodman can do most things as he did before his accident. Here he is using a drill press in the workshop in his barn.
    USA_SCI_MEARM_04_xs.jpg
  • Bob Goodman, a rancher in Halfway, Oregon, lost his arm in a freak accident. Researchers at the University of Utah gave him a myoelectric arm, which he controls by flexing the muscles in his arm that are still intact. Sensors on the inside of the prosthetic arm socket pick up the faint electrical signals from the muscles and amplify them to control the robot arm. In this way, Goodman can do most things as he did before his accident.
    USA_SCI_MEARM_02_xs.jpg
  • At the Tsukuba Mechanical engineering Lab (M.E.L.), Japan, a robotic hand with tactile sensors gently grips an orange. The robotic hand is equipped with tactile sensors in the finger tips to transmit a signal back to the operator. Designed by Hitoshi Maekawa Ph.D. a researcher in the cybernetics division of the Department of Robotics of Tuskuba MEL. Over the last 8 years, Maekawa has developed a robotic hand with tactile sensors that can hold items in its fingertips and compensate for slippage. His research is into dynamic grasping force control for a multi-fingered hand. (Paper on project was presented at the IEEE International Conference On Robotics and Automation, 1996. Work is ongoing).
    Japan_Jap_rs_32A_120_xs.jpg
  • In an oddly ghoulish bit of dental R&D, Waseda University engineers have built a "jaw-robot" from a skull, some electronic circuitry, and an assembly of pulleys, wheels, and cables that act like muscle. Sensors measure the biting action of the jaw and the force of the chewing. Japan. From the book Robo sapiens: Evolution of a New Species, page 173.
    Japan_JAP_rs_41_qxxs.jpg
  • Sucking up ashes in a London living room, the RoboVac, shown here in a photo-illustration, shuttles randomly around the area, vacuuming everything in its path. Built by Kärcher, a German appliance company, the RoboVac monitors the level of dirt in the stream of incoming air with its optical sensors, that is, it detects when an area especially needs cleaning. When the RoboVac hits a grimy spot, the machine passes back and forth over it until the incoming air is clean, and so too, presumably, is the floor. London, UK. From the book Robo sapiens: Evolution of a New Species, page 164-165.
    GBR_rs_8_qxxs.jpg
  • Even someone who believes that in the future most humans will become the slaves of all-powerful machines has to have a laugh sometimes. Why not have it with toy machines? Taking a moment off from his work at the cybernetics department at the University of Reading in the UK, Kevin Warwick (on left), author of March of the Machines: Why the New Race of Robots Will Rule the World, plays with Lego Mindstorm robots that his students have programmed to box with each other. The toys are wildly popular with engineers and computer scientists because they can be programmed to perform an amazing variety of tasks. In this game, sensors on the toys determine which machine has been hit the most. In his more serious work, Warwick is now trying to record his neural signals on a computer and replay them into his nervous system. From the book Robo sapiens: Evolution of a New Species, page 222-223.
    GBR_rs_2_qxxs.jpg
  • Virtual reality in air traffic control (ATC) systems. Bill Wiseman from the University of Washington Human Interface Technology Laboratory, Seattle, demonstrating how ATC might operate in the future. Optical fiber sensors in his black data glove & the pink-rimmed micro-laser scanner glasses connect the operator with a virtual, computer-generated, 3-D image of the airspace he is controlling. Through raising his gloved hand to touch an icon (projected image) of an approaching jet, he is placed in instant voice communication with the pilot. This photograph was taken with the cooperation of SEA/TAC international airport, Seattle. MODEL RELEASED. (1990)
    USA_SCI_VR_12_xs.jpg
  • Virtual sex. Pornographic application of virtual reality, showing a man mauling his virtual conquest provided by his headset and data glove & an unseen computer system. Virtual, in computer parlance, describes equipment or programs that assume one form yet give the illusion of another. Here, the image of the woman is provided by the system through goggles in the head-set; contact is effectively faked by optic-optic sensors in the black, rubber data glove, which relay information on aspect and movement of the man's fingers. Photographed at Autodesk Inc., USA. MODEL RELEASED. (1990)
    USA_SCI_VR_08_xs.jpg

Peter Menzel Photography

  • Home
  • Legal & Copyright
  • About Us
  • Image Archive
  • Search the Archive
  • Exhibit List
  • Lecture List
  • Agencies
  • Contact Us: Licensing & Inquiries