Monday, March 23, 2009

The MULE




The MULE



This prototype of the Gladiator, built by Carnegie Mellon, is equipped with smoke generators and grenade launchers. Marine Corps officials say they will field-test the latest, 1-ton version this year. (Photograph by Craig Cameron Olsen)
Despite the challenges, armed UGV development is on the rise. Foster-Miller is currently working on a successor to the SWORDS, a larger and more versatile robot called the MAARS (Modular Advanced Armed Robotic System). Technicians in the field will be able to replace the system’s M240 machine gun (the same kind currently planned for the MULE) with an arm or trade the tracks for wheels. However, the MAARS requires a human operator to move and acquire targets. IRobot, the maker of thousands of bomb-defusing PackBots, plans to introduce its Warrior X700 this year. The Warrior is larger than the PackBot and has a similar set of articulated tracks that allows it to climb stairs, and a 150-pound carrying capacity. The company is touting the Warrior’s ability to fight fires, haul wounded and serve as a weapons platform. But according to Joe Dyer, the president of iRobot Government & Industrial Robots division, a key benefit of an armed UGV isn’t what it can dish out, but what it can take: “A robot can shoot second.” The Warrior is able to follow GPS waypoints, can breach ditches and navigate cramped conditions on its own, but it will still rely heavily on human guidance in a fight. Where weapons are involved, Dyer says, “Autonomy’s going to come into robots on little cat’s feet.” Like their bomb-poking forebears, weaponized bots are disposable, making them particularly useful in urban warfare, with its high potential for collateral damage and sudden, point-blank firefights. “Robots are fearless, so there’s an opportunity to better assess the situation,” Dyer says. “That means less risk to noncombatants and to friendly forces.” In urban warfare, where troops often lose the high-tech edge, an armed ground robot is the perfect point man. “Send a robotic platform into a room, and it might take some small arms fire,” Shoop says. “But it can be repaired fairly easily. A soldier or Marine is not as easily repaired.” The MULE is toying with my emotions. After running through its full range of articulated positions—a hilarious diagnostic dance routine that has it pivoting, rising and tipping its wheels off the road—the robot is now ramming a car. The sedan offers little resistance, sliding across the asphalt. Like proud owners watching their pit bull tear through a chew toy, the small crowd of defense contractors and engineers are chuckling. Next, the MULE climbs onto a 5-ft.-high platform and prepares to cross a 6-ft. gap. The robot reels back on its hind legs. It inches forward and falls across the space, its front wheels slamming onto the next platform. Although it was moved into position by a human operator, the robot’s terrain-clearing performance was automatic, using internal sensors that track wheel position and speed and two onboard Pentium III processors cycling an array of mobility algorithms. Despite being blind, the MULE is already surprisingly autonomous. The exact details haven’t been worked out, but the goal is for a single sergeant to handle multiple robots. But no matter how sophisticated the robot, Lockheed officials point out, it will never fire without a command from a human operator. Having a person decide when to shoot is a recurring theme in discussions about armed robots. Maj. David Byers, the assistant program manager for the MULE, compares the likelihood of the robot’s weapons discharging accidentally to a modern tank inexplicably firing off a round. Using the UGV’s sensors, a human will confirm that each target is a hostile before firing. “Armed robots are still foreign to Army culture,” he says. “We need to cultivate the understanding that they are quite safe.” The demonstration is winding down, and after a slew of caveats and reassurances, it’s my turn to drive. On a grassy slope overlooking the track, an engineer hands me the Xbox 360 controller. I will not, I’m told, get to wear the shiny Rocketeer backpack. The game controller is surprisingly standard-issue—no external tweaks or mods. When I hit a button, the prototype rumbles forward. I jam the thumbstick to one side, and the robot turns in place, its fresh wheel screeching, painting a perfect circle on the asphalt. I guide the MULE through a small parking lot, around cars, and across a muddy patch to give the tires a little more traction. The robot is responsive, literally leaning into turns and braking with finesse. My fingers keep hitting the unused buttons, automatically probing for the one that opens fire. In the sci-fi cult classic The Last Starfighter, the teen hero is drafted into a galactic war. The arcade game he spent hours mastering was really an alien simulator, and with a quick costume change, he’s reborn as an ace pilot. For 10 minutes, my fantasy is much better: Years of Saturday afternoons and missed classes and so-called sick days spent clicking away at a game console—it wasn’t wasted time; it was training. I have become a crack military robot pilot. Time’s up, and I hand back the controller, the prototype still rumbling away, slightly muddier than I found it. We head down the hill, and as I pass an engineer, I mention how easy it is to drive. “Yeah, we based the controls on Project Gotham Racing,” he says. It’s a joke, but the quip offers a glimpse of what future warfare might look like—robotic, autonomous and just a little bit chilling.



Robot Versus Sniper


Securing an intersection is a basic combat task that the Pentagon hopes will one day be tackled by unattended robots. In the scenario below, set in 2020, an armed robot has forged ahead of a squad (not shown) to determine if a sniper is stationed at a key corner. As simple as the mission might seem, it’s a huge engineering challenge to program the skills needed for the assignment into a robot’s brain. Experts say success will require integrated sensors to double for human sensory organs and powerful processing of the data to mimic human training—and instinct. “What is intuition?” asks Jon Bornstein, head of the Army Research Lab’s robotics office. “A series of cues that give a high probability of something occurring.”

America's Robot Army: Are Unmanned Fighters Ready for Combat?




















At a muddy test track in Grand Prairie, Texas, 13 miles west of Dallas, the robot is winning. It has climbed on top of a sedan, its 2.5-ton bulk propped on the crumpled roof. The car never stood a chance.
digg_url = 'http://digg.com/tech_news/Meet_America_s_new_armed_robots';
The MULE (Multifunction Utility/Logistics and Equipment) is roughly the size of a Humvee, but it has a trick worthy of monster truck rallies. Each of its six wheels is mounted on an articulated leg, allowing the robot to clamber up obstacles that other cars would simply bump against. Right now, it’s slowly extricating itself from the caved-in roof, undulating slightly as it settles into a neutral stance on the asphalt. This prototype’s movements are precise, menacing and slow. When the final product rolls onto the battlefield in six years, it will clear obstacles in stride, advancing without hesitation. And, like the robot cars that raced through city streets in last fall’s Pentagon-funded DARPA Urban Challenge, the MULE will use sensors and GPS coordinates to pick its way through a battlefield. If a target is detected, the machine will calculate its own firing solutions and wait for a remote human operator to pull the trigger. The age of killer robots is upon us. But here at defense contractor Lockheed Martin’s test track, during a demonstration for Popular Mechanics, this futuristic forerunner of the robot army has a flat tire. “Actually, this is good,” says Michael Norman, Lockheed’s project manager for the prototype. “You’ll be able to see how quick it is to swap in a new tire.” He nods toward an engineer holding an Xbox 360 controller and wearing a gigantic, gleaming backpack that contains a processing computer. The engineer taps a handheld touchscreen. One of the robot’s wheeled legs rotates upward, and a two-man crew goes to work. Each leg has its own hub motor to allow for a variety of ­positions. If one leg is blown off by enemy fire or a roadside bomb, the rest are able to soldier on, with the robot automatically adjusting its center of gravity to stay mobile. It’s highly functional. But with its engine powered down—it runs on a Mercedes-built engine originally modified for unmanned aerial vehicles (UAVs)—and one leg cocked gamely in the air, the MULE doesn’t look so tough right now. In fact, the MULE isn’t ready for battle. Barely a year old, the prototype is a product of the Army’s Unmanned Ground Vehicle program, which began in 2001. It has yet to fire a single bullet or missile, or even be fitted with a weapon. Here at the test track it’s loaded down with rucksacks and boxes, two squads’ worth of equipment. At the moment, the MULE has no external sensors. “We’re 80 percent through the initial phase,” Norman says, “but we don’t have the perception fully tested. It knows heading and speed, but it’s blind.” In other words, it’s essentially one of the world’s biggest radio-control cars. And, eyeing the robot’s familiar controller, I realize I might have a shot at driving it. I know my way around a video-game console, but the engineers are noncommittal about my request to drive the MULE. The goal, of course, is for the MULE to drive itself. Sitting a short distance away is the prototype’s future, a full-size mockup of a weaponized variant, its forward-facing machine gun bracketed by missile tubes. The gleaming sphere set on a short mast looks precisely like a robot’s eyeball. It will visually track moving targets, allowing operators to zoom in for a closer look before pulling the trigger. According to the Army, this giant prop represents a revolutionary shift in how we will wage wars. This is the face of the robotic infantry. Unmanned ground vehicles (UGVs) have already flooded the battlefield. There are at least 6000 robots in use by the Army and Marine Corps in Iraq and Afghanistan. For years these small, remote-control vehicles have allowed troops to peek around corners and investigate suspected bombs. And while unmanned aerial vehicles have been loaded with missiles since 2001, the arming of ground robots is relatively uncharted territory. Last June the Army deployed the first-ever armed UGVs. Three SWORDS (Special Weapons Observation Remote Direct-Action System) robots landed in Iraq, each equipped with an M249 light machine gun. These UGVs are essentially guns on tracks, a variant of the remote-control Talon bots routinely blown up while investigating improvised explosive devices. When the trio was approved for combat duty, the potential for historic robot-versus-human carnage lit up the blogosphere. Never mind the dozens of air-to-ground Hellfire missiles that have already been launched by a squadron of armed Predator drones over the past seven years—this was a robot soldier, packing the same machine gun used by ground troops. The historic deployment ended with a whimper after the Army announced that the SWORDS would not be given the chance to see combat. According to a statement from Duane Gotvald, deputy project manager of the Robotic Systems Joint Project Office, which oversees robots used by the Army and Marines, “While there has been considerable interest in fielding the system, some technical issues still remain and SWORDS is not currently funded.” The robots never fired a shot, but Gotvald pointed out that the Army’s 3rd Infantry Division used them for surveillance and “peacekeeping/guard operations.” The nature of the robots’ “technical issues” remains an open question. The Army has not released details, and officials with Foster-Miller, the Massachusetts-based contractor that developed the SWORDS, refused interview requests for this story. But according to Col. Barry Shoop, deputy head of West Point’s electrical engineering and computer science department, the reason armed UGVs continue to lag behind UAVs is because of their mission: close-quarters firefights. “The technical challenges are greater,” Shoop says. “Think of the kind of image and graphics processing you need to make positive identification, to use lethal force. That’s inhibiting.”

AMERICA"S ROBOT ARMY







Already there are killing machines operating by remote control. Soon the machines will be able to kill on their own initiative. A new warfare is on its way

War is about to change, in terrifying ways. America's next wars, the ones the Pentagon is now planning, will be nothing like the conflicts that have gone before them.

In just a few years, US forces will be able to deal out death, not at the squeeze of a trigger or even the push of a button, but with no human intervention whatsoever. Many fighting soldiers - those GIs in tin hats who are dying two a day in Iraq - will be replaced by machines backed up by surveillance technology so penetrating and pervasive that it is referred to as "military omniscience". Any Americans involved will be less likely to carry rifles than PlayStation-style consoles and monitors that display simulated streetscapes of the kind familiar to players of Grand Theft Auto - and they may be miles from where the killing takes place.

War will progressively cease to be the foggy, confusing, equalising business it has been for centuries, in which the risks are always high, everyone faces danger and suffers loss, and the few can humble the mighty. Instead, it will become remote, semi-automatic and all-knowing, entailing less and less risk to American lives and taking place largely out of the sight of news cameras. And the danger is close to home: the coming wars will be the "war on terror" by other names, conflicts that know no frontiers. The remote-controlled war coming tomorrow to Khartoum or Mogadishu, in other words, can happen soon afterwards, albeit in moderated form, in London or Lyons.

This is no geeky fantasy. Much of the hardware and software already exists and the race to produce the rest is on such a scale that US officials are calling it the "new Manhattan Project". Hundreds of research projects are under way at American universities and defence companies, backed by billions of dollars, and Donald Rumsfeld's department of defence is determined to deliver as soon as possible. The momentum is coming not only from the relentless humiliation of US forces at the hands of some determined insurgents on the streets of Baghdad, but also from a realisation in Washington that this is the shape of things to come. Future wars, they believe, will be fought in the dirty, mazy streets of big cities in the "global south", and if the US is to prevail it needs radically new strategies and equipment.

Only fragments of this story have so far appeared in the mainstream media, but enough information is available on the internet, from the comments of those in charge and in the specialist press to leave no room for doubt about how sweeping it is, how dangerous and how imminent.

Military omniscience is the starting point. Three months ago Tony Tether, director of the Defence Advanced Research Projects Agency (Darpa), the Pentagon's research arm, described to a US Senate committee the frustration felt by officers in Iraq after a mortar-bomb attack. A camera in a drone, or unmanned aircraft, spotted the attackers fleeing and helped direct US helicopters to the scene to destroy their car - but not before some of those inside had got out. "We had to decide whether to follow those individuals or the car," he said, "because we simply didn't have enough coverage available." So some of the insurgents escaped. Tether drew this moral: "We need a network, or web, of sensors to better map a city and the activities in it, including inside buildings, to sort adversaries and their equipment from civilians and their equipment, including in crowds, and to spot snipers, suicide bombers or IEDs [improvised explosive devices] . . . This is not just a matter of more and better sensors, but, just as important, the systems needed to make actionable intelligence out of all the data."

Darpa has a host of projects working to meet those needs, often in surprising ways. One, called Combat Zones That See, aims to scatter across cities thousands of tiny CCTV cameras, each equipped with wireless communication software that will make it possible to link their data and track the movements of every vehicle on the streets. The cameras themselves will not be that different from those found in modern mobile phones

MACHINE LIFE






MACHINE EVOLUTION
Perhaps on other planets, organisms have evolved into machine life, cyborgs, or an entirely new form of synthetic life we cannot as yet comprehend. Indeed people have seen aliens that have appeared to be machine like with body armor or complex space suits.
These two publications are presented jointly here because they create a bridge between two generations of artists who nonetheless share concerns such as abandoning a strictly instrumental use of technology to embrace random phenomena and simulating human perception. Representing this first generation since the sixties, Norman White, a professor at the Ontario College of Art (OCAD, formerly OCA) in Toronto, Canada, taught in the seventies and eighties with his colleague Doug Back to many artists from the subsequent generation (the eighties), including David Rokeby. In addition to the author contributions, the catalogues for these exhibitions include multimedia complements (a CD-ROM for "Machine Media" and "Norm’s Robots" as well as a DVD for "David Rokeby"), along with video excerpts to complete the documentation of each of the works exhibited.

The catalogue for "Machine Life" brings together three author contributions. In "Norman White, Beginning," Ihor Holubizky looks at the remarkable critical interest sparked by media arts between 1968 and 1970, a period when White’s work was first emerging. White took part in some of the major exhibits organized in the United States in the late sixties, including "Some More Beginnings: An Exhibition of Submitted Works Involving Technical Materials and Processes" in 1969 at the Brooklyn Museum (New York, U.S.). Holubizky follows White’s career from the artist’s first robotic works of the seventies to his recent projects that accentuate the playfulness of his work. The author presents White’s sometimes ambivalent views on the difficult relationship between art and technology. He concludes by underlining the entropy found in White’s work, which distinguishes it from many artistic projects modelled on the notion of technical or scientific progress. In "Taken with Surprise," Caroline Langill points out that like White, artists from the subsequent generation were interested in technology’s unpredictability. When computer units of a media artwork exchange data randomly, the end results produce a range of varied experiences for viewers. In "Encountering Machine Life," Jan Allen, like Caroline Langill, stresses the role of entropy in White’s works, which, according to this artist, "represent unusable experimental models." The artist’s process may tend toward a form of productive failure in which technology frees itself from the uses predetermined by its functions. Allen explores the years of creative exchanges between White and his former colleague Doug Back from the Ontario College of Art. The two artists shared an interest in manual work that resulted in the construction of technological components for artworks as well as a decompartmentalized approach to interactivity. A description of works presented within the exhibition follows.

The catalogue for "David Rokeby" includes two author contributions. In "Between Chaos and Order: The Garden and the Computer in the Work of David Rokeby," Su Ditta relates Rokeby’s work to a garden, a space inciting both action and contemplation. She tracks Rokeby’s career path, describing emblematic works created since the eighties. She separates projects exploiting sound and language (Very Nervous System [1986-2004], The Giver of Names [1991-] and n-cha(n)t [2001]) from projects concerned with the boundaries between computer vision, controlled by a function of analytical observation, and human vision (Watch [1995], Seen [2002] and Taken [2002]). Other works (Machine for Taking Time [2001] and Steamingmedia.org [2002]) are linked to complex devices in which a real site coexists simultaneously with several representations of the site captured at different times of the day or year. In "Interpolation: The Method of David Rokeby," Sara Diamond reports on Rokeby’s technological and aesthetic research while evoking artistic projects situated between technical application (software, protocol) and artistic intervention. Diamond emphasizes the concept of invention in Rokeby’s work and the manner in which he himself creates the technological tools required to produce his works so that these tools then exist independently (Very Nervous System, The Giver of Names). The author analyzes this last work from the perspective of a theoretical reflection on the differences between strictly human faculties and computer functions for capturing and analyzing data. Finally, Diamond comments on the artist’s use of sophisticated surveillance tools whereby the immersion in the image and the aesthetic experience are often accompanied by a paradoxical update of the technology’s essentially coercive functions

ROBOT LIFE






For more than 30 years a dedicated team of scientists and engineers at Waseda University in Tokyo, Japan, have been working on a project that will integrate robots into our everyday lives. One of the main functions for most of these robots is an ability to define its position and navigate. 




The first step is to use different sensors, such as inertial sensors, satellite navigation system receivers, magnetic sensors, RFID tags, and others, which are operating in coordinate domain. These sensors provide robot with coordinates, and coordinate-related parameters such as azimuth and orientation. The navigation process using these sensors can be in general terms described as an artificial navigation. 

The Waseda Humanoid Robot Project, headed by Professor Shuji Hashimoto, is researching the integration of robots into our social infrastructure, or human-robot symbiosis. The advanced adaptability of robots to humanity and environment is highly required in the aging society and in the symbiotic future with natural environment (see sidebar). 

Waseda University established the Humanoid Robotics Institute in April 2000 to promote research that aims to construct a new relationship between humans and machines in an advanced information society. As researchers, we expect that in sometime in this century robots will provide housework assistance for the elderly, a well as entertainment and other functions to improve the quality of life for humans. To make possible this symbiosis between humankind, robot, and the environment, we need to build accommodations into a house's structure and functions

Robot Versus Sniper






Securing an intersection is a basic combat task that the Pentagon hopes will one day be tackled by unattended robots. In the scenario below, set in 2020, an armed robot has forged ahead of a squad (not shown) to determine if a sniper is stationed at a key corner. As simple as the mission might seem, it’s a huge engineering challenge to program the skills needed for the assignment into a robot’s brain. Experts say success will require integrated sensors to double for human sensory organs and powerful processing of the data to mimic human training—and instinct. “What is intuition?” asks Jon Bornstein, head of the Army Research Lab’s robotics office. “A series of cues that give a high probability of something occurring.”