Showing posts with label ROBOTS. Show all posts
Showing posts with label ROBOTS. Show all posts

Friday, April 10, 2015

DOCTORS TRAIN WITH HUMAN PATIENT SIMULATOR

FROM:  NATIONAL SCIENCE FOUNDATION
How robots can help build better doctors
Research seeks to make better 'human patient simulators'

A young doctor leans over a patient who has been in a serious car accident and invariably must be experiencing pain. The doctor's trauma team examines the patient's pelvis and rolls her onto her side to check her spine. They scan the patient's abdomen with a rapid ultrasound machine, finding fluid. They insert a tube in her nose. Throughout the procedure, the patient's face remains rigid, showing no signs of pain.

The patient's facial demeanor isn't a result of stoicism--it's a robot, not a person. The trauma team is training on a "human patient simulator," (HPS) a training tool which enables clinicians to practice their skills before treating real patients. HPS systems have evolved over the past several decades from mannequins into machines that can breathe, bleed and expel fluids. Some models have pupils that contract when hit by light. Others have entire physiologies that can change. They come in life-sized forms that resemble both children and adults.

But they could be better, said Laurel D. Riek, a computer science and engineering professor at the University of Notre Dame. As remarkable as modern patient simulators are, they have two major limitations.

"Their faces don't actually move, and they are unable to sense or respond to the environment," she said.

Riek, a roboticist, is designing the next generation of HPS systems. Her NSF-supported research explores new means for the robots to exhibit realistic, clinically relevant facial expressions and respond automatically to clinicians in real time.

"This work will enable hundreds of thousands of doctors, nurses, EMTs, firefighters and combat medics to practice their treatment and diagnostic skills extensively and safely on robots before treating real patients," she said.

One novel aspect of Riek's research is the development of new algorithms that use data from real patients to generate simulated facial characteristics. For example, Riek and her students have recently completed a pain simulation project and are the first research group to synthesize pain using patient data. This work won them best overall paper and best student paper at the International Meeting on Simulation in Healthcare, the top medical simulation conference.

Riek's team is now working on an interactive stroke simulator that can automatically sense and respond to learners as they work through a case. Stroke is the fifth leading cause of death in the United States, yet many of these deaths could be prevented through faster diagnosis and treatment.

"With current technology, clinicians are sometimes not learning the right skills. They are not able to read diagnostic clues from the face," she said.

Yet learning to read those clues could yield lifesaving results. Preventable medical errors in hospitals are the third-leading cause of death in the United States.

"What's really striking about this is that these deaths are completely preventable," Riek said.

One factor contributing to those accidents is clinicians missing clues and going down incorrect diagnostic paths, using incorrect treatments or wasting time. Reading facial expressions, Riek said, can help doctors improve those diagnoses. It is important that their training reflects this.

In addition to modeling and synthesizing patient facial expressions, Riek and her team are building a new, fully-expressive robot head. By employing 3-D printing, they are working to produce a robot that is low-cost and will be one day available to both researchers and hobbyists in addition to clinicians.

The team has engineered the robot to have interchangeable skins, so that the robot's age, race and gender can be easily changed. This will enable researchers to explore social factors or "cultural competency" in new ways.

"Clinicians can create different patient histories and backgrounds and can look at subtle differences in how healthcare workers treat different kinds of patients," Riek said.

Riek's work has the potential to help address the patient safety problem, enabling clinicians to take part in simulations otherwise impossible with existing technology.

-- Rob Margetta,
Investigators
Laurel Riek
Related Institutions/Organizations
University of Notre Dame

Tuesday, July 29, 2014

NSF REPORTS ON TELE-ROBOTICS

FROM:  NATIONAL SCIENCE FOUNDATION 
Tele-robotics puts robot power at your fingertips
University of Washington research enables robot-assisted surgery and underwater spill prevention

At the Smart America Expo in Washington, D.C., in June, scientists showed off cyber-dogs and disaster drones, smart grids and smart healthcare systems, all intended to address some of the most pressing challenges of our time.

The event brought together leaders from academia, industry and government and demonstrated the ways that smarter cyber-physical systems (CPS)--sometimes called the Internet of Things--can lead to improvements in health care, transportation, energy and emergency response, and other critical areas.

This week and next, we'll feature examples of Nationals Science Foundation (NSF)-supported research from the Smart America Expo. Today: tele-robotics technology that puts robot power at your fingertips. (See Part 1 of the series.)

In the aftermath of an earthquake, every second counts. The teams behind the Smart Emergency Response System (SERS) are developing technology to locate people quickly and help first responders save more lives. The SERS demonstrations at the Smart America Expo incorporated several NSF-supported research projects.

Howard Chizeck, a professor of electrical engineering at the University of Washington, showed a system he's helped develop where one can log in to a Wi-Fi network in order to tele-operate a robot working in a dangerous environment.

"We're looking to give a sense of touch to tele-robotic operators, so you can actually feel what the robot end-effector is doing," Chizeck said. "Maybe you're in an environment that's too dangerous for people. It's too hot, too radioactive, too toxic, too far away, too small, too big, then a robot can let you extend the reach of a human."

The device is being used to allow surgeons to perform remote surgeries from thousands of miles away. And through a start-up called BluHaptics--started by Chizeck and Fredrik Ryden and supported by a Small Business Investment Research grant from NSF--researchers are adapting the technology to allow a robot to work underwater and turn off a valve at the base of an off-shore oil rig to prevent a major spill.

"We're trying to develop tele-robotics for a wide range of opportunities," Chizeck said. "This is potentially a new industry, people operating in dangerous environments from a long distance."

-- Aaron Dubrow, NSF
Investigators
Fredrik Ryden
Howard Chizeck
Blake Hannaford
Tadayoshi Kohno
Related Institutions/Organizations
BluHaptics Inc
University of Washington

Friday, April 25, 2014

PRESIDENT OBAMA MARKES REMARKS TO MIRAIKAN SCIENCE AND YOUTH EXPO IN TOKYO, JAPAN

FROM:  THE WHITE HOUSE PRESIDENT 
Remarks by President Obama to Miraikan Science and Youth Expo
Miraikan Museum
Tokyo, Japan

3:27 P.M. JST

PRESIDENT OBAMA:  Konnichiwa.  Please sit down.  Thank you so much.  Well, I want to thank Dr. Mohri and everyone at The Miraikan for welcoming me here today.  And it is wonderful to see all of these outstanding students.  Dr. Mohri is a veteran of two space shuttle missions, embodies the spirit that brings us here together —- the incredible cooperation in science and technology between Japan and the United States.

I want to thank all the students that I had a chance to meet with as we went around the various exhibits.  We heard a message from the international space station.  We saw some truly amazing robots -- although I have to say the robots were a little scary. They were too lifelike.  They were amazing.  And these students showed me some of their experiments, including some soccer-playing robots that we just saw.  And all of the exhibits I think showed the incredible breakthroughs in technology and science that are happening every single day.

And historically, Japan and the United States have been at the cutting-edge of innovation.  From some of the first modern calculators decades ago to the devices that we hold in our hands today -- the smartphones that I’m sure every young person here uses -- Japan and the United States have often led the way in the innovations that change our lives and improve our lives.

And that’s why I’m so pleased that the United States and Japan are renewing the 10-year agreement that makes so much of our science and technology cooperation possible.  Both of our societies celebrate innovation, celebrate science, celebrate technology.  We’re close partners in the industries of tomorrow. And it reminds us why it’s so important for us to continue to invest in science, technology, math, engineering.  These are the schools -- these are the skills that students like all of you are going to need for the global economy, and that includes our talented young women.

Historically, sometimes young women have been less represented in the sciences, and one of the things that I’ve really been pushing for is to make sure that young women, just like young men, are getting trained in these fields, because we need all the talent and brainpower to solve some of the challenges that we’re going to face in the future.

Earlier today, Prime Minister Abe and I announced a new initiative to increase student exchanges, including bringing more Japanese students to the United States.  So I hope you’ll come.  Welcome.  And it’s part of our effort to double students exchanges in the coming years.  As we saw today, young people like you have at your fingertips more technology and more power than even the greatest innovators in previous generations. So there’s no limit to what you can achieve, and the United States of America wants to be your partner.

So I’m very proud to have been here today.  I was so excited by what I saw.  The young people here were incredibly impressive.  And as one of our outstanding astronauts described, as we just are a few days after Earth Day, it’s important when we look at this globe and we think about how technology has allowed us to understand the planet that we share, and to understand not only the great possibilities but also the challenges and dangers from things like climate change -- that your generation is going to help us to find answers to some of the questions that we have to answer.  Whether it’s:  How do we feed more people in an environment in which it’s getting warmer? How do we make sure that we’re coming up with new energy sources that are less polluting and can save our environment?  How do we find new medicines that can cure diseases that take so many lives around the globe?  To the robots that we saw that can save people’s lives after a disaster because they can go into places like Fukushima that it may be very dangerous for live human beings to enter into.  These are all applications, but it starts with the imaginations and the vision of young people like you.

So I’m very proud of all of you and glad to see that you’re doing such great work.  You have counterparts in the United States who share your excitement about technology and science.  I hope you get a chance to meet them.  I hope you get a chance to visit the United States.  As far as I know, we don’t have one of those cool globes, but we have some other pretty neat things in the United States as well.  And I hope we can share those with you if and when you come.

Thank you very much.  And I just want you to know in closing that I really believe that each of you can make a difference.  Gambatte kudasai.  You can do this thing if you apply yourselves.  Thank you.  (Applause.)

END

Thursday, April 24, 2014

DEFENSE SECRETARY HAGEL OBSERVES DEVELOPING TECH DEMONSTRATION

Right:  Arati Prabhakar, director of the Defense Advanced Research Projects Agency, briefs Defense Secretary Chuck Hagel on the Atlas robot and other robotics at the Pentagon, April 22, 2014. The program showcased DARPA technologies and how they contribute to U.S. national security. DOD photo by Marine Corps Sgt. Aaron Hostutler.  

FROM:  U.S. DEFENSE DEPARTMENT 
DARPA Officials Show Hagel Technologies Under Development
American Forces Press Service

WASHINGTON, April 23, 2014 – Defense Advanced Research Projects Agency program personnel demonstrated five technologies under development to Defense Secretary Chuck Hagel in the secretary's conference room yesterday.
DARPA Director Arati Prabhakar provided the secretary with a demonstration of the agency's latest prosthetics technology.

The wounded warrior demonstrating the device was Fred Downs Jr., an old friend of Hagel's who lost an arm in a landmine explosion while fighting in Vietnam. Hagel hugged him and shook his mechanical hand, with Downs joking, "I don't want to hurt you."

"He and I worked together many years ago," said Hagel, who earned two Purple Hearts during his service as an enlisted soldier in Vietnam. "How you doing, Fred? How's your family?"

Downs demonstrated how he controls movements of the arm, which appeared to be partly covered in translucent white plastic, with two accelerometers strapped to his feet. Through a combination of foot movements, he's able to control the elbow, wrist and fingers in a variety of movements, including the “thumbs-up” sign he gave Hagel.

It took only a few hours to learn to control the arm, Downs said.
"It's the first time in 45 years, since Vietnam, I'm able to use my left hand, which was a very emotional time," he said.

Dr. Justin Sanchez, a medical doctor and program manager at DARPA who works with prosthetics and brain-related technology, told Hagel that DARPA's arm is designed to mimic the shape, size and weight of a human arm. It's modular too, so it can replace a lost hand, lower arm or a complete arm.

Hagel said such technology would have a major impact on the lives of injured troops.

"This is transformational," he said. "We've never seen anything like this before."
Next, Sanchez showed Hagel a video of a patient whose brain had been implanted with a sensor at the University of Pittsburgh, allowing her to control an arm with her thoughts.

Matt Johannes, an engineer from the Johns Hopkins University Applied Physics Laboratory, showed Hagel a shiny black hand and arm that responds to brain impulses. The next step is to put sensors in the fingers that can send sensations back to the brain.

"If you don't have line of sight on something you're trying to grab onto, you can use that sensory information to assist with that task," Johannes said.
The tactile feedback system should be operational within a few months, he said.
"People said it would be 50 years before we saw this technology in humans," Sanchez said. "We did it in a few years."

Next, officials gave Hagel an overview of the DARPA Robotic Challenge, a competition to develop a robot for rescue and disaster response that was inspired by the March 2011 Fukushima nuclear incident in Japan.

Virginia Tech University's entrant in the contest, the hulking 6-foot-2-inch Atlas robot developed by Boston Dynamics, stood in the background as Hagel was shown a video of robots walking over uneven ground and carrying things.

Brad Tousley, head of DARPA's Tactical Technology Office, explained to Hagel that Hollywood creates unrealistic expectations of robotic capability. In fact, he said, building human-like robots capable of autonomously doing things such as climbing ladders, opening doors and carrying things requires major feats of engineering and computer science.

Journalists were escorted out before the remaining three technologies could be demonstrated because of classified concerns. A defense official speaking on background told reporters that Hagel was brought up to date on the progress of three other DARPA programs:

-- Plan X, a foundational cyberwarfare program to develop platforms for the Defense Department to plan for, conduct and assess cyberwarfare in a manner similar to kinetic warfare;

-- Persistent close air support, a system to, among other things, link up joint tactical air controllers with close air support aircraft using commercially available tablets; and

-- A long-range anti-ship missile, planned to reduce dependence on intelligence, surveillance and reconnaissance platforms, network links and GPS navigation in electronic warfare environments. Autonomous guidance algorithms should allow the LRASM to use less-precise target cueing data to pinpoint specific targets in the contested domain, the official said. The program also focuses on innovative terminal survivability approaches and precision lethality in the face of advanced countermeasures.

(From a pool report.)

Sunday, January 19, 2014

ROBOSIMIAN

FROM:  NASA 

The Jet Propulsion Laboratory's official entry, RoboSimian, as it awaited the first event at the DARPA Robotics Challenge in December 2013, created to develop ground robots that can work in dangerous, degraded, human-engineered environments. Also known as "Clyde," the robot is four-footed but can also stand on two feet. It has four general-purpose limbs and hands capable of mobility and manipulation.

Multiple points of contact increase stability during operations that range from climbing stairs to turning a valve. The design also allows RoboSimian to reverse direction without reorienting itself.

The RoboSimian team is led by JPL. Stanford University, Palo Alto, Calif., collaborated on the development of the robot's unique hands.

The California Institute of Technology, Pasadena, manages JPL for NASA.

Thursday, September 26, 2013

ROBOT PERCEPTION

FROM:  NATIONAL SCIENCE FOUNDATION 
Teaching a computer to perceive the world without human input

Researcher's work could lead to assistive technology for the visually impaired, traffic modeling, and improved navigation and surveillance in robots
Humans can see an object--a chair, for example--and understand what they are seeing, even when something about it changes, such as its position. A computer, on the other hand, can't do that. It can learn to recognize a chair, but can't necessarily identify a different chair, or even the same chair if its angle changes.

"If I show a kid a chair, he will know it's a chair, and if I show him a different chair, he can still figure out that it's a chair," says Ming-Hsuan Yang, an assistant professor of electrical engineering and computer science at the University of California, Merced. "If I change the angle of the chair 45 degrees, the appearance will be different, but the kid will still be able to recognize it. But teaching a computer to see things is very difficult. They are very good at processing numbers, but not good at generalizing things."

Yang's goal is to change this. He is developing computer algorithms that he hopes will give computers, using a single camera, the ability to detect, track and recognize objects, including scenarios where the items drift, disappear, reappear or when other objects obscure them. The goal is to simulate human cognition without human input.

Most humans effortlessly can locate moving objects in a wide range of environments, since they are continually gathering information about the things they see, but it is a challenge for computers. Yang hopes the algorithms he's developing will enable computers to do the same thing, that is, continually amass information about the objects they are tracking.

"While it is not possible to enumerate all possible appearance variation of objects, it is possible to teach computers to interpolate from a wide range of training samples, thereby enabling machines to perceive the world," he says.

Currently, "for a computer, an image is composed of a long string of numbers," Yang says. "If the chair moves, the numbers for those two images will be very different. What we want to do is generalize all the examples from a large amount of data, so the computer will still be able to recognize it, even when it changes. How do we know when we have enough data? We cannot encompass all the possibilities, so we are trying to define ‘chair' in terms of its functionalities."

Potentially, computers that can "see" and track moving objects could improve assistive technology for the visually impaired, and also could have applications in medicine, such as locating and following cells; in tracking insect and animal motion; in traffic modeling for "smart" buildings, and improved navigation and surveillance in robots.

"For the visually impaired, the most important things are depth and obstacles," Yang says. "This could help them see the world around them. They don't need to see very far away, just to see whether there are obstacles near them, two or three feet away. The computer program, for example, could be in a cane. The camera would be able to create a 3-D world and give them feedback. The computer can tell them that the surface is uneven, so they will know, or sense a human or a car in front of them."

Yang is conducting his research under a National Science Foundation Faculty Early Career Development (CAREER) award, which he received in 2012. The award supports junior faculty who exemplify the role of teacher-scholars through outstanding research, excellent education and the integration of education, and research within the context of the mission of their organization. He is receiving $473,797 over five years.

Yang's project also includes developing a code library of tracking algorithms and a large data set, which will become publicly available. The grant also provides for an educational component that will involve both undergraduate and graduate students, with an emphasis on encouraging underrepresented minority groups from California's Central Valley to study computer sciences and related fields. The goal is to integrate computer vision material in undergraduate courses so that students will want to continue studying in the field.

Additionally, Yang is helping several undergraduate students design vision applications for mobile phones, and trying to write programs that will enable computers to infer depth and distance, as well as to interpret the images it "sees."

"It is not clear exactly how human vision works, but one way to explain visual perception of depth is based on people's two eyes and trigonometry," he says. "By figuring out the geometry of the points, we can figure out depth. We do it all the time, without thinking. But for computers, it's still very difficult to do that.

"The Holy Grail of computer vision is to tell a story using an image or video, and have the computer understand on some level what it is seeing," he adds. "If you give an image to a kid, and ask the kid to tell a story, the kid can do it. But if you ask a computer program to do it, now it can only do a few primitive things. A kid already has the cognitive knowledge to tell a story based on the image, but the computer just sees things as is, but doesn't have any background information. We hope to give the computer some interpretation, but we aren't there yet."

-- Marlene Cimons, National Science Foundation
Investigators
Ming-Hsuan Yang
Related Institutions/Organizations
University of California - Merced

Monday, April 30, 2012

THE UNDERWATER ROBOT COMPETITION


FROM:  DEPARTMENT OF DEFENSE ARMED WITH SCIENCE 
Underwater Robot Face Off
With a national title on the line, student teams from across the country are competing with their underwater robots in the Office of Naval Research (ONR)-funded championship in Manassas Park, Va.

The 2012 National SeaPerch Challenge brings top teams from middle and high school together to compete with the underwater robots they’ve built as part of a curriculum designed to boost their skills and interest in science, technology, engineering and mathematics (STEM).

The SeaPerch program is an initiative under the Department of the Navy’s STEM Coordination Office, which facilitates outreach efforts across the service. The chief of naval research, Rear Adm. Matthew Klunder, presented awards to winning teams.
“SeaPerch provides an affordable entry point for underwater robotics, and, from there, directional arrows to other science and engineering competitions and internships—it’s an easy-to-follow ‘yellow brick road’ approach,” said Kelly Cooper, program officer, ONR Sea Platforms and Weapons division. “The goal is to expand student awareness and encourage them to pursue STEM education and careers.”

The competition challenges are designed to reflect Navy-relevant operations. This year, the 70 teams are competing in two events: an obstacle course and a salvage operation. Both take place in a community center indoor pool.

For the obstacle course, teams must navigate through 24-inch rings—which may be oriented in any direction—surface, re-submerge and return through the course. The salvage operation involves five 5-gallon buckets inverted on the pool’s bottom, which each team must float to the surface and then bring poolside.

SeaPerch gives teachers and students the resources they need to build an underwater remotely operated vehicle (ROV) from a kit made up of low-cost, easily accessible parts, following a curriculum that teaches basic engineering and science concepts with a marine engineering theme. The objective is that students will build STEM, problem-solving and teamwork skills.

Tuesday, April 24, 2012

THE LEGOBOT COMPETITION




FROM:  ARMED WITH SCIENCE DEFENSE DEPARTMENT
Naval Research Laboratory oceanographer Dr. Clark Rowley (back right) coaches the Boyet Junior High School's FIRST LEGO League team. (Photo credit: U.S. Naval Research Laboratory/Clark Rowley)

LEGObot Competition
Naval Research Laboratoryoceanographer Clark Rowley recently spent 80 hours over 10 weeks playing with LEGO blocks, teaching junior high students how to build robots.
Rowley has been coaching the Boyet Junior High School’s FIRST Lego League (FLL) team since 2009. FLL is a robotics-focused, extra-curricular program for middle school students. During the 10-week season, junior high teams build LEGO-based robots and develop research projects for a chance to compete in the FLL regional competitions.

“It’s fun to watch the kids go from just a box of LEGO parts and create a really capable robot with some very clever engineering,” Rowley said. “The kids do the research. They build the robots. They do the work. That is the heart of FIRST LEGO League.”

With the help of teachers and an assistant coach, Rowley prepared the 10 students for the 2011 FIRST LEGO League Louisiana Regional Competition in December.

Food contamination was the theme for this year’s competition, so Rowley’s team found an article about a rodent infestation in a Peruvian school cafeteria. The students conducted research and spoke with experts on rats, rat control, and autonomous robots, and proposed a system of communicating robots to perform rodent control in food storage warehouses.

As part of the project, they demonstrated a system of two LEGO robots communicating over Bluetooth.

Rowley and his team’s hard work and dedication paid off again at this year’s competition. Boyet won a Core Award for Mechanical Design and placed second out of 57 teams in the Robot Performance division.

Search This Blog

Translate

White House.gov Press Office Feed