Scientists Use Sensors to Help Robot Animals Move More Naturally

Monday, September 30, 2019 - 16:29

Researchers from Virginia Tech have been developing computer algorithms and sensors to help robot animals move more naturally.

According to the TechTimes report, researchers from VT’s College of Engineering have launched four different studies to find out how they can draw inspiration from human and animal behavior for software development.

Kaveh Hamed and his colleagues aim at coming up with programs that would help give mechanical appendages more natural movement.

Hamed first got the idea of translating human and animal movement into algorithms for robots from experiences in his personal life. He watched how his one-year-old son learned how to walk and how his pet dog tended to switch from a run to a trot whenever it approached him. He said all these motions made him think about math.

"We do these things every day — we walk, run, climb stairs, step over gaps. But translating that to math and robots is challenging," he said.

One of the team's projects focuses on applying bipedal (two-legged) locomotion to powered prosthetic legs. The VT researchers are working on decentralized control algorithms that they could use to power a prosthetic leg model developed by colleagues at the University of Michigan.

Hamed and his team are also looking to leverage control algorithms, artificial intelligence, and sensors to improve the quadrupedal (four-legged) movement of robotic dogs.

The researchers took note at how most two or four-legged robots lack the proper movement to match their real-life inspirations. They believe that even state-of-the-art machines still cannot exactly mimic the agility of animals, such as dogs, cheetahs, and mountain lions. There seems to be a fundamental gap between the locomotion seen in robots and those seen in their biological counterparts.

For their work, Hamed and his colleagues are developing control algorithms that can help recreate the agility and stability of animal movement. These will be combined with sensors that work in the framework of basic animal biology.

The team also attached cameras and Light Detection and Ranging (LiDAR) technology to give the robotic dogs machine vision that would allow them to have a better grasp of potential obstacles around them.

With these features, Hamed and his team hope that the robotic dogs would be able to act accordingly to their environment. The sensors would read measurements of the robots' motion and surroundings, while the onboard computers would calculate the robust control actions necessary to help the machines navigate from point A to point B.

The robotic dogs used in the experiment have successfully mimicked several different gaits of real animals. So far, they have been able to amble, trot, and run even at sharp angles with better agility, balance, and speed than other robots.

The VT researchers are now working to incorporate AI into their control algorithms to help improve the machines' decision-making in real-world settings.


کد خبر: 666898 0 0


Popular News

Latest News