Your weekly selection of awesome robot videos
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next two months; here’s what we have so far (send us your events!):
IEEE CASE 2017 – August 20-23, 2017 – Xi’an, China
IEEE ICARM 2017 – August 27-31, 2017 – Hefei, China
IEEE RO-MAN – August 28-31, 2017 – Lisbon, Portugal
CLAWAR 2017 – September 11-13, 2017 – Porto, Portugal
FSR 2017 – September 12-15, 2017 – Zurich, Switzerland
Singularities of Mechanisms and Robotic Manipulators – September 18-22, 2017 – Johannes Kepler University, Linz, Austria
ROSCon – September 21-22, 2017 – Vancouver, B.C., Canada
IEEE IROS – September 24-28, 2017 – Vancouver, B.C., Canada
RoboBusiness – September 27-28, 2017 – Santa Clara, Calif., USA
Drone World Expo – October 2-4, 2017 – San Jose, Calif., USA
Let us know if you have suggestions for next week, and enjoy today’s videos.
Sadly, not everyone can afford a lab full of robots to experiment with. Georgia Tech has taken pity on the rest of us, and is opening up a Robotarium designed to give (almost) anyone remote access to a swarm of robots for research:
We’ll be checking back in with the Robotarium just as soon as the rest of the world starts using it for research. And, we hear a rumor that the plan is to outfit all of the robots with flamethrowers, so that should be fun too.
Ziquan Lan, Mohit Shridhar, David Hsu, and Shengdong Zhao from the National University of Singapore wrote in to share a much, much better way of autonomously taking pictures with drones:
You can read the full paper, which won Best Systems Paper Award in Memory of Seth Teller at RSS this year, at the link below.
When doing experiments with a PR2, always remember to wear a helmet:
But seriously, it’s always good to see PR2 still out there, working hard and getting stuff done.
I won’t pretend to know what Dota stands for, or what happened to Dota 1 (something bad I assume?), but I guess Dota 2 is a big deal, because OpenAI seem pretty stoked that the programmed an AI that can beat the best human players in a 1v1:
Here’s the match, along with a bunch of sports-y prelude that I assume is very much making fun of itself:
[ OpenAI ]
Alex runs the YouTube channel Super Make Something, and in this episode, he makes a super continuous line drawing machine. In other words, a robot that can draw better than you:
Alex, who has a fresh Ph.D in robotics from CMU and also spent some time at HEBI Robotics, now works for a company that supports robotics projects for certain government agencies with bunches of letters for names.
Here’s a video featuring CMU’s Sandstorm autonomous car, shot before the first DARPA Grand Challenge and shown during the 2004 Intel Developer’s Keynote:
[ CMU ]
A team of engineering students from the University of Antwerp are building a humanoid robot that will have the ability to translate speech into sign language. Sponsored by the European Institute for Otorhinolaryngology, the robot titled Project Aslan aims to support the short supply of sign language interpreters across the world. The goal of Project Aslan is not to replace human sign language translators but to merely provide support when they are not available. Once the designs are optimized, the robot can be used in numerous practical applications that along with general assistance can attempt to solve the root of the problem. The root problem is that sign language courses are sparse resulting in a shortage of the necessary amount of translators. The Aslan robot can be used to help teach sign language with a human teacher thereby expanding the capacity of these classes.
The first prototype featured 25 3D printed parts, which took a total of 139 hours to print. In addition to these 25 parts, 16 servo motors, 3 motor controllers, an Arduino Due and a number of other components were necessary to fully assemble the robot. The assembly of the complete arm takes around 10 hours. The Aslan robot works by receiving information from a local network, and checking for updated sign languages from all over the world. Users connected to the network can send messages, which then activate the hand, elbow and finger joints to process the messages.
Will wrote in to share this video from Sony, which (as he points out) is interesting because it was published just a few weeks ago but features AIBO and QRIO as sort of the flagship expression of the current (public) state of Sony’s robotics and AI. The auto-translated subtitles do a pretty good job, fortunately:
AIBO was discontinued in 2006, and QRIO never made it to market, but well over a decade later these are still some of the most sophisticated consumer robots ever sold. It’s entirely possible that an AIBO ERS-7 from 2005 would give the forthcoming generation of social home robots a run for their money in terms of overall capabilities and interactivity, which is bananas, considering how long ago 2005 is in robotics years. I doubt it’s worth reading much into what Sony is planning here, but we’d love to see more top-notch consumer robotics hardware from them.
[ Sony ]
This is what Cassies do when they’re not doing anything:
And this is why Cassies do this, according to Agility Robotics:
The “looking around” behavior was added for safety reasons: the balancing control loop runs so quickly that the robot appears stationary to many observers. It’s a bit like personal space for automobiles – you’re comfortable standing up against a parked car, but stay further away if it’s obvious that the engine is on. Plus, as you observe, it looks pretty cool. To answer your question, Cassie is always actively balancing unless it’s in the crouched transport position. The battery is managed by the controller, so Cassie should never fall over as a result of a low battery. In the event of a commanded e-stop or if an internal safety engages, the robot will cut power completely and collapse in a not very graceful heap.
And here’s one more video of break-in testing, because I cannot get enough of this robot:
[ Agility Robotics ]
“Imagine a trip to the future, a car that drives itself…”
I had to wait another year for my first ride in a self-driving car, which is still pretty early, I think:
[ CMU ]
Angela Schoellig’s undergrad researchers are working on some very cool drone stuff:
[ UTIAS ]
Here’s a KUKA IIWA stapled to a Ridgeback to make a mobile manipulator, brought to you by Clearpath Robotics:
I’m not sure what this robot is actually designed to do, but I’m hoping that it mostly just chases people down and gives them high fives.
[ Clearpath ]
Robot finger monkey:
Did… Did that robot finger monkey just fart? If so, totally worth $14.99 at your local Toys R Us.
[ Wowwee ]
Free Gait is a software framework for the versatile, robust, and task-oriented control of legged robots. The Free Gait interface defines a whole-body abstraction layer to accommodate a variety of task-space control commands such as end effector, joint, and base motions. The deﬁned motion tasks are tracked with a feedback whole-body controller to ensure accurate and robust motion execution even under slip and external disturbances. The application of this framework includes intuitive tele-operation of the robot, efficient scripting of behaviors, and fully autonomous operation with motion and footstep planners.
[ GitHub ]
Dr. Ryan Eustice, TRI’s Vice President of Autonomous Driving, presents technical challenges on the road to autonomous vehicles at the Center for Automotive Research’s (CAR) Management Briefing Seminars (MBS), July 31, 2017.
[ TRI ]
For the first time in the history of SparkFun’s Autonomous Vehicle Competition (AVC), we’ll have a separate 1 pound plastic ants division. These are combat bots made mostly out of plastic (the intention is for competitors to 3D print their chassis). We have combat bot veteran, Jamie Leben, help explain the different types of bot weapons and chassis types as well as give us some tips for making them out of plastic.
[ SparkFun AVC ]
On August 15th WeRobotics hosted a webinar featuring Alexander Fraser entitled: “Witchcraft and Explosions: how do community perceptions to humanitarian drones differ in Tanzania and Malawi?” Alexander, who is a BSc Geography Undergraduate student at the University of Edinburgh, spoke about his recent experiences shadowing the WeRobotics team in Zanzibar and a UNICEF team in Kasungu, conducting interviews with community elders, chiefs, matriarchs, village citizens, and district officials to assess their understandings of drones, or ‘ndege’. Some of the goals of the research were to investigate which drone applications would benefit the local communities, what are their fears regarding drone use, and to identify sites where flying a drone would be considered inappropriate.
[ WeRobotics ]