They will be taking part in Track B in the DARPA Robotics Challenge where they will use the PETMAN instead of developing their own robot. The task are shown in the chart above. TORC Robotics will take the lead developing the algorithms required for the humanoid to perform during the competition.
TORC has developed autonomous navigation kits for vehicles and they develop components for autonomous vehicles. Teleoperation with autonomy of vehicles is one technology that they have that will improve the usability of PETMAN robot. TU Darmstadt‘s Simulation, Systems Optimization and Robotics group will join the team. They have developed autonomous robot team and researched in dynamic modelling and optimisation methods in simulation. Last but not the least important is the Human-Computer Interaction Group from Virginia Tech. The team consist of groups with different expertises that make them suited for Track B.
Sliding Autonomy is a buzz word that is used widely in this competition and some feel that this will make the difference between the various teams. This is important as robots are still unable to perform robustly in the given scenario. Human intervention is still required and this is allowed during this competition. This makes it interesting when some form of autonomy is given to the robot but there are of course situations human teleoperation might be more suited. It’s about striking a balance depending on the capability of the robots. In Track B, all teams will use the PETMAN which means that they can concentrate on developing algorithms and teleoperation capabilities for use with the PETMAN. This is certainly a scenario that makes more sense today, as we take the first step away from teleoperating “dumb” robots. Heaphy Robotics was an initiative by Willow Garage a while back (watch video below) that allowed people from around the world to gain control of the PR2 to perform task without their premises. As seen in the video, you could either take full manual control over the robot or allow the robot some form of autonomy. That’s a good example of sliding autonomy.
DARPA Robotics Challenge is the next grand challenge that’s really getting most of the robotics community on their toes. The goal of this program is to develop ground robots that are able to operate in degraded human-engineered environments performing complex task such as opening a valve, climb a ladder or even drive a car. This is in response to the difficulties faced by robots deployed during the Fukushima Daiichi nuclear incident. That revealed a lot about the current capabilities of the robots in the market today. It’s not that the robots are incapable but it’s a matter about using the wrong tool for the job.
Team Steel, led by Christopher Atkeson, is one of the two CMU teams taking part in the DARPA Robotics Challenge. They have selected to take part in Track B (using a DARPA-provided robot, the PETMAN) while their counterparts (TARTAN RESCUE, lead by Tony Stentz) have chosen to take part in track A where members of that team will develop a robot of their own.
They have worked on the Sarcos Humanoid Robot, a robot similar to the PETMAN. They will implement some learning methods and interesting techniques for their robot. This will definitely be beneficial to the other team given CMU has fielded 2 teams in this competition.
Bartending is never easy, even the many different kinds of cocktails and concoctions you have to remember. On top of that, the amount of each liquid used makes or breaks a drink. The webtender is a site with all the recipes for the different cocktails you can find that every bartender might need. Sounds difficult for a human but say, we give this job to a robot..
Barbot 2013 is an annual showcase of robots that are designed to make and serve drinks. It’s great to see the different ideas that have been used to make these drink and the reward is a good cocktail always!! why not. It’s far more interesting than working on a dish washing robot.
Simulation is important for robotics but a realistic simulation is what is useful for most roboticists. In ROS, gazebo and Rviz are used for visual and physics simulation but there’s always room for a better simulation. This took 4 years but it’s finally here. MORSE (Modular OpenRobots Simulation Engine) is the latest simulation tool available for ROS among other robotics architecture and middleware. It’s the most realistic simulation I have seen and it allows you to take control of how real you want the simulation to be.
It’s based on open source projects like Blender for 3D rendering and Bullet for physics simulation. It’s build on an established workflow and that’s allows the simulation to sit on these two establish platform. It allows single and multi robot simulation and has 20 classes of sensors from cameras, depth sensors to IMUs. It currently supports 4 open source middleware – ROS, YARP, Pocolibs and MOOS.
One interest aspect of MORSE is that it can handle multiple robots in joint simulation scenario and work together in a distributed network of simulation nodes. This is a server-client model and it allows the use of multiple systems to run the simulation smoothly with distributed computing.
Neato is always under my radar due to it’s low-cost hackable LIDAR that came with the XV-11. It came with a 360 degrees scanning LIDAR with 1 degree accuracy and 10Hz scanning speed. This is a entire website dedicated to getting the xv11 to run under ROS. They have recently launched the XV Signature series. It looks really nice with it’s all black casing but even better still is the fact that this new Neato has 50% more suction power with the same hardware and specification.
Honestly, i am not really concerned about the vacuuming but i am rooting for Neato to build better, faster and more efficient low-cost solutions to the navigation ability to this robot. Hopefully, this will be a general base one day where we can hack it to sweep, mop, fetch and cook at very affordable prices.