Monday, July 30, 2012

Kuratas

SUIDOBASHI HEAVY INDUSTRY has introduced their fully configurable Kuratas robot. Designed by artist/ironworker Kogoro Kurata, along with Wataru Yoshizaki (controls/electronics) and Yusuke Kitani (engineer), the Kuratas robot stands 12 ½ feet tall, weighs in at over 4 tons and features roughly 30 hydraulic joints which enables the robot to move its arms, 4 legs (each with its own wheel) and torso independent from each other. Powering the mech is a diesel engine (unknown as to what engine, or drive-train for that matter) that has a top speed of just over 6 mph, which may be slow but I doubt that anyone driving behind you will be honking their horn in a fit of road-rage. The company can outfit the Kuratas, based on your preference, with multiple ‘less-than-lethal’ weapons systems that include dual Gatling guns, Iron-crow grip claw, Iohas rocket launcher, Kuratas hand-gun and Pilbunker rifle. Each weapon fires either plastic bio-degradable BB’s or water bombs, so don’t expect to do much damage to your target. This is intentional. The designers meant to do as the robot is actually a piece of art rather than a war machine. However, you could probably replace those with actual weapons platforms and weaponry if you so desired.

Transformers assemble: The 13ft, four ton, super-robot is going on sale for £900,000 - but you have to pay extra for the cup holder
Transformers assemble: The 13ft, four ton, super-robot is going on sale for £900,000 - but you have to pay extra for the cup holder
Kuratas, made by Suidobashi Heavy Industry, can be controlled either through the one-man cockpit or from the outside using any smartphone connected to the 3G network.
The robot, which is set will go on sale for £900,000, has around 30 hydraulic joints which the pilot moves using motion control.

As it is made to order the style conscious buyer will not have to worry about sticking to the grey exterior - it comes in 16 colours, including black and pink, and for an extra £60 they will sort you out with a cup holder.
Simple touch: All you need to control the robot is a 3G connection and works particularly well with iPhone's touchscreen
Simple touch: All you need to remote control the robot is a 3G connection and works particularly well with iPhone's touchscreen

If you are not the piloting kind of millionaire, Kuratas can be operated using what Suidobashi calls the ‘Master-Slave system’ where you control the robot’s movements from outside using any device with a 3G network such as an iPhone.

‘Automatic alignment allows you to lock on your enemy target. Kuratas will not allow any targets to escape.
'With the alignment set appropriately the system will fire BBs when the pilot smiles.’

Sources:
http://suidobashijuko.jp/
http://www.element14.com
http://www.dailymail.co.uk

Thursday, July 26, 2012

Immortality by 2045

A Russian research project aims to bring immortality to humans by the year 2045, and a team of 30 scientists are working on the project called Avatar with hopes to transplant brains into robot bodies.
Russian Immortality 2045”This project is leading to immortality,” says Dmitry Itskov, the Russian entrepreneur who heads the hi-tech ‘Avatar research project.
“You have the ability to finance the extension of your own life up to immortality. Our civilization has come very close to the creation of such technologies: it’s not a sciencefiction fantasy. It is in your power to make sure that this goal will be achieved in your lifetime,” Itskov told Forbes magazine.
He has contacted a list of billionaires with a proposal for funding his quest for immortality – which Itskov refers to as “cybernetic immortality” and the “artificial body.”
The initiative is opening its San Francisco office this summer, and will be launching a social media project connecting scientists around the world.
“The 2045 team is working towards creating an international research center where leading scientists will be engaged in research and development in the fields of anthropomorphic robotics, living systems modeling and brain and consciousness modeling with the goal of transferring one’s individual consciousness to an artificial carrier and achieving cybernetic immortality,” Itskov stated. “Such research has the potential to free you, as well as the majority of all people on our planet, from disease, old age and even death.”
Itskov envisages surgically ‘transplanting’ a human consciousness into a robot body within 10 years.
He hopes to then ‘upload’ minds without surgery, leaving human bodies as empty husks as their owners ‘live on’ inside robots.
The project is called Avatar after the James Cameron movie, set far in the future, where human soldiers use mind control to inhabit the bodies of human alien hybrids as they carry out a war against the inhabitants of distant world.

What's the Latest Development?

Russian entrepreneur Dmitry Itskov is courting the world's richest individuals to help him in conquering death. Itskov, a 33 year-old, can afford to wait but the billionaires he approaches have an average age of 66, meaning they may be looking for shorter-term solutions to living longer—much longer. "Itskov expects the first fruits in about a dozen years, when a human brain is to be transplanted into a robot body. The resulting 'avatar,' as he calls it, will 'save people whose body is completely worn out or irreversibly damaged.'" Called the 2045 Initiative, it recently held a meeting in Moscow and opened office space in San Francisco.

What's the Big Idea?

Preserving the brain and placing it in a host container, so that the spark of consciousness could outlive the body's organ failure, may be "just a way station to Nirvana, which would ultimately involve downloading the brain’s contents into a computer." The concept of melding man and machine, and thereby preserving consciousness past physical death is known as the Singularity. "A brand new body can get crushed by a 500-pound anvil that may fall on it, as anvils are wont to do. Once it’s downloaded into a computer, your mind is safe from anvils, pandemics, and even planet-destroying asteroids (as soon as its mirrored onto interplanetary networks)."

The "2045" Initiative


2045 Initiative's Vision for Further Development of Humankind

The world is on the verge of global change. The rate of globally significant events, and that of discoveries and crises, is growing exponentially. We are facing the choice: To fall into a new Dark Age -- into affliction and degradation – or to find a new model for human development and create not simply a new civilization, but a new mankind.


Sources:
http://bigthink.com
http://2045.com
http://www.newsoxy.com

Thursday, July 19, 2012

PR2 - Robots for Humanity


 
PR2 from Willow Garage is now able to help people with disabilities to perform everyday tasks such as manipulating objects shaving and more. The video below illustrates the scope and the results of the Robots for Humanity project that Willow Garage, the Healthcare Robotics Lab at Georgia Tech, and Henry and Jane Evans are pioneering.


Assistive Mobile Manipulation for Older Adults at Home

Motivation

There is a growing need in society to enable older adults to remain in an independent living environment. Many older adults fear losing their independence and being required to move to an assisted living facility. From a societal perspective, it is cost-effective to support older adults' preference to age in place. The economic implications of transitioning to full-time residential care settings are substantial, both to individuals and to society. Given current demographics, these costs are projected to increase exponentially. Older adults living in their own homes may be faced with situations in which there is a mismatch between the demands of their daily environment and their capabilities. These situations generally result from both increased demands (e.g., use of new medical devices) and deficits in the capabilities of the individual (e.g., age-related changes in cognition, perception, or movement control). There is great potential for robotics to support the needs of older adults -- either directly or by supporting the activities of professional caregivers (e.g., nurses or physical therapists) who work in the homes of older adults.

Approach

The proposed research will consist of two closely integrated thrusts: one devoted to human-robot interaction and the other focused on software development. Both thrusts will be directed toward the development of assistive capabilities for the PR2 robot, with an emphasis on home care for older adults. These two thrusts are highly synergistic. The human-robot interaction (HRI) thrust will help ensure that the software development is closely connected to real-world needs, and the software development thrust will provide capabilities that both inform and enable cutting-edge studies of human-robot interaction.

An Interdisciplinary Team

Prof. Charlie Kemp, a leading researcher in the area of assistive mobile manipulation, will serve as the PI for the overall project and thrust leader for software development. Prof. Wendy Rogers, who is a leading researcher on technology and human factors for older adults, will be the thrust leader for human-robot interaction. They will lead a cross-disciplinary team of 17 researchers. Each of the four Co-PIs brings important expertise to the project. Prof. James M. Rehg is a leading expert in computer vision and machine learning, and is the Associate Director of Research in the Center for Robotics and Intelligent Machines. Prof. Andrea Thomaz is a pioneer in HRI and socially guided machine learning. Dr. Tracy Mitzner has almost a decade of experience conducting research on human factors and aging, with the goal of finding ways that technology can support aging. Brian Jones, the Director of the Aware Home Research Initiative, will provide supported access to the Aware Home, a free-standing home on the Georgia Tech campus that will house the PR2 for two 3-month periods so that we may conduct HRI studies and software evaluations in a realistic home environment.
Sources:
http://www.hsi.gatech.edu
http://www.robotshop.com
http://www.ros.org
http://www.willowgarage.com/pages/robots/pr2-overview 


Saturday, May 26, 2012

Heaphy Project

When we think about cloud robotics, we understand it consist of off-loading the computationally intensive tasks from a robot to a more powerful external resource that will process the data and send an answer or instructions on what to do back to the robot. We usually envision this computational outsourcing to be done by powerful server farms, but Tim Field from Willow Garage has a different idea and devised a system to allow these complex tasks to be offloaded to other humans.
The Heaphy Project allows humans to help ROS-enabled robots perform complex tasks via teleoperation. This means that if you need to empty your dryer, you could ask your robot to do it which, in turn, will hire a human to help him with the chore. This might seem a convoluted patch to follow in order to have humans perform chores but the long-term goal is to have robots learn these tasks from their human operators.
The video bellow explains how this projects works and how you can get started teleoperating robots via the Amazon Mechanical Turk.
Source: http://www.robotshop.com

Tuesday, May 22, 2012

Microbots Made of Bubbles


We're used to thinking of robots as mechanical entities, but at very small scales, it sometimes becomes easier to use existing structures (like microorganisms that respond to magnetic fields or even swarms of bacteria) instead of trying to design and construct one (or lots) of teeny tiny artificial machines. Aaron Ohta's lab at the University of Hawaii at Manoa has come up with a novel new way of creating non-mechanical microbots quite literally out of thin air, using robots made of bubbles with engines made of lasers.

To get the bubble robots to move around in this saline solution, a 400 mW 980nm (that's infrared) laser is shone through the bubble onto the heat-absorbing surface of the working area. The fluid that the bubbles are in tries to move from the hot area where the laser is pointing towards the colder side of the bubble, and this fluid flow pushes the bubble towards the hot area. Moving the laser to different sides of the bubble gives you complete 360 degree steering, and since the velocity of the bubble is proportional to the intensity of the laser, you can go as slow as you want or as fast as about 4 mm/s.
This level of control allows for very fine manipulation of small objects, and the picture below shows how a bubble robot has pushed glass beads around to form the letters "UH" (for University of Hawaii, of course):

Besides being able to create as many robots as you want of differing sizes out of absolutely nothing (robot construction just involves a fine-tipped syringe full of air), the laser-controlled bubbles have another big advantage over more common microbots in that it's possible to control many different bubbles independently using separate lasers or light patterns from a digital projector. With magnetically steered microbots, they all like to go wherever the magnetic field points them as one big herd, but the bubbles don't have that problem, since each just needs its own independent spot of light to follow around.

The researchers are currently investigating how to use teams of tiny bubbles to cooperatively transport and assemble microbeads into complex shapes, and they hope to eventually develop a system that can provide real-time autonomous control based on visual feedback. Eventually, it may be possible to conjure swarms of microscopic bubble robots out of nothing, set them to work building microstructures with an array of thermal lasers, and then when they're finished, give each one a little pop to wipe it completely out of existence without any mess or fuss.

Cooperative Micromanipulation Using Optically Controlled Bubble Microrobots by Wenqi Hu, Kelly S. Ishii, and Aaron T. Ohta of the the Department of Electrical Engineering, University of Hawaii at Manoa, was presented last week at the 2012 IEEE International Conference on Robotics and Automation in St. Paul, Minn.

Sources:
http://spectrum.ieee.org
http://www-ee.eng.hawaii.edu

Thursday, May 17, 2012

Cloud Robotics

An important announcement for the field of robotics was made at Google IO 2011 that complements the announcement for the Android Open Accessory Kit.
As shown in the video below about Cloud Robotics (this presentation is very interesting and explores somewhat involved robot programming), there is a new implementation of ROS, the popular operating system for robots, that runs directly on Android. Even PR2 was invited to the talk!

This means that any ROS compatible robot  (including Arduino-based robots) can be controlled via an Android phone, including the popular PR2. This interoperability and the power of cloud computing could provide robots, in the future, with better abilities especially when facing unexpected situations. With the power of the cloud, robots can offload complex computations and thus require less electrical power for computations. They could also learn new skills on the fly without needing to have all possible skills installed at once.

We are also happy to see that this technology is accessible to everyone through the use of Arduino and Open Hardware. RobotShop’s MyRobots.com initiative is compatible with this vision ans also aims to give robots, and robot owners, the power of the cloud though Open Hardware and Software.
Sources:

Wednesday, March 21, 2012

Flying Robotic Swarm of Autonomous Quadrotors

Wikipedia:
A quadrotor, also called a quadrotor helicopter or quadrocopter, is a multicopter that is lifted and propelled by four rotors. Quadrotors are classified as rotorcraft, as opposed to fixed-wing aircraft, because their lift is generated by a set of revolving narrow-chord airfoils.

In the last couple of years quadrotors have become a leading platform for aerial robotics. Precision control, maneuverability, and the ability to pack on a payload make quadrotors the goto choice when designing an aerial robot.

ArduIMU Quadrotor Drone


Stanford STARMAC


UPenn GRASP Lab Aggressive Quadrotor


CrazyFlie Micro Quadrotor


Musical Quadrotor


Quadrotors Playing Catch


Construction by Teams of Quadrotor Drones


Kinect-powered autonomous quadrotor


Here come the stylish Scandinavian flying spycams


GRASP Lab Gag Reel

GRASP quadrotors
KMel Nanoquadrotors
These acrobatic robots can launch themselves through rings, duck and weave around obstacles, and even fly through your bedroom window. Hell, they can construct your bedroom window. Flying quadrotors first developed at UPenn’s GRASP Lab by Daniel Mellinger have reached the big time.

Mellinger and his colleagues Prof. Vijay Kumar and Alex Kushleyev have created a swarm of 20 “nano quadrotors” – miniature versions of the stunt bots that can fly in complex 3D formations. The coordinated acrobatics of these quadrotors is truly amazing and exciting to behold (don’t miss the video below). Not only have the quadrotor team out done themselves with this latest demonstration, they’ve apparently formed a new company, KMel Robotics around the devices. Could swarming flying bots be coming soon to a store near you?



Mellinger has been developing various versions of the quadrotor system at UPenn for several years. In previous demonstrations he and Vijay Kumar have shown larger incarnations of the quadrotors buzzing around at high speeds, revving up their engines to “leap” through obstacles, and dashing about like mad hornets. The swarm showcased in the video above adds the element of coordinated movement across a large number of the robots, up to 20, and in complex three dimensional formations. While the scale of these movements is really impressive (3 million views doesn’t lie) it should be noted that the swarm relies upon external sensors for tracking position and movement. These robots aren’t quite smart or aware enough to fly so smoothly entirely on their own. Vicon cameras (the boxes seen along the walls in the video) provide a central control computer with necessary data to help the robots keep from smacking into one another. Simpler instances of coordination were seen in earlier demonstrations when quadrotors worked together to build basic structures out of prefabricated beams. In that light this nano swarm should be seen as a natural progression of the technology rather than a quantum leap.



On a related note, previous demonstrations from Mellinger also revealed that the larger quadrotors were purchased form Ascending Technologies, not produced in-house at UPenn. While the latest video doesn’t address this issue, the nano quadrotors bear a remarkable resemblance to AscTec’s Hummingbird ResearchPilot drone. Undoubtedly Mellinger and his colleagues augmented the devices to work with their tracking and coordinating system, but the robots themselves may not be proprietary to the team.

Which raises some interesting questions about what the new startup, KMel Robotics, will focus on. The KMel site has almost no information beyond the statement that they are preparing to tell the world more (that’s actually very understandable considering the enormous interest from the press and public the latest video has received). However, I will venture two guesses on the subject. First, while you can go to the AscTec site today and rent one of their quadrotors for aerial photography or acquire one for research, the capabilities developed by KMel are clearly above and beyond what the robot producer provides. That means KMel is likely going to focus on the networking/tracking/coordination side of swarm robotics, and that the quadrotors themselves may not be their actual product – just a test platform. Can you imagine similar feats of piloting performed with full scale military drones or surveillance UAVs? Definitely something worth forming a company around.

My second guess? Daniel Mellinger’s online handle is “The Dmel”, which probably means that the ‘K’ in KMel Robotics is for Professor Vijay Kumar. Singularity Hub is glad to see that partnership continue, likewise with their impressive work on flying drones. Kudos to KMel for pushing the envelope; showing the world that modern robots have evolved beyond their awkward origins and can now move with the beauty of a perfectly synchronized ballet.
Sources:
http://kmelrobotics.com/ 
http://singularityhub.com/
http://blog.makezine.com/

Hawk Wireless Networked Autonomous Humanoid Mobile Robot with Dual Arms

Hawk is designed and built on i90 robot base, featuring two large arms (Hawk Arms), dual-camera animated head (Hawk head), indoor GPS navigation system, and auto-docking/recharging station.


  • Dual arms with 6 joints (DOF) + 2DOF gripper, reaching 60cm (2ft), with max lifting weight of 800g (optional 1kg)
  • 6DOF animated head with dual 640x480 color cameras
  • 3.5 inch color display on chest, playing video (.wmv), audio and displaying images
  • Overall height of 1.4m; Dimension: 43cm(L) x 38cm(W) x 140cm(H)
  • Navigation and localization providing collision-free point-to-point autonomous navigation
  • Vision-landmark base indoor localization (indoor GPS, position/orientation) sensor and the landmarks together provide precise position and direction information covering every inch of the floor
  • Auto-docking and recharging station
  • Fully wireless networked 802.11g
  • OS independent application development tools
  • Navigation sensors including 6 sonar and 10 IR range sensors
  • Max speed 0.75m/sec.
  • Comprehensive circuit protection
  • High resolution pan-tilt-zoom (10X) camera
  • Max payload of 10kg (optional 40kg) with body weight of 21kg
  • Tele-operation and remote monitoring
  • Extended operating time. 2 hour nominal operation time for each recharging
  • Upgrade options: Laser Scanner; Hand Camera; Power and battery systems for 4, 8 hours operation time are available
Photos:

Movies:
Auto Docking and Charging


720x480         320x240

Autonomous Patrol and Door Entry


720x480         320x240
Playing Drum


720x480         320x240
Serving


720x480         320x240
Teach and Learn


720x480         320x240
Serve a Drink


720x480         320x240

Usage:
  • Showroom, Tradeshow & Museum as Autonomous Tour Guide/Receptionist Robot
  • Robotic Development Platform for Academic and Research Institutes
Source: http://www.drrobot.com/