Guess who is learning on the web?

A robot servicing a hospital room sharing the "knowledge" it is acquiring and using the ones of robots who where in that room before it. Credit: TU/e

The web is a space where most of activities is the result of bots, software widgets that continuously roam the cyberspace. What you find through Google is the result of bots that have discovered that piece of information. But, ad the edges of the web, where terminals and apps are, it is clearly us, human beings that initiate a search ... 

Well, pretty soon it looks like a new breed of surfers will appear: robots. This is what the research carried out in several European research institutes and Universities, including KTH, one the the EIT ICT LABS Partner, is pointing at.

The researchers have developed a platform, RoboEarth, through which robots can learn from one another. One robot enters into a hospital room and observe the objects that are there. The result of this observation is stored in the platform that can be accessed by another robot that will enter into that room, so that it will acquire a "previous" experience. Of course it (he?) will have to use that experience as a starting point, since some new objects may be present, others may have disappeared and others have been moved. But still, it will be faster to look for any difference than to reconstruct the whole ambient understanding from scratch. This is similar to what happens to us. It is much easier to navigate in a place where we have been before, even though many details may have changed since the last time we were there, than to navigate a new environment.

Besides sharing what they have learnt the robots also share "how" they have learnt it and this knowledge is passed on to other robots checking on the web. 

According to the researchers, they have created a sort of wikipedia for robots to consult and learn from other robots experience. Something we do once in a while the robots could be doing continuously, hence learning new things much much faster than us. And the robots don't need to be very smart, as we have to be to learn new things, since the RoboEarth lives in the cyberspace and a robot can leverage on the cyberspace brain growing in knowledge every moment as more robots share their experience and more areas are being captured. The processing of information need not be done completely in the robot "head". Most of it can take place in the RoboEarth Cloud.

I guess this result is impressive since it set in motion a process of increasing learning that given the number of robots, the diversity of ambient they operate and the speed at which learning can be processed may rapidly lead to the singularity, the point in which machines are smarter than humans. Someone says this singularity point is just two decades away, some venture saying it is even closer. However, none is any longer saying that such a point will ever be reached. 

Amazing and ... scaring!

Author - Roberto Saracco

© 2010-2020 EIT Digital IVZW. All rights reserved. Legal notice. Privacy Policy.