venerdì 3 giugno 2016

martedì 27 ottobre 2015

An album from Maker Faire Rome 2015

Thanks to all visitors that join Geduino at Maker Faire Rome 2015!

The Geduino Team at Maker Faire Roma

With Giovanni Burresi of UDOO Team (who produce the SBC used in Geduino)

Kids have fun playing with Geduino

With Gaël Langevin and its 3D Printed humanoid robot

Geduino in the arena


For everyone who miss the "Introduction to ROS and its Navigation Stack" talk at Maker Faire Rome 2015 the recorded video and slides (in Italian only) are now online!

Download here the slides as PDF (here the original Keynote file).

On YouTube the playlist of the complete talk.

I wish this can be helpful to you!

lunedì 26 ottobre 2015

New PowerBoard and GeduinoShield


Finally the PowerBoard and GeduinoShield PCB REV2 was arrived from Fritzing. You can download original files from:


In order to open them you must use Fritzing software that can be freely downloaded from this link. The PCBs shown in the picture was build by FritzingLab.

giovedì 10 settembre 2015

Join Geduino at MakerFaire Rome 2015


Geduino Foundation will be at MakerFaire Rome 2015. You will have chance to see Geduino demonstration and meet Geduino developers.

You can attend Introduction to ROS and its Navigation Stack talk. We will start from ROS basis to arrive to its Navigation Stack exploring everything you always wanted to know but that you have found on the net.

lunedì 18 maggio 2015

A roadmap to Social Autonomous Robot

Social robot
From Wikipedia, the free encyclopedia

A social robot is an autonomous robot that interacts and communicates with humans or other autonomous physical agents by following social behaviors and rules attached to its role.


The Navigation Stack is the basis of a mobile robot but a robot that is only able to move is not useful at all. What is missing is a reason for which a robot should move. This reason must be provided by the social attitude of Geduino: its role in society must define not only how to move but its overall behaviour!

The following roadmap for Geduino development was written following this goal: bring Geduino to be an Autonomous Social Robot!




Navigation

Navigation is the basic of an autonomous robot: the ability to move autonomously. This step includes:

  • build a map of unknown environment;
  • localization;
  • plan a path to reach the goal end execute it;
  • avoid fixed and mobile obstacles.

In this video you can see a demonstration of the Geduino Navigation Stack.

Perception

Perception is the ability to sense real world by sensors's measurements. Of course perception is already used in Navigation but, in order to interact with humans, it must be extended:

  • using computer vision, to detect and recognise objects and humans;
  • using speech to text algorithm, to recognise voice.

This step is actually in progress: by now some tests with OpenCV and Geduino camera was performed.

Interaction

Navigation is the fundamental interaction between Geduino and external world. Geduino's interaction capabilities must be extended based on the social role we will assign it. Hardware's extensions must be evaluated in this phase (arms, audio output etc…).


Navigation Stack is ready!

I'm finally proud to announce the Geduino Navigation Stack is working properly. See this video to discover what Geduino can do.



The video will show you capabilities of Geduino Navigation Stack:


  • build a map of unknown environment;
  • localization;
  • plan a path to reach the goal end execute it;
  • avoid fixed and mobile obstacles.


mercoledì 1 aprile 2015

GMapping and RPLidar

Previous post ends with a question: why Hector Slam perform better than GMapping when in theory should be the opposite? The right question should be: why GMapping works bad with RPLidar? Since I'm not the only developer asking this question I hope will be helpful for everyone share my results.

There are several reasons for which RPLidar and GMapping does not work well together:
  • the LaserScan message expected by GMapping;
  • the RPLidar reference frame;
  • the LaserScan message created by RPLidar node.

The good new is that all issues can be easily fixed. On this post I will show you how I success to make RPLidar and GMapping work together.

The LaserScan expected by GMapping


First issue regards the LaserScan expected by GMapping. Unfortunately this is not documented anywhere and I found it only checking out source code: on latest commits (not yet released) a new check was added to make sure LaserScan message min angle and max angle are opposite:

maxAngle = - minAngle

This condition is not satisfied by RPLidar node but, since released version of GMapping does not include this check, is not easy to notice.

The RPLidar reference frame


Second issue came from RPLidar frame and RPLidar rotation direction. It assume positive angle in CW direction: in order to reflect this orientation Z-axis must be directed to the RPLidar's bottom side (this is not a bug but, since this is not pointed out in documentation, is quite common to make a mistake broadcasting laser scan frame).

Further more, in order to satisfy the GMapping required condition about angles shown on previous section, is better to use a frame rotated by 180° respect one shown in RPLidar documentation: using this frame the resulting min  and max angle will be (approximately) equals to -3.14 and 3.14 according to GMapping condition.

The RPLidar frame from RoboPeak documentation

The RPLidar frame must be used in ROS


The LaserScan message created by RPLidar node


The RPLidar node require some modification in order to reflect corrections shown before about min and max angle (they must take count of the frame rotated by 180°):

scan_msg.angle_min =  M_PI - angle_min;
scan_msg.angle_max =  M_PI - angle_max;

Applying this fix will result in negative value of published angle_increment according to the fact that RPLidar rotate CW.

GMapping configuration


GMapping default settings are designed for high speed and large range lidars. RPLidar has really good value/price rate but it does not fit those requirements so modifications of GMapping parameters  is needed in order to have best result.

Those are the GMapping parameters I used with best results:

maxUrange: 5.5
maxRange: 5.5 it was 6.0 but after better tuning 5.5 is suggested
minimumScore: 50
linearUpdate: 0.2
angularUpdate: 0.25
temporalUpdate: 5.0
delta: 0.025

Conclusion


In conclusion RPLidar can works with GMapping with good performance after those issues are fixed:

The GMapping built map before fixes

The GMapping built map after fixes. Unfortunately was not possible to use the same bag file for this test since the LaserScan message was changed. Anyway the test was conducted on same environment with same conditions.
The RPLidar node with fixes it is available on GitHub repo.