Robot Operating System (ROS)
Machine Learning
Robot Operating System (ROS)
Signal processing
3D rendering
Fine-grained parallel computing
Using Khepera robots with ROS
Using the turtlebot
Using the kinect sensor
Using rovio robots with ROS
Detecting color blobs in an image
NAO Robot
Using Koala robots with ROS
Using AxisPTZ cameras with ROS

MAchine Learning and Interactive Systems

NAO Robot
  by Fix Jeremy


In order to use the Nao robot with ROS, you need

1) the pynaoqi to be in your PYTHONPATH

2) the ros-xxx-nao-yyyy packages. If you install ros-indigo-nao-bringup, it should install most of what you need to play with the NAO :

terminal:$ sudo apt-get install ros-indigo-nao-bringup

You can then start a bunch of the NAO ROS nodes using :

NAO_IP= roslaunch nao_bringup nao_full.launch  force_python:=true

in the above command, you may have to change the IP of the NAO you are using. As we are making use of the python naoqi, we mention force_python :=true

Playing with the nodes : let us bring it alive

As soon as the ROS nodes are launched, we can post commands to the topics , for example /cmd_vel. The first thing to do when using the Naos is to enable the stiffness, which is done by calling a dedicated service :

rosservice call /body_stiffness/enable "{}"

To disable the stiffness, you simply call :

rosservice call /body_stiffness/disable "{}"

Now that the stiffness is enabled (the Nao should become rigid with a little contraction of the "muscles"), we can move the robot :

rostopic pub -1 /cmd_vel geometry_msgs/Twist '{linear: {x: 1.0, y: 0.0, z: 0.0}, angular: {x: 0.0, y: 0.0, z: 0.0}}'

The /cmd_vel topic expects geometry_msgs/Twist messages.

If we want to move the head, in a non-blocking way, we can post messages on the /joint_angles topic ; The joint names can be found on the Nao documention for the head.

rostopic pub /joint_angles nao_msgs/JointAnglesWithSpeed '{joint_names:["HeadYaw", "HeadPitch"], joint_angles: [-0.7, -0.1], speed: 1}'

If you want to make it talk, you simply need to post to the /speech topic :

rostopic pub /speech std_msgs/String "data: 'I am aliiiive'"

You can see the robot, and all its transformation frames by running Rviz

rosrun rviz rviz -d `rospack find nao_bringup`/config/nao_full.rviz

You can check the cameras (top and bottom) by running the rqt_image_view node :

rosrun rqt_image_view rqt_image_view

You can also customize the quality of the videos with dynamic_reconfigure :

rosrun rqt_reconfigure rqt_reconfigure

And so many other things, check out the nao_robot ROS wiki

Example : Tracking a ball

That is actually the work done by Laure Giuliano and Viktor Terrier during their 1st year project. The project was about controlling the NAO with ROS in order to locate a ball and to get close to it. Nao sometimes speeks to indicate its status in the game.

The developed files are the package bouge_nao on the foundry of Supelec. To start the demo, use the launch file bouge_nao/launch/run_demo.launch. The packages might now be outdated (especially for the part of filtering out the colored object) but check the video below ;