Robots! Robots! RA! RA! RA!

Katherine Scott


@kscottz


katherine@tempoautomation.com

Overview

  • What I do all day, why I wrote this tutorial

  • Quick overview of what ROS is and what it does.

  • Quick look at the hardware.

  • ROS in seven parts -- getting progressively more interesting.

    • Lesson 1: Arduino stepper driver
    • Lesson 2: Adding a controller.
    • Lesson 3: Services and Actions.
    • Lesson 4: Basic Computer Vision
    • Lesson 5: Universal Robot Description Format.
    • Lesson 6: Working with TF
    • Lesson 7: Moving beyond the basics

Why should you listen to me?

I am also doing this full time at Tempo Automation

This is what that robot does.


Shameless plug -- I am hiring. Also we can probably make your circuit boards faster and cheaper.




It looks like this

Before we talk about robots let's talk about the web

  • Who writes a web server from first principals?

  • That one person did it once, 'cause it is hard.

  • We take it at a given when we go to do work.

  • SO WHY WOULD YOU EXPECT TO WRITE A ROBOT FROM FIRST PRINCIPALS?

  • Wheels -- do not require re-invention

Good News -- There's robotics package for Python

  • Runs some of the most advanced robots.

  • Plays well on Linux. Runs on x86 and ARM

  • REALLY REALLY AWESOME FOSS COMMUNITY

  • Reasonably mature and battle hardened.

  • Let's C/C++/Lisp/Python play well together. Support for Java and JS.

ROS -- So what is it?



  • Package manager, build system (catkin), and deployment (launch) all in one.

  • Unified "system bus" for message passing. Basic pub/sub stuff.

  • Every utility you could ever think of.

  • Plugin framework for adding robot modules.

  • Standard libraries for handling images, transforms, 3D models, etc.

I could go on lecturing ...but let's just do it instead.





















Open source hardware version of this arm. I suggest you use it or roll your own.

What is running twitchy?

  • Arduino uno with a proto shield.

  • Protoshield just makes sure twitchy gets enough juice.

  • Gotta have a kill switch.

Lesson 1: What are we going to do here.

  • Think of ROS as a bunch of independent programs called nodes.

  • Nodes are glued together and talk using a publish/subscribe model.

  • Messages are published and subscribed on what are called topics.

  • The arduino node will publish one topic on the ros bus called state. It is four ints that define motor positions.

  • It will subscribe to the bus topic called /robot. This is also four ints for the servo motors.

  • We also have a text "status" message published on /status

  • ROS handles the encoding and passing info over the serial connection.

Let's put some ROS on our Arduino (sorry for the C!)

#include <ros.h>
#include <Servo.h> 
#include <std_msgs/Int16MultiArray.h>
#include <std_msgs/String.h>
#include <std_msgs/MultiArrayDimension.h>

ros::NodeHandle  nh; // ros node handle
std_msgs::Int16MultiArray out_state; // our current state
std_msgs::String stat; // our current status

ros::Publisher robot_state("state", &out_state); // publish our state
ros::Publisher robot_status("status", &stat); // publish a status message

// create our servo objects
Servo myservo1; 
Servo myservo2; 
Servo myservo3; 
Servo myservo4; 
int state[4];    // variable to store the servo position 


// user input scrubbing
int clip(int value,int top, int bottom)
{
  return min(max(value,bottom),top); 
}

void robot_cb(const std_msgs::Int16MultiArray& cmd_msg)
{
  // copy the input object to our state variable
  // and write it to the servos. 
  memcpy(&state,&cmd_msg.data[1],4*2);
  myservo1.write(state[0]);
  myservo2.write(state[1]);
  myservo3.write(state[2]);
  myservo4.write(state[3]);
  return;
}

// listen to the robot topic, when there is a message
// call the robot_cb function
ros::Subscriber<std_msgs::Int16MultiArray> robot("robot", &robot_cb);

void setup(){
   myservo1.attach(3);  // attaches the servo on pin 3 to the servo object    
   myservo2.attach(6);  // attaches the servo on pin 6 to the servo object  
   myservo3.attach(10);  // attaches the servo on pin 10 to the servo object
   myservo4.attach(13);  // attaches the servo on pin 13 to the servo object 
   // Setup our state variable
   for( int i=0; i < 4; i++ )
   {
      state[i] = 90;
   }
   // send the servos to a neutral state
   myservo1.write(state[0]);
   myservo1.write(state[1]);
   myservo1.write(state[2]);
   myservo1.write(state[3]);
   // Tell ros about our baud rate
   nh.getHardware()->setBaud(115200); //or what ever baud you want
   // initialize the node
   nh.initNode();
   // subscribe to the robot topic
   nh.subscribe(robot);
   // advertise that we publish two topics, state and status
   nh.advertise(robot_state);
   nh.advertise(robot_status);
}

// message string 
char stat_str[64];

void loop(){

 // copy over our data from ros. 
 memcpy(&out_state.data[1],&state,4*2);
 // construct our state message
 out_state.data_length = 5;
 out_state.layout.data_offset = 0;
 robot_state.publish( &out_state );
 sprintf(stat_str,"%i %i %i %i",state[0],state[1],state[2],state[3]);
 stat.data = stat_str;
 // publish our state message. 
 robot_status.publish( &stat );  
 // update ros stuff
 nh.spinOnce();
}

TL;DR

  • Arduino will listen on /robot for messages that consist of four integers.

  • Arduino will publish on /status the current robot status as a string.

  • Arduino will echo the servo /state the current robot state as the same four integers.

Now lets spin up ROS -- introducting launch files.

  • Launch files basically tell ROS what bits of code to run.

  • Launch file will spin up ROS core and then launch each node as a process.

  • Node launch order is not guaranteed.

  • You can include system parameters, and change the names of topics too.

  • Can pull in other launch files. Allows for subsystem encapsulation.

  • </ul> </h2> serial_controller.launch

    <launch>
     <node pkg="rosserial_python" type="serial_node.py" name="arduino_serial">
           <param name="port" value="/dev/ttyACM0"/>
           <param name="baud" value="115200"/>       
     </node>
    </launch>
    

    Run it with

    roslaunch serial_controller.launch

    A really simple ROS program

    publish_to_a_topic.py

    #!/usr/bin/env python 
    # publish_to_a_topic.py
    # THIS SHEBANG IS REALLY REALLY IMPORTANT
    import rospy
    import time
    from std_msgs.msg import Int16MultiArray
    
    if __name__ == '__main__':
        try:
            rospy.init_node('simple_publisher')
            # Tell ros we are publishing to the robot topic
            pub = rospy.Publisher('/robot', Int16MultiArray, queue_size=0)
            # Setup our message
            out = Int16MultiArray()
            val = 20
            # generate the message data
            for j in range(0,4):
                # set the joint angles
                out.data = [0,50,50,50,int(val)]
                # send the message
                pub.publish(out)
                # do some book keeping
                val += 10
                rospy.logwarn("Sent a message: {0}".format(val))
                time.sleep(1)
    
        except rospy.ROSInterruptException:
            rospy.logwarn('ERROR!!!')
    

    Lesson 2: Let's add a controller

    • ROS has a package for that! sudo apt-get install ros-indigo-joy

    • Joy will publish a joystick message.

    • We'll write a node that listens to joy stick messages.

    • When we hear a joystick message we'll publish a robot message.

    joystick_node.py

    #!/usr/bin/env python 
    # THIS SHEBANG IS REALLY REALLY IMPORTANT
    
    import numpy as np
    from sensor_msgs.msg import Joy
    from std_msgs.msg import Int16MultiArray
    class JoystickNode(object):
        def __init__(self):
            # put away our toys cleanly 
            rospy.on_shutdown(self.shutdown)
            self.pub = rospy.Publisher('robot', Int16MultiArray, queue_size=1)
            # subscribe to the joy and state messages
            # format is topic, message type, callback function
            rospy.Subscriber("/joy", Joy, self.do_it)
            rospy.Subscriber("/state", Int16MultiArray, self.update_state)
            # our internal state message
            self.state = [0,0,0,0,0]
            # tell ros to chill unless we get a message. 
            rospy.spin()
    
        def update_state(self,msg):
            # update our internal state every time the robot posts an update
            self.state = msg.data
    
        def do_it(self,msg):
            # Update our state
            if( self.state is None ):
                self.state = [0,0,0,0,0]
            m1 = self.state[1]
            m2 = self.state[2]
            m3 = self.state[3]
            m4 = self.state[4]
            step = 5
            # Update our state from our buttons
            if(msg.buttons[1] == 1 ):
                m1+=step
            elif( msg.buttons[2] == 1 ):
                m1-=step
            if(msg.buttons[0] == 1 ):
                m2+=step
            elif( msg.buttons[3] == 1 ):
                m2-=step
            if(msg.axes[-1] > 0 ):
                m3+=step
            elif( msg.axes[-1] < 0 ):
                m3-=step
            if(msg.axes[-2] > 0 ):
                m4+=step
            elif( msg.axes[-2] < 0 ):
                m4-=step
            # Don't allow our servos to go outside of range 
            data = [self.state[0],
                    int(np.clip(m1,0,180)),
                    int(np.clip(m2,0,180)),
                    int(np.clip(m3,0,180)),
                    int(np.clip(m4,0,180))]
            # slick little list comp to look for button changes
            change = any([abs(a-b)>0 for a,b in zip(data,self.state)])
            self.state = data
            # if there is a change
            if( change ):
                # Set the new position out on /robot
                out = Int16MultiArray()
                rospy.loginfo("sending {0}.".format(data))
                out.data = data
                self.pub.publish(out)
    
        def shutdown(self):
            data = [0,0,0,0]
            out = Int16MultiArray()
            print "sending {0}.".format(data)
            out.data = data
            pub.publish(out)
    
    
    if __name__ == '__main__':
        try:
            # boiler plate to spin up a node. 
            rospy.init_node('joystick_node')
            node = JoystickNode()
        except rospy.ROSInterruptException:
            rospy.logwarn('ERROR!!!')
    

    Before we start we should pick apart our launch file.

    run_joystick.launch

    <launch>
     <node pkg="joy" type="joy_node" name="joy_stick">
           <param name="dev" type="string" value="/dev/input/js0" />
           <param name="deadzone" value="0.12" />
           <param name="autorepeat_rate" value="30"/>
     </node>
    
     <node pkg="rosserial_python" type="serial_node.py" name="arduino_serial">
           <param name="port" value="/dev/ttyACM0"/>
           <param name="baud" value="115200"/>       
     </node>
    
     <node pkg="owi_arm" type="joystick_node.py" name="controller">
     </node>
    
    </launch>
    

    Lesson 3: Let's add services and actions.

    • What if we want to teach our robot behaviors?

    • There are two primitives for this in ROS: Services and Actions

    • Services -- short function calls, take an input, can give an output

    • Actions -- complex sets of behaviors with preemption

    • Both of these use the ROS message generator.

    A simple service and a simple action.

    • Let's write a service that records robot positions to file, and an action that replays them.

    • Need to modify our CMakeLists.txt and package.xml file.

    • Need to run catkin_make at the root of our directory.

    • ROS magic puts the generated C++/Python types etc in our path

    • </ul> </h2> waypoint.srv

      string fname
      ---
      string result

      play_animation.action

      string filename
      int32 step_size
      ---
      string result
      ---
      string update

      arm_controller.py

      #!/usr/bin/env python
      import rospy
      # need to use the ros actionlib
      import actionlib
      # need to import our automagically generated messages
      from owi_arm.srv import *
      from owi_arm.msg import *
      from std_msgs.msg import Int16MultiArray
      import sys, os, time
      
      class ArmController:
          def __init__(self):
              self.debug = rospy.get_param('~debug', False)
              # get param let's us set parameters 
              self.path = rospy.get_param('/animation_path','/home/kscottz/Desktop/')
      
              rospy.loginfo("Start Arm Controller")        
              self.state = [0,0,0,0,0]
              # create the waypoint service, topic, message type, callback function.
              self.waypoint_service = rospy.Service('/waypoint', waypoint, self._handle_save_state_to_file)   
      
              # subscribe to the robot's state
              rospy.Subscriber("/state", Int16MultiArray, self.update_state)
              self.pub = rospy.Publisher('/robot', Int16MultiArray, queue_size=1)
              # create our action server. 
              # format is name, action message type, callback, autostart.
              self.action_server_play = actionlib.SimpleActionServer('play_animation', play_animationAction, 
                                                                     execute_cb=self._handle_play_animation, auto_start = False)
              # start the server
              self.action_server_play.start()
              # run the node
              self._run()
      
          def update_state(self,msg):
              # update the internal state just like before. 
              self.state = msg.data
      
          def _run(self):
              # tell ros to just hang out until we get a message.
              rospy.spin()
      
          def _handle_save_state_to_file(self,req):
              # wait for a state update.
              rospy.wait_for_message('/state', Int16MultiArray, timeout=10)
              # set our file name
              fname = self.path+req.fname
              # create our output stream
              out_str = "{0},{1},{2},{3}\n".format(self.state[1],self.state[2],self.state[3],self.state[4])
      
              # write the file out of the current state
              if( not os.path.isfile(fname) ): # open in write mode
                  with open(fname, 'w') as outfile:
                      outfile.write(out_str)
              else:
                  with open(fname, 'a') as outfile:
                      outfile.write(out_str)
              # create and return a response message. 
              msg = "Wrote {0} to file {1}.".format(out_str,fname)
              rospy.logwarn(msg)
              response = waypointResponse()
              response.result = msg
              return response
      
          def _handle_play_animation(self,goal):
              # assemble the file name from the input goal
              fname = self.path + goal.filename 
              # get the time between each animation step. 
              sleepy_time = goal.step_size 
              # assemble the output string 
              result = play_animationResult()
      
              # check that our file is good, if not bail and send the result
              if( not os.path.isfile(fname) ): 
                  result.result = "Could not find file {0}".format(fname)
                  self.action_server_play.set_failed(result)
                  return
              # create our feedback object
              self.feedback = play_animationFeedback()
              self.feedback.update = "Loading file {0}.".format(fname)
              # publish the feedback
              self.action_server_play.publish_feedback(self.feedback)
              steps = []
              # open our file and parse it into the steps list. 
              with open(fname, "rt") as f:
                  for line in f:
                      steps.append([int(j) for j in line.split(',')])
              # send an update 
              self.feedback.update = "Finished reading file, got {0} commands.".format(len(steps))
              self.action_server_play.publish_feedback(self.feedback)
              # go through our steps and send them to the robot. 
              for step in steps:
                  out = Int16MultiArray()
                  rospy.loginfo("sending {0}.".format(step))
                  out.data = [0,step[0],step[1],step[2],step[3]]
                  self.pub.publish(out)
                  # send out our feedback
                  self.feedback.update = "Going to {0}.".format(out.data)
                  self.action_server_play.publish_feedback(self.feedback)
                  # sleep a bit to let the command execute
                  time.sleep(sleepy_time)
                  # check if we need to stop
                  if self.action_server_play.is_preempt_requested():
                      result.result = "We did nothing"
                      self.action_server_play.set_succeeded(result)
                      return
      
              result.result = "All done"
              self.action_server_play.set_succeeded(result)
              return result
      
      if __name__ == '__main__':
          try:
              rospy.init_node('ArmController')
              arm_controller = ArmController()
          except rospy.ROSInterruptException:
              pass
      

      Now we need to add a few lines to our joystick node to make this work.


      owi_joystick_node.py

      def __init__(self):
              # create the action library client to play animation
              self.animate_client = actionlib.SimpleActionClient('play_animation', play_animationAction)
              # wait for that action to spin up.
              self.animate_client.wait_for_server()
              # create a proxy to the waypoint service. 
              self.waypoint_proxy = rospy.ServiceProxy('/waypoint', waypoint)
      
      ...
          def call_animate_service(self,fname,t):
              # create a goal for action
              goal = play_animationGoal() 
              goal.filename = fname
              goal.step_size = t
              self.animate_client.send_goal(goal)
              # wait for the animate action to complete
              self.animate_client.wait_for_result()
              # get the result of the animate
              result = self.animate_client.get_result()
              rospy.logwarn("result of animation {0}".format(result))
      
          def call_waypoint_service(self,fname):
              # create the request message
              req = waypointRequest()
              req.fname = fname
              # send the message
              response = self.waypoint_proxy(req)
              # print the result. 
              rospy.logwarn("JOYSTICK NODE GOT {0}".format(response.result))
      

      Also add joystick key bindings to call these functions.

      Let's put it all together in our launch file.

      service_and_action_example.launch

      <launch>
        <node pkg="joy" type="joy_node" name="joy_stick">
             <param name="dev" type="string" value="/dev/input/js0" />
             <param name="deadzone" value="0.12" />
             <param name="autorepeat_rate" value="30"/>
        </node>
        <node pkg="rosserial_python" type="serial_node.py" name="arduino_serial">
             <param name="port" value="/dev/ttyACM0"/>
             <param name="baud" value="115200"/>       
        </node>
        <node pkg="owi_arm" type="owi_joystick_node.py" name="controller">
        </node>
        <node pkg="owi_arm" type="arm_controller.py" name="animate_service">
        </node>
      </launch>
      

      Quick Break For Cool Stuff

      ZOMG Your robot can run in the browser.

      Lesson 4: CAMERAS!

      • ROS supports like all of the cameras -- cheap, expensive, whatever.

      • Images get published as a message

      • Uses the OpenCV bridge to OpenCV formats (inlcuding numpy!)

      • Uses nodelets (like nodes, but pass memory pointers).

      • Camera controls, calibration, and rectification baked right in!

      Using a simple USB camera.

      • sudo apt-get install ros-indigo-usb-cam ros-indigo-image-proc

      • sudo apt-get install ros-indigo-uvc-cam also works.

      • rosrun image_proc image_proc image:=/usb_cam/image_raw

      • </ul> </h2> run_camera.launch

        <launch>
          <node ns="camera" pkg="image_proc" type="image_proc" name="image_proc"/>
          <node ns="camera" pkg="uvc_camera" type="uvc_camera_node" name="uvc_camera"
            output="screen">
            <param name="width" type="int" value="1280" />
            <param name="height" type="int" value="720" />
            <param name="fps" type="int" value="30" />
            <param name="frame" type="string" value="camera" />
            <param name="device" type="string" value="/dev/video0" />
            <param name="camera_info_url" type="string"
              value="file:///home/kscottz/.ros/camera_info/usb_cam.yaml" />
          </node>
          <node name="image_view_raw" pkg="image_view" type="image_view" respawn="false" output="screen">
            <remap from="image" to="/camera/image_raw"/>
            <param name="autosize" value="true" />
          </node>
          <node name="image_view_rect" pkg="image_view" type="image_view" respawn="false" output="screen">
            <remap from="image" to="/camera/image_rect_color"/>
            <param name="autosize" value="true" />
          </node>
        </launch>
        

        I would be remiss if I didn't show you some computer vision.

        • To get images into python we use the OpenCV bridge.

        • These are Numpy/CV2 format, but I like to use SimpleCV.

        • Let's find the largest red thing and output its pixel position.

        • </ul> </h2> image_processing_example.py

          #!/usr/bin/env python
          import rospy
          # This is the tool that marshals images into OpenCV
          from cv_bridge import CvBridge, CvBridgeError 
          # Import some stock ROS message types.
          from geometry_msgs.msg import Point
          from sensor_msgs.msg import Image
          # import some utils.
          import numpy as np
          import cv
          import SimpleCV as scv
          import copy as copy
          
          class ImageProcessingExample:
              def __init__(self):
                  # Allows conversion between numpy arrays and ROS sensor_msgs/Image
                  self.bridge = CvBridge() 
          
                  # Allow our topics to be dynamic.
                  self.input_camera_topic = rospy.get_param('~input_camera_topic', '/camera/image_rect_color')
                  self.output_camera_topic = rospy.get_param('~output_camera_topic', '/camera/color_stuff')
          
                  # WE are going to publish a debug image as it comes in.
                  self.pub = rospy.Publisher(self.output_camera_topic, Image,queue_size=10)
                  # We also publish the position of the thing we found.
                  self.point_pub = rospy.Publisher("/position",Point,queue_size=10)
                  # we get our input from a topic, it is of type image, call _show_colored_stuff.
                  rospy.Subscriber(self.input_camera_topic, Image, self._show_colored_stuff)
                  # run the node
                  self._run()
          
              # Keep the node alive
              def _run(self):
                  rospy.spin()
          
              def _show_colored_stuff(self,input):
                  # convert our image to CV2 numpy format from ROS format
                  latest_image = self.bridge.imgmsg_to_cv2(input)
                  if( latest_image is not None ):
                      try:
                          # convert the image to SimpleCV
                          img = scv.Image(latest_image, cv2image=True, colorSpace=scv.ColorSpace.RGB)
                          # create a writeable copy
                          src = img.copy()
                          # Do our image processing get hue, threshold, morphology, connected components.
                          mask = src.hueDistance(180).invert().threshold(240).erode(2).dilate(3)
                          blobs = src.findBlobsFromMask(mask=mask)
                          # if we find something 
                          if( blobs ):
                              # draw its convex hull with alpha
                              blobs[-1].drawHull(width=-1,color=scv.Color.YELLOW,alpha=128)
                              # and publish a position update
                              pt = Point(x=blobs[-1].x,y=blobs[-1].y,z=blobs[-1].area())
                              self.point_pub.publish(pt)
                              src = src.applyLayers()
                          # convert SimpleCV to CV2 Numpy Format
                          cv2img = src.getNumpyCv2()
                          # Convert Cv2 numpy to ROS format
                          img_msg = self.bridge.cv2_to_imgmsg(cv2img, "bgr8")
                          # publish the topic.
                          self.pub.publish(self.bridge.cv2_to_imgmsg(cv2img, "bgr8"))
                      except CvBridgeError, e:
                          print e
          
          # Boilerplate node spin up. 
          if __name__ == '__main__':
              try:
                  rospy.init_node('ImageProcessingExample')
                  p = ImageProcessingExample()
              except rospy.ROSInterruptException:
                  passpython
          

          We then add the following to our launch file.


          run_image_processing.launch

          ... 
            <node name="image_processing_example" pkg="owi_arm" type="image_processing_example.py"/>
          ...
          

          Lesson 5: URDF & RVIZ

          • URDF - Universal Robot Descrition Format

          • Fancy XML that describes the geometry of your robot.

          • Exportable from a bunch of CAD packages with a plugin.

          • Also defines joints and lots of other stuff.

          • RVIZ is the ROS visualization package. Shows URDFs and much more.

          
          
          In [ ]:
          
          

          Let's look at a simplified URDF for twitchy

          • Robots are made up of LINKS and JOINTS.

          • Links have size, color, and kinematics.

          • Joints have types -- in this case revolute

          • Joint states are what are really moving our robot.

          • This is a gross over simplification as twitchy does have some other dynamics.

          twitchy.urdf

          <?xml version="1.0" encoding="utf-8"?>
          <robot name="twitchy">
            <link name="base_link">
              <visual>
                <geometry>
                  <box size="0.097 0.070 0.014"/>
                </geometry>
                <material name="white">
              <color rgba="1 1 1 1"/>
                </material>
              </visual>
            </link>
            <link name="body_link">
              <visual>
                <geometry>
                  <box size="0.078 0.068 0.080"/>
                </geometry>
                <material name="magenta">
              <color rgba="1 0 1 1"/>
                </material>
              </visual>
            </link>
            <link name="arm_link_1">
              <visual>
                <origin xyz="0 0 0.07" rpy="0 0 0"/>
                <geometry>
                  <box size="0.027 0.025 0.137"/>
                </geometry>
                <material name="yellow">
              <color rgba="1 1 0 1"/>
                </material>
              </visual>
              <inertial>
                <origin xyz="0 0 0" rpy="0 0 0"/>
                <mass value="1"/>
                <inertia ixx="100"  ixy="0"  ixz="0" iyy="100" iyz="0" izz="100" />
              </inertial>
            </link>
            <link name="arm_link_2">
              <visual>
                <origin xyz="0.0705 0 0.0" rpy="0 0 0"/>
                <geometry>
                  <box size="0.151 0.017 0.022"/>
                </geometry>
                <material name="cyan">
              <color rgba="0 1 1 1"/>
                </material>
              </visual>
              <inertial>
                <origin xyz="0 0 0" rpy="0 0 0"/>
                <mass value="1"/>
                <inertia ixx="100"  ixy="0"  ixz="0" iyy="100" iyz="0" izz="100" />
              </inertial>
            </link>
            <link name="wrist">
              <visual>
                <geometry>
                  <cylinder length="0.01" radius="0.016"/>
                </geometry>
                <material name="mystery2">
              <color rgba="0.1 1 0.3 1"/>
                </material>
              </visual>
            </link>
          
            <link name="end_effector">
              <visual>
                <geometry>
                  <cylinder length="0.01" radius="0.016"/>
                </geometry>
                <material name="mystery">
              <color rgba="0.5 0.5 0.5 1"/>
                </material>
              </visual>
            </link>
            <link name="magnet">
              <visual>
                <geometry>
                  <cylinder length="0.005" radius="0.008"/>
                </geometry>
                <material name="mystery_red">
              <color rgba="1 0 0 1"/>
                </material>
              </visual>
            </link>
            <joint name="base_to_body" type="revolute">
              <axis xyz="0 0 1"/>
              <limit effort="1000.0" lower="-1.57" upper="1.57" velocity="0.5"/>
              <parent link="base_link"/>
              <child link="body_link"/>
              <origin xyz="0.01 0 0.047"/>
            </joint>
            <joint name="body_to_arm1" type="revolute"> 
              <limit effort="1000.0" lower="0" upper="1.7" velocity="0.5"/>
              <axis xyz="0 1 0"/>
              <parent link="body_link"/>
              <child link="arm_link_1"/>
              <origin xyz="0.015 0 0.035"/>
            </joint>
            <joint name="arm1_to_arm2" type="revolute">
              <limit effort="1000.0" lower="-1.7" upper="1.7" velocity="0.5"/>  
              <axis xyz="0 1 0"/>
              <parent link="arm_link_1"/>
              <child link="arm_link_2"/>
              <origin xyz="0.0 0.0 0.137"/>
            </joint>
            <joint name="arm2_to_wrist" type="revolute">
              <limit effort="1000.0" lower="-3.1415" upper="3.1415" velocity="0.5"/>  
              <axis xyz="0 1 0"/>
              <parent link="arm_link_2"/>
              <child link="wrist"/>
              <origin xyz="0.14 0.0 -0.015" />
            </joint>
            <joint name="wrist_to_endeffector" type="revolute">
              <limit effort="1000.0" lower="-1.57" upper="1.57" velocity="0.5"/>  
              <axis xyz="0 0 1"/>
              <parent link="wrist"/>
              <child link="end_effector"/>
              <origin xyz="0.0 0.0 -0.01"/>
            </joint>
            <joint name="endeffector_to_magnet" type="fixed">
              <parent link="end_effector"/>
              <child link="magnet"/>
              <origin xyz="0.0 0.0 -0.005"/>
            </joint>
          </robot>
          

          Let's spin up twitchy's URDF

          twitchy_model.launch

          <launch>
              <arg name="model"/>
              <param name="robot_description" textfile="/home/kscottz/Code/toy_ws/src/owi_arm/launch/urdf/twitchy.urdf" />
              <param name="use_gui" value="false"/>
              <node name="joint_state_publisher" pkg="joint_state_publisher" type="joint_state_publisher">
              <param name="use_gui" value="true"/>
              </node>
              <node name="robot_state_publisher" pkg="robot_state_publisher" type="state_publisher"/>
              <node name="rviz" pkg="rviz" type="rviz" args="/home/kscottz/Code/toy_ws/src/owi_arm/data/urdf.rviz" required="false" />
          </launch>
          

          Now let's wire the URDF up to our motors.

          • Pretty simple -- map motors to joints

          • Take the robot's input state topic and map it to joint states.

          • One rub, if you look at our robot model the head is always parallel to the ground.

          • </ul> </h2>

            #!/usr/bin/env python 
            # THIS SHEBANG IS REALLY REALLY IMPORTANT
            from sensor_msgs.msg import JointState
            from std_msgs.msg import Int16MultiArray
            import rospy
            import time
            import numpy as np
            import tf
            
            class JointStateRepublisher(object):
                def __init__(self):
                    # trims are optional, just small adjustments to the 
                    # motor angles.
                    self.m0_trim = rospy.get_param('m0_trim', -90.00)
                    self.m1_trim = rospy.get_param('m1_trim', -90.0)
                    self.m2_trim = rospy.get_param('m2_trim', 0.00)
                    self.m3_trim = rospy.get_param('m3_trim', 0.00)
                    self.wrist_trim = rospy.get_param('wrist_trim', 0.00)
                    # We need a transform listener to get the head. 
                    self.listener = tf.TransformListener()
                    # The input topic 
                    self.state_topic = rospy.get_param('~arduino_topic', '/robot')
                    # subscribe to the state update of the robot
                    rospy.Subscriber(self.state_topic, Int16MultiArray, self._update_from_arduino)
            
                    #set the joint state publish topic
                    self.joint_topic = rospy.get_param('~joint_topic', 'joint_states')
                    self.joint_state_pub = rospy.Publisher(self.joint_topic, JointState, queue_size = 1 )
                    # m0 m1 m2 wrist m3
                    self.states = [0,0,0,0,0]
                    # set our update rate to 30hz
                    self.rate = rospy.get_param('~rate', 30)
                    self._run()
            
                def _run(self):
                    # update the robot's state at 30Hz
                    r = rospy.Rate(self.rate) # 30hz 
                    while not rospy.is_shutdown():
                        self._publish_joint_state()
                        r.sleep()
            
                def _update_from_arduino(self,msg):
                    pitch = self.states[3]
                    # get the pitch as a euler angle between the wrist
                    # and the ground plane. Set the wrist to what it is.
                    self.listener.waitForTransform('/base_link','/wrist',rospy.Time(),rospy.Duration(.1))
                    (trans,rot) = self.listener.lookupTransform('/base_link', '/wrist', rospy.Time(0))
                    euler = tf.transformations.euler_from_quaternion(rot)
                    pitch = euler[1]
                    # Also add the update to the pitch.
                    self.states[3] += pitch
                    self.states[0] = np.deg2rad(msg.data[4]) + np.deg2rad(self.m0_trim)
                    self.states[1] = -(np.deg2rad(msg.data[3]) + np.deg2rad(self.m1_trim))
                    self.states[2] = np.deg2rad(msg.data[2]) + np.deg2rad(self.m2_trim)     
                    self.states[4] = np.deg2rad(msg.data[1]) + np.deg2rad(self.m3_trim)
            
                def _update(self):
                    try:
                        # we repeat the process every time we 
                        # get an update from the robot.
                        self.listener.waitForTransform('/base_link','/wrist',rospy.Time(),rospy.Duration(.1))
                        (trans,rot) = self.listener.lookupTransform('/base_link', '/wrist', rospy.Time(0))
                        euler = tf.transformations.euler_from_quaternion(rot)
                        pitch = euler[1]
                        self.states[3] += pitch
                    except:
                        pass
            
                def _publish_joint_state(self):
                    # This gets gets called at 30Hz
                    # do the update
                    self._update()
                    # construct the joint state message.
                    msg = JointState()
                    msg.header.stamp = rospy.Time.now()
                    msg.name =  ['base_to_body', 'body_to_arm1', 'arm1_to_arm2', 'arm2_to_wrist','wrist_to_endeffector']
                    msg.position = self.states
                    # we are not using these
                    msg.velocity = [0.00,0.00,0.00,0.00,0.00]
                    msg.effort = [0.00,0.00,0.00,0.00,0.00]
                    # republish the message. 
                    self.joint_state_pub.publish(msg)
            
            if __name__ == '__main__':
                try:
                    # general ROS boiler plate. 
                    rospy.init_node('joint_state_republisher')
                    node = JointStateRepublisher()
                except rospy.ROSInterruptException:
                    rospy.logwarn('ERROR!!!')
            

            Let's construct the launch file for our robot

            run_arm_with_urdf.launch

            <launch>
             <node name="joint_state_publisher" pkg="joint_state_publisher" type="joint_state_publisher">
                   <remap from="joint_states" to="different_joint_states" />
             </node>
                <arg name="model"/>
                <param name="robot_description" textfile="/home/kscottz/Code/toy_ws/src/owi_arm/launch/urdf/twitchy.urdf" />
                <param name="use_gui" value="true"/>
                <node name="robot_state_publisher" pkg="robot_state_publisher" type="state_publisher"/>
                <node name="rviz" pkg="rviz" type="rviz" args="/home/kscottz/Code/toy_ws/src/owi_arm/data/urdf.rviz" required="false" />
             <node pkg="joy" type="joy_node" name="joy_stick">
                   <param name="dev" type="string" value="/dev/input/js0" />
                   <param name="deadzone" value="0.12" />
                   <param name="autorepeat_rate" value="30"/>
             </node>
             <node pkg="rosserial_python" type="serial_node.py" name="arduino_serial">
                   <param name="port" value="/dev/ttyACM0"/>
                   <param name="baud" value="115200"/>       
             </node>
              <node pkg="owi_arm" type="toy_joint_states.py" name="my_joint_states">
              </node>
              <node pkg="owi_arm" type="owi_joystick_node.py" name="controller">
              </node>
              <node pkg="owi_arm" type="arm_controller.py" name="animate_service">
              </node>
            
            </launch>
            

            Lesson 6: Let's put the Camera and the URDF together

            • We've got a camera, but what happens if we put it in our URDF model?

            • If we know where our camera is (x,y,z,p,y,r) we can map pixels to positions.

            • How do we figure out where our camera is with respect to our robot?

            • One solution is that we get our ruler and measure.

            • But we are lazy software engineers, can't the camera figure out where it is for us?

            AR Markers FOR THE WIN!

            • These are basically little printable QR codes.

            • High contrast so they are easy for CV to find.

            • If we know how big they are, for a calibrated camera we can figure out the marker's position w/r/t to camera.

            • We can also invert the transform! (do the reverse)

            • We can also peg the transforms to other frames in our model.

            • This is on git so we use wstool set ar_tools --git repo_name to add it.

            Let's see this in action

            ar_robot_multi.launch

            <!-- snip -->
             <node pkg="tf" type="static_transform_publisher" name="ar_to_robot" 
                args="0.0 0.11 0.0 1.57 0 0 4x4_68 base_link 10"/>
             <node name="ar_pose" pkg="ar_pose" type="ar_multi" respawn="false"
                output="screen">
               <param name="marker_pattern_list" type="string"
                  value="/home/kscottz/Code/toy_ws/src/owi_arm/launch/markers_to_find.txt"/>
                <param name="threshold" type="int" value="150"/>
                <param name="use_history" type="bool" value="true"/>
                <param name="reverse_transform" type="bool" value="true"/> 
             <!-- snip --> 
             </node>
            

            markers_to_find.txt

            #the number of patterns to be recognized
            3
            
            #pattern 1
            4x4_68
            data/4x4/4x4_68.patt
            100.0
            0.0 0.0
            
            #pattern 2
            4x4_82
            data/4x4/4x4_82.patt
            40.0
            0.0 0.0
            
            #pattern 3
            4x4_80
            data/4x4/4x4_80.patt
            40.0
            0.0 0.0

            WTF: What's TF?

            • In ROS a TF is a transform.

            • The TF library handles all of the linear algebra for free!

            • Let's reflect on this. We've reduced a lot of math to a simple question. How to get from A to B
            • </ul> </h2> In the shell we can do this look up quick

              rosrun tf tf_echo 4x4_82 end_effector
              At time 1428374587.598
              - Translation: [-0.541, -1.018, -0.035]
              - Rotation: in Quaternion [0.311, 0.037, -0.591, 0.744]
                          in RPY [0.480, 0.436, -1.235]
              rosrun tf tf_echo end_effector 4x4_82
              At time 1428374641.889
              - Translation: [-0.470, 0.038, -0.247]
              - Rotation: in Quaternion [-0.037, 0.007, 0.885, 0.464]
                          in RPY [-0.022, 0.071, 2.174]
              

              If we want to do this in python it looks like this.

              import tf
              import rospy
              listener = tf.TransformListener()
              listener.waitForTransform('/end_effector','/4x4_82',rospy.Time(),rospy.Duration(.1))
              (trans,rot) = self.listener.lookupTransform('/end_effector', '/4x4_82', rospy.Time(0))
              rospy.loginfo("end_effector->marker_82 trans: {0} rot: {1}".format(trans,rot))
              

              Lesson 7 -- Inverse Kinematic in Move It

              • Now we can find things, move robot joints, and roughly estimate how to make a robot go somewhere.

              • We want to put the arm at a certain spot, what are the joint angles to get there?

              • Bonus points if the robot doesn't flail around madly. (i.e. optimal solution)

              • Bonus points if the robot doesn't slam into itself or others. (i.e. internal collision).

              • Guess what -- ROS has tools for this too!

              • This is called inverse kinematics. Lots of math. Lots of fun.

              • roslaunch moveit_setup_assistant setup_assistant.launch

              
              
              In [ ]:
              
              
              
              
              In [ ]: