University of Florida
Research Faculty
PID Controller with ROS
A PID controller is implemented that feeds back on the translational error and the yaw error.
The turtle bot follows a rectangular trajectory defined using waypoints. These waypoints are defined in the Cartesian coordinate frame.
The HITL video shows an example of an improper dynamics update rate. An important step for HITL is to launch the node that maps the velocity command to the motor torque of the Kobuki bot.
A turtle bot spawns in Gazebo through a ROS node (SITL).
Gazebo sends the current pose information to the controller node.
Kobuki ground robot in action with the PID Controller (HITL).
The motion capture system (MoCap) sends the current pose information to the controller node.
The jitter is due to the lower frequency of publishing the pose message.
Pose Graph Optimization
Factor graphs are graphical models (Koller and Friedman, 2009) that are well suited to modeling complex estimation problems, such as Simultaneous Localization and Mapping (SLAM) or Structure from Motion (SFM). A factor graph, however, is a bipartite graph consisting of factors connected to variables. The variables represent the unknown random variables in the estimation problem, whereas the factors represent probabilistic constraints on those variables, derived from measurements or prior knowledge.
(Source: Factors Graph and GTSAM )
A simple hexagonal trajectory is traversed by the agent.
New nodes are created based on pre-defined odometry.
The uncertainty ellipse increases with time.
Loop closure constraint optimizes the trajectory and reduces uncertainty.
A figure 9 trajectory is traversed by the agent.
Exact waypoints form the nodes for the "Truth" trajectory.
Gaussian-corrupted nodes form the "Unoptimized" trajectory.
Post-optimization, nodes form the "Optimized" trajectory.
Swarm of MAVs (Micro Aerial Vehicles)
Crazy Flies sequential take-off. Each member of the swarm aimed to take off one after the move by 1 unit on one of the axes of the coordinate system.
#1 takes off successfully but crashes due to propeller loss.
#2 takes off and hovers successfully but crashes due to sensor misleading from the obstacle in trajectory.
#3 takes off, hovers, and lands successfully.
Terrain Relative Navigation
This project is a visualization animation for Terrain Relative Navigation (TRN) and serves as its foundation. The first step for TRN is feature identification. For the purpose of investigation, the terrain features are defined as points of the Cartesian coordinate system and are made available apriori to the flying agent (quadcopter) for detection. In application, the flying agent would rely on visual aid by the onboard camera.
The rectangular box in the animated figure represents the field of view (fov) of an autonomous flying agent (UAV), and the colored points represent features on the ground/terrain; they have been assigned unique IDs. The relative coordinates of the terrain features are used to determine their relative elevation and azimuth angle. These parameters are intended to be inputs to the particle filter. The package has been developed in MATLAB R2020a.