C++代写:COMP3017 Ball Localization

TurtleBot附近的物体可以通过外部传感器识别并进行交互,使用卡尔曼滤波器实现图像的检测以及球体的定位。

Ball

Requirement

Objects in the vicinity of the TurtleBot3 can be recognized and interacted with via the external sensors. In this programming task, the detection of an orange tennis ball in RGB images of the Intel RealSense R200 and the subsequent position determination (ball modeling) of the ball are to be implemented using a Kalman filter.

Implement the following functionalities in the classes Detector (detector.cpp, detector.h) and KalmanFilter (kalman_filter.cpp, kalman_filter.h).

Ball detection

First of all, the balls contained in the camera images must be recognized. For this you should use the OpenCV library, which provides an extensive collection of image processing tools. Below you will find a brief description of a possible solution strategy in Figure 1. You are free to implement alternatives to this exemplary solution.

The class Detector has the update method, which receives as input parameters the camera- captured and rectified RGB image as a matrix in the img parameter, the current time stamp and the time passed since the last detection dt. The image is received via the topic “sensor / camera / rgb / image_rect_color”. In order to carry out “Time Update” and “Measure Update”, the Kalman filter to be implemented in the next part of the task, the balls in the transferred image must first be recognized (see Figure 1). The ball detection is to be outsourced in the method detectBalls, which receives the color image and the current time stamp as input and returns the ball positions “Percept” in robot coordinates as vector of the type std::vector<Detector::BallPercept>. The threshold values for color segmentation are defined in the corresponding variables in the header file. Use the projectBallPerceptTo3d method to convert the ball positions recognized in the binary image into robot coordinates and thus create the BallPercept objects required for the “Measurement Update”. Assign the current time stamp to these objects.

The following heuristics can be used to filter the minimally enclosing circles found via OpenCV based on the previously determined contours. The circles should have a radius that is larger than 10 px. Furthermore, the proportion of the contour area (also determined via OpenCV) in the circular area should be greater than 60%. You are free to adapt and extend these heuristics, for example via the plausibility of the ball position (strongly negative z values mean that the ball is in the ground).

After calling up the ball detection in the update method, a “time update” is first carried out with the help of dt and, based on the ball positions percepts or the time stamp, the “measure update” of the Kalman filter, which is implemented as described below should.

Ball modeling

The implementation of the Kalman filter cv::KalmanFilter, which is also available in OpenCV, is to be used for ball modeling. The KalmanFilter class to be implemented should estimate the state of each ball using the system model xk = A xk-1 + wk with the state transition matrix A and the model of measurement zk = H xk + vk with the measurement matrix H. The state should consist of the position and speed xk = (xk, yk, zk, xk , yk , zk )T of the ball in robot coordinates.

In the initKalman method, the Kalman filter (kf_), the system state (state_) and the measurement (measurement_) must be initialized. In addition to the instantiation of the Kalman filter, its attributes, the state transition matrix A (transitionMatrix), the measurement matrix H (measurementMatrix), and the covariance matrices Q (processNoiseCov) and R (measurementNoiseCov) must be initialized. For the covariance matrix Q of the system, the values 10e - 5 for the position estimation and 10e - 4 for the speed estimation can be assumed, whereby the matrix should only be occupied on the diagonal. The values of the covariance of the measurement noise R can be assumed to be 0.001 on the diagonal. Take a look at the matrix operations available via OpenCV for initialization.

The updateMeasurement method to be implemented receives the ball position position estimated by the image recognition in robot coordinates and the current time stamp. The position should be stored in the measurement_ measurement and stored in the ball model ball_model_ as the last detected position at the time stamp.

The timeUpdateStep method to be implemented carries out the “Time Update” with the time step dt on the Kalman filter kf_. The transition matrix must be updated to carry out the “Time Update”. Optionally, you can also adapt the processNoiseCov matrix to avoid higher uncertainties e.g. to be taken into account by the ball movement in the corresponding component for the position or speed estimation. The position or speed estimated after the “Time Update” and the current time stamp ros :: Time :: now () should be saved in the ball model ball_model_.

In the correctionStep method, the “Measurement Update” should be carried out with the measurement_ on the Kalman filter kf_ and the corrected position or speed as well as the current time stamp ros::Time::now() should be saved in the ball model ball_model_.

Sequence of the steps possible for ball detection in the update and detectBalls methods based on an RGB color image IRGB. This is first converted into a binary image ISW, which differentiates between areas that potentially contain a ball and the background. The object recognition is finally carried out on this and the image positions of all the balls found are converted into robot coordinates Bxyz and returned.

Hints and start the program

  • Start the application:
    • Start ball localization roslaunch turtlebot3_ball_tracking_exercise start_ball_tracking_with_bag.launch
  • Various ROS bag files are available for testing your solution, which are available for download in Moodle and must be stored in the “bags” subfolder. You can change the bag file used by adapting the launch file or adding the optional parameter bag: = my_file.bag when executing roslaunch. The bag files contain the recordings of the Intel RealSense R200 on the Turtlebot.
    • Bags for checking ball detection optimal_circle: 25 cm distance from the camera, horizontally shifted by -10 cm, -1.8 cm below the camera. close_real_ball: 10cm distance from the camera, horizontally shifted by 0cm, -1.8 cm below the camera.
    • Bags for checking the position estimate using the Kalman filter ball_occluded: 60cm distance from the camera, horizontally shifted by 20cm, ball is covered ball_obstacle: Position approx. 60 cm from the camera, runs horizontally from 50 cm to -30 cm behind the obstacle. ball_at_ball: Ball runs past ball.

Note: The bag files should be decompressed before use. To decompress all bagfiles at once, execute the following command in the folder: rosbag decompress *.bag

  • Other optional parameters of the launch file are: loop: = true | false The bagfile is played in an endless loop (default: false) rate: = x The bag file is played at speed x (default: 1.0)
  • You can also check your solution to the Kalman filter methods to be implemented using the tests provided. The tests only serve as additional support and do not cover all cases. Passing all tests does not mean that the sub-task’s full score has been achieved.

Build and run the tests

turtle test turtlebot3_ball_tracking_exercise ball_tracking.test
  • Below is a note to help you evaluate the solution. During your test runs, make sure that the limits of the ball are recognized as precisely as possible. The Kalman filter should be able to use the previous ball course for prediction even if there is no ball detection, so that no new models are generated for the same ball.

Debug visualization of the ball localization consisting of the camera image with recognized ball position (left), the binary image for ball recognition (top right) and the position estimation of the Kalman filter (bottom right).

  • Various parameters are stored in the “ball_tracking_config.yaml” file, such as the threshold values for thresholding or writing debug images (“img” folder in the package required) from the video sequence. By default, a visualization of the recognition consisting of the video signal with recognized ball position, the binary image and the ball position estimate is displayed by the Kalman filter (see Figure 2). The red circle on the ball position estimate visualizes the error covariance Pk of the Kalman filter and thus its uncertainty in the estimate.