Quick example of localization with only a 360 degree LiDAR. The final robot will use RTK+IMU+ODOM, running through a extended Kalman filter with a 6D model (3D position and 3D orientation) to combine measurements from wheel odometry, IMU sensor, visual odometry and GPS position. The outcome from this will be feed to the AMCL package that uses the LiDAR for localization to find know structures in a know map. The Key to get this working in real life is to get a reasonably position in the real world for the robot.
The Virtual fences is only working if the relative position is quite accurate to the real world. The virtual fence, is just like a normal perimeter wire (used as a limit on where to robot can’t go). Later on when creating the virtual fences, one will navigate the robot around the lawn (with Joystick) to setup the virtual fences (for example lawn limits, trees, objects and other stuff). This is the first test and the results looks promising. Later on we will explore the options for using the 3D camera for realtime elevation mapping in combines to the 360 LiDAR to increase accuracy as well. I posted a small video on the current process when, launching and creating fences.