This week I’ve did the first actual test drive on my lawn. Everything went fine and the robot is mowing around fine even with the mounted temporary front wheels. For the test drive I am using the morph stack with a DS4 joystick. The speed is okay, I could reduce the speed slightly in the joystick launch.
Before I could not test the robot with any resistance on the wheels, since the pulley started to slip when forces where applied. I re-designed all pulleys with anti-slip protection and problem where solved.
Tonight I have begun building the robot’s URDF for better visualization in RVIZ matching the reality when driving around. Its always good trying to have an exact match for visualization, this helps both the human and the robot understand the environment easier. The URDF will mostly be built with combined STL files to keep the mesh size down (I don’t want a to heavy model that steals performance away from other tasks)
Even if I use meshes The URDF takes a lot of time building. All joints and sensors needs to been the right place in order for a accurate visualization.
The robot will use two IMU´s (9DOF and ZR300), RTK (Reach RTK), LiDAR (Scans Sweep), Wheel Odom (Absolute encoders AS5048A), (also maybe VO if it works good outside). To get a good localization for the robot in the real world. I will also try to use robot localization package (sensor fusion) to combine several sensors data and get the robot state estimation. Lets see how it goes further down the line. My mission for this week is to mount 1xIMU, liDAR and try to run gmapping in the real world (previously shown as a simulation) and test virtual fences.