nov 27 - first test of the sensor in the wild

today i went out for the first time with the robot with wheels, pi, webcam and lidar only. So I pushed it like a stroller in front of me and guided it around the forest and neighbourhood for about an hour.

to get started i used 2 usb battery packs to power the lidar any my mobile phone hotspot to have both the pi and my laptop be in the same network. Then i did an ssh from the laptop to the pi and triggered the webcam via "nohup rosrun usb_cam usb_cam_node &", the lidar via "nohup roslaunch ydlidar ydlidar.launch &" and the recordign via "nohup rosbag record /usb_cam/image_raw/compressed /scan --split --duration=5m --lz4 &-2.

The nohup and the & take care of running the job in the background and not quiting when my ssh session ends. The record options compress the recording and split it every 5 minutes (if power fails then i do not loose the whole recording).

Back at home i stopped the recording and all else and then sat down in my basement to review 50 minute of recording.

I did not get virtualbox to work and vm ware costs 50€ - so I did not get a decent vm with rviz runnogn on the recordings - so i looked at all of it on the pi itself.

Observations in the data:

leaves make the line ruther rugged

the line is really not straight but the closer it is the more likely it is in the same position

the road can be tilted based on shaking or based on the road - so the line is typically not horizontal

key is for a line that is not jagged but a smooth line (need to fin the right mathematical function for this

above are patterns that I found to be re-occuring:

  1. the line from the floor is sometimes interrupted with the right line low and the left line high - i attribute this to the laser making a full circle and then a short delay before it continues. So if i turn the laser around 180° then this should go away TODO
  2. the half circle in the bacl that is me
  3. the 2 dots are either some other source of error or the stick that i use for pushing. i have to test this explicietly next time TODO

below is a situation where on the left there is a hill and on the right it goes downhill. The downhill on the right is not so easy to see but if the points are far away from the line to the top right thtis is an indication.

Next I want to create a node that takes the laserscan and comes up with a model of the world ahead and around me that shows where it is save to go and where not. i'm uncertain if this model is based on angles and depth information (very similar to the laserscan - could be input to "traditional" navigation algortihms in ros) or if it should be more like ranges of angles where going towards is save to go and a value indicating how close i'm to trouble if going forward further.

What is really good with the pushbroom laser configuration is that the area in front of me comes in at a very high resolution compared to all of the rest.

Coming up next: a hole in the ground! Not very visible in the lidar - in the image its filled with leaves. I guess "smoothness" of the curve means that the first derivation ("erste ableitung") does not have any jumps.

Our first real obstacle is approaching:

Paved Paths are of course very nice to sense: