- offline data. I spent quite some time writing tools to allow playing back data recorded from the live sensor. I could play back at real time speed, or play it back as fast as possible. And the later one was really useful when I wanted to profile my code. Besides it allowed me to work from home and save a lot of commuting time.
- a visualizer, tightly integrated with my code, so that it can access much of the internal data. With the visualizer I could look at the data, change the value of some parameters to see in real time their effect, pause, zoom, and really ask questions to my code, such as "why this point is considered foreground?"
- a profiler. I've been using google's gperf. Besides, I have easy to use timers that I can quickly integrate in my code to measure the execution time of a particular block. Those time print to stdout the timing results, and I wrote a script to parse the result and display the stats neatly in a table (mean, standard deviation, min, max). Those timers were useful to cover for some of the short comings of the profiler (it has to be run in debug mode).
Tuesday, July 23, 2013
In the past few months I made great progress with analyzing point clouds from the velodyne sensors. One of the challenges is to write super fast algorithms. Looking back, I realize that a few things made my life easier.
Thursday, July 4, 2013
A cloud computing platform for robots. See http://spectrum.ieee.org/automaton/robotics/robotics-software/roboearth-cloud-robotics-rapyuta-service