Thought some of you (esp. Linux_0 and MU_Engineer) might be interested to hear about my robot I made last Summer (a little late, I know, but between then and now I was really busy, hence my absence)
A link with reports and such: http://www.cise.ufl.edu/~bmouring/
Basically, I got a small PC/104 single board computer with an Intel XScale (ARM-based) processor. This board was responsible for receiving the video stream from the camera, parsing out individual frames from the video stream (which required a modification to libjpeg), decompress the jpeg-compressed frame, run various filters on the image (simple gamma distance, vertical-enhancing sobel, and horizontal/vertical histograming), run simple AI on the results (fitting to a profile to give a confidence value to how likely a block was located and not, say, a same-colored ball), then communicate the results to the microcontroller board. The microcontroller board (which was programmed using eclipse coupled with a gcc toolchain that was built to target ATMEL uC's) would take this info (along with sonar rangefinder info) and head toward the block. If something got in the way of the robot before it got to the block, it would avoid the obstacle, reacquire the block visually, and continue. Once it picked up the block, it would then try to find me (actually, my black shoes) and return the block to me.
Once again, showing Linux is more than just wobbly windows.
Up next: a UMC controlled mini-mill. Can you say "homebrew prototyping"?
A link with reports and such: http://www.cise.ufl.edu/~bmouring/
Basically, I got a small PC/104 single board computer with an Intel XScale (ARM-based) processor. This board was responsible for receiving the video stream from the camera, parsing out individual frames from the video stream (which required a modification to libjpeg), decompress the jpeg-compressed frame, run various filters on the image (simple gamma distance, vertical-enhancing sobel, and horizontal/vertical histograming), run simple AI on the results (fitting to a profile to give a confidence value to how likely a block was located and not, say, a same-colored ball), then communicate the results to the microcontroller board. The microcontroller board (which was programmed using eclipse coupled with a gcc toolchain that was built to target ATMEL uC's) would take this info (along with sonar rangefinder info) and head toward the block. If something got in the way of the robot before it got to the block, it would avoid the obstacle, reacquire the block visually, and continue. Once it picked up the block, it would then try to find me (actually, my black shoes) and return the block to me.
Once again, showing Linux is more than just wobbly windows.
Up next: a UMC controlled mini-mill. Can you say "homebrew prototyping"?