# 2d mapping using a webcam and a laser

Following on from my previous blog post using a webcam and a laser as a rangefinder, it’s now mounted on a cheap 3 euro stepper motor and used to make a 2d map of the surroundings. As you can see from the video the results are not great, I think with a bit more work there is room for improvement. A PIC18f14k50 is used to control the stepper motor and send the current step value over to the computer where a python script matches this to the current laser distance. Then using basic trigonometry x-y points can be found and plotted on a graph.

Webcam and laser rig

The laser and webcam were just glued to a piece of wood, then that piece of wood was stuck onto a stepper motor and the stepper motor is being held in a vice. The webcam is a Sweex WC003V3, available for 10 euros delivered on ebay. The laser diode is from china so that cost around 4 dollars delivered. The stepper motor is a 28BYJ-48 model, available for 3 euros delivered from china. All in all your looking at around 30 quid for some basic 2d mapping when you include the PIC microcontroller and H-Bridge IC etc.

Theory of operation:

X-Y points

We can work out theta by knowing how many many steps the stepper motor has taken and D is calculated using the webcam and laser combination. From there, by using basic trigonometric equations the x,y points can be worked out.

$xpoints = D*cos(\theta)$
$ypoints = D*Sin(\theta)$

So the webcam-laser rig is rotated through 360 degrees, as it is rotating values of D are matched with values of theta. After it has stopped rotating this information can be converted into x-y points and plotted on a graph. The datasheet for the stepper motor says there are 4096 steps per 360 degrees. I think that might be incorrect though as I found 2048 steps results in a 360 degree turn. From this information we can work out the minimum step angle possible.

$2048 steps = 360\,^{\circ}$
$1 step = 0.176\,^{\circ}$

This minimum angle is important for figuring out what the smallest gap that can be “seen” at the maximum range of the device. In this case the webcam and laser rig is calibrated to around 2.5 metres max range. To find the smallest gap noticeable simple trig can be used:

theoretical smallest gap

This means that at a range of 2.5 meters, the smallest gap that can be reliably detected is 0.763 cm wide. This is quite low, an ideal value for making a detailed map of a room. However this is a theoretical limit, this value assumes every step angle is matched with a value of distance from the webcam. This is not the case due to the fps of the webcam.

The main limitation is how fast you can store the information being received from both the PIC microcontroller and the webcam. The bottleneck in the process is definitely the webcam. At best I could get around 8 fps at 640×480 resolution. A higher fps could be achieved with a lower resolution and this is something I’m going to try in an attempt to get a faster, more detailed scan. As a result of the fairly low fps, the stepper needed to rotate fairly slowly to make up for the loss in scan resolution. The stepper takes 25 seconds to complete a full revolution.

$25seconds = 360\,^{\circ}$
$1second = 14.4\,^{\circ}$

In one second, the stepper rotates through 14.4 degrees. Since the webcam is running at 8fps, this gives us a new angle of 14.4/8 = 1.8 degrees. So now the minimum gap visible at max range becomes:

smallest gap

This brings the resolution down to 7.85 centimeters at the maximum range. This is still usable but it’s something I might try and improve. The easiest way would be getting a better webcam. The PlayStation eye toy camera works with opencv and it can easily capture 60 fps. So that is definitely an option.

Sourcecode:

Here is the code for the microcontroller to control the stepper and send the current step value over serial:


//Written by Shane Ormonde 26/1/2014
//Compiled using XC8 compiler.

#include
#include
#include

#pragma config FOSC=IRC,MCLRE=OFF,WDTEN=0,LVP=OFF,BOREN=OFF

//Functions
void setup(void);
void config_usart(void);
void step_pos(int n);
void step_neg(int n);

//Variables
int x_state = 1;
int count = 0;
char c;

void main(void)
{
setup();
config_usart();

while(1)
{
if (PIR1bits.RCIF == 1 ) // new data in receive buffer
{
c = RCREG;
if( c == 'g')
{
step_pos(2048); // rotate clockwise 360 degrees
step_neg(2048); // rotate anticlockwise 360 degrees

}

}
}

}
void setup(void)
{
OSCCON = 0b01100011; //internal oscillator block, 8MHz clock speed.
TRISB = 0b00100000; // all outputs except for rx pin
TRISC = 0b00000000;
}

void config_usart(void)
{
TXSTAbits.TXEN = 1; // transmit enabled
TXSTAbits.SYNC = 0; // asynchronous mode
RCSTAbits.SPEN = 1; // usart enabled, tx set as output
RCSTAbits.CREN = 1; // enable rx
SPBRG = 12;         // baud rate = 9600
}

void putch(char data) // this function is required for printf to work properly, got it from the xc8 manual
{
while (!TXIF)
continue;
TXREG = data;
}

void step_pos(int n)  // this function accepts the number of steps clockwise required as its argument
{
count = 0;
while( count < n )
{
if (count == n) break;

LATCbits.LATC2 = 1;
Delay10KTCYx(2);
LATCbits.LATC2 = 0;
count++;
printf("%d\n",count);

if (count == n) break;

LATBbits.LATB4 = 1;
Delay10KTCYx(2);
LATBbits.LATB4 = 0;
count++;
printf("%d\n",count);

if (count == n) break;

LATCbits.LATC7 = 1;
Delay10KTCYx(2);
LATCbits.LATC7 = 0;
count++;
printf("%d\n",count);

if (count == n) break;

LATCbits.LATC6 = 1;
Delay10KTCYx(2);
LATCbits.LATC6 = 0;
count++;
printf("%d\n",count);

}

}

void step_neg(int n)  // this function accepts the number of steps anti-clockwise required as its argument
{
count = 0;
while( count < n ) 	{ 		if (count == n) break; 		LATCbits.LATC6 = 1; 		Delay10KTCYx(2); 		LATCbits.LATC6 = 0; 		count++; 		printf("%d\n",count); 		if (count == n) break; 		LATCbits.LATC7 = 1; 		Delay10KTCYx(2); 		LATCbits.LATC7 = 0; 		count++; 		printf("%d\n",count); 		if (count == n) break; 		LATBbits.LATB4 = 1; 		Delay10KTCYx(2); 		LATBbits.LATB4 = 0; 		count++; 		printf("%d\n",count); 		if (count == n) break; 		LATCbits.LATC2 = 1; 		Delay10KTCYx(2); 		LATCbits.LATC2 = 0; 		count++; 		printf("%d\n",count); 		 	} 	 } 

Here is the Python script to grab frames from the webcam, calculate distance, match that with a step number,calculate the x-y points and then plot those points:

 ## Written by Shane Ormonde on 28th January 2014 ## scans surrounding area and plots x-y points forming a 2d map import cv2 from numpy import * import math import time import serial import matplotlib.pyplot as plt ser = serial.Serial('/dev/ttyUSB1', 9600)  ser.close() ser.open() #variables loop = 1 dot_dist = 0 count = 0 i = 0 step_num = [] obj_dist = [] x_points = [] y_points = [] angles= [] #cv2.namedWindow("preview") vc = cv2.VideoCapture(2) if vc.isOpened(): # try to get the first frame     rval, frame = vc.read() 	 else:     rval = False     #print "failed to open webcam" 	 input("type something and press enter to continue...") ## wait for user input to begin ser.write('g') ## this tells the pic to start rotating the stepper while loop == 1: 	 	############Get distance value from camera############### 	#cv2.imshow("preview", frame) 	rval, frame = vc.read() 	key = cv2.waitKey(20) 	if key == 27: # exit on ESC 		loop = 0 	num = (frame[...,...,2] > 245)
xy_val =  num.nonzero()

y_val = median(xy_val[0])
x_val = median(xy_val[1])

#dist = ((x_val - 320)**2 + (y_val - 240)**2 )**0.5 # distance of dot from center pixel
dist = abs(x_val - 320) # distance of dot from center x_axis only

#print " dist from center pixel is " + str(dist)

theta =0.0011450*dist + 0.0154
tan_theta = math.tan(theta)

if tan_theta > 0: # bit of error checking
obj_dist.append(float(5.33 / tan_theta))## add the latest distance value to obj_dist lsit

###### Get stepper value from PIC ####################
try:
except ValueError:
pass

if (int(ser.readline()) > 2000): ## if a full revolution has occured, stop grabbing frames and process data
loop = 0

print "\033[12;0H" + "the dot is " + str(obj_dist) + "cm  away"

ser.write('h')		## sent a 'h' over serial to return the stepper to original position

######## Get x-y points from angle and distance information ################

for i in range(len(step_num)):
angles.append(step_num[i]*0.176)    ## converting steps into angles in degrees

i = 0
print "len angles : " + str(len(angles)), "count = " +str(count), "length obj_dist = " + str(len(obj_dist))
for i in range(len(angles)): ## getting x-y points from distance and angle information

plt.plot(x_points,y_points,'ro') ## plot the x-y points
plt.show() ## show the plot



2D Plot:

As I said in the video, where it hits my door with all the clothes hanging on it, the map goes a bit mad.

2d map

Future improvements:
A higher FPS rate would allow for a faster scan time. This would then make possible a real time 2d map. For something like that I’d probably look into using processing to display the map on screen because matplotlib doesn’t seem to be all that fast at updating the display. From there it could be mounted on an autonomous robot of some kind. Removing the need for a computer would the main goal, that and making everything smaller. I have tried something like this with a Raspberry Pi but I just couldn’t get a decent FPS with the webcam using it. Using an FPGA might be an option.

## 19 thoughts on “2d mapping using a webcam and a laser”

1. batchloaf

Really impressive!! I wonder why some of the points scatter out beyond the wall, but they’re kind of at a consistent distance. This is a really interesting project.

1. name

My guess would also be reflective surfaces. With a line-laser you could get more readings of the same wall per frame and such upsets would get much less likely. Finding the line would be more coplicated though.

2. neon22

To get a bit more info on graphing effectively – check out this intro to SLAM by Claus Brenner. His intro pdf outlines how you get graphing easily done and his Youtube course is pretty darn useful too.
Welcome to mapping on Python dude ! :)

http://www.clausbrenner.de/slam.html

Also useful but not right now:

http://eclecti.cc/computergraphics/easy-interactive-camera-projector-homography-in-python

If you want to go embedded CPU check out micropython – running on microcontroller. KS is over but boards wil be available soon. http://micropython.org/

3. parkview

Why not use the Raspberry Pi Camera? HD video at 30fps? Python control might be a bonus as well.

1. shaneormonde Post author

It’s a cool idea alright. As far as I know you can only get around 4fps using opencv with the raspi camera. This is because there is no access to the raspi gpu leaving the cpu to do all of the image processing . Things might have changed since I last checked though so I’ll give it a look

4. frankbeissel

How about the pixy? I got their kickstarter but have not played with it yet.

1. shaneormonde Post author

Thanks for the links neon. I’m busy with college stuff at the moment but I have been thinking a lot about this. First of all I’m going to try and get it working with a line. I was thinking maybe it would be possible to rotate the laser through 360 degrees to get a 3d scan of something? That would be pretty cool to try

1. neon22

Altering the angle of the vertical line +/- a small amount will increase your S/N. This wil be at teh cost of knowing precisely how non-vertical it is. Rotating it 360 is probably redundant.
If you want to go super cool then place three lasers with diff coolurs at a slight angle to each other (so the lines cross somewhere). Then use either a filter, or initially just the RGB values, to detect each color line. This will make your scanner resistant to the colour of an object absorbing all that line’s colour and giving you poor results. Of course it will not help translucent objects or furry fabrics…
You can just start with red/green to see the effect, a violet (DVD) laser is ideal for third colour. Or use an invisible infrared as well. But its easy to start with one, add green and then determine what to do about the rest. A servo will do if you want to try wobbling the vertical line.
Good luck.