Daniel Barrett's Lab Notebook Entries

Week 01

January 11, 2011 (2 hours):
Looked up parts and created initial budget estimate.
Atom board ~$200
Chassis with built-in motors and encoders: $71.67
circuit board with 3d accelerometers, 3d gyro, 3d compass: $89 information here
Already possess 4 lynxmotion servos: $0
sonic rangefinder: 255 inch range, inch precision: $28
765 inch range,centimeter precision :$50
additional encoder for rangefinder motor: $10-25
GPS with antenna included: $60
web cam ~$50-70 (if we don't use the low quality one we have)
total: $509-$566
presumably there will be costs other than this in addition

January 12, 2011 (2 hours):
Met as a team after class to finish writing preliminary project proposal.

January 13, 2011 (2 hours):
Researched Kalman Filters , Dead Reckoning, and inertial navigation

January 16, 2011 (2 hours):
Researched batteries and sensors. Found accelerometer and magnetometer with i2c interfaces
Sebastian remembered he has a laptop battery at home which may be situable.

WEEK 01 SUMMARY
Accomplishments: Submitted preliminary project proposal.
Found parts which may be suitable.
Weekly Work Total: 8 hours
Project Work Total: 8 hours

Week 02

January 17, 2011 (2 hours):
Met as a team to formulate PSSC for presentation in class on Jan. 19. Completed the final project proposal except the final block diagram (made preliminary version)

January 18, 2011,(1.5 hours):
Created preliminary block diagram of software architecture.
Researched parts and ordered online an accelerometer, magnetometer, and sonic rangefinder.
still looking at GPS receivers

January 21, 2011 (3 hours):
The chassis and motors which Sebastian had ordered arrived, so we put it together and played with it.
The encoders were installed on the motors, and the motors and wheels were attached to the chassis, which was assembled.
Also worked on the design of the interface from the microcontroller to the H-bridges.
looked up the pinout of the 9S12 board, and compared it to our needs. We require:
1 ATD pin
2 general purpose pins for implementing I2C interface for the accelerometer and magnetometer
(2 more because the accelerometer documentation says having other communication on the bus influences the measurements through noise)
2 pins for RS232 interface of the GPS(unless we interface it directly to the Atom board
5 PWM pins: There are 2 PWM controlled servos for camera control, each of which uses double precision PWM, using up 2 pins each (on a 9S12), minimum of 1 PWM for controlling wheel motor speed.
1 serial port for communicating with the Atom board
4 general purpose pins for controlling the motor direction.
Shift registers could be used to reduce the number of required pins
The 9S12 has enough pins and periferals to satisfy these needs, and we already have them from ece362: it has
5 PWM pins
1 serial port
7 ATD pins
there are 13 other pins which can be used as general purpose pins if only 1 ATD pin is used.

January 22, 2011 (6 hours):
Summary: worked on setting up the Atom board for compiling/running/loading our software
Configured Laptop to create ad hoc network
Set up Atom board to be logged into via remote desktop through wifi
Configured OpenCV(Open Computer Vision Library) and CodeBlocks (a C/C++ IDE) on the Atom board
Used aforementioned software to re-compile the Code I wrote for the ece362 Mini-project. This code is based on the lkdemo.cpp demo code which comes with openCV which implements a tracking algorithm. I added serial communication functionality to it, so that the target location is sent out the serial port .
Configured Codewarrior on the Atom board.
Reloaded the assembly from ece362 miniproject onto the 9S12 microcontroller. This code reads from the serial port, and controls two servo motors to orient a camera to place the target in the center of the camera's view.
Tested the webcam tracking using the Atom board. Tweeked the motor speeds to improve performance.
Tested a wheel motor encoder, which did not seem to be outputting a square wave as specified.

WEEK 02 SUMMARY
Accomplishments:
-PSSC and project proposal finalized.
-Robot chassis arrived and put together
-Atom board set up for compiling/running code and for loading code onto the microcontroller
Weekly Work Total: 12.5 hours
Project Work Total: 20.5 hours

Week 03

January 24, 2011 (3 hours):
Worked on HW #3: Design Constraint Analysis.

January 25, 2011 (2 hours):
Met with team to discuss microcontrollers and packaging.
Also retested wheel encoders and verified that they function properly. The encoders output a square wave with frequency 120 times their input frequency, which is a shaft coming out of the back of the motors which is geared 30:1 with the wheel shaft. therefore there are 3600 cycles of the square wave per wheel revolution. With a 4.5'' diameter wheel, this gives a precision of 255 cycles per inch, subject to error from tire deformation.

January 27, 2011 (2 hours):
Read about using kalman filters, and about the kalman filter functions supplied by opencv. Opencv reference
Worked on a kinematic model of the robot, and converted that into the matrices needed by the kalman filter.(there are errors in the scanned paper, but it is mostly right)

January 28, 2011 (3 hours):
Implemented sensor fusion code and created a simulation on which to test and debug it. The simulated car moved in a pattern based on motor inputs and the kinematic model, and noisy sensor measurements were created for the position, wheel speed, and orientation. This was done by adding gaussian noise to the "real" values. These measurements, along with the derived kinematics matrices, were fed to the openCV kalman filter function and used to estimate the state of the machine. The results are very positive. Captured a screenshot (red x's are GPS measurements, white x's are "real" positions, blue x's are estimated positions, and green x's are a running avg of the GPS measurements) and added the code to the webpage.

January 29, 2011 (2 hours):
Went to the lab with Sebastian and improved the assembly code for controlling the servos for camera tracking. There are now 3 speeds based on the distance in pixels from the center of the image to the target

January 29, 2011 (1.5 hours):
Went back to the lab with Sebastian and tested the ultrasonic rangefinder. Wires were soldered onto the range-finder to facilitate breadboard prototyping. The rangefinder has good detection of objects, but the beam becomes quite wide after ~9 feet. Although the beam is wide, it still detects objects when placed near the ground without picking up the ground instead. This was tested on a smooth surface, and one with a sweatshirt on it, simulating grass. With luck, the wide beam will also not be a problem on grass.

January 29, 2011 (2 hours):
Continued work on the Kalman filter sensor fusion algorithm. Made a new kinematic model which allows measured acceleration to be taken into account, and tested it in simulation code. The estimation is quite good even with large amounts of noise, as long as the standard deviation of the noise is known, and taken into account.

January 30, 2011 (3 hours):
Met with team in 477 lab to work.
Discussed design constraints and packaging
modified the camera servo control code, removing old counterproductive interrupt code which was slowing down the left-right movement.

WEEK 03 SUMMARY
Accomplishments:
-Obtained meaningful oscilloscope readings from both wheel encoders and ultrasonic rangefinder
Improved the web cam tracking control code
-Created code to use the OpenCV kalman filter algorithm function to perform modeling of our robot and to perform sensor fusion to combine multiple noisy sensor readings: Position (GPS), wheel speed(encoders), orientation (compass), acceleration(accelerometer).
-Tested sensor fusion code on simulated noisy data, and confirmed that it does an excellent job, even with large amounts of noise.
Weekly Work Total: 17 hours
Project Work Total: 37.5 hours

Week 04

February 2, 2011 (.5 hours):
Looked at IR rangefinders online.

comparison of a number of different models of SHARP IR rangefinders

February 5, 2011 (6 hours):
Worked in lab.
Discussed powersupply,and looked up voltage regulators, H-bridges and batteries
a good-looking h-bridge a bunch of h-bridges on digi-key integrated h-bridge
2600mAh Li-ion battery pack 4400mAh Li-ion battery for less money
Tested the voltage translator
Worked on Design Constraint analysis
Worked on software: Made code able to deal with sensor data at different update rates,since GPS data will be at 1Hz, and everything else will be much more often.
Also created PID controller to make the robot move to a waypoint. The controller tries to minimize the difference between the angle the robot is facing, and the angle toward the waypoint. Given the state of the system: (x,y,theta,dtheta/dt,speed of left motor, speed of right motor, acceleration), the function finds new values for the inputs to the motors.
Tested the controller in simulation code, made corrections, tuned the parameters of the PID controller.

February 6, 2011 (5 hours):
Worked on and finished Design Constraint analysis
Discussed power supply

WEEK 04 SUMMARY
Accomplishments:
-Verified functionality of Voltage translator
-Finished Design Constraint Analysis
-Found voltage regulators, H-bridge, and battery
-Created control system to generate outputs to the motors in order to travel to a waypoint
-Tested control system code on the simulation using the kalman filter, and confirmed that it works in the simulation.

Weekly Work Total: 11.5 hours
Project Work Total: 49 hours

Week 05

February 8, 2011 (2 hours):
Together with Sebastian, used a tool on the National Semiconductor website to design the power circuits for the three voltage regulators. The tool allows one to enter the input and output specifications, and it then optimizes the circuit, selecting components and displaying its properties. It also allows one to choose different components from a list of components which meet the specifications. We used this feature to ensure that no components smaller than 1206 surface mount were used. The tool then generates a schematic for each 3.3V, 5V 12V and Bills of Materials: 12V, 5V, 3.3V, , which was ~$3-4 for each of the three circuits.
based upon the ease of use of this tool, we have decided to use a National Semiconductor component LM25085 for the 12V supply, rather than our previous choice.

February 10, 2011 (negligible time)
Purchased H-bridge circuit kit

February 11, 2011 (3 hours)
Worked on the navigation software.
Added obstacle-related data structures and functions to simulation code.
Added function to simulate the detection of objects by rangefinders
Researched pathfinding algorithms at
a Stanford webpage and an interesting pdf presentation,

February 12, 2011 (3 hours)
Downloaded an implementation of the D* Lite pathfinding algorithm.
Worked on integrating this into my simulation: created function for inputting the obstacle information into the D* Lite
Successfully tested the pathfinding algorithm by displaying the chosen path given simulated terrain with obstacles (code).

February 13, 2011 (4 hours)
Worked in lab.
Tested IR range sensors with oscilloscope. They work well if the noise is reduced with a capacitor.
Tested the GPS module and usb converter by hooking the GPS up to my laptop and viewing the packets with Terra Term. The results agree to within a few feet with google earth.
More fully integrated the path-finding and obstacle detection in simulation (code). The detected objects are used to update the graph structure of the path-finder, which reupdates the path as the robot moves along. In simulation, this combination, along with previously completed kalman filter and control system, successfully navigates to given waypoints, detecting and avoiding obstacles.

WEEK 05 SUMMARY
Accomplishments:
- Verified functionality of the GPS module, and IR rangefinder.
- Designed the voltage regulator circuitry with the help of National Semiconductor's website
- Integrated path-finding with the control-system in my software, and added simulated obstacle detection to the simulation for testing.
- Ordered H-bridge
Weekly Work Total: 12 hours
Project Work Total: 61 hours

Week 06

February 14, 2011 (5 hours)
I downloaded a NMEA GPS packet parser implementation, which was acctually buggy, so I fixed the bugs and got it working.
I took screenshots of campus on google earth, and saved them as pictures. campus engineering fountain intermural fields hilltop area
Then, I recorded the GPS locations of the boundaries of the pictures.
Next, I created function which maps GPS coordinates onto the pixel coordinates of the picture, given the GPS boundaries of the picture. I tested this on the GPS data recorded on Sunday, and plotted an X on the location(screenshot), which is in the same location as what google earth marks when given those coordinates. I then created functions which map points between global GPS (angles), picture(pixels), and the local coordinate system(meters) used in the kalman filter simulation.
I used these newly created functions to map simulation onto picture of campus, and scaled the time to be in "real" time (screenshot) (code)
Pictures of simulation in progress:
The large red x is the origin of the local coordinate system, and the location where the robot begins
The hard to see thin white lines are simulated obstacles before they have been detected.
The thick white line is where the robot has actually been, and the blue line which is on top of it most of the time is the estimated position from simulated noisy sensor data and fusion algorithm. (small red x's are simulated gps measurements)
Yellow is used for obstacles which have been detected.
The green line is current path to be followed to the next waypoint which has been chosen by the pathfinding algorithm.
The black/greenish lines are formerly chosen paths.
-first: the robot is moving north, and planning to go north around the north/south oriented obstacle to the waypoint to the east
-second : the robot has made it to the eastern waypoint, and is now moving west around the same obstacle. It is planning to go straight through the obstacle to the west which has not yet been detected
-third Having gotten closer, the western obstacle has been detected, and it is now planning to go north around it
-fourth Having reached the western waypoint, it is now planning to go around the obstacle to the south in order to get to the waypoint to the south

February 16, 2011 (1 hour)
I went to the lab and tested at what voltage the Atom board shuts down. It is rated for 12V, but can run down to 10V.
Based on this information, and comments from this week's presentation, we are considering using a 12V battery and eliminating the 12V regulator. The following is a very good-looking 12V battery: 12V 4500mAh. It can provide up to 40A of continuous current, and is no more expensive than the Li-Ion batteries previously mentioned. It is somewhat heavier: 1.52 lb vs .75 lb, but this should be fine.
Also, in order to avoid having to remove the battery pack each time it must be charged, we are considering including a recharge circuit using a chip such as the DS2715 .

February 17, 2011 (3 hours) Met in lab with team
I tested the LM317 voltage regulator.
I worked with Anthony trying to communicate with the HMC magnetometer over emulated i2c. It pulls the "ack" low and holds it for several clocks, preventing us from giving it an address for reading. After looking at the documentation, we attempted to put the chip into continuous mode, and got some results which may or may not be useful. We will need to convert the read data to ascii to see it on the terminal properly.
Helped Sebastian with the schematic.

February 18, 2011 (2 hours)
I worked on designing a battery charging circuit to be placed on the pcb. The TI bq2002 serves as a battery recharge controller. However, in the provided diagram it requires a current source which it can switch on and off with a voltage signal.
The LM317 can serve as a current source. It accomplishes this by attempting to set the voltage between the "Vo" and "adj" pins to 1.25 V.
Found an example diagram of a custom designed recharging circuit which uses the LM317 as a switchable current source.
The LM317 is used in current source mode, and a transistor is used to short the "adj" pin to ground, thus setting the "Vo" pin to 1.25V, and reverse-biasing the diode to the battery and stopping the charging. The 1.25V are then across the 1K resistor, thus reducing the wasted current draw to 1.25mA
diagram

February 19, 2011 (7 hours)
I looked over the PADS tutorial.
I worked with Sebastian to prototype the H-bridge wheel control circuit, and tested the wheel speed and direction control by sending the PWM from the microcontroller to it. It works.
I interfaced the COM port function, NMEA parser, and plotting functions (code). The GUI can now display GPS data on a map, google-earth-style.

WEEK 06 SUMMARY
Accomplishments:
I verified the functionality of the H-Bridge module, and 3.3V power circuit.
I also designed the battery recharging circuitry.
I integrated the COM port interface, NMEA parser, and functions for converting between GPS(longitude,latitude), image(pixel x, pixel y), and local(meters x, meters y) coordinate systems and implemented Google-Earth-like interface for displaying GPS data points
Weekly Work Total: 18 hours
Project Work Total: 79 hours

Week 07

February 21, 2011 (3 hours)
I worked on the SPI communication with the accelerometer with little success and ordered the battery pack

February 22, 2011 (4 hours)
I worked on the SPI communication with the accelerometer with little success and successfully prototyped the use of the LM317 as a current source.
I also worked on the PCB layout by arranging the components to minimize trace crossings and length, and generating the autoroute.

February 23, 2011 (4 hours)
I worked on the SPI communication with the accelerometer with little success and prototyped the LM317 current source with voltage-controlled switch to turn it on and off.

February 26, 2011 (3.5 hours)
I decided that the SPI block of the microcontroller must not be working properly, and implemented SPI in C. It works, and we successfully read acceleration data from the accelerometer.
I also implemented a user interface which allows the user to right click on the map to add that point to the front of the waypoint queue, and shift-click to add that point to the end of the waypoint queue

February 27, 2011 (4 hours)
I worked in the lab trying to get the I2C interface with the compass working.
I reimplemented the software i2c, and tested the communication, but was unable to get even and "ack" response from the compass, which differed from our earier experience with the code written by anthony.
Therefore, I tried anthony's code again, but did not recieve an "ack" with it either. It appears that the compass may be broken.
I also worked on the Design Review powerpoint, and dded a block software block diagram, and more information to someof the later slides.
I looked over the current PCB, and identified a few problems such as a lack of power/ground connections for external sensors.

WEEK 07 SUMMARY
Accomplishments:
I prototyped and verified the switchable current source circuit for battery charging, implemented SPI in embedded C, and used it to get acceleration readings from the accerometer.
I also helped produce the preliminary PCB layout and ordered the battery pack.
Weekly Work Total: 18.5 hours
Project Work Total: 97.5 hours

Week 08

February 28, 2011 (1 hour)
I improved on the software diagram slides of the powerpoint.

February 28, 2011 (4 hours)
I prototyped and tested the bq2002 battery charging controller, which works.
The team discussed the powerpoint slides and schedule for the rest of the semester.

March 1, 2011 (3 hours)
I worked on the pcb layout and schematic.

March 2, 2011 (6 hours)
I prototyped battery charging circuit, combining the bq2002, LM317 as current source, and power transistor as switch. A resistor was used for the load, because of this, there was nothing to reverse bias the diode, and the current was not completely stopped, but this will not be a problem when a battery is used.

March 3, 2011 (2 hours)
I worked on the pcb layout, making the corrections suggested in our design review: we made the pads for the surface mount components longer in order to allow for easier soldering. We also ran the checker and corrected the mistakes it found.

March 5, 2011 (1 hour)
I worked on potentially using multiple threads for the software. This would allow the timing of the vision and navigation sections to be decoupled.
This could be useful because the vision code will only be able to loop at ~20Hz, while the navigation involves much less computation power. Having them in 1 loop would mean that the number of navigation loops per vision loop would be fixed, which is not optimal since the pathfinder has highly variable runtime.
Created a block diagram
for the multithreaded system.

WEEK 08 SUMMARY
Accomplishments:
I prototyped and verified the bq2002 battery charging controller and the batttery charging circuit all together, and they seem to work. I also created content for the software and schedule portions of the powerpoint. We received feedback at our design review, and made some of the suggested changes to the pcb, such as increasing the length of the pads for surface mount parts. Weekly Work Total: 17 hours
Project Work Total: 114.5 hours

Week 09

March 8 (3 hours)
I worked on the pcb. One thing I did was to increase the widths of the power and ground traces which will have the most current passing through them

March 9 (5 hours)
I worked on the pcb. I checked the schematic and pcb for errors, and found that the labels on the transistors in the battery charging circuit were wrong, and that the pinout did not match the transistor we had, therefore, I fixed these problems. I also compressed the layout, reducing wasted space.

March 10 (5 hours)
I worked on the pcb. I checked the schematic and pcb for errors, fixed some, such as missing optical isolator resistors. I repeatedly submitted the code for checking, and tried to fix the errors. I fixed the errors by increasing the anular ring for all the headers, and by reducing the solder mask size for the 5V power supply.

March 11 (1 hour)
I submitted the final pcb files for checking, and then submitted homework 7 and the pcb to Chuck.

WEEK 09 SUMMARY
Accomplishments:
The week was spent working on the pcb, checking for errors, and fixing them. On Friday the pcb was submitted to Chuck, and homework 7 was also submitted. Weekly Work Total: 14 hours
Project Work Total: 128.5 hours

Week 10 ---- Spring Break ----

WEEK 10 SUMMARY
Accomplishments:
Weekly Work Total: 0 hours
Project Work Total: 128.5 hours

Week 11

March 21 (2 hours)
I worked on a new software block diagram
and also made a few slides for the software presentation.

March 22 (3 hours)
I finished the Software Design Presentation, and also worked on the Homework 9 paper.

March 23 (1 hour)
I continued work on the Homework 9 paper.

March 24 (5 hours)
I worked out a packet protocol with Anthony, and successfully tested the communication between the Atom board and microcontroller software. I practiced soldering in preparation for work on the board. I also finished the Homework 9 paper.

March 27 (.1 hours)
I did some more practice soldering. I also removed the solder remaining on the pins of the 5V power supply from when we put wires on it for breadboard prototyping. This solder was preventing it from fitting through the hole in our board, so now it fits.

WEEK 11 SUMMARY
Accomplishments:
For the most part, I worked on the Software Design homework and presentations. I also worked out and tested a communication protocol for sending packets between the Atom board and microcontrollers with Anthony, and practiced soldering.
Weekly Work Total: 11.1 hours
Project Work Total: 139.6 hours

Week 12

March 29 (4 hours)
Sebastian and I soldered the 5V and 3.3V power supplies onto the pcb and verified that they work properly. We also also soldered on the optical isolators, headers and dip sockets.

March 30 (4 hours)
Sebastian and I soldered the h-bridge components and level translator components onto the pcb and verified their operation. The level translators appeared at first not to output a correct voltage: only 1.8V rather than 3.3. However, after applying a 100K-ohm load, it output the correct voltage.

March 31 (7 hours)
Sebastian and I soldered the components of the charging circuit onto the pcb. Initially, it was not working properly. After investigation, we determined that the pinout of the inverter IC was wrong. We then cut the errant traces, and flywired to correct the mistake. (see diagram)

After this, the circuit worked correctly, and shut down when the thermristor was heated with a heat gun, simulating a hot battery.

April 1 (5 hours)
Sebastian and I soldered the push-buttons for the microcontrollers onto a proto-board, and connected the board to the pcb. This is so that the buttons will be accessible from the outside, since the pcb itself will be underneath the Atom board.

We charged the battery for a while and observed it to make sure the charger was behaving properly. We tested whether the thermristor would turn off enough to shut down the charging circuit when put in contact with the heat sink of our current sourcing LM317, which gets hot to the touch. This is more realistic than testing it with a heat gun.
The charging circuit uses a voltage divider between the thermristor and a resistor to create a temperature-varying voltage. This voltage is monitored by the bq2002, which turns off the circuit when the voltage is below 2.5 volts. We initially used a 2K-ohm resistor along with the thermristor, but the voltage only went down to 3V when heated by the heat sink. We therefore changed the resistor to 8K-ohms and the voltage now passes beneath the threshold when the thermristor is heated by the heat sink.
After charging the battery for a while, we used it to power the Atom board, which it was unable to do before we charged it.
We then attempted to program one of the microcontrollers using the atom board and a usb BDM multilink, but the atom board software is evidently not set up correctly to use the BDM. Therefore, we used the bdm mulilink on the lab desktop, and successfully programmed one of the microcontrollers while it was on the pcb.

WEEK 12 SUMMARY
Accomplishments:
Sebastian and I populated the pcb.
We first did the power supplies, then the headers and level translator, then the H-bridge, and finally the battery charging circuit. We tested and debugged each section before moving onto the next.
After completing the battery charger, we used it to charge the battery pack, which was successful as evidenced by the battery being unable to power the Atom board before charging, but able afterwards.
Weekly Work Total: 20 hours
Project Work Total: 159.6 hours

Week 13

April 5 (4 hours)
We reorganized the wires to make them more neat. This was accomplished by unsoldering them and soldering them so that they come out of the back of the pcb, rather than the front. The wires that were changed were the battery power output, the charging circuit output, the atom board power, and the battery charging circuit thermristor.
We tested the H-bridge's ability to control the motor direction using battery power. This involved plugging the battery into the pcb and plugging the motors into the H-bridge. We then used a pair of wires attached to the +5V and ground headers and touching them to the pins of the 40-dip header that the microcontroller will be plugged into. We made each of the motors go forward, backward, coast, and break.
I later worked on the Atom board code.
Since the various modules are for the most part complete, and the code is currently implemented to work on a simulation, my current task is to first clean up the code to make it easier to work with, and second, to modify it so that it will actually communicate with the microcontrollers, sending them instructions and using the data from their sensors.
I accomplished the first part of my goal: to clean up and reorganize it for later modification.

April 6 (3 hours)
Sebastian and I went to lab to try to clean up the wires in the robot more. We drilled a few holes in the plexiglass on which the pcb is mounted so that the wires coming out the back can pass through them, rather than having to go around the outside.

After drilling the holes and routing the wires, we put one of the microcontrollers onto the pcb, and loaded our servo-controlling code onto it. We then plugged the camera servo connectors into the pcb and the camera into the Atom board. After this we connected a serial cable between the pcb and the Atom board, and ran the target tracking software on the Atom board.
-The system successfully caused the camera to follow targets chosen by clicking on them on the screen in the video.
-This was the first time we used the serial connector on the pcb, and it worked properly.
-Next, Anthony sent me his H-bridge controlling code, which I tested by plugging the second microcontroller into the board and running it. It successfully made the motors on each side go forward, backward, coast, and break, as well as was able to control the speed of the motors.
-Finally, I modified the Atom board code to send the PWM values it uses in the simulation in a packet to the microcontroller, and ran the code Anthony wrote for processing the packets and using them to control the H-Bridge. There were initially problems with the packets becomeing out of synch: the microcontroller would read several bytes and interpret them as a packet, but they were actually the end and begining of two seperate packets. We modified the microcontroller code to avoid this, and the Atom board was then able to control the motors.
-We then experimented with making the robot turn. It goes mostly straight even when one motor is given much less power than the other. However, it is able to turn very well if one motor breaks while the other is at high power, or if the two motors each turn in opposite directions.

April 7 (4 hours)
I continued my work cleaning and reorganizing the Atom board code, as well as creating the functions necessesary to communicate with the microcontrollers, camera and GPS all at once.
-I decided that an object-oriented approach would make the code easier to deal with, so I converted it to that from a function-oriented approach. I created classes roughly corresponding to the main blocks in the code:
-The classes are Atom_board_module, Kalman_module, Navigation_module, Communication_module, and UI_module. When the object tracking code is added, it will be contained within a class as well.
-As part of the communication class, I created methods for creating, sending, receiving and parsing packets(as defined earlier by Anthony and I here). I then created a method which uses these methods to obtain the needed information from the microcontrollers and GPS and convert it to the structure needed by the Kalman filter.
-The next step is to test the re-organized code, and then to create the map creation methods which will use real sensor data, rather than simulated data, as I have been using up to now. -I also found and fixed a bug in the control system which was preventing the simulated robot from going directly east or west.

(before: note the angled path from the bottom middle to the right)

(after: note the straight path between the bottom middle and right)

April 8 (1 hour)
Two of the microcontrollers were fried today:
Sebastian and I were powering the robot using the Atom board's power supply in order to avoid draining the battery unnecessarily. While we were allowing it to drive in a circle on the floor, an exposed piece of metal on the power jack we were using made contact with a wire on the H-bridge circuit. At this point, the robot ceased to function. We attempted to reprogram the microcontroller which had been controlling the h-bridge, but it appears to be dead. We then tried the other micro, which had been plugged into the servo-controlling header, and found that it still worked. We were able to reprogram it, and have it control the motors.
-We are somewhat puzzled how the micro could have been destroyed, since it is protected from the H-bridge by optical isolators, which are still intact. The only wire going from the H-bridge to the microcontrollers is the ground trace, which goes to both microcontrollers, only one of which was destroyed.
-We then took out Sebastian's 362 microcontroller, plugged it into the pcb socket and reprogrammed it to control the motors, which it did successfully. It is known that at least one PT pin on this microcontroller was damaged in ece362. Later in the day, Anthony tried to use this microcontroller to communicate with the compass. He was able to sucessfully reprogram it, but the bad pin prevented him from communicating with the compass. Sebastian and I later came in and told him the previously mentioned story, and gave him the other microcontroller, which he was able to use to communicate with the compass. We took the microcontroller with the bad pin from him, but were unable to program it, and after this point, neither was he. The micro appears to be dead.
-This is especially puzzling because it was not present during the wire-shorting incident, and because the second microcontroller which WAS present, is perfectly intact.

WEEK 13 SUMMARY
Accomplishments:
-This week, a lot of packaging and tidying was accomplished. This also applies to the Atom board code, which was cleaned up, and has been for the most part converted to an object-oriented approach.
-In addition, the H-bridge circuit was tested, and shown to be able to control the motor direction and speed. A microcontroller was used to control this behavior for the first time.
-Two other important milestones accomplished for the first time this week are communicating between the microcontroller and the Atom board using the serial port on the pcb, and controlling the motors based on packets received from the Atom board.
Weekly Work Total: 12 hours
Project Work Total: 171.6 hours

Week 14

April 11 (2 hours)
-I finished converting the Atom board code to a object-oriented approach, and tested it to ensure it behaves as it did before. The classes I created are:
-Atom_board_module: this is the main class which contains everything
-Kalman_Module: this block contains variables and methods for updating the state of the kalman filter
-Communication_Module: this block contains variables and methods for initializing the COM ports, reading and parsing packets, and converting the data to a form required by the kalman module
-UI_Module: this block contains variables and methods for displaying information on the screen
-Navigation_Module: this block contains variables and methods for managing the waypoint queue, obstacle-mapping, path-finding, and the PID control system (the obstacle-mapping still needs to be altered to make use of real data, instead of simulated obstacles)

April 12 (2 hours)
-I created a custom connector for the charging terminal of the battery pack. The battery we are using has two ports: a standard male Tamiya connector for discharging, and a male Tamiya Mini connector for charging. We already have a connector for the discharge port, but we did not have one for the charging port. Therefore I purchased a female connector, but accidentally got a standard, rather than Mini Tamiya. The connector came with the plastic housing separate from the metal pins so that wires can be crimped onto the pins before inserting them into the housing. Since the pins fit, but the housing did not, I decided to use the pins, and constuct a housing from electricl tape.

Here is a diagram of how I put it together.
-I also verified that the signals destined for the accelerometer coming from the microcontroller at 5V are correctly replicated at 3.3V on the other side of the level-translator.
-I also discovered that the PM5 pin of the microcontroller is burned out. We had planned to use this for the SPI SCK signal, but we can use PM1 instead.

April 13 (4 hours)
-Sebastian and I attempted to communicate with the accelerometer through the pcb. We had previously gotten it to work (see entry from Feb 26), but now no longer receive any output from it. We used the logic analyzer to verify that all the signals going to the accelerometer are correct, and measured the input voltage and ground but it outputs nothing. The accelerometer is not critical to the success of our design, so we will get everything else to work before wasting more time on it.
-I also worked on changing the Atom board code to use the imperfect data which will be coming out of the rangefinder to do map-making and obstacle avoidance, rather than the somewhat idealized input I had been simulating it with. I changed the simulation so that the only information available to the map-making system is a single value representing the distance to the closest obstacle in a 45 degree arc in front of the robot.
-The map-maker now adds obstacles to the map in the entire 45 degree arc at the given distance. It also removes any obstacles in the line of sight between the robot and the added obstacles. These cannot still be there, since an obstacle was detected behind them. If no obstacle is detected, then any obstacles in the sensor's field of view are removed from the map.

-This system works well in simulation, though not quite as well as with idealized data. It can be improved by having the sensor scan back and forth, which will be easy, since it is mounted on the camera servos.

April 14 (5 hours)
I made a list of things which remain to be done: (I will be adding "DONE" after items when they are completed, and adding items if necessary)
THINGS NEEDING DOING:
Hardware - mounting things
- IR sensors------------------------------------------------DONE
- Sonic rangefinder-----------------------------------------DONE
- finalize power connections:----------DONE
   - remove power jack from current location----DONE
   - add power jack to charger circuit
   - remove short from switch connector-------DONE
- mount compass(es)-----------------------------------------DONE
- mount acceleromter (if is works/ is necessary)
Hardware - debugging things
- acceleromter-----------------------------------------DONE(removed from PSSC list, sinces its unnecesary)
- cause of burned out micros (added 4/17) ---------- DONE
Software
- Merge Anthony and Sebastians code for servo-micro-------DONE
   - Anthony's pulse accumulator code--------DONE
   - Anthony's rangefinder code--------------DONE
   - Sebastian's servo code------------------DONE
- Make sure Atom packet-parsing code works----------------DONE
- Make sure micro packet-parsing codes work---------------DONE
- Interface Atom code with both micros at once, demonstrating:-----DONE
   - wheel speed measurements------------DONE
   - rangefinder measurements------------DONE
   - compass measurements--------------------DONE
   - (accelerometer measurements)
   - Atom-directed motor control-------------DONE
   - Atom-directed motor control, using sensor data-----DONE
   - Atom-directed camera control------------DONE
   - Atom-directed camera control, at same time as everything else---DONE
- Tune/determine parameters in Atom code:
   - control system parameters (coefficients of the PID controller)---DONE
   - simulation parameters (width,friction,encoder:meter ratio)-------DONE(friction not messed with, just increased the model uncertainty)
- Features to add:
   - Atom should use the IR sensors as high-priority stop command------DONE
   - Define a packet for the camera micro to tell the atom the direction of the camera.---DONE
   - Define a packet(s) for requesting a sonic rangefinder 'scan' and for responding with the results---no new packet necessary
----------------------------------------------------------------------------------------------------------------------
-Today Sebastian mounted the IR and Sonic rangefinders, the GPS, and the compass onto the robot. Here a few pictures:

-I worked with Anthony to test the communication between the microcontroller code and Atom board code and verified that we can send, receive, and parse packets. This required fixing a few bugs, such as logic errors and sending/reading the wrong number of bytes. The packet we were testing was packet type 2, the one containing the compass data. In the process, we verified that we can extract the angle the robot is facing from the compass data using inverse tangent and a small amount of logic.
Here is a picture showing the packet data and the resulting angle value:

- At this point, we are using a single thread in the Atom board code, and as a result, reading from the COM port wastes a lot of time, limiting our loop to approximately 7Hz.

April 15 (5 hours)
-I investigated the slow COM port transfer rate. First I reduced the "timeout" time of the COM port in my Atom board code. This gave a slight increase in speed. Then we experimented with increasing the baud rate from 9600, trying 38400, 57600, and 128000. Because the microcontroller runs at 24Mhz, we were not able to divide the clock down to particularly close to the two upper rates, and they resulted in many errors in the packets. We were able to hit the 38400 baud rate very close: 24Mhz/16/39 = 38461, which yeilds an error percentage of 0.01%. This only gave a small improvement to the rate, increasing it to ~10 packets per second. We then noticed that the Atom board is able to read over 60 packets per second if it stops reading and allows the microcontroller to send many packets first. This indicated that the bottleneck is in the microcontroller. We then commented out all the code in the microcontroller besides the packet sending function, and it was able to send an extremely large number of packets per second. Uncommenting functions one at a time led us to learn that there were several milli-second delays within the compass reading function. Removing these delays will result in the packet rate being far above our 20Hz goal.
-We next tested the two Infrared and one Sonic rangefinders on the robot. After resolving an issue with the reading of the ATD registers, the data was successfully retrieved from all 3 sensors.
-After this, Sebastian and Anthony's partial microcontroller codes were merged together.
-While this merging was in progress, I worked on the Atom board code, specifically adding in code to convert the motor inputs used in the control system and simulation into packets to be sent to microcontroller 2.
-Next, the merged code was interfaced with the Atom board code, and I fixed a problem with the atom packet-sending code, where it was sending the same byte over and over.
-At this point, the Atom board was able to control the motors by sending packets to microcontroller 2 while also receiving and parsing packets from it and from the GPS module.
-We only had 1 serial to usb converter in lab, which we use to connect between the serial ports on the pcb and the usb ports on the Atom board. Therefore, in order to test communication with microcontroller 2, we disconnected the converter from microcontroller 1.
The Atom board was able to control the camera servos by sending packets to microcontroller 1

.

WEEK 14 SUMMARY
Accomplishments:
-I finished reorganizing the Atom board code to an object-oriented approach.
-I also made a good connector for the charging terminal of the battery.
-Sebastian and I discovered that the accelerometrer we had working in February no longer functions.
-Anthony bought a new compass, and got it working and interfaced with the pcb.
-The packaging was finished, with all the sensors mounted.
-Simultaneous communication of both microcontrollers with the Atom board was achieved, receiving and parsing all required sensor data, and controlling the wheels and camera servos.
Project Work Total: 189.6 hours

Weekly Work Total: 18 hours

Week 15

April 16 (7 hours)
Summary: Today, we connected both microcontrollers to the Atom board, and tested the communication. I then created a function to calculate the wheel speed, and we began testing the ability to stop the wheels when an obstacle is detected. We found that there was a latency issue, and partially fixed it. Lastly, the microcontrollers were destroyed.
-We modified the packets being sent to the Atom board to contain 2 bytes for the wheel encoder count, and to contain the camera servo position. Anthony also modified his code to send packets more often. We tested the sending of packets from microcontroller 1, and debugged a few problems such as the bytes being sent in the wrong order. We were successfully sending, receiving and parsing 32 packets per second from each microcontroller.
-After verifying that simulateous packet communication with both microcontrollers was working, I made a function to convert the wheel encoder values to speed in meters per second. The following is how it is computed:
The wheel encoders are attached to a shaft on the back of the wheel motors which is geared to turn 30 times faster than the wheel. The encoder outputs a square wave with a frequency 120 times that of the shaft. This square wave is sampled by a pulse accumulator, and the number of periods in a 5 miliseconds sample time is sent to the Atom board in a packet. Thus, the number of wheel revolutions per second is count/30/120/.005. The diameter of each wheel is 4.25 inches = 0.1143 meters, so its circumference is .359 meters. Therefore the speed in meters per second is count/30/120/.005*.359
-We then made the Atom board send packets commanding the wheels to turn, in order to test the wheel encoders. We observed that for one of the microcontrollers, there was a several second delay between the time that a wheel was slowed, and the time the Atom board code parsed a packet containing the new speed. This was because that microcontroller was sending packets faster than the other, but the Atom board was only pulling them out of the Windows Com port buffer at the speed of the slower microcontroller. Thus the packets built up in the buffer, resulting in latency. This was solved by making the two microcontrollers send packets at a closer rate, and periodically reading the whole buffer.
-Then I made a simple code to be used as a last resort stop command if the IR rangefinders either detected an object that was right in front of the robot, or detected a bump or dropoff. This behavior was successful in a test setting, but upon allowing the robot to actually try to stop before hitting an object, we observed that there was a second long delay between the detection of an obstacle, and the wheels stopping, which is unacceptably long. This may similar to the previously described delay, or may be the fault of the microcontroller code.
-After breaking for dinner, we returned to find that one of the microcontrollers no longer was sending packets to the Atom board, and soon found that it was extremely hot. The micro was completely burned, and no longer programmable. We checked the supply voltage, and checked for shorts, but found nothing. While checking for shorts, the second microcontroller made a "bzzt" sound and then also became hot and died. We suspect that perhaps oscillations are causing voltage spikes, which are killing the microcontrollers.

April 18 (4.5 hours)
-Anthony and Nilmini discovered the problem which fried our microcontollers: We had temporarily connected the Atom board power supply to the pcb instead of the battery in order to be able to test it without draining the battery. When we did this, we connected it to the switch incorrectly: when the switch was turned off, instead of disconnecting the power from the board, it instead shorted it to ground. This caused the power supply to shut down, giving the appearance of proper function, however it caused a voltage spike, which fried the microcontroller.

Here is a diagram of the bad switch connection
-We connected two good microcontrollers, and I worked on getting the kalman filter working with real data. I had to change some of the matrices: Because the sensors have different maximum sampling rates (eg 40ms for compass, 1 s for GPS), not all the sensors are measured for each iteration of the kalman filter. Because of this, the measurement matrix must be changed depending on which sensors have been read.
-I then worked on the control system, making the robot follow a path, while sitting on a platform with its wheels off the ground. Only the wheel encoder data was used in order to fool the kalman filter into thinking it was moving.
Image: Robot controls wheels and uses wheel encoder data to cause it to think it is following the path around simulated obstacles.

-In getting this to work, I had to fix a problem where the left and right PWM values were sent to the opposite side's motor.

April 20 (4 hours)
-I fixed an issue causing the kalman simulation to think the robot was turning much faster than it really was. The change in direction was not multiplied by the change in time, causing the simulated model to turn each time step the amount it should have turned in a second. This kind of problem would not be catastrophic, since we are measuring the direction with the compass, but the kalman filter will be more useful if the simulated model is not completely wrong.
-I next merged the camera tracking code with the other code, so that now we have a single unified atom board code. The camera tracking code is now called as a function within the main loop.
-I then added code to make the whole robot turn based on the degree to which the camera is pointing to the side when tracking an object.
- When testing this, we discovered that the latency issue caused by not reading packets fast enough had reappeared. After briefly trying to fix this by sending a counter with each packet, Anthony had a good idea to solve the problem once and for all: the microcontrollers now only send data packets after receiving a packet from the Atom board. This means that each time the Atom board's main loop iterated, it receives one packet from each microcontroller. The reduced packet flow has actually resulted in a greatly increased loop rate, because we no longer need to read packets until there are none available, which wastes a lot of time.
-Our temporary wall-plug power system is causing us some more inconvienience: the connection is loose, and prevents us from allowing the robot to move freely on the floor, because movement causes the contact to be briefly lost. We continue to use this temporary plug because we do not want to have to repeatedly wait for the battery to recharge when trying to work on it. The following is a diagram comparing the temporary and final power connection schemes:

April 21 (9.5 hours)
-Sebastian and I made a custom plug similar to the one I made and described on April 12, except that we used hot glue instead of electrical tape.

-We then tested the robot's tracking and movement on the ground, and improved the turning by changing the values used to control the wheel motors.
-We attempted to operate the robot through the wireless network, but kept getting disconnected. We suspect that this is because of the many people in the lab using laptops as well as the several teams doing custom wireless projects. We think they are causing too much interference because the wireless connection worked well in the past. Because of this problem, we connected to the Atom board with an Ethernet cable.
-At first, the Atom board would send packets to the microcontroller causing the motors to receive equal PWM values proportionate to the angle between directly forward and direction the camera was facing, and in opposite directions. In this way, it would try to turn in place to face a target. This was marginally successful, but the robot would over turn because the power required to begin a turn is much higher than that required to continue turning. Therefore, the robot would not turn until the angle became large, and then turn extremely fast once the power to the motors became large enough to begin turning.
-We added in moving forward and backward based on the range obtained from the camera-mounted sonic range finder so that it will follow towards the target if it is further than 55 cm away, and back up if closer than 30cm. Next we added in the ability to move forward or backward while turning so that it can move toward/away from the target while at the same time turning to face it.
Here are some videos of it in action:
 

-We next charged the battery for a few minutes and then switched over to battery power and demonstrated this behavior using it to check off PSSCs 2,3, and 4.
- After this we went outside, and worked on the two remaining PSSCs: GPS measurements and using our wheel encoders and compass to track changes in position. Both these features are mostly implemented, but required debugging. The GPS feature was returning correct values, but there was a bug causing the receipt of a GPS packet to crash the code, which I fixed. The movement tracking, which was nominally working previously, was not anymore, because of a bug I had added earlier in the day when adding in the camera following code. This bug made the robot think its speed was always zero.

April 22 (5.5 hours)
-Sebastian and I found that one of the wheels was loose, and that the power cord had been disconnected from the battery, so we fixed this. In order to prevent further disconnections, Sebastian used hot glue to fix the motor connectors in place.
-Next, we tested the wireless again now that the other teams are not in the lab, and found it to work fine.
-We then set out to create a function to serve as a wheel control system: The higher level algorithms will send this function the desired wheel speed and direction, and it will dynamically control the motors to achieve that desire. This is in contrast to what we have done previously, where we tried to control the motor PWM values directly in the target following code.
We are implementing this controller as a PID controller. After initially implementing it with a few arbitrary coefficients, we had a fairly good, but slightly unstable control system: it successfully makes the motors go the desired speed on average, but the speed oscillates around the desired speed. This should be remedied by modification to the PID coefficients.
We tested using this function to make the robot turn at a constant rate, and it was fairly successful, much better than the previous turning.
-In order to implement this control system, we decided to improve the data we are receiving from the microcontrollers on the wheel direction. Previously, we have been recording the actual speed of the motors using the wheel encoders, but have not determined the direction through measurement. Instead, we have been assuming that the movement is in the direction the motor is being driven by the H-bridge. However, the encoders make it possible to measure the direction through two 90 degree phase shifted square waves: when one wave hasa rising edge, the current value of the other yields the direction.

We modified the microcontroller code to perform this calculation using an interrupt on the rising edge of one of the square waves. Initially this was unsuccessful, because of other delays in the microcontroller code caused by wait loops, but after reorganizing the code to no longer use wait loops, the direction measurement was successful.
- This modification changed the way the wheel speed is measured as well: Rather than clearing the pulse accumulator,waiting 5ms, and recording the new acculumator count, the microcontroller instead only clears the counter when it sends a packet, thus removing the need for a wait loop. This means we will need to change the Atom board speed calulation, since it counted on the sample time being 5 ms.

WEEK 15 SUMMARY
Accomplishments:
This week, we finalized the microcontroller-Atom board communication:
We connected both microcontrollers to the Atom board for the first time, and fixed the resulting latency issue.
After this, our microcontrollers were fried, and we discovered and fixed the cause: an incorrectly connected temporary power connection.
We then replaced the microcontrollers and implemented following a person using the camera tracking.
After encountering somewhat erratic movements, we began implementation of a motor speed controller. In order to accomplish this, we implemented motor direction measurement using the wheel encoder.
Weekly Work Total: 30.5 hours
Project Work Total: 220.1 hours


Week 16

April 25 (3.5 hours)
Today I fixed two problems which were messing up the kalman filter: Before going into detail on these bugs I will quickly summarize the way the kalman filter operates
-The kalman filter is a way to estimate the state of a noisy system using noisy data over time, and it improves the quality of the estimate by using a state-machine model of the system. The better the model corresponds to reality, the more measurement noise can be handled without degrading the output.
-There are several matrices on which the kalman filter operates, the most important being the state vector, which holds the state variables, the transition matrix, which holds the equations for how the state is expected to change after a time step, the measurement vector, which holds the current measurements, and a matrix representing the equations from which the measurements are derived from the state vector.
-Our state vector is composed of the robot's position (x,y) (in meters), orientation(in radians from east), angular velocity(in radians/second), the speed of its left and right wheels(in meters/second), and the acceleration of its wheels (in meters per second squared).
-Some of these variables are measured, but some are not: the position is measured by the GPS, the orientation by the compass, and the wheel speed by the wheel encoders, while the angular velocity and acceleration are never measured. In addition, all sensors may not have output data during a particular time step, because they have differing output rates: The GPS outputs coordinates at a rate of 1Hz, while the wheel encoders are sampled at around 60Hz. In order to accomodate this variablility, I update the measurement equation matrix based on which variables have been measured: a row of zeros in this matrix means that the measured variable corresponding to that row is zero regardless of the state variables, and thus causes the algorithm to ignore that measurement when updating the state.
-The first problem had to do with the measurement matrix: I was updating it correctly based upon the new sensor data available, but had forgotten to comment out code which was originally used when using simulated data. This code was over-writing the matrix and causing it to use the data from all the sensors, whether measured recently or not. Fixing this problem greatly improved the results of the kalman filter.
-The other thing I changed was to greatly increase the assumed uncertainty of the kinematic model with regard to the orientation. This is largely because of wheel slippage, which prevents the actual robot from turning as quickly as would be assumed based on the wheel encoders. This increased uncertainty causes the estimated orientation to follow the results of the compass much more closely. This tends to be more accurate than it was previously.
-After making these changes, we were ready to demonstrate our GPS PSSC, and were checked off for it.
-We are also ready to demonstrate our last PSSC, once we remove the word "accelerometer" from it. Originally, we intended to use an accelerometer, and were able to get it working.However, after sitting on our bench for a couple months it stopped functioning completely. We have found that we can accomplish the task associated with this PSSC: to determine changed in position using the compass and wheel encoders, without needing the accelerometer. Therefore, we do not think it is worth buying a new one simply because we said we would use it, and intend to submit a PSSC change request.

April 26 (4 hours)
-Today I created a display to demonstrate that the robot is able to use the wheel encoders and compass to determine changes in direction. The robot is represented as a triangle on a black screen. The triangle faces in the direction that the robot is facing: north -> up, south-west -> down-left etc. This triangle also moves in the same direction and speed as the robot when it moves. Thus, the triangle mimics the movement of the robot.
-After creating this display, we used it to demonstrate our last PSSC: to determine changes in position based on compass and wheel encoder data.
-Next, we took it to MSEE, where there is more room to move, and took some video for our presentation.

-We also submitted our PSSC change form to remove the accelerometer from our criteria.

April 27 (4 hours)
-Today I worked on enabling the obstacle-mapping and path-finding blocks, and in creating a display which makes it easy to see the obstacles, path, etc.
-First, I enabled only the waypoint navigation section, and made it so that the control system we created for following a target is used to move toward a designated waypoint which is entered by clicking on the map. I was able to make this work with a minimum of effort since all the necessary functions are already implemented.
-Next, I created a display system which includes a background and a foreground. These two layers can be cleared or drawn on independently, enabling me to draw moving objects without destroying the background, as I have done previously. This is done by making a copy of the background, and drawing the foreground onto the copy each frame, and displaying the modified copy.
-I used this framework to add an obstacle display to the robot display I created on April 26. The obstacle mapping algorithm is the same as that which I created and described on April 13, except that it uses my new background-foreground framework.
Image: The white triangle is the robot, the yellow line points in the direction the turret points, and the fuzzy green arcs are where there may be obstacles.

-Next, I added in code to the navigation routine which makes the camera turret(which has the sonic rangefinder mounted on it) scan back and forth. In this way, the robot will be able to make a rudimentary map of its surroundings as it drives around. The next step will be to re-enable the pathfinding and see how well it works.

April 28 (10 hours)
-Today, I enabled the pathfinding algorithm, which together with the obstacle mapping was able to cause the robot to navigate to waypoints and avoid obstacles.
Image: The green line is the path around the obstacles which the robot plans to take.

I did this by giving the pathfinder control of the robot through the control system we created for following targets. The control system turns with varying speed based on the angle between the camera turret(which aims at the target) and the front of the robot. With a small modification, I passed the angle to the nearest sub-waypoint to this control system, which causes the robot to follow the path, since waypoints are removed when they are reached.
-Next, we took tons of video of the robot in action in MSEE, both with a camera and screen capture software. We did this because we wanted to include the new pathfinding in our video, and because we wanted to have the robot act completely wirelessly throughout the video. The previous video was shot mostly with the ethernet cable plugged in. We then took the shots we wanted, and put them together to create our video. We found the screen-capture of the robot navigating around two people and the camera footage corresponding to it, and put them together to show the navigational ability.

April 29 (4 hours)
-Today we presented our project to the ECE362 and ECE270 classes.
-Later I worked on correcting my graded homeworks in anticipation of adding them to the final report. This was mostly adding citations and making changes based on changes we have made since the papers were written. For example, we did not use the 14.8V battery I initially put in the report, and therefore did not require a 12V regulator, since we used a 12V battery.

April 30 (3 hours)
-Today we worked on the final report and poster. This involved arranging pictures for the poster, and creating a project overview for it. I also wrote out my personal contribution section and the summary.

WEEK 16 SUMMARY
Accomplishments:
This week, we made the finishing touches to our project, such as
- fixed bugs with the software, and improved the following ability of the robot by creating an improved wheel speed controller.
- added in a display for the robot's internal estimation of its movement
- added obstacle map to the display
- enabled pathfinding
We were able to demonstrate all of our PSSCs this week, and have finished the final reports.
Weekly Work Total: 28.5 hours
Project Work Total: 248.6 hours