Progress Report for Alexander Davila
Week 1:
Date: 1/13/17
Total hours: 3 hours
Description of design efforts:
This week I mainly worked on the Final project proposal and looked for wireless communication modules. We also went through soem options for microprocessors
Week 2:
Date: -1/20/2017-
Total hours: -5 hours-
Description of design efforts:
-Worked on the functional specification report, specifically in the Computational, thermal/power and mechanical constrains. I also signed in the lab for a chasis that will be used as the base for the rover.
We also looked for a microcontroller and options to control the arm as well as for transmiting information
.
Week 3:
Date: -1/27/2017-
Description of design efforts:
-This week I worked with Alejandro chosing some elements for the project, we decided to use a STM32F401 as the microcontroller of the rover. We also made some tests with a couple of FS100A modules, but we sasw that there were not useful as they could only transmit from a few centimeter away. Also they received too much noise. In the fig. 1 the two modules (receiver and transmiter) are shown.
Fig.1 FS1000 RF modules
I also made simulations on proteus of how the chosen motor driver (L298) should be connected. The simulation was fine and I learned a couple of things about this IC, for example I did not know how the "sense" pin worked. The simulation can be seen in the fig. 2.
Fig.2 Proteus simulation of a L298 driving a DC motor
Appart from that I also made some preliminary tests with the motors in the cahsis we got. The left motor does not work properly, so we asked joe for a replacement and he already ordered a new one. we also asked him to order the robotic arm and it is already on its way.
As for the documentation, this week I did the software overview.Here I realized how much in advance we should already be thinking. I had to define a lot of things for the sensors and the software, and as a group we had to make a lot of desitions. Such as how we are going to transmit data and how we are going to handle the whole algorithm of the program. The general algorithm, the state machine diagrams and a general layout I made for the project can be seen in fig3, 4 and 5 respectively.
Fig.3 General algorithm in the controller and rover (some quality is lost but the complete diagram can also be seen in the Software overview document)
Fig.4 General state machine diagram for the software
Fig.5 General layout diagram for the project
I also worked on the component analysis, specifically in the Power supply and voltage regulators. I did some research and decided that a lienal regulator is not a good option for this project, given the low efficiency and reliability. therefore,we are going to use a LTC8050 switching regulator rather than a LM7805 linear regulator. I also did soem calculations for the battery we need and came up to the conclussion that we need a 12v 15000mAh Lithium ion battery.
Also the battery monitor is going to be a LTC6802-1 rather than a LM3914 as I first thought because the battery we need has 4 cells and the measurement would not be reliable if we use our first option.
Week 4:
Date: -2/03/2017-
Total hours: -10 hours-
Description of design efforts:
-Worked mainly on the Raspbery pi this week. I made some simple programs and learned to use the RPi.GPIO library. I also made a small program that can take an instruction from a computer in order to get the information from the leap sensor. Basically the computer using this sensor will write small file to a location in the raspberry pi and this program will read the first character of the file, then with a series of if-else statements the Rpi will act in a way or another.
In figure 6 we can see the raspberry pi driving some LED's according to the user's input.
The program basically turns on a different LED every time the user presses Return in the keyboard.
Fig.6 Raspbery pi driving LED's
I also had a problem when I tried to make the Rpi run my script when booting, as it would get trapped in an infinite loop. I had to format the whole memory, but then I solved it. Basically I just declared a pin as an input and made a simple if statement to check whether the pin is high. If it is high the program will terminate.
The loop I made it's as follows:
With this small loop I implemented a turn off button that will be useful alter on because we plan to use the Rpi without any screen, keyboard or mouse.
Appart from that I made some research in order to make get the best way to monitor the battery, initially we thought tha t the IC I found is too expensive but Alejandro found out that the 477 lab has one module thtat uses exactly the same chip.
I also ran some simulations in order to see if we can controll the motors of the rover using PWM. My main concern was that the optoisolators would not be able to correctly reproduce the PWM but it seems to work in the simulation.
This Sunday I plan to make the actual circuit and have the prototipe for those motors and hopefully start movign the rover.
Also we plan to use the xbee modules with Alejandro in the Rpi and the STM32 so we can start transmiting instruction from the controller and the rover. The Rpi should not be a big problem as a library alreay exists for Python and I am confident the STM32 will not be a big challenge neither, as the UART communication si really straight-forward.
With Leonel, we will try to communicate a computer with the raspberry pi directly using Python. So we can automatically send instructions with our Scripts.
With all this done by Monday, we will be abel to start working on our programs, defining fuctions that will be used in the final design and some necessary instructions. So I think the next week will requite a lot of hard work but we will see a lot of advances in the project.
finally I would liek to start working with the batterry monitor module and the Ultrasound sensors, but that is not going to be the priority next week.
Week 5:
Date: -02/10/2017-
Total hours: -13 hours-
Description of design efforts:
This week I worked with the H-Bridges, me and Mateo constructed the circuit that will drive the rover's motors. However, we encountered a problem when we tried to add the L298 Optoisolator. This device is basically a Transistor with a base activated by a LED. The problem was that the transsitor had a capacitance that discharged slowly, so before the output reached zero, the next one already took its place, and thus we could never reach a zero level as the input to the H-bridge. In figure 7 the output of the Optoisolator and the PWM from the microcontroller are shown.
Fig.7 Output from the Optoisolator, notice the output (in yellow) never reaches zero
This will be fixed once we get the Optoisolators FOD3180S we are going to use, as they have a digital output, so there is no capacitance.
I also made simulations for all the external circuits we need in the project except for the battery monitor, because there is no model of the IC we are going to use yet. The models were created using proteus. The circuits I made are: Ultrasonic sensor, DC-DC converter with Hbridge and optoisolators, DAC converter for the joystick, VU meter driver and a servo motor control circuit. Below the circuit for the DAC converter can be seen, as well as the ultrasonic sensor simulation.
Fig. 8 DAC converter for the joystick (The converted output in yellow and the clock signal in blue)
Fig. 9 ultrasound sensor circuit
These circuits are very useful, not only because they can guide us when we actually construct them on the breadboard, but also because they let me understand better how the IC's we are going to use work and the necessary extra parts we need for them, such as resistors, capacitors, etc...
I also was able to communicate the xbee modules as trancievers, I did so using my laptop and conecting them to different USB ports. I was able to transmit and receive data practically instantaniusly between them. I noticed they are set with the settings by the computer, so they are ready to use with the Rpi and the microcontroller
Next week I am going to center on the Rpi and microcontroller. I hope to be able to transmit data from it and control the arm motors with the microcontroller.
Week 5:
Date: -2/17/2017-
Total hours: -9 hours-
-This week I spend most of the time at the lab trying to implement the UART protocol on the STM microcontroller in order to commuicate with a Xbee module. I already got the inicializations, but havet got the whole implementation to work yet. I'll work more on that on Saturday. In figure 10 the inicialization can be seen.
Fig 10. Inicialization of UART protocol on the STM microcontroller
I also investigated on how to use the BQ76925 as a battery monitor. The cicuit seemed complex at first, but after more research I found a smaller cuircuit that works only as a battery manager without using other functions of the IC that are not required for our project. In figure 11 and 12 The simplified circuit and the design requirements for it can be seen. It is worth noting that this circuit can give us information not only about the voltage but also about the current and temperature of the battery and also works as a voltage regulator.
Fig.11 Circuit to use the BQ76925 as a battery manager.
Fig 12. Design requirements for the circuit on fig. 11
We made some changes on the way we are going to handle the information sent via UART. We are going to develop Mateos's idea of coding them in the 8 bits we can send at a time. The first bit is going to tell the STM32 if the instruction is meant to be a rover movement or an arm movement. If it is a rover movement the next 3 bits indicate the speed of the movement and the last 4 bits the direction. If it is an arm movement the next 3 bits indicate which motor is going to be moved and the last 4 the direction. This will allow us to use arrays to handle the control of all the motors on the project, which will make everything easier and more understandable when programming and making changes.
We also decided to use a Wii nunchuck instead of the joystick, as it will be easier to use by the user and will give a better presentation to the final product. We also can take advantage of the buttons in the nunchuck if we require them in the project. The nunchuck uses I2C to communicate with the raspberry pi, and we expect to have it working this week.
Week 6:
Date:-02/24/2017-
Total hours: -16 hours-
Description of design efforts:
This week I worked primarly with the Raspberry pi. I made a lot of progress with the peripherals we are going to use. I implemented the Xbee module on the Rpi in API mode. I choose API mode because is more reliable and it can protect the communication from other Xbee modules and Rf signals in general. The implementation from the computer was really simple, because the X-CTU application makes the packages according to the user settings. On the other hand the Rpi outputs a dictionary where the keys are the parameters and the values are the settings for those parameters. From this I already have an idea of how to take only the data from the transmition. You can access the data with the key "rf_data" in the dictionary. In figure 13 the received dictionary can be seen as displayed on the terminal of the Rpi. We might have to change the API mode to transparent mode, depending on how hard the implementation on the microcontroller will be.
Fig. 13 Dictionary received in the Raspberry pi from the Xbee module.
I also worked in the motor driver with the L298n, the driver worked with a small motor and a PWM frequency of 200Hz, which is low compared with a typical PWM. The circuit did not work well with the rover's motors, it tried to move them but did not provide enough current. This might be due to the fact that the L298 is designed to be used on high voltage circuits and as a consecuense, it consumes about 2.5 to 3 volts, so the voltage left for the motor is too little to make it work. We are thinking about getting some modules to drive the motors and try other configurations. In fig. 14 The circuit and the small dc motor spinning can be seen.
Fig. 14 Dc motor with a L298n driver
I also worked woth the Wii nunchuck and the LCD screen. The nunchuck required a special adapter in order to have its pins accessed. The implementation was fairly straight forward for it too. For the LCD I did some research and created two moduels to be imported in the program, with these modules we can print in the screen just by using the comand lcd.lcd_display_string() which takes two arguments, the string and the line where it will be displayed. also the inicialization can be done wil the command lcd.lcd_inicialize(). This is going to make all the future programming more simple and tidy. A problem I encountered after both of the devices were working is that they could not work in the same program. I understand that every I2C device has its own address, and they both were being called with the correct one. This weekend I will try to make them work in the same program. In figure 15 the output of the nunchuck (printed on the terminal) and the LCD can be seen. In the output from the nunchuck, the Jx and Jy are the coordinates of the joystick, the Ax, Ay and Az are the coordinates of the accelerometer and Bc and Bz are the two buttons.
Fig 15. The nunchuck output on the terminal (down) and LCD displaying an ascii wolf (up)
Week 8:
Date: -03/03/2017-
Total hours: 15 hours
Description of efforts:
This week we focused on the Midterm Review presentation. On Saturday we did all the slides as well as the diagrams and pictures. On monday we divided the presentation and changed some things in the slides. On Tuesday, before the presentation, we met again to practice a couple of times.,br. Overall I think it went very well, we all knew the subjects we presented and were well prepared. We still have a lot of work to do but I am sure we are on a good track to finnish the project on time.
I also made some research about what the RSSI entry in the dictionary taken from the Xbee received package means. It is a scale to measure the quality of connection on IEEE 802.11 wireless implementations. The scale decreses from 0, 0 meaning 0 Dbm (Decibels per milliwatt) lost. Therefore the higher the number, the better the connection. In Fig 13 the RSSI of each package received can be seen as K and I. This means that the first package had a RSSI of K -> 0x4b -> -75dBm and the second one of I -> 0x49 -> -73dBm. According to the manofacturer of the XBee modules, readings above -40dBm are not reliable as the module is saturated and its behavior is not lineal anymore. They also state that the RSSI can be taken from the pin 6 of the module and accessed using the AT command 'DB' in trasparent mode.
Week 9:
Date: -03/10/2017-
Total hours:12 hours
Description of design efforts:
This week we did a lot of advances. We were able to test the motor driver using some modules Joe gave us. At first we tried assembling our own prototype on the breadboard but it could not give enough current to drive the motor, probably because of the gauge of the wires we were using or the fact that the breadboard and its connections are not made for high currents neither. However, the module was more than capable of driving the motor even with only one channel, so we should be fine using two channels (in parallel) for each motor as we did on the PCB. In fig 16 the model we used is shown.
Figure 16. Module to test our motor driver
We also received our robotic arm. We actually got the braccio (our first option) and the Lynxmotion AL4D. We decided to try with the braccio for now. We assembled it and tested the motors with the included shield and a Arduino MEGA we have. It works fine and next week we are going to test Matt's programs to move it accordign to the Leap sensor input. The braccio's size seems apropiate to be mounted on top of the rover.
On fig 17 the Braccio can be seen.
Figure 17. Braccio from Arduino.
Finally this week I finally got the communication from the Raspberry pi to work with the Xbee module on transparent mode. The main problem was the way I addressed the serial port. Given that the Rpi 3 has two UART ports but the first (main) UART port is used for the bluetooth funcionality, while a 'mini UART' is used to connect to the GPIO pins. The first port is addressed as /dev/ttyAMA0 and the second one as dev/ttyS0. This differs greatly from other versions of the Rpi, so most documentation and tutorials I found on internet are not applicable to ours.
At the end the program is pretty simple. And the way we can read and write from/to the port is straight forward. Basically we just need to declare a Serial object with the desired port, baud rate and timeout after reading and then we use the functions serial.write([string to be sent]) that returns the number of bytes succesfully sent and serial.read([number of bytes to read at a time]) which returns the read bytes as a string. All these functions are found on the Serial library for Python.
I can address the port as dev/ttyUSB0 (when connectiong the Xbee using USB) and dev/ttyS0 (when connecting using UART on the GPIOs). I am using ttyUSB0 because when addressing the port as dev/ttyS0 sometimes it crashes when trying to write immediatly after reading. I have to make some more research to solve this but I believe is just a matter of changing some time configuration on the port, because the error it gives talks about the port not being available at that moment. In any case I can avoid the whole program crashing using try and except, so it just prints a string saying 'we failed' instead of stoping. On figure 18 the program I made to use the Xbee modules can be seen. It is worth noting also thta we archieved communication directly between the STM32F and the Rpi.
Aditionally, from what I have seen, now I think transparent mode is not so unreliable as I thought, because in order to make it work we first had to set the receiving addresses on both Xbee modules. This indicates that our Xbee modules on transparent mode are not going to receive interference from other Xbee modules around.
Fig. 18. Transparen mode program to use the Xbee module with the Raspberry pi. Ifthe module is connected using the GPIOs, the only change necessary is to change USB0 to S0 on the port declaration.
Week 11:
Date: -3/23/17-
Total hours:12 hours
Descriptions of design efforts:
-During this week I focused on the software for the Raspberry Pi. After testing every individual part I started writing the main function. The way I decided to handle it is to initialize all the communications and necesary variables before entering an infinite loop. This loop will coordinate the whole behavior of the control calling functions and using flags.
I imported the modules serial for the xbee communication, socket for the TCP communication, time to use the sleep function. I also made some modifications to the files I created before. In figure 19 the main function is shown. It is important to note that for now only the functions to move the rover and the arm are implemented. It is also important the way I am handling the interrupts, any interrupt will only be printed to the terminal, except for a keyboard interrupt that will terminate the execution. I decided to let the program run even when other interrunpts raise in order to avoid a crashing of the controller and just correct it in the next while loop.
Fig. 19. Main function where the funcitons mov (to get the movement from the nunchuck) and arm_mov (to get the movement from the arm) are called.
In the main file there is also a mov function that calls a function called readnun. This function is located on the nunchuck.py file. It receives a smbus object (previusly initializated when the function nun_ini was called at the beggining of the program) and a string. If it receives 'x' it returns the position of x and y of the joystick. If it received b1 or b2 it returns the state (either 1 or 0) for each button.
br>
After getting the x and y coordinates , this function corrects each coordinate so instead of going from 0 to 255, they go from -127 to 127 in order to tell easily in which direction the joystick is. It also sets the number to zero if its between 35 and -35 in order to avoid errors. At the end it calls the function dir that is located in the same main.py file. It decides which instruction should be sent based on the received coordinates.
In figure 20 both the mov function and the dir function can be seen. It is important to note that right now the output will not be sent to the Xbee module but printed to the terminal along with the corected coordinates. In order to send the instructions the print statements must be replaced with serial.write([instruction])
Fig. 20. Functions to send the rover movement instructions from the controller.
Finally the function to send instructions for the arm is taken from the TCP client file made by Mateo. It request to read the angles given by the leap sensor and compares them to the previous angles using the function TCP_client located on the tcpClient.py file. These angles are initialized in the main function to the same angles the arm will start with. It will then send a string for the instructions to move the arm and return the new angles to the main functions in order to be updated. This is how we are going to coordinate the arm with the user movements. When we implement the arm movement in the rover the latency (sleep) time will be calibrated in order to optimize the mimick function of the arm.
In figure 21 the function mov_arm can be seen. Note that the print statements must be replaced with serial.write([instructions string]) to send them to the Xbee module.
Fig. 21. Arm movement function.
The rest of the week we are going to work on the program for the STM32. Hopefully in two weeks we can have both the rover and arm working. After that we will send the data from the STM32 to the Rpi and continue incorporating other functions in the controller such as the VU meter.
Week 12:
Date: -03/30/2017-
Total hours:12
Description of design efforts:
This week we mostly worked with the rover, given that everything in the raspberry pi to send instructions is working. Although we did put the conection of the Rpi in one place. As shown in the figure 22, all the conections for the I2C of the raspberry (nunchuck, LCD and the logic converter) are now soldered in one place.
Fig. 22. I2c conections for the Raspberry pi.
For the rover, Mateo and me designed how the main will be oulined. Mateo tested the code and I did the first main file based on it. Basically we read a byte from the Xbee and save that value on data_i. Then we mask it woth 0x20 and save it in a bool called main_flag. This will decide if the instruction is for the rover movement (main_flag == 1) or the arm movement (main_flag == 0). I also included a flag to put the arm on the initial state that will be set to one and cleared at the beggining of the program when the arm goes to the initial position. On figure 23 the main function is shown as it is today.
Fig. 23. Main function.
We have also worked on the Rover.c file that will take the incoming instruction masked with 0x0F. This values will determine which structure in the array will be used to set the PWM of the motors. Aditionally we set the controll bits based on which directio the motors should be going to. On figure 24 the file for the rover movement is shown. Note that the PWM set here is a expressed as a fraction of the period of the wave form, being 1049 a PWM of 100%.
Fig. 24. Rover.c file. Note the values in the array are not final. They will be calibrated when we make test with the rover
Alejandro has made a great job with the functions and initializations of the rover. The function delaytsec is going to be really useful to calibrate the movements and he has given me information to put the correct variable types and functions to change the PWM and control bits.
On the weekend we will make the firsts test controlling the rover from the Rpi and then concentrate on the arm movement. Hopefully by next week we only have the ultrasound sensors and VU meter left to do.
Week 13:
Date: -06/04/2017-
Total hours: 20
Description od design efforts:
This week we did a lot of progress. First we implemented the RF control of the rover movement. It required some calibration for the speed of each motor when going to the right-forward, right-backwards, left-forward and left-backwards directions.
Right now the motors are working at around 90% of their full strength. We might change that in the future so the rover is easier to drive. We also finished the trace-back function. It also needs some calibration but overall it's working pretty well. On image 25 the rover being controlled by FR can be seen moving
Img. 25. Rover being controlled through RF.
The way the trace-back function was done is pretty simple. Matt already had the script to convert the instructions to its opposite (i.e. forward turns to backward and viceversa). As far as the rover is concerned, the way it moves is always the same, so the trace-back function is fully implemented on the controller.
To start recording the path the user just have to press the button C of the nunchuck, and when the user wants the rover to go back to the starting point, just presses the same button again. Then the recorded instructions are converted and sent to the rover one by one. On figure 26 the trace-back function can be seen working.
Img. 26. Trace-back function: The rover is taken to a certain point, then stops and comes back on its own.
We are still working on the arm. The way we planned to implement it at first does not work as well as we expected. The main problem is that the frame of reference for the movements is relative to the starting point and we cannot predict the exact position where the user's arm will be when the program starts. The other problem is that the range where each motor moves. If, for example, the range we use is 90 degrees, the program starts going from 0 to 90, but soon it goes out of that range and goes from 20 to 110 and so on.
The new approach we are taking is to have absolute angles. now we are going to send the current angles of each motor in a specific order. For example, if we sent base,shoulder,elbow,wrist,claw and all the motors are in 70 degrees, the controller would send 7070707070, so the first 7 is the tens of the base motor and the first 0 is de unit of the same motor.
The problem with this approach is that the buffer is not fast enough to read the to values back to back, but a delay in between solved it and now we have the arm working, but it works on frames because the delay is not optimized and it requires two delays for every angle. Right now we have the delay set at 0.1 seconds, so we have a total delay of 1 second on all the arm movement (without counting the actual program execution). On image 27. The arm with the second approach can be seen working.
Img. 27. Second approach on moving the arm. Note that the movement has some lag, but it does follow the user's arm very well.
Right now we came up with another approach. To send the angle value in only one byte. The way we are planning to do this is to send the byte with the decimal value of the angle plus 32. The 32 is used to offset the ascii character so it is a printable one. Right now it's going from A to z. Then the program in the rover will substract 32 and get the original value. This approach avoids one reading from the register. I think that, because we no longer read from the register back to back, we don't need the delay now so the movement can be really smooth and practically continous.
Other than that I came up with a formula to convert the angles to set the PWM. The way I did it was to take the range of angles (0 to 180) and see how that translates to the range of possible PWM to set (150 to 570). The formula is as follows (180-0)/(570-150) and then we add 150 (because of the offset of the PWM that starts from 150). The error is really small and neglible so the way the arm mimics the user should be really well done.
This weekend we will try the last approach and calibrate some values. The we will work on the battery monitor and the ultrasonic sensors. Hopefully for next week or maybe in two weeks we can have the whole project completed.
Week 14:
Date: -13/04/2017-
Total hours: 20
Description of design efforts:
The most remarkable thing we did ths week was to prepare everything for our first demo. It went really well and we got checked out for three of our preliminary PSSC's. Professor Meyer suggested thaat we start the rover's movements with a slower speed and progressively increase it. We want to implement this feature after we finish the other two PSSC's. On image 28 the rover with the arm as it was presented is shown.
Img. 28. Rover as presented in the first demo.
During the weekend I did more tests with the arm and discovered that the problem of the lag on the movement was caused not only by the need of a delay time after reading an instruction. The other problem is the way the buffer works, reading 2 bytes at a time. This caused the second instruction (without a delay) to be ignored and instead go for the third received instruction. The solution I came up with is to send an extra byte at the beggining of the string on angles.
So the buffer read the first instruction(the one that tells the rover the angles for the arm are incoming next) and along with this instruction, it also reads an extra character that can be discarted. This allows the first angle (composed of two bytes as explained in the third approach on Week 13) to be read as a whole and then used. And the same thing happens for all the others angles. By taking advantage of the way the buffer works, we could eliminate the delay on each angle read and we only have one before starting to reading the first angle. On image 29 the arm mimicking the user's hand can be seen.
Img. 29. Final arm software being tested.
I also worked with Alejandro, testing his functions to measure the distance in front of the ultrasound sensors. The measurement is pretty reliable when we make three reads and take an average. The way we are implementing this feature is to request a read of the sensors every time an instruction of for the rover (as oposed to the arm) is received. Matt also wrote the function to receive the response from the rover and write it to the LCD. We used some fixed distances to display the information from the sensors. On image 30 the setup on the rover for the sensor tests is shown.
Img. 30. Test setup for the ultrasonic sensor.
I also made the driver for the LED bar display and it is ready to receive the instruction from the rover and turn on the appropiate LEDs. We agreed to implement this feature as an interrupt every 5 minutes. The Raspberry pi will call this function everytime it receives an instruction from the battery monitor.
For the weekend we will work on making all the PSSCs work together and hopefuly make a new demo on next man. lab. session. After that we will focus on the packaging and anything we want to add to the functionalities of the project. I am presonally excited to try a graphic display for the controller, allowing us to display warnings and even have a more finished initialization when the controller is turned on.
Week 15:
Date: -21/04/2017-
Total hours: 10
Description of design efforts
This week we finished the last two PSSC's. First we made the ultrasound sensor functions in the rover and the controller work together. We decided to display the distance in front and behid the car in the LCD screen and to light up LEDs indicating the rover is about to fall from an edge. The control expects to receive a reading from the ultrasonic sensors every time it sends a movement instruction, this way we can coordinate the receiving of data and we don't need to add any new delay to neither of the programs.
The way we did the LEDs signals was very useful when trying to make the rover stop before falling from an edge. Basically we just made two flags that work the same as each LED.So when, for example an edge in front of the rover is detected the first flag is set to 1 and this won't let any instruction that involves going forward to be sent. The flag is cleared (set to zero) when any instruction that involves going backwards is sent.
We also made the batery monitor(our last PSSC) fully functional this week. For the implementation, we take a reading from the battery monitor and sent an appropiate instruction for the remaining battery. This is done veery 30 seconds using an interrupt that when is asserted sets a flag to one. When the flag is set to one, the rover execute the described function in place of sending a reading from the ultrasound sensors. This approach made it possible to coordinate the controller and the rover, again, without adding any new delay. as a result one set of information from the ultrasonic sensors is never send every 30 seconds. This is neglegible, given that a set is sent every time the rover receives a movement instruction and the controller is constantly sending them when the user uses the nunchuck.
I made the driver for the LED bar driver, where the instruction is received, then turns off all the LEDs and turns on the appropiate LEDs for the received instruction.
On image 31 the controller with the LED bar is shown.
Img. 31.Controller with the LED bar. Note the LEDs to signal an edge was detected are not here yet.
This weekend we will finish the packaging and correct some minor bugs. Next Monday we will present the final PSSCs.
Week 16:
Date: -28/04/2017-
Total hours: 31
Description of design efforts:
This week we mainly focused on fixing small bugs. We realized the motors from the rover do not roll at the same speed so we had to acount for that on software, taht also fixed the errors on the trace-back functions and made the driving more precise.
A big problem we encountered was the ultrasonic sensors, we highly understimated the difficulty of that part of the project and when the rover would start failing we just tried to debug the communications. We thought the Xbee modules were failing, but upon some research, we concluded they were most likely not the problem.
At the end, we found out the utrasonic sensors would get stuck in an infinite loop when they don't get an echo back. So we just added a timeout and it solved everything.
This week we got all our finla PSSCs checked out and finished all the project. We also participated on the Spark challenge where HERO bot received an honorific mention.
At the end this project took a lot more than I spected but I learned many useful skills appliable to the 'real world' and gave me a good insight of what a project development requires.