Progress Report for: Leonel Vasquez

Week 1:

Date: 13 January 2017
Total hours: 2
Description of design efforts:
Since this was the first week of the semester, we only met to finish the final project proposal and discuss the different roles that each of us would have. We also shared our calendars and agreed on times when to meet up to work in the project.

Week 2:

Date: 20 January 2017
Total hours: 7
Description of design efforts:
In this week, we found out that Purdue had a chassis available, so we decided to use it for our rover since it met all the specifications that we needed. Furthermore we looked into several options for the robotic arm, as well as the micro controller that we would use to start our project. Finally, besides of contributing writing the Functional Specification assignment, I also did some research about the Leap sensor that we planned to use and downloaded some libraries and modules that would help me gathering the data from it. However, there might be some issues related to which micro controller use in order to process the data gathered from the leap sensor, and I will look into that in the next days.

Week 3:

Date: 27 January 2017
Total hours: 11
Description of design efforts:
- During week 3, I mainly worked on installing, interpreting and understanding the Leap sensor. After extensive research, I found the libraries needed in order to facilitate the code to be written. I decided to write all the programs related to the Leap sensor in python since it is the programming language that I feel the most comfortable with. After downloading all the libraries and modules needed, I ran into the issue that the Leap sensor was only compatible with Python2.7, thus, I decided to use Pycharm as my IDE (Integrated Development Environment) since I had previous experience with it and the python version could be changed accordingly.

The downloaded libraries needed to be in the same directory as the program to be implemented in order to be successfully imported. However, I thought that this process was inefficient and error prone, so I proceeded to save the libraries in a specific folder and then created a header file that would specify the path to them, making the whole importation process very straight forward.

After successfully importing the libraries, I started researching in order to understand the use of each class and method in the libraries. The first program I wrote consisted on identifying how many hands were placed above the sensor, count the number of fingers, show the timestamp, detect any gesture (predefined specific hand movements) and show the ID of the frame. The output of the script can be seen in Figure 3.1

Figure 3.1 Output for first implemented program


It was very comforting that the results obtained were the same as predicted. Then, I proceeded to change the program, and after reading more about the libraries, I found a way to identify which hand was placed above the sensor (left or right). Furthermore, I learned that while one had remains placed above the sensor, its Id number will stay the same. However, if the hand moves out of the sensor's range, when it is placed back it will have a different ID number. Finally, I discovered how to obtain the XYZ coordinates of the hand placed above the sensor. X represented movements from right to left. Y represented up and down movements. And Z represented depth. This data was obtained as a tupple of floats and I will be working with them next week in order to learn how to interpret them and use them to control the motors in the robotic arm. The output of the second program can be observed in Figure 3.2

Figure 3.2 Output for second implemented program


- I was also in charge of connecting the raspberry pi to my computer. I successfully completed this task using an Ethernet cable. First I obtained the IP address of the raspberry pi and then I connected it to my computer via ssh. Next week I will upload my python program to the raspberry pi and see how it behaves with the leap sensor connected to it.

-Finally, I also helped writing the component analysis and provided feedback for the Software Analysis.

Week 4:

Date: 03 February 2017
Total hours: 10
Description of design efforts:
- During week 4, I kept on working with the Leap sensor. After extensive research, I found out classes implemented in the libraries that will help me finding the angle of rotation of the wrist, as well as identify the position of the elbow, the hand, the arm, and many other functionalities. All the new data gathered can be seen in the following graphs.

Figure 4.1 Output for wrist and elbow data


Figure 4.2 Output for hand normal position and other data


Unfortunately, I am not totally sure yet of which of these functionalities will be the most useful for the implementation of the project. However, in the table below, I have explained the function of each one of the new parameters:

Figure 4.3 Funcitons of parameters


Furthermore, I was able to extract information of the fingers. In order to avoid interference and undesired outputs, I disabled the option to recognize the middle, ring, and pinkie fingers. Thus, the movement of the claw will be more accurate and easy to implement. The following graph shows the detection of the thumb and index fingers with their respective ids.

Figure 3.1 Output for data gathered from fingers


Finally, I found a way to decrease the data reading latency of the sensor, which will not only be useful to identify and appreciate the outputs obtained in a more clear and easier way, but also, it will come handy when we need to synchronize the data transmission speed later in the semester.

- I also tried to connect the Leap Sensor directly to the Raspberry Pi and run my python program from there. Even though I copied the data from my computer to the Raspberry pi successfully, I was not able to import and use the Leap libraries needed. I still need to do some research to find out if it is an issue with missing python libraries and modules in the Raspberry Pi, or if simply the Leap sensor is not compatible with the Raspberry Pie.

Furthermore, I am still working in the process of sending data from my computer to the Raspberry Pi. One of my main goals for next week is to find a way to redirect the output of my python program (that will be updating constantly) to the Raspberry Pi.

-Finally, I designed and defined the main structure for a program that will read the instructions representing the original path stored in a text file in memory, and then, it will revert them and write them to a new text file from which the rover will read them and trace back to the point where the tracking function was invoked. It is important to note that what I defined is just the main idea that will need to be adjusted once we start working with the rover.

Week 5:

Date: 11 February 2017
Total hours: 7
Description of design efforts:
- During week 5, I did some research regarding the issue of connecting the leap sensor directly to the Raspberry Pi. I found that this was not a feasible idea since the Raspberry Pi can't provide enough power for the Leap Sensor to work. Furthermore, a i3 processor is required by the Leap sensor in order to process and create at least 290 frames per second. However, after reading a couple of books in order to get more familiarized with programming the Raspberry Pi with Python, I successfully completed one of the goals I set for myself last week. I found a way to constantly send the data obtained from my python script to the Raspberry Pi. The solution consisted on using the “socket” libraries in python. These libraries let me established a Server-Client connection between my computer and the Raspberry Pi through Ethernet Transmission Control Protocol (TCP).

I wrote a program that would get a string of characters for the client(Raspberry Pi) and then the server(my computer) will process the string and make it uppercase. Finally, the converted string would be sent back to the client. Figure 5.1 shows the behavior of the program. The screen with the black background is my computer and shows the TCP server program, specifying when the data has been received or sent. The screen with the white background shows the output of a TCP Client program ran in the Raspberry Pi.

Figure 5.1 Output for program ran in the server


Figure 5.2 Output for program ran in the client


-Furthermore, I ran some tests and now I have a better idea of what coordinates and type of data I will be using to track the user's arm movement. For next week, I will try to define the parts of the arm that the Leap sensor will be keeping track of, as well as, how I am going to process this data in order to make the data transmission process simple, straight forward, and less error prone.

-Finally, I helped Edgar running tests with the H-bridges, started organizing data for the software formalization assigment, and cofigured the new Raspeberry Pi that we ordered.

Week 6:

Date: 17 February 2017
Total hours: 12
Description of design efforts:
- During week 6, I mainly focused on organizing and writing the Software Formalization assignment for which I am in charge. Starting this assignment early not only saved me time but also motivated me and my team to think about parts of the project in a more specific way, thus, clarifying and defining the different approaches that we will take in order to successfully implement the HERO bot project. Additionally, I came up with the idea of organizing the 8-bit data packages to be sent by the XBEE modules in such a way that a single package will contain all the information needed to move either the robotic hand or the rover. This process is explained in a more detailed manner in figure 6.1.

Figure 6.1 Data packages to be sent by XBEE modules


-Additionally, I talked to my team the possibility of changing our joystick model to a "Wii Nunchuck". The reason for this is merely a presentation matter, the previous model was too small and I thought that the proposed solution will be more comfortable for the user. Regarding the software needed to communicate between the Nunchuck and the Raspberry Pi, I found a great example online of such connection and also did some research about the needed python libraries so I am sure that it will work without any inconvenience.

-Finally, I continued working with the leap sensor. I think I am very close to find a way to obtain the angles needed to control the servo motors in the robotic arm. I have decided that I will work with the "hand.position" method in the "on_frame" attribute. I still need to do some research to decide if I will use an inverse kinematic approach in order to find the angles need from the position of the hand, or try to use attributes already defined in the python library. Unfortunately, the shipment for the robotic arm has been delayed for another week so I can't test my program, however, I will try to come up with several different approaches in order to have different options available thus speeding up the testing process. Figure 6.2 shows an example of hoe attributes of the leap libraries that can be very useful, I found a way to obtain a stabilized position of the hand such that involuntary small movements are not taken into consideration. Also, I restricted the z values to be always positive.

Figure 6.1 Stabilized position of the hand


Furthermore,I found a way to have the leap sensor reading only the thumb and index fingers in order to control the claw. Figure 6.3 shows the positions for both of the fingers, as well as the distance between them.

Week 7:

Date: 24 February 2017
Total hours: 11
Description of design efforts:
- During week 7, I finished writing the Software Formalization. While doing so, I made very good progress specifying the steps needed for the software implementation, as well as, as defining the different approaches that we will take in order to debug the different components in the HERO bot project. Additionally, I designed the flow charts needed for the different parts of the programs to be implemented, they will be very helpful in the coding and debugging steps. The software structure for the "receiving data" step of the microcontroller is explained in figure 7.1

Figure 7.1 Receiving data process


-Additionally, I managed to find the value for 4 different angles given the xyz had coordinates identified by the leap sensor. The base angle was found by taking the taking the tangent of the x value divided by the z value. The values of the angles for the shoulder and elbow were found applying an inverse kinematic function based on the y and z coordinates. Finally, the angle for the claw was found by subtracting the distance between the two fingers from 120. Figure 7.2 shows the output of the program specifying the four different angles, as well as the distance from the thumb finger to the index finger used in the claw.

Figure 7.2 Output for finding angles script


Unfortunately, I can not guarantee yet the functionality and accuracy of the obtained angles since our robotic arm has not still arrived. However, I expect to have the arm sometime next week so I can test the obtained results. In the meantime, I will look into other ways to find the desired angles in order to have a back up plan in case the first approach doesn't work.

-For next week, I will be preparing myself for the midterm design review, and also, I will work with the nunchuck in order to perform a final test of the "trace back" function that I wrote a few weeks ago. I will also start working with Edgar and Daniel in the process of program the microcontroller.

Week 9:

Date: 10 March 2017
Total hours: 10
Description of design efforts:
-During week 9, Edgar and I assembled the robotic arm. Furthermore, I tested all the servo motors in the robotic arm using an example code and an Arduino board that came with the arm kit. Once I was sure of the proper functionality of the robotic arm, I proceeded to write a code in the Arduino IDE in order to familiarize myself with the characteristics and properties of the motors. Finally, I completed the structure of a python script that will communicate with the Arduino using serial communication, in order to be able to test the angles obtained in the python script that I developed two weeks ago. Figure 9.1 shows a picture of the robotic arm.

Figure 9.1 Robotic arm


-Additionally, I helped Daniel testing the H-bridges to drive the motors and after several testing and debugging sessions we realized that the error was not present in the circuit but it was due to the wires that we were using. For next week, I will have tested the output of the arm-controller python script, and will have implemented a function that will find the value for the 2 remaining angles: the wrist rotation, and claw rotation.

Week 11:

Date: 24 March 2017
Total hours: 10
Description of design efforts:
-During week 11, I discovered an error in my Python script regarding the shoulder and elbow angles. I realized that the angles obtained by the script were in the range from 90 to 45, and I needed them to be from 90 to 0. Thanks to Alex's advice, I could come up with the solution of subtracting 45 from the output and then multiplying the result by two, thus obtaining the angles in the desired range.

Furthermore, I was able to successfully obtain the wrist angle. The only missing aspect for this part of the project is the claw rotation angle that will be developed in the upcoming week. I also fully implemented the communication with the leap sensor and the raspberry Pi. Figure 11.1 shows the Python script running in the computer, and Figure 11.2 shows how the received data has been parsed in the raspberry Pi in order to print the resulting angles after manipulating the hand coordinates obtained by the leap sensor.

Figure 11.1 Output from script running on pc


Figure 11.2 Output from script running on RPi


-Additionally, I helped Daniel testing the H-bridges to drive the motors and after several testing and debugging sessions we realized that the error was not present in the circuit but it was due to the wires that we were using. For next week, I will have tested the output of the arm-controller python script, and will have implemented a function that will find the value for the 2 remaining angles: the wrist rotation, and claw rotation.

-Finally, I work with Alex writing a master Python script that incorporates different functions and scripts written in the past few weeks, such as:
1.TCP client script: Establishes the connection with the TCP server using the socket libraries.
2. LCD script: Makes the process of writing to the LCD screen more straight forwards.
3. Path script: Saves the nunchuck's instructions in a text file and reverses it for future use.
4. Nunchuck script: Interprets and converts the data from the nunchuck to instructions.
5. XBEE script: Initializes and configures the transmission of packets.
6. Other scripts and functions.
Additionally, we built the packets to be sent by using the XBEE modules for both the rover and arm. We now consider ourselves ready to help Daniel programming the micro controller in this upcoming week in order to being able to finish setting up the different parts of the project so we can start testing them separately.

Week 12:

Date: 31 March 2017
Total hours: 12
Description of design efforts:
-During week 12, I designed and coded the main structure of the C program in charge of decoding the hex character received by the XBEE module in the STM micro controller. The program will mask the third bit of the hex character in order to classify the instruction for either the robotic arm or the rover. If the third bit was a 1,the last four bits will be masked obtaining an integer which will be used as an index for a predefined array of structures where the 8 possible movement instructions for the rover are defined. Then, PWM values and control bits for the H-bridges will be assigned by accessing the methods of the structure. Figure 12.1 shows the output of the program when the micro controller has received a hexadecimal number 0x70, which will mean that the rover should move forward.

Figure 12.1 Instruction for rover


On the other hand, if the third bit of the received hex character was a 0, the last four bits will be masked to obtain the motor to be controlled, as well as the direction that it will move(the last one encoded on the least significant bit). The motors will be originally set to an original position, and depending on the direction that the motor will move, an integer 1 will be added or subtracted form the current value and then set the according PWM to the motor. Figure 12.2 shows the output of the program when the micro controller has received a hexadecimal number 0x42, which will mean that motor 1 of the rover will move in the upward direction.

Figure 12.2 Instruction for arm


-Finally, I worked along Alex in order to design and write the main structure of the C program for the micro controller. The code specified above will be written in a separate .c and .h file in order to have a clean and easy to read main function. Thanks to Daniel's progress with all the initializations needed for the micro controller, Alex and I will be able to test the rover and arm functionality in the upcoming week.

Week 13:

Date: 07 April 2017
Total hours: 15
Description of design efforts:
-During week 13, I worked along Alex in the development, testing and debugging process of the radio frequency control for the rover and the traceback function. I had written the code for this part of the project several weeks ago so I just had to include the function in the python script of the raspberry pi. This first part of the function will start saving the instructions sent to the rover in a '.txt' file stored in memory,when the user pressed the 'c' button of the Nunchuck. The second part of the function will be called when the user pressed the 'c' button a second a time. In that case, the script would read the previous stored '.txt' file and reverse each one of the instructions, such that the rover will follow the same path but in backwards motion. Finally, these reversed instructions will be saved into a temporary string that will be sent to the micro controller by the XBEE modules. Figure 13.1 and 13.2 show the output and results obtained from testing the traceback function.

Figure 13.1 Code output


Figure 13.2 Traceback results


-Furthermore, Alex and I started the testing process for the robotic arm. We realized that the approach that we had in mind (specified in previous progress reports) would not be accurate nor totally feasible. Thus, we decided that instead of instructions, the raspberry Pi will send the actual value of the angels for each motors to the micro controller. These values will be encoded with their character representation in the ASCII table. In order to avoid errors given by values from 0 to 31, an offset of 32 will be given to values of the angles. Finally, the micro controller will assign the respective PWM to the motors depending on the angle obtained by subtracting 32 from the value of the received character. Figure 13.3 shows the results obtained so far in the attempt of moving the robotic arm.

Figure 13.3 Testing robotic arm


The reason why the movement is not smooth nor totally accurate is due to 2 reasons. The first one is that by the time the video was taken, the robotic arm was screwed on the opposite direction thus giving the wrong results for right and left. Finally, the robotic arm will follow the user's arm movement more smoothly once the delay placed on the buffer is reduced and more calibration is performed.

We have made some serious progress during this week. For next week I will work on the user manual, and as a team, we will work on implementing the ultra sonic sensors and battery monitor such that we can have a functional prototype for next week.

Week 14:

Date: 14 April 2017
Total hours: 13
Description of design efforts:
-During week 14, I fixed minor errors in the code in the raspberry, and also, designed and implemented the logic of interpreting the received data from the ultrasonic sensor to display according messages to the LCD screen. In order to optimize the code and avoid writing several lines of if statements, I came up with the idea of using a dictionary of lists in which the values for the sensors will be predefined. Thus, the script will only need to read the received character and display to the LCD the message for the user. Figure 14.1 shows the main idea of the code.

Figure 14.1 Code for ultrasonic sensors


Furthermore, I worked along Alex in order to make the movement of the arm more smooth. We accomplished it and thus were able to check off three of the five PSCCs. I hope that for next week we will be able to check the remainder two and keep working on the final packaging in order to be ready to present our final prototype.

-Finally, I wrote the user's manual which was the last assignment for this semester. I was very careful and thorough in writing this assignment so I believe that it will be very helpful for a future user.

Week 15:

Date: 21 April 2017
Total hours: 6
Description of design efforts:
-Thanks to the great progress that we have made throughout the semester, we were able to check off the last two PSSCs for our project. Such good work let me devote most of my time to 3 exams and projects for other classes that were due this week.

Nevertheless, I worked with Alex in the process of debugging the implementation for the ultrasonic sensors in order to make the rover stop when it is close to fall of and edge. We accomplished that by creating flags that will turn on when the ultrasonic sensors have detected a height below the rover greater than the specified threshold. This flags will be used in if statements placed in the function in charge of sending the 'move forward' or 'move backward' instructions to the rover. Furthermore, the specified implementation was used also to turn on LED's in the controller kit in order to warn the user when the rover is close to fall of an edge.

-We are planning on finishing the packaging over the weekend in order to have a fully functional final prototype next week. Also, we are looking forward to present the HERO bot in the 'Spark Challenge' competition this upcoming Friday.

Week 16:

Date: 28 April 2017
Total hours: 17
Description of design efforts:
-During this final week, we successfully checked all the final PSCCs after long debugging sessions. After several hours of keep getting a loss of packets in the RF communications, we realized that the reason for this misbehavior was not the software implementation but actually a hardware issue. It turned out that one of the XBEE breakout boards was not working properly so we took the XBEE module out of the breakout board and adapt it inside the rover. After doing so, I helped with the calibration of the traceback function and fixed bugs in the Raspberry Pi that improved the overall speed of the project.

We thought we were done with everything, however, when we tried to record the video for our final presentation, we realized that when put together, the entire project was behaving in a unpredicted and unreliable way. At first, we thought the XBEE modules were not working properly, but since we just dealt with them, we started looking at the code and how the packaging might interfere with the data communication.

After long and disappointing hours of trying to debug the project, we realized that the issue for all the unexpected behavior lay on the fact that the XBEE was still getting some interference due to all the wires present in the controller kit. Thus, we proceeded to move the XBEE out from the controller kit and place it where it would not suffer from any interference and also look professional.

Finally, we have successfully finished our project “HERO bot” and not only completed all the PSSCs that we set as goals at the beginning of the semester, but also made improvements to them in order to make it look better and more efficient. The video showing the functionality of the HERO bot project will be recorded and uploaded soon.