Progress Report/Engineering Project Journal for Jakub Kowalski

Week 15


Date Reported: 4/18/2024
Start Time: 12:00 PM
Work Time: 2 Hours

Quality Testing
Today I just spent some time doing some user testing, playing around with the thresholds for the distortion, cleaning up the code, and making sure all the effects work properly prior to tommorow's demo. I mainly try to work on the distoriton effect. The following is the final implementation of the distortion effect that I settled on:

distort_final


____________________________________________________________________________________________________________________



Date Reported: 4/17/2024
Start Time: 11:00 AM
Work Time: 5 Hours

Delay Fix
I was able to fix the delay, it turns out my issue was incorrectly accessing the delay line samples. In the algorithm we increment the index of the current delay line sample, and instead of using a prefix increment, I used a post fix which was incorrect. Making this simple change was able to make the delay work correctly. However, I noticed when feedback was set to 1.0, the signal started to become very loud. A fix to this was just setting the maximum allowable feedback to 7.0.

delay_fix

TouchGFX Bug
We experienced a bug with TouchGFX where changing the screens sometimes resulted in an assert error. This assert checked the state of the block and it was failing because instead of 'DRAWN' the state was 'EMPTY'. I looked up a post about this issue but it was for 2021 and it was some logic bug on TouchGFX's side which was supposedly fixed: https://community.st.com/t5/stm32-mcus-touchgfx-and-gui/assert-state-nextsendingblock-drawn/td-p/222896. I decided to comment out the assert error and run the program to see if it would work, which it did. I didnt experience any errors since so I left it as is. Me and Josh also did some tests where we tried to rapidly press the buttons, touch the screen and use the encoders to see if there were any issues with the UI keeping up with these changes, but as far as we could tell, it was working just fine.

Filter Verification
I also wanted to verify the filters on an oscilloscope to see they are actually working properly. I noticed when you go into the higher frequencies its very hard to hear any filtering being done, and when you filter in the mid range it would sound like youre filtering the high range. I displayed the frequency on the oscilloscope and noticed the issue. For some reason the frequency that was being filtered was twice the one being input. I could not figure out why this was happening after reviewing my code, so I just decided to work around it by dividing the input frequency by 1/2 when finding the filter coefficients. Doing so, I was able to get a proper response. I ran a small demo with an input sinusoid of a specified frequency and demonstrated that I can vary its volume when I set the filter in its range:




____________________________________________________________________________________________________________________



Date Reported: 4/16/2024
Start Time: 12:00 PM
Work Time: 3 Hours


Button Responsiveness
Today Josh pointed out that I was not using the button debouncing code he implemented. After he showed me how to use it, we got it set up and we were able to see some improvement with the button responsiveness; however it still did not detect all the time. Later in the day when Josh was gone I tried to play around with the confidence values for the button debouncing and found a good value where I was able to get the button presses detected each time.



Parameter Limit Detection
I also implemented parameter limits. Early on, whenever we reached the maximum parameter value and continued to turn the knob, the value would restart back at zero and vice versa. Me an Josh discssed this issue and we had some ideas on how to fix this. The end solution was to use a higher count limit for the encoders than needed. For example, we would use 500 count increments, but we set the limit to 600. What we then did was initialize 50 as the minimum count value and 550 as the max. Whenever the count would reach either of these limits or increase past them, we would then set the count back to the corresponding limit so it doesn't go past it. This way, we could limit the parameters at the maximum allowed values and you could keep on turning the knob without surpassing that limit. This would fix any akward pops when drastically changing the parameter values from min to max or vice versa.

param_limits


____________________________________________________________________________________________________________________



Date Reported: 4/15/2024
Start Time: 1:30 PM
Work Time: 5 Hours

DSP FX Review and Debugging
Today I spent reviewing all the DSP algorithms we had implemented. I found some issues with Josh's distortion implementation when determining the distortion thresholds. I also had some suggestions on changing which parameters we will modify. I also tested the delay algorithm and asked Shubo to help me with determining the issues with it. I focused on fixing the filters so that we could run them simultaneously while also being able to change their parameters. I was able to fix my issues by stumbling across an error with my logic when implementing my UI logic. Whenever we updated the parameters for the filters we ended up going through a bunch of switch cases which never called 'break' after handling the case. I also fixed how we calculated parameter values based on the counter of the encoder timer. I implemented some macros to convert the counters to parameter values in a clean and easy to understand fashion. After implementing these changes, I was able to run all five filters with no issues.

When I was done with filters, I took a look at Josh's new code for handling the distortion. I tested it out on my headphones and I did not notice any distortion no matter how much I boosted the preamp for the input signal. The issue lied in the fact that our thresholds were set too high. I realized that our input samples were always going to be in the range [-1,1] since we scaled it down before processing. I adjusted the thresholds to more reasonable values to see if it would make a difference. I was able to produce a distorted signal. I figured we could play around more with the algorithm to create a better sound.

New UI and Button Navigation
I also implemented a new UI and I implemented the necesary logic to handle changing effects and display it on the screen. The buttons still need some work to be more responsive:




____________________________________________________________________________________________________________________



Date Reported: 4/14/2024
Start Time: 7:00 AM
Work Time: 5 Hours

AudioFX Final Refactoring
This will be the third time I have refactored the audioFX library and hopefully my last. I stripped down the code to avoid using the AudioFX_Chain structs of AudioFX_UserParams structs I defined to keep track of the effects chain and hold pointers to user parameters and update flags, respectivley. My new implementation of applying the effects mirrored some examples I found online, which took into account processing for stereo signals:

audiofx_apply_final

In this implementation I avoided using the chain struct and instead just hardcoded the filter functions being called in the order I wanted them to be. I declared each effects parameters as an external variable so I can access it from main.c. Each effects output is being fed into the next effect's input via the left_in and right_in variables.

The only other two functions I kept in the AudioFX library were the SwitchFX() and SetParams() functions which I modified to acommodate the lack of the two structs I removed from the library. Previously the chain struct held a variable which tracked the current effect being modified; however, now it is just a global variable which is checked in the SwitchFX function to determine the next FX to be switched to. Outside of AudioFX, I created independent initalization functions for each effect which I call from main.c during initalization:

fx_init

Thats pretty much it, and honestly should have been the implementation from the beginning of when I started to work on the library since this is much more straightforward and easier to avoid some of the memory errors I experienced with the previous renditions. Now when I tested the audio with the effects I noticed that I still had persistent issues with running more than one filter at once. The distortion function seemed to be working fine, and delay did not work as expected since it caused the signal to have noise and popping as if the signal was not being processed fast enough. This was confusing for me since technically the delay function required less operations such as multiplication compared to the filter function:

filter_function
The filter function accesses past inputs/outputs and multiplies them by 6 calculated coefficients.

delay_function2
The delay function updated the delay line index and returned a ratio of the input sample to delayed sample.

I was not sure why this was happening so I started looking into possibe reasons why this is not running efficiently.

Project Optimization
One idea I had was the clock speed was not high enough. Normally we had system clock set to about 72 MHz when I was originally experiencing hardfaults with a higher clock setting. Later on I found that this was no longer an issue so I currenlty had 108MHz for SYSCLK. However, I increased the clock to 216 to maximize it. After running the code, I was able to cascade about 3 filters, but the response from the LCD was slower than typical since more time was spent on processing the data. This was good news since technically we had more headroom for completing the dsp compared to what we had previously. The delay was still not working even it it was the only effect being applied, which baffled me as to why that was the case.

sysclk_max

Another tip I found was to set compiler optimization to an appropriate value. I went into the compiler settings and set it to speed optimization.

compiler_optimization

After doing this, I was able to run 5 filters cascaded together. The delay was able to run by itself finally; however, I was testing this with parameter values which essentially calculated the final output to be the original input, so I wanted to test it out with values which would create a delay sound, but then the audio was once again jittery as before. I also tried to run the filters while adjusting one of their parameters, and at a certain point I started hearing popping and a loud noise. I will try to find the root of this tommorow.

Being so close to the deadline, Im worried we wont be able to apply all three effects simultaneously since im having issues implementing them individually. My last hope is that I can at least optimize the filters further using the CMSIS library which has its own filter implementations; however, Im not sure about the delay.


____________________________________________________________________________________________________________________



Date Reported: 4/13/2024
Start Time: 9:00 AM
Work Time: 1 Hours

Hardfault Debugging
Today I went into lab to test out my refactored AudioFX library along with the new delay and distoriton effects, as well as seeing if the cascaded filters would work seamlessly. I quickly ran into some issues where I was experiencing hardfaults with my code during initialization of the effect parameters. The issue existed within my logic on how I was adding the delay function parameters into the AudioFX chain struct. I tried running the code with the filters first, and they did not improve. When I ran it with the delay function I started getting another hardfault due to accessing an array with an out of bound index. This didn't make sense so I stepped through the debugger and noticed that for some reason when I was initializing the structs I have defined for my filter functions, it would somehow modify the array index variable inside of my delay struct. I was not sure why this was happening, but at this point I decided that the AudioFX library needs to be even more reduced than it already was. I felt like it still added too much overhead and was more complex than it needed to be. My original idea behind it was to be able to dynamically add and remove effects to the chain, but at this point it would just be better to hardcode the effects being applied.


____________________________________________________________________________________________________________________


Week 14


Date Reported: 4/11/2024
Start Time: 4:00 PM
Work Time: 2.5 Hours

AudioFX Refactoring
Today I refactored the AudioFX library for the second time. I was happy with the results as the code looked much cleaner and was more space efficient since we stored all the FX specific parameters in their own structs rather than using a general struct for all the FX which would cause some variables to be unused depending on which FX was assigned to it. My new implemetnation had a general FX struct which would hold some universal variables that all the FX will need and a pointer to the FX specific parameter struct which had the necessary variables for each effect. I also redid how the variables controlled by the encoders behaved. Earlier they directly stored the value of a certain parameter such as frequency, gain, etc. But now they just stored a percentage value which would be used to scale the corresponding param that was stored in the FX struct.

New EQ Implementation
I watched the video from Phil's lab to follow his EQ implementation. Overall it was very similar, minus how he calculated some of the coefficients. In his video I noticed he applied multiple FX to each sample, whereas I applied the effect on the entire buffer before going on to the next. This worked fine in the past since I was testing individual filters in the past, so we only had to compute the sample once. But now we are essentially traversing the entire input buffer 5 times for each band. I need to change this so that each band is processing a single sample instead. We can then traverse the array from the Apply_FX_Chain function.

current_updating_audiofx

Old implementation.

updated_apply_audioFX

New implementation.

The new implementation will update all the parameters prior to entering the processing stage. Once in the processing stage, we pass in a single sample to each effect. Each effect's output is set as the new input to the next effect. Once all the effects are done processing the sample, we store it in the output buffer and move on to the next sample.


____________________________________________________________________________________________________________________



Date Reported: 4/10/2024
Start Time: 3:00 PM
Work Time: 5 Hours

Man Lab
Today in manlab I discussed my findings regarding the filters with the team, we also discussed some possible issues with the codec. During our meeting with the course staff we got our final two PSDR's checked off and we identified that our two pressing issues where getting the codec to work and implementing all the software for the effects. All we had was the filters which didn't work simultaneously, and Josh and Liam did not have much progress on the delay and distorition effects. We decided that Liam would focus on the codec instead since he had some concerns with the soldering, and Josh and I would focus on implementing distoriton and delay, respectivley.

Delay Implementation
For the delay algorithm, I found an algorithm online which was used for an embedded device: https://wiki.analog.com/resources/tools-software/sharc-audio-module/baremetal/delay-effect-tutorial

I followed the implementation while also adjusting it for our specific device. I avoided using the left and right output buffers and instead just used one. After I was done I went ahead to try and test if it would work. When running, I noticed that the delay effect specifically would cause a hard fault at seemingly different parts of its code. This did not happen when running other effects such as low pass filter, so I think this must be something wrong with my implementation for the delay.

Codec Testing
After Liam was able to redo some of the soldering on the codec. We ran my code to test if it would work correctly. The codec was able to successfully initialize and the audio was coming through correctly. I also ran the audio through a lowpass filter to verify we can simultaneously update parameters, display the UI, and run the codec. This was successfull and a relief as we were worried that we wouldn't be able to get the audio to work on our final board.

Distortion Integration
Josh implemented the distortion algorithm he was able to find online and sent me it for integration. I took his code and integrated it with the rest of the AudioFX library. One thing I noticed is that I shouldn't have gotten rid of FX specific structs when I was refactoring the AudioFX library as I did a couple days before. When Josh was trying to implement his effect he had some issues understanding the flow of the software and where to store his variables for parameters. I will fix this tommorow to make the library more easy to understand.


____________________________________________________________________________________________________________________



Date Reported: 4/9/2024
Start Time: 5:30 PM
Work Time: 0.5 Hour

Cascaded Filter Testing
Today I came to lab trying to test out the current performance of the cascaded filters I am using for the parametric EQ. When running the code I noticed the output audio had a significant buzzing noise present. I found that the source of this was the highpass filter. Earlier I did not notice this issue since the highpass filter had static parameters which were high enough to filter it out. I tried to test the performance of just one lowpass filter cascaded with one peaking filter to see how it would sound. I noticed there was almost as if samples missing from the output. This could be an issue with my implementation and so I began to look for some other ways I could create the EQ since something could be wrong with my implementation.

Online I found two alternatives to creating a EQ with multiple bands. One was to use several bandpass filters which would have their outputs combined. Another implemention was from Phil's Lab where he did something similar to what I already had. His implementation was using several cascaded peaking filters. I wanted to try and follow his implementation first since it could build off of my already existing code.

Link: https://www.youtube.com/watch?v=4o-_gUht_Xc


____________________________________________________________________________________________________________________



Date Reported: 4/6/2024
Start Time: 3:00 PM
Work Time: 6 Hours

LCD and Codec Integration
Today I came into lab with Josh and Liam to work on integrating the LCD and codec onto the PCB. I asked Josh to flash his code on the board to see if it would run with any hardfaults. He had a different clock configuration than I did so I was guessing that could've been the issue I had yesterday. His code ran correctly without fault, and so I changed my configuration to his and then my code was able to run without fault as well. We helped Liam inspect the I2C lines communicating with the codec to identify any possible issues with transmitting commands to the codec, however, we were able to read data being sent to the I2C lines. We were not able to identify what the issue might be, so Liam got to soldering off the resistors for the lines I identified. He also wanted to solder off the pullup resistors for the I2C lines since apparently that set up worked for him during prototyping.

In the meantime, I began to design a GUI for the EQ parameters using TouchGFX, as well as implementing the logic for reading the values from the parameters and displaying them on the screen. Once Liam was done, I loaded the code onto our PCB and was able to successfully display the GUI on our LCD, we are able to get our final 2 preliminary PSDR's checked off.

lcd_pcb_success

We also tried to run the codec software; however, the I2C communication was receiving an error and stopping the rest of the program from running. Liam decided he would solder the I2C lines back on and that we would continue to work on getting the codec working next week.

Multiple DSP FX Processing & GUI Navigation
At this point Liam and Josh were on their way, Josh agreed to looking into how to implement the distortion algorithm for DSP, and Liam already began looking into how to make the delay effect. I therefore began to work on implementing the logic for handling multiple DSP effects at the same time, as well as navigating their parameters in the GUI.

Honestly, this process took me a long time because I ended up refactoring alot of my code I made for DSP onto this point. My initial application was based on creating structs for the individual FX, the FX chain, and their parameters. The FX struct would hold function pointers to certain functions for each effect such as updating the effect, applying the algorithm onto the signal, etc. as well as some variables to along with it. All the specific effect parameters would be stored in the FX params structs which were meant to be unique for each effect. However, I decided that for our purposes it would be better to create a general params struct which would be used for all the effects, and inside the struct there would be a couple general variables used by every effect, as well as some unique ones if need be. This would make it easier to handle logic that requires accessing these variables, since each effect will be using the same struct layout.

audiofx_new

I also added in logic for handling switching the current effect being controlled. Inside of the FX chain struct, theres a pointer to the current effect's user parameters which stores the "type" variable which identifies it. This type can be incremented or decremented to find the next effect in the chain. This function is called from the while loop on button press.

audiofx_switch

This sets a flag which is read in the TouchGFX Model, which then calls the appropriate functions I wrote for updating the specific GUI elements. After I was done implementing the code and debugging any compile errors, I ran to test it. The good news is that I was able to somewhat cycle between which element is being edited on the UI, but there was some issue with displaying the correct value of the parameters. It seems as though the values dispayed are being overwritten each time the effect is supposed to be selected, meaning that I am not correctly reading the parameter values from memory and converting them to their timer counter value.

My goal for the this week is to assist Liam with getting the codec to work, completing the filter effect, and assist Liam and Josh with implementing the delay and distortion effect, respectivley.


____________________________________________________________________________________________________________________


Week 13


Date Reported: 4/5/2024
Start Time: 7:30 PM
Work Time: 2.5 Hours

RST, D/C, and SPI_NSS Prototype
Today I came back into lab to do some more prototyping on the LCD connections. Last time I only tested if the connections would work if we have 220 Ohm resistors on the MOSI and SCK pins. However, I neglected to check on RST, D/C, and NSS pins. After testing out different combinations, I determined that NSS cannot have series resistor either, since the screen would stay white when it was connected to one; however, the RST and D/C pins seemed to be working correctly.

PCB Codec Debugging
My next task was to test the codec on the PCB. I set up all the necessary code and ran it to test. I noticed there was no output coming from my headphones. I tried to run the project in the debugger to see where the issue would arise and I noticed that our code was experiencing a hardfault in random places in the code. At first it was while initializing peripheral clocks, then in I2C initialization, then SAI, then in the while loop. It seemed to occur in random places each time I checked. When we did try to initialize I2C, we experienced an error because the code was going into a error handler I defined. It seems like the hardfault occured no matter what, the only question was when would it occur.

hard_fault

I read some threads to see where the issue could exist: https://community.st.com/t5/stm32-mcus-products/hardfaults-at-random-places-in-code-on-an-stm32f722/td-p/190494

After playing around with the code where I would comment out certain changes I made from my previous working state, I was unable to determine what could be the issue. Another error I noticed was during initialization of peripheral clocks we would sometimes enter into an error handler. At this point I was unsure what the exact issue could be.


____________________________________________________________________________________________________________________



Date Reported: 4/4/2024
Start Time: 5:30 PM
Work Time: 3 Hours

LCD Connection Verification
Today I came to lab to verify the connections for the pins on the LCD connector. By simple inspection, I noticed the first two pins on the LCD connector were bridged together. However, by inspecting the PCB in Kicad, I saw that these two pins were not responsible for any signal we needed:

lcd_connector_pcb

During lecture, Liam suggested I do continuity tests on the pins to make sure they are correctly routed. I tested the routing for all of the pins related to communicating with the display controller: RST, D/C, SPI_NSS, SPI_SCK, and SPI_MOSI. After checking all of these, I concluded there was no issue with the routing, and all of the relevant pins were not bridged. I also took the time to verify all of the connections on the EyeSPI connector which we used as a reference for determining which pins were routed to what signal. After doing so, I verified that our pins were routed in the correct order. I also decided to do a continuity test on the connector pins while they were connected to the EyeSPI (the same connector the one on the LCD) via ribbon cable. The results also inidcated that there was no connection issues between the two connectors.

eyespi_connector_cont_test

At this point I was more confident that our routing was done correctly, one thing I thought of while testing the routing was that the series resistors on the SPI lines could've been interfering with the signal from being read correctly. These resistors were meant for ESD protection, and we did not have any on our prototype board.

SPI_MOSI and SPI_SCK Prototype
To test my theory, I put two series 220 Ohm resistors on the SPI_MOSI and SPI_SCK lines. After doing so, I ran the program on the nucelo board and saw that the display was all white. This might be the source of our issue. At this point I transitioned to testing the performance of the dsp while adjusting the parameters.

220_ohm_lcd_white

Prototype board with 220 Ohm resistors on SPI_MOSI and SPI_SCK.

Parameter Update Testing
I connected the codec to my audio input and output, and while I was playing audio and adjusting the knob, I noticed there was no more noise on the signal. The audio was able to be filtered out smoothly as I adjusted the frequency value. I still noticed there was a loss in resolution for frequencies below 1000 kHz, so in the filter parameter update function I added some logic to set a limit for the lowest allowable frequency to 1200 kHz for safe measure. I also played around with the counter values to determine the one which felt the most comfortable for adjusting the param values. Originally it was set at the maxx 65535, but I set it to 119 after some trial and error.


____________________________________________________________________________________________________________________



Date Reported: 4/3/2024
Start Time: 12:30 PM
Work Time: 7 Hours

Man Lab
Today during Man Lab we met with the cours staff and discussed the state of our procect. I reported on our progress with integrating the software. Two key points brough up by the course staff is designing the GUI display as well as trying to implement the easier DSP effects first rather than focusing strictly on the more complicated algorthims.

Parameter Updating Optimization
Today I also tried to fix the issue we are having with updating the parameters for the filters. Last week I theorized that the issue could exist in trying to access the filter coefficients before they are done being updated, leading to unexpected behavior. I implemented the logic to correct this by defining temporary coefficients inside of the filter parameter structure, as well as a flag to indicate that the coefficients are ready to be updated. Inside of the SAI DMA handler, I set up logic to check for the update flag prior to applying DSP, if it is true, then we update the real coefficients with the ones stored in the temporary variables.

After running the code, I noticed the noise was still present when rotating the encoder. But I noticed this only occured sometimes depending on how much I turned the encoder. At this point I figured there must be some issue with how the values are being set, possibly to something unexpected. I decided to work on the TouchGFX project for the device to create a simple display of all the parameters for the filter.

GUI Parameter Display
For the GUI display, I created three text boxes which would display the current value of each parameter. This was done using what are known as "wildcards" in TouchGFX, which you can use to update the value of the string inside of the text. I set up all the necessary logic for communicating between the MCU and the TouchGFX process, where the TouchGFX model would fetch the parameter value every tick, and would then send it to the screen view via the presenter. After doing this, I was able to get a simple display like this:

param_display

For now, the only parameter being displayed was the frequency so that we can track its value as we listen to the filtering being done on the signal.

After running the code and rotating the encoder I found the issue. The frequency value was being updated multiple times on one encoder turn and was getting unexpected values after the frequency went below 0. The way the code was currently set up, it would check for a difference between the encoder's current timer counter and the last one which was recorded. It would then decrease the frequency by 1kHz on each time the counter would change, regardless of the direction. The issue with this was that one notch on the encoder, was equivalent to updating the counter 4 times. Also, when turning the knob only slightly without reaching the notch in the encoder, the counter value could bounce between two values multiple times. This meant that for the current logic, the frequency value could quickly go below zero even on one turn depending on how you turned it. The solution to this is to derive the parameter value from the encoder's timer directly. I set up the following logic in the while loop:

param_change

This implementation would divide the counter value by the max counter value and then be multiplied by the maximum parameter value to derive the parameter value from the counter. This way, the parameter value cannot go outside of the range of the maximum value it can be. After making this change, I was able to get the frequency to change within reasonable values:



PCB LCD Testing
At this point Liam was done with soldering on the LCD connector. I wanted to possibly get our first two preliminary PSDR checked off for interfacing with the LCD display via SPI and interacting with the screen via touchscreen. I began working on porting over all the code for the LCD onto our PCB project. I ran the code and noticed that the screen was all white.

lcd_pcb_white

When trying to debug, I set up a breakpoint for the initialization code to check for errors, but when running, there was none. I also set up a breakpoint for the touch screen logic to check if it detects touch, or if theres some greater issue with the LCD connector in general, but when touching the screen, the touch was able to be detected:

touch_screen_debug

I made sure to run the code on the nucelo board to see if I can replicate the issue, however, everything was working fine. I asked Josh or Shubo to check if the PCB routing was correct for the SPI lines responsible for the LCD. Shubo was able to verify that the connections were correct. At this point I had to leave lab, but later in the day I realized there could be an issue with bridged pin connections, specifically the RST pin which could be causing the screen to stay white.


____________________________________________________________________________________________________________________



Date Reported: 4/2/2024
Start Time: 5:30 PM
Work Time: 0.25 Hours

Chip Erase
Josh and Liam were having some issue with programming the MCU becuse the ST-Link was not being found. I read some threads on using the STM32 ST-Link Utility tool in order to fix this. After following the steps outlined, I was able to erase the chip and verified that the device would be able to connect successfully.


____________________________________________________________________________________________________________________


Week 12


Date Reported: 3/27/2024
Start Time: 12:30 PM
Work Time: 3 Hours

Man Lab
Today in man lab we discussed the progress of the project with course staff, and I shared my progress with the rest of the team. After the meeting, me and Shubo discussed our next steps for integrating the rotary encoder with the DSP parameters. Shubo would add in the code he wrote for the rotary encoder, and I would figure out how to handle the logic for accepting the user input and then updating the values, as well as designing the UI to display the current value of the encoder.

Rotary Encoder + DSP Integration
After Shubo updated the code, I took over and decided that for now we did not need the UI to verify that the encoder works since he added in an LED being set by a PWM signal that is controlled by the counter from the encoder. For now, the implementation would be to check the encoder counter in the while loop, compare it with the last stored value, and then use the offset from that value to determine how much to increase the parameter that the encoder is currently assigned to. For a simple test case, I made it so that the encoder would only modify the center frequency of the filter by 1kHz, after this is modifed, then the FILTERS_Update() function would be called to recalculate all the coefficients.

This is the implementation:

encoder_dsp_integration
Timer 3 is set to PWM to light up an LED to verify that the value is changing.

After running the code, upon rotating the encoder the signal became very distorted and loud. I was not sure what exactly could cause this, but my theory is that its possible that while trying to update the coefficients, the DMA interrupts the code and runs the DSP function, which in turn could use the incompletey calculated coefficients which can result in unexpected behavior. I am further convinced this could be the case since I ran a test case where I only incremented the dbGain by 0.1 on each rotate, which only would effect the value of one coefficient to adjust the gain, and this did not result in any distorted noise. Some options are to either disable interrupts for the moment that the coefficients are being assigned, but possibly at the expense of continous transmission of the signal, or we can instead calculate the coefficients and store them in temporary variables and upon completion set a flag which is read in the DMA callback to update the real coefficients from the temporary variables prior to DSP.


____________________________________________________________________________________________________________________



Date Reported: 3/26/2024
Start Time: 3:50 PM
Work Time: 0.5 Hours

PCB MCU Test
Today I came to lab with Josh to help him figure out why the code they flashed on the soldered MCU was not working. We tried running the same code on the Nucleo to verify that the heartbeat LED was working as expected. After confirming the code was correct, Josh verified that the correct pins were being set in the code for our PCB. I suggested we try to measure the signal using an oscilloscope to verify. When we tried to measure it we did not see any output on the line. So then I suggested that Josh try to generate a new CubeIDE project, since he was using an existing one from the F746ZG-Nucleo board we were using. We thought the existing project should work since our board uses the same MCU model as the F746ZG-Nucleo, but only after Josh generated a new project only selecting the MCU model rather than the specific board in the settings, then the code finally worked.


____________________________________________________________________________________________________________________



Date Reported: 3/24/2024
Start Time: 9:30 PM
Work Time: 2.5 Hours

Filter Code Refactoring and DMA Ping Pong Buffer
Today I worked on improving the implementation of the filters in order to make it more efficient, as well as implementing DMA ping pong buffer to improve the speed of transmission. For my filter implementation, I realized that my current logic was quite slow because it calculated all the coefficients each time the filter function was being called:

old_filter2

The filter() function would accept the user parameters "u_p", and would then check what filter type was stored in the parameters to determine which filter function was being called, and then call the function passing in the 3 specified user parameters.

old_filter

These are two examples of the filter functions, and each call would grab the required user parameters and then calculate all the IIR filter coefficients. My new implementation aimed to avoid this continuous recalculation of the same exact parrameters. My idea was to calculate all the IIR coefficients in the FILTERS_Init() function using some predetermined default values, which then would be stored in the user parameter struct. Then each call to FILTERS_Apply() (renamed from filters()) wouldn't call a new filter function such as lowpass() or highpass(), but instead just fetch the IIR coefficients from the user param struct and then use them to calculate every new sample. The only time when recalculation of the coefficients would be necessary would be when user input is detected from the rotary encoders. For this purpose, I created a FILTERS_Update() function which is almost identical to FILTERS_Init(), except it fetches the user parameters from the struct rather than having predetermined values.

This is the updated filter code (some changes will be explained later below):

filters_init
The new FILTERS_Init function, all the coefficients are calculated here and stay the same until the user updated them by calling FILTERS_Update().

filters_apply
The new FILTERS_Apply() function, all the filtering gets done here instead of a separate function since the filter formula stays the same for all of the filter types, except for how their coefficients are calculated.

The next thing I did was move the SAI transmit/receive from the while loop, and instead call circular DMA as I did a couple weeks back when I was trying to help Liam with removing the popping issues we had with the codec. I revisited a video from Phil's Lab where I copied some of his implementation of the DMA ping pong buffer: https://www.youtube.com/watch?v=zlGSxZGwj-E I confirmed that his implementation was working for the case of an unaltered signal, where the input and output were the same. Then I tried to add in my own filter functions to test if this would fix the noise issue I was having earlier. Upon running the code, I did not see any changes. The noise was still present, and there wasn't any clear change in the audio quality.

After some consideration, the last change I realized I had to make was changing how I stored previous outputs and inputs. Previously each call to the filter function would create a new buffer of past outputs and inputs and initalize them to zero. The buffers would then be updated for each sample being calculated. This worked when I tested out the algorithms in python on generated audio signals because I was essentially passing in a couple seconds of audio data which filtered everything in one call; whereas in a live DSP setting we are passing in probably milliseconds of data and calling the filter function multiple times, meaning that the buffers for past inputs and outputs was being reset to zero on each call, which could explain why I was getting noise. Before I made any other changes, I modified the size of my buffer from around 128 to 1024 and then 4096. Each time I increased the buffer size, the noise in the backgroound had a lower frequency and was less audible. This confirmed my suspicion, since creating a larger buffer meant that the filter had more samples to process before the previous output and inputs were reset to zero.

The simple change I had to make was storing the previous value buffers in the user parameter struct which was being passed into the FILTERS_Apply() function, this meant each new call would access the values stored in the previous function and allow continuity. After running the code, the output was finally clean and had the desired filter effect I wanted. The only issues I noticed was that the sharpness of the filter was not as effective, since generating a sine of about 10kHz required that I created a lowpass filter with a center frequency between 1-5kHz before it began to be filtered out; however, this could just be an issue with the bandwidth parameter, and I will test for this more later. Another issue is a decrease in the resolution of the audio when the lowpass filter center frequency is set to anything at or below 1kHz. As the audio would get quiter, it would become more distorted and sound like it was being played on a lower quality radio. Im not sure what this is caused by, but a simple workaround is setting some accepted range of frequencies that we may pass in to the filter to avoid this quality loss.

DSP and LCD Optimization
Up to this point, I have tested all the DSP without running the LCD concurrently. I uncommented all the LCD code to run the UI on the touchscreen while DSP was being performed. The result was that there was some minimal glitching and popping sounds whenever the touchscreen was touched and then updated. The good news was that there was no more delay between detecting the touch and the UI update, it was instantaneous. Following the tutorial from Phil's Lab, the DSP processing was done in a while loop after setting a flag in the DMA callback. To fix the popping issue, I tried to call the DSP function in the DMA half complete and complete callbacks. The DMA callbacks had a very high priority set, so my thinking is that this would interrupt the touchgfx process and apply the DSP rather than having to wait. Making this change has resulted in an incredible decrease in any sort of noise being generated when touching the LCD. It was almost silent, and only really audible when the input was silent. My only concern is that this could stop working when we try to apply 5 filters + 2 other effects running simultaneously, since this would take up more time in the callback function.

DMA_callbacks
The DMA callbacks with the filter being applied before exiting.


____________________________________________________________________________________________________________________



Date Reported: 3/23/2024
Start Time: 11:30 PM
Work Time: 2 Hours

Lookup Table Implementation
Today I worked on creating lookup tables for optimizing the processing time for the DSP effects. My main idea was to just create an array of precomputed values for the cosine and sine functions at the beginning of the program, prior to the codec running. This way, we can avoid computing them and just fetch them from memory instead. My main approach is to specify the array size, and from the array size compute the step size for the angles around the unit circle.

The formula for the step size is 2 pi divided by (table_size - 1). To compute the outputs of the trig functions, I wrote an initialization function which sets each index 'i' in the array to table[i] = trig_function(i * step_size), where trig_function is cosine or sine from the math.h library. This means that the range of inputs supported by the table is [0, 2*PI]. To fetch these values, I wrote another function where the input to the trig function is scaled down by 2 pi until its in the accepted range, and then divided by the step size to find the index. In order to handle the cases where the calculated index is a decimal value (meaning its true value is somewhere in between two indicies in the array), then I looked into linear interpolation and extrapolation in order to derive a close estimate value from two existing values in the table.

Linear Interpolation: https://en.wikipedia.org/wiki/Linear_interpolation
Linear Extrapolation: https://en.wikipedia.org/wiki/Extrapolation

The finished code:

lut_library

I tried running some test where I would play the audio with the filter on; however, I started getting weird outputs with just a beeping noise playing over the output which became distorted. I verified the range of values computed in the sine and cosine tables, meaning this wasn't an issue with the table. I thought this could just be an issue with how long it was taking to fetch the values from the table. I ran some tests where I just multiplied the input signal with a value from the LUT without any filtering, and the signal came out sounding fine without the noise, so I concluded this was an issue with my current implementation of how the filters were being applied.


____________________________________________________________________________________________________________________


Week 11


Date Reported: 3/21/2024
Start Time: 5:30 PM
Work Time: 0.5 Hours

DSP Filter Testing
Today after lecture I came into lab just to test out the code for filtering the audio signal. My theory is that I was using the wrong data types for processing the audio, so I first made sure to convert the int16_t buffer data to float for processing and then back to int16_t for output. When I tried to run the code I was able to get some sort of output from the filtered signal, although it was not behaving as expected. The main issue I had was that all the filters acted as if they were a lowpass filter, and raising certain parameters such as the center frequency would suddenly cause a very loud distorted noise to output. I tried playing around with a couple different parameter types to observe the behaviour, but I couldn't deduce what the exact source of the issue could be. I don't think this has anything to do with the data types anymore because even with the passthrough function, I still had to convert the data from int to float and then back. My plan was to start implementing lookup tables for the trig functions in order to reduce the time spent on processing each buffer and then adding in DMA ping pong buffers to make the audio output smooth my processing half of the DMA buffer whereas the other half is being filled.


____________________________________________________________________________________________________________________



Date Reported: 3/20/2024
Start Time: 12:30 PM
Work Time: 5.5 Hours

Man Lab
During man lab I shared my findings with Liam, but when I tried to power on the board then all of a sudden I started getting the same popping from the codec output as I initially found last week. I had no clue why this was the case. We took another look at the SAI configuration, and I noticed that I forgot to change the SAIB channel from SAI Receive to SAI Transmit. Upon fixing this, when I ran the code again the popping stopped; however there was a buzzing noise present in the signal. Liam hypothesized that the buzzing in the signal could've been from the screen being refreshed at a rate of 60Hz. To verify, he asked me to set the refresh rate to 10Hz and the buzzing noise turned into small pops. He hooked up the codec I2S lines to the oscilloscope to look at the data being sent through. On the scope he found that there were periodic intervals where the data would just transmit nothing for a brief amount of time before resuming data transmission. The solution to this would be using DMA and possibly only redrawing the screen whenever we detect a change in the parameter values.

After meeting with course staff, we concluded that we had to start working on the DSP algorithms alongside integrating everything. Liam asked Shubo to help me with working on integration, whereas Liam and Josh wwould work on soldering the PCB. Since I already had the LCD and codec set up, I wanted to start working on implementing the code I had for DMA transmissions of the audio data, as well as start implementing the logic for passing in this audio data for processing. I asked Shubo to work on integrating the rotary encoders, we discussed that our goal would be to at least have a running demo of accepting encoder input, passing it in as a parameter to an effect even as simple as adjusting the signals volume, and displaying this parameter value on the LCD. Shubo got to work on the encoders, and I began implementing the logic for the signal processing.

DSP Effects Integration
A good protion of my time was spent on thinking about how I would like to handle adding in new effects and controlling their parameters. Eventually, I wrote my own AudioFX library in which I defined an audio FX handle that stored the pointer to the user parameters as well as a function pointer to the DSP function which was being applied. Another struct was the audio FX chain handle which stored an array of FX handles as well as the pointers to the input and output buffers, essentially mimicking a mixer channel with a chain of effects. My logic behind this was to create a layer of abstraction so that we can easily add in new DSP algorithms with differing parameters. In order to test this library I copied over the DSP filter code I was testing several weeks back. With some adjustments, I was able to successfully integrate the filter code to my AudioFX library and let the project compile. For now, the filters were not yet optimized for real time processing, and when I tried to run the code with them I did not get any output. Instead, all I did was set the filter to "FILTER_PASSTHROUGH", in order to just let the original audio signal come through without any sort of processing being done on it. Upon running the code, I was able to get normal output from the codec while running the code through the AudioFX process which I called from the main while loop.

audioFX_library
Baseline for the AudioFX library I plan on using to integrate new DSP functions.

filters_library
Filters library I wrote several weeks back, the FILTERS_UserParams would be passed into the AudioFX FX handler's "user_params" variable, whereas "filter()" would be the function being pointed to by the "dsp_fx" function pointer.


____________________________________________________________________________________________________________________



Date Reported: 3/19/2024
Start Time: 6:30 PM
Work Time: 1.5 Hours

Component Integration
Today I wanted to take another look at the component integration to make sure that everything was done correctly. Before starting on the SAI DMA communication, I was wondering if the LCD was the source of the issue. I tried to comment out all the code relating to the LCD and running the project to see if I could hear any difference. Upon doing so, I initially first heard noise. I tried to reset the power for the board and then suddenly there was no more noise in the output signal from the codec. I started to uncomment the code relating to the LCD in order to see where the issue would arise. After eventually uncommenting all the LCD code, I had both the codec and LCD running simultaneously without any sort of noise coming from the codec. I had no idea why this was the case. The circuit and the code was exactly the same. I reset the power from the board a couple times to see if this was just a one time fluke, but it consistently outputted a clean signal. The only issue that still persisted was the UI updating slowly. I also noticed that whenever the UI updated after detecting touch, there would be a momentary glitch in the audio data like a beep or a pop, but this was expected. I decided to continue my work the next day.


____________________________________________________________________________________________________________________


Week 10


Date Reported: 3/14/2024
Start Time: 4:30 PM
Work Time: 2.5 Hours

Component Integration
Today I continued the integration of the project components. My first goal was to integrate the LCD with the codec so that they may work simultaneously. I copied over all the related code from our prototype projects in CubeIDE over to the integration project. I had to rewire our prototype board so that the LCD communication would be done via SPI5 rather than SPI1. After making all the necessary configurations, I ran the project on the board. Initially there were some hiccups with syncing up the LCD display with the TouchGFX library, but eventually I was able to have both the codec and the LCD running at the same time. The result was the LCD UI taking longer to update, as well as significant noise and popping in the codec signal. I tried to verify that everything was configured correctly, especially the SAI I2S protocol since we initially had issues with it, but I did not find any apparent issues. I figured this must have been due to the fact we had the codec process running in the while loop simultaneously with the TouchGFX process, so it might have been just too slow to send out all the data correctly; therefore I decided my next step would be to implement DMA receive/transmission in order to alleviate this issue.


____________________________________________________________________________________________________________________


Week 9


Date Reported: 3/6/2024
Start Time: 6:00 AM + 12:30 PM
Work Time: 3 (1 + 2) Hours

Touch Controller TouchGFX Integration
This morning I downloaded the FT6x06 driver from STM32 and implemented the changes in TouchGFXTouchController.cpp I saw being used on the STM32F7 Disco board. It turns out that all the initialization, communication, and error handling for the touch controller interface was being done here. I had to define some functions such as TS_IO_Write(), TS_IO_Read(), etc. which were then being called in the FT6x06 driver. After doing so, I ran the demo on the board, and when I touched the screen, I saw that the button UI would change colors to indicate it was being pressed. One issue I found however, was that when i touched the screen with two fingers or more, then the LCD would stop detecting touch. I was not sure why this was the case, I assumed that the multiple touches were causing some kind of error with the I2C, so I just made the error handler reinitialize the I2C interface. After doing so, the touch screen worked as intended. This was all done on the STM32F4 board, so in man lab my goal would be to port it all over to the F7.

Manlab
In man lab I transferred over the touch controller code to the STM32F7 Nucleo board. At first, it was not working for whatever reason. I tried to verify the wiring, resetting the board, verifying the code, until eventually for whatever reason the touch screen began to work. I didn't know if this was some kind of fluke, so I tried to reset the board again multiple times to see if it would work consistently, which it did. After demonstrating the touch screen and talking with course staff I decided that the next step would be to integrate all the components together. I got an .ioc file from Josh which had all the current pins we were using on the board, and this would be the starting point for generating the code for integrating everything together. I started to work on setting up all the interfaces to the correct parameters. I noticed that we had an issue with the SPI pins we were using. SPI1 was being interferred with another peripheral, so Josh set SPI2 as the one which was being used for the LCD. However, these pins were using a different clock, which only allowed up to half the baud rate as SPI1. The only other SPI peripheral which could achieve the same speed as SPI1 was SPI5. I communicated this to Josh and he went ahead to make the changes on the PCB. At the end of the lab, I decided to take all the components with me for spring break to work on their integration.


____________________________________________________________________________________________________________________



Date Reported: 3/5/2024
Start Time: 7:00 PM
Work Time: 1 Hours

Touch Controller TouchGFX Integration
Today I tried to run the TouchGFX setup I had from Week 7 where I followed a tutorial for a different touch controller but adapted to mine. However when I tried to run the demo I created I was not able to get any response from the UI when I pressed the screen. I decided to take a look at the implementation of the touch controller for the STM32F7 Disco board we were using at the beginning of the semester since I knew that the implementation on it worked. I inspected the TouchGFXTouchController.cpp file and looked at the driver they were using for that LCD. The driver had an almost identical implementation as the FT6x06 driver from STM32 that I originally found prior to the current driver I was using. The reason I avoided it at first was because it was a bit too abstract and confusing for me when I first started looking into touch controllers and I2C communication on the STM32, but now I had a better understanding and I could just implement the same code as on the STM32F7 Disco with the exception of replacing the functions with the ones from the FT6x06 driver. I decided to do this the next morning.


____________________________________________________________________________________________________________________


Week 8


Date Reported: 2/26/2024
Start Time: 5:30 PM
Work Time: 2 Hours

Touch Controller Interfacing
Today I was successfull in verifying that the touch controller I2C interface was working. Last time I tried to change the interrupt mode for the touch controller from polling to trigger, but the behavior of the INT pin was not changing. I initially just observed this using an LED, but I also verified via oscilloscope that the INT pin was always in polling mode. Howwver, I did use 'HAL_I2C_Mem_Read' to read the value of the register which was responsible for setting the interrupt mode, and I verified that the value that I wrote to the register was indeed stored there, meaning there must've been some kind of other issue with why the interrupt mode was not changing. My initial thought was that maybe I was writing into the wrong register somehow, but it wouldnt make sense since I verified the register value in the documentation. After doing some searching, it turns out that other customers for the same LCD were having the same exact issue, and it was just something you couldn't fix: https://forums.adafruit.com/viewtopic.php?f=48&t=138230

At this point I still wanted to verify on the oscilloscope that the I2C communication was working, last time when I attempted to visualize it on the oscillsocpe, the reading was not making any sense. I realized that since I was sending the signal only once, it would be hard to capture it on the oscilloscope as soon as its sent, I found that the oscilloscope had its own serial interface mode which I had to set it to and I set the trigger mode to whenever Im writing to the device. I though this would freeze the screen as soon as the message was sent, but this did not fix the issue. I decided to just repeatedly write and read to and from the register in a while loop which actually worked. After playing around with the scaling I was able to get a good reading:

i2c_interface

From the image you can see that I am writing to device address 0x38, to register 0xA4 (interrupt register) the value 0x00, and then I am reading from it right after. At this point we can be sure that I2C was working and writing to the correct address, so the next step was integrating it to the TouchGFX library.


____________________________________________________________________________________________________________________


Week 7


Date Reported: 2/21/2024
Start Time: 12:30 PM
Work Time: 3 Hours

Man Lab

During manlab I told Liam about the issues I found with the audio playing only in one ear or the very loud noise. He quickly experienced these for himself as he tried to run the code for himself in order to set up the show-me-a-thing for this week. Liam tried to set up the line in inputs using a TRS module rather than using the on-board audio jack on the codec. Upon finishing it and changing necesary driver code, he was able to address the issue with the audio playing in one ear. For the loud noise, we noticed that it only happened when power was reset using the reset button on the board, not when unplugging and replugging in the USB cable. This could be some kind of timing issue and all we might need to do is use the driver to turn the codec output on after some delay on startup.

Touch Controller Initial Test

After out manlab meeting I was able to start freely working on the touch controller since we didnt need the codec connected to the board. In order to better debug my program, I set up a connection with the "INT" interrupt pin coming out of the LCD breakout board in order to diagnose if touch is even being detected. According to the data sheet, the INT pin goes low whenever setup is complete and whenever touch is being detected. I set up an exterior interrupt pin on the F746ZG-Nucleo in order to set a breakpoint for it in the debug to test if the interrupt is being triggered on touch. When I ran the code, I quickly saw that indeed touch was being detected:



I used an LED to change state whenever INT triggers an interrupt. Sometimes a single touch could trigger multiple interrupts, I realized that this method of testing the interrupt output was not ideal and a better alternative would be to connect an LED to a voltage input and treat the INT pin as ground, meaning the LED will light up whenever the INT pin goes low.

I initially thought that meant the I2C setup was successfull, however just as a sanity check I commented out the initialization code for I2C and ran the code again to see if the interrupt would still trigger. I found that the interrupts were independent of I2C communication, meaning that there was a possibility that the I2C signals were not transmitted correctly. I looked at the driver code again and saw that 'HAL_I2C_Mem_Write' function was the one sending the actual data to the device. I decided to just use this function directly rather than using the driver function which just added additional logic in case the I2C peripheral was not initialized already. Since there were no initialization steps required such as in the case of the LCD display controller, this meant I could immediatley just read/write data to registers. To test if I2C was successfull, I called Mem_Write to change the interrupt mode from polling to trigger. The difference is described in the app note as such:

i2c_int_polling i2c_int_trigger

I also added a second LED to light up whenever the I2C function returns a HAL_Error status rather than just checking this manually through the debugger. When I ran the code I was receiving an error as indicated by the LED. I asked Liam about his codec implmenetation and he mentioned you had to shift the I2C address by one bit to the left to add in the read/write bit. The device address is 7 bits long but the instruction is 8 bits, where the LSB is the R/W bit. Since we want to be in write mode, the R/W bit should be set to zero. After making this adjustment I was no longer receiving an error, however when I tried to test if the interrupt behaviour was different, I did not see any change. When I tried to test if the code reading the x and y values of the touch was being read correctly, I noticed that the values being returned by the function to read were apparently out of the accepted range as specified by the driver. When I tried to interract with the UI button there was no change in behaviour either. I decided that my next steps would be to set up oscilloscope readings for the I2C pins to further verify the accuracy of the interface. Upon connecting the pins and displaying the signals I noticed that the SDA and SCL signals were identical to each other and did not have any meaningful output to it.

i2c_oscilloscope

I tried playing around with different pull-up resistors for the pins to see if this would fix the issue however to no avail. At this point I had to get going and decided to return to this issue the next day.

____________________________________________________________________________________________________________________



Date Reported: 2/20/2024
Start Time: 8:30 AM / 5:30 PM
Work Time: (1 + 1) 2 Hours

Codec
In the morning I came into lab with Liam to take a look at the codec to see if we can get it to work properly. Liam was doing most of the debugging with the oscilloscope to see if I2C was transmitting correct bits. He found out that the pops were coming from negative values being incorrectly shifted which converted them into a really large positive value. I tried to assist him with reviewing the codec's datasheet and verifying that the correct register values were set for correct I2C mode configuration. We were not able to find any issues with the implementation of the driver that Liam wrote. Eventually I had to leave and planned on retruning to lab after lecture.

Wrapping up Codec
After I left the lab, eventually Liam was able to fix the codec by reconfiguring the SAI channels as slaves which are synchronized with the master clock coming from the codec. After lecture in the afternoon, I came back to the lab to reflect these changes in my code for the DMA transfers, I tried to run the code however I was still experiencing popping when trying to implement double buffering using one circular DMA and one normal DMA. I decided to set it back to 2 circular DMA's reading from the same address to input/output, which fixed the popping issue and was functioning basically the same as Liam's non-DMA implementation. I tried to see if I can add in another DMA transfer to make it easier to copy the contents of the received sample buffer into a second output buffer, that way we can actually process the signal without working in the same adress space as the received input; however, upon running the code the audio generated a lot of noise and it was not intelligble. I decided to leave the DMA transfers in the state where they were working, one thing I did notice was that sometimes the audio would only play on one ear or generate a very loud noise on power reset. I decided to leave the codec for now and resumed working on the touch controller.

Touch Controller Resumed
I followed the steps outlined in the video I found last week. The video tutorial was using an SPI driver for a different touch controller, whereas I had to use I2C. I set up the I2C pins and downloaded a driver which was different than the one I initially found last week: https://github.com/bcmi-labs/arduino-library-graphics/blob/master/src/ft6x06.h. I found this link on a site which also informed me of the I2C address of the FT6206: https://i2cdevices.org/devices/ft6x06

i2c_pins
I2C setup (ERR_LED was added at a different date)

I looked through the driver code to figure out exactly how it worked. It's implementation was done for STM32F0 microcontrollers and it had a lot of code which was already handled in main.c regarding initializing I2C and the necessary peripherals for it. I had to modify the driver by removing the redundant code which was already handled in main, and I modified some functions in order to better fit these changes. I also had to copy a function from the tutorial which was used to convert the touch control x and y values to the display x and y values.

Upon making these changes I tried to run the code and see if it would work. I added a button UI in TouchGFX in order to test if the touch will be detected; however, when I tried to interract with it I was not getting any response from the button. I decided to continue working on the touch controller driver the next day since at this point I had to leave lab.

____________________________________________________________________________________________________________________



Date Reported: 2/19/2024
Start Time: 6:30 PM
Work Time: 2 Hours

Codec DMA Transfers Continued

I came into lab to continue my efforts getting DMA to work correctly. I figured out that the reason why the buffer was being filled with empty samples was because I defined the buffer type incorrectly as uint8_t rather than uint16_t, and when the DMA recieive transfers were configured incorrectly to receive word-sized (32-bit) data, we were essentially filling up 4 8-bit buffer samples with 32 bits of data, which only the first 16 bits were meaningful since the audio was 16-bit resolution. After I fixed both the buffer type and DMA transfer size, the buffer was filled up correctly:

debug_fix

After doing this, my audio data quality was still having some popping in it, especially with higher volumes. I was guessing that somehow my DMA transfers were still too slow or incorrectly handling the data. I looked at a tutorial from Phil's Lab to see how he implemented DMA transfers using I2S interface: https://www.youtube.com/watch?v=zlGSxZGwj-E

I payed attention to how he configured his DMA transfers. Two differences was that he set the DMA priority as "very high" and he made the DMA transfers circular for both receiving and transmitting. I implemented the priority very easily, but since Phil used I2S protocol directly rather than SAI, his DMA implementation used different functions than we have available for SAI. Rather than implementing two buffers for input and output such as he did, I decided to keep using my previous method of receiving and transmitting from the same buffer. I realized this might be a risk since we might be overwritting buffer data before it transmits if for whatever reason receive transfer is faster. However, when I tried to run the code everything seemed fine; however still with popping at higher volumes. One improvement was that now the buffer size could vary greatly without effecting the quality of the audio. With Liam's implementation, he was essentially using a buffer of 2 samples, which I had to use to get to his level of quality before the audio started to generate noise and output with missing samples. Now, with two circular DMA transfers for both Rx/Tx, I was able to use buffers of 1024/2048 samples without those issues.

My plan at this point was to continue the next morning with Liam to see if we can get the codec working properly once and for all.

____________________________________________________________________________________________________________________


Week 6


Date Reported: 2/16/2024
Start Time: 12:30 PM
Work Time: 3.5 Hours

Codec DMA Transfers Continued
Today I came back into the lab to see if I could figure out why the DMA transfers were causing the signal to be corrupted. Liam messaged me about his hypothesis that the pops in the signal could be caused by incorrect timing for the SAI. He said that the MCLK on the codec was supposed to be 12.288 MHz in order to transmit 48KHz quality audio. He said that the MCLK being output on our board was 15 MHz and that this could possibly be causing an issue. He said originally he assumed this was not an issue since the codec has its own MCLK, and therefore did not rely on out board's MCLK output. He also mentioned that a project he found on github said that the SCK signal being output from our board should've been 1.536 MHz. I measured this on the oscilloscope and found that the SCK coming from our board was 1.715 MHz. I spent some time adjusting the clock configuration to see if I could get the SCK frequency to get closer to 1.536 MHz.

pllsaiq_clock

I arbitrarly decided to raise the prescaler from /2 to /4 for PLLSAIQ. When I flashed the code onto the board, the oscilloscope reading showed 1.5 MHz. I tried adjusting it to different values if I could get closer to 1.536 MHz, but was unsuccessful. So I decided to keep it at the settings I showed above.

sck_freq

Liam had an idea to set up a serial monitor to print out the input values into the codec and watch them to see if there was samples being skipped or if garbage data was being sampled. Then try and output a square wave and see if the output signal is clear.

My idea was to try and output a hardcoded audio_out buffer of zero's to see if the source of the noise was from the input signal or if somehow noise being added to the output in the software. In this implementation the input from SAI A would not be transferred to the output block SAI B, so theoretically the output signal should just be zero's if funcitoning correctly. I rewrote some of the code and ran it in the debugger. I added the audio_out buffer as a watched expression to monitor its values. I found that after the SAI has been enabled by the HAL driver inside of the HAL_SAI_Receive_DMA function, the buffer was filled with random values in a regular pattern:

debug_audio_out

This was not making any sense to me since the audio_out buffer was not being passed into the receive funciton. I inspected the audio_in buffer and I noticed that it's address was equal to (&audio_out - 32) which makes sense since the audio_in buffer stores 32 values inside of it. This meant they were next to each other in memory and when the audio_in buffer was being filled with data from SAI A, some values probably leaked over to the audio_out buffer memory. I tried to define the audio_out buffer inside of the audio_transmit.c file I created for the callback functions. When I ran the code, the oscilloscope reading of the output channel showed a flat line which meant the data was not being overwritten in the output after SAI was enabled because the buffer was stored in a separate part in memory. (Unfortunatley I did not get an image of this.)

I spent some time changing up different configurations for the DMA and SAI to see if I can fix this issue of the buffer being filled up with data on every other sample. Something I realized was that my DMA configuration for circular DMA had a data width of a word, which is 32 bits of data; however, it should've been only half-word since the audio stream is 16 bit. I thought this might've been causing the audio buffer to fill up incorrectly, causing there to be zero's every other sample. I also tried raising the fifo threshold on SAI to be full rather than empty. This means the data in the Fifo buffer would be transmit via DMA once it was full.

Upon making these changes I noticed that the audio quality has "improved", in the sense that it was the same quality as Liam's implementation before adding DMA. I took an oscilloscope reading and it appeared similar to the one Liam had in his journal. You could see occasional pops in the audio data.

noise_pops

At this point I had to leave lab again, and I sent over my work to Liam so he can continue the work on Sunday or Monday.


____________________________________________________________________________________________________________________



Date Reported: 2/15/2024
Start Time: 5:30 PM
Work Time: 2.5 Hours

Codec DMA Transfers
I came into lab after lecture to begin my work on the DMA transfers. Liam sent over his STM32CubeIDE project for the codec. I spent some time looking through his implementation to better understand what was going on. I watched some videos on SAI to better understand how it worked: https://www.youtube.com/watch?v=BC7cPfePc-o. Some concepts I had difficulty understanding were slots and frame length. Liam had his slot number set to 2. When I asked him about why this was the case he told me he did not know exactly why and was just following instructions. I decided I was not going to touch those parameters at the moment and just add DMA transfers to see if they would fix the popping and noise problem.

My plan was to implement a circular DMA transfer for SAI A, the receiving channel, so that we could fill up a buffer of 32 samples. For every 16 samples loaded in, a normal DMA transfer would be called to transfer the last 16 samples to SAI B, the transmitting channel. I also created a flag to prevent preemptivley calling the transmit function before the previous transmission was completed. This was all handled by callbacks for the half/full DMA transfer interrupts.

audio_transmit

dma_circ dma_norm

Upon flashing this code, I noticed there was significant noise in the signal being sent out. When playing audio into the codec which I input from my laptop, the output was almost distorted like and "jittery" in a sense. The quality sounded worse than compared to the implementation Liam had. I tried adjusting the slot size to 4 since I thought maybe 2 was too small for whatever reason, but that created a very loud and higher pitched noise being output from the codec. I was stumped on why this was the case and I had to leave the lab at this point, so I decided to continue the next day.


____________________________________________________________________________________________________________________



Date Reported: 2/14/2024
Start Time: 12:30 PM
Work Time: 2 Hours

Man Lab
During manlab I set up the touchscreen demo I created last night to present to the course staff. Liam reported that he was able to get input/output working for the codec; however, the audio had popping and noise present with it. His guess was that the signal was being transferred too slowly from SAI A to SAI B. He wanted to try and set up circular DMA to try and fix this issue, although he was not sure how to implement it. During our meeting with course staff, it was emphasized we should get the codec working properly soon. I also showed Josh the changes in the pin connections I made for the LCD interface since he is the one working on the LCD portion of the schematic. I also looked into the schematics for the Adafruit LCD to update our bill of materials since our current one was using a different LCD that Shubo interfaced with. At the end of man lab Liam told me I could help him with setting up the DMA for the codec since he wanted to focus all his efforts on filling out the software formalization. Although I was originally planning on continuing my work with the touchscreen controller, I agreed to help.


____________________________________________________________________________________________________________________



Date Reported: 2/13/2024
Start Time: 10:00 AM / 5:30 PM
Work Time: (1 + 1) 2 Hours

TouchGFX Integration Continued
In the morning I continued working on implementing TouchGFX, this time I used the library that was being used in the tutorial. After setting up the project on the F4 and following the steps listed in the tutorial I was able to successfully add TouchGFX to the project and display a simple UI I created using the TouchGFX designer software.



From the tutorial, the framerate for the screen was about 20 FPS, since the timer on the board was used to synchronize TouchGFX. The timer would raise an interrupt at a rate of 20Hz, and at each interrupt the UI on the LCD would be redrawn and updated. I wanted to see if I can make this framerate higher later in the day.

After lecture, I went into lab to port over the code from the F407 Discovery board onto the F746ZG Nucleo. I followed the same general steps outlined in the tutorial; however this time I configured the timer differently, so that it would raise an interrupt at a rate of 60Hz. After some effort, I was able to run the TouchGFX UI on the LCD, this time on the F7 board.



In order to fit the implementation used in the new ILI9341 library, I had to change the chip select pin. The new driver just used the SPI NSS pin rather than a GPIO pin.

spi_nss

Touchscreen Controller Interface
At this point I was done with interfacing the LCD display. My next steps were to begin implementing the touchscreen interface. The tutorial I found used a touch controller which used SPI as the interface; however, our LCD uses a FT6x06 compatible controller which uses I2C to interface. I spend the rest of my time briefly reading through the relevant documentation to learn more about it and also downloading the FT6x06 driver included in the BSP driver offered by STM32: https://github.com/STMicroelectronics/stm32-ft6x06.

FT6x06 Datasheet: http://www.adafruit.com/datasheets/FT6x06%20Datasheet_V0.1_Preliminary_20120723.pdf
FT6x06 App Note: http://www.adafruit.com/datasheets/FT6x06_AN_public_ver0.1.3.pdf


____________________________________________________________________________________________________________________



Date Reported: 2/12/2024
Start Time: 1:00 PM
Work Time: 1.5 Hours

TouchGFX Integration
Today I came into lab to begin working on integrating TouchGFX on the LCD I was able to interface with. I showed Liam and Josh my progress. One of the things I pointed out was the possibility of using parallel interface in the future in order to increase the speed of the UI display. So far the SPI interface was not too slow, however I wasn't sure how well it would perform once we integrate all the project components.

During my time in the lab I spent most of my time looking into the steps for adding TouchGFX to your project for a custom screen. So far, I've only used the TouchGFX template for the F746 Discovery board which had its own LCD. I was able to find a tutorial on interfacing a touchscreen LCD with TouchGFX: https://www.youtube.com/watch?v=suMytEyQTP4.

The tutorial used a ILI9341 driver which was different than the one I was using on the LCD. However, the only requirement for the driver was to have a "window" and a "draw bitmap" command. My driver did not have these functions defined; therefore I had two options: write them myself or use a different library. Since I was able to interface the LCD with the current library I was using, I decided to try and implement these two functions on my own, this shouldn't be difficult since the functionality for sending commands was already implemented in the existing driver. I took a look at the ILI9341 driver being used in the tutorial and looked at the implementation.

I copied over the missing functions and replaced the functions called inside of them with their equivalents in the driver I was currently using. However upon flashing the code it was not successfully running TouchGFX. I was not sure what the issue was. I decided that it would be more efficient if I just use the driver in the tutorial: https://github.com/dtnghia2206/TFTLCD/blob/master/TFTLCD/ILI9341/ILI9341_Driver.c. At this point I decided to leave since Liam and Josh were still working on the codec, so I took the F407 Discovery home with me to see if I can implement TouchGFX on it and then later transfer over the code to the F746ZG Nucleo.


____________________________________________________________________________________________________________________



Date Reported: 2/11/2024
Start Time: 3:00 PM
Work Time: 2 Hours

LCD Interfacing
Today I returned to the lab to try and get the Adafruit LCD to work. The last time I was looking into this, I read in the ILI9341 documentation that the IM[3:0] pins must be set to certain values in order for the LCD to enter into SPI mode. The correct pin values should be "1110". I took a look at the adafruit site for the board in order to maybe find a guide for wiring the board correctly to see if I missed anything the last time I was wiring.

When scrolling through the adafurit pages for the breakout board, I was able to find the SPI wiring guide: https://learn.adafruit.com/adafruit-2-8-and-3-2-color-tft-touchscreen-breakout-v2/spi-wiring-and-test

I found an entry which dealt with the IM pins:

im_jumpers_guide

I tried to implement these values using the IM pins on the board; however, this did not work. I went ahead and soldered the jumpers on the board instead:

im_jumpers_soldered

Upon doing this, I took some time to rewire both the breakout board and verifying the pins being used on the F746ZG. This time I used the EyeSPI connecter instead of using the pins on the board.

EyeSPI: https://learn.adafruit.com/adafruit-2-8-and-3-2-color-tft-touchscreen-breakout-v2/plugging-in-an-eyespi-cable

I flashed the code and successfully was able to fill up the LCD screen with the color red, which I specified in the code.

red_screen

I then ran the demo provided by the creator of the library to see how the screen preformed with changing states.



Pins used on the F746ZG Nucleo:
  • PC9 - CS
  • PC8 - D/C
  • PC6 - RST
  • PA5 - SPI1 SCK
  • PB5 - SPI1 MOSI

Configuration for SPI:

spi1_1 spi1_2


____________________________________________________________________________________________________________________


Week 5


Date Reported: 2/9/2024
Start Time: 11:30 AM
Work Time: 3 Hours

LCD Interfacing
Today I tried to implement LCD interfacing on the LCD from Adafruit. Shubo was still not able to make it work on the F407 discovery board but was able to interface with a different LCD without touchscreen.

I took some time to look through STM32 libraries, it seems there is a library available for the FT620 touch screen controllers; I was also able to find a library for ILI9341 chips; however, this library depended on functions defined for an LCD found on an Arduino shield. There were a couple of other open-source libraries I found on GitHub, but I decided to try and implement the library Shubo showed me on Wednesday since it seemed simple enough.

https://www.micropeta.com/video37

I followed the steps for setting up SPI, clock speeds, and adapted the output pins for CS, D/C, and RST to our board. I then looked at the macros defined in the header files and adjusted them as needed: replacing the F1 HAL library to F7, adjusting the output GPIO pin values to the ones I used on our board, and adjusting the width/height dimensions.

Upon flashing the code onto the board, I did not have any success, the LCD just lit up white, not displaying any characters as defined in the code. I took some time looking through the functions defined for transmitting data and commands, while referencing the ILI9341 data sheet.

https://cdn-shop.adafruit.com/datasheets/ILI9341.pdf

According to the sheet, the LCD can be interfaced with a variety of interface types. Here are two 3-line serial interface types:

3_line_serial

I assumed we need to be using serial interface II, since we will be using the MOSI pin on the MCU and sending it to the SDI pin on the LCD. To ensure we are in this mode, I pulled the IM[3:0] pins to 1101. I then made sure to pull the D[17:0] to ground since previously they were just unconnected to anything.

The next step I took was to ensure SPI was configured correctly; upon checking, I saw that the tutorial never specified the bit size for the transmission, which is 4 by default. We,however, needed to be sending byte sized commands/data so I set it to 8 bits. Clock polarity was low, meaning it was sampling on the rising edge, which is consistent with the diagrams shown:

serial_interface_pause

From the above diagram, you can see that the CS pin must be pulled low when transmitting data. I made sure to verify this in the driver code.

ili9341_driver_code

Initially I thought there was some mistake with how the CS pin was being selected, it was not set high after transmission inside of the WriteCommand, WriteData functions; however, I then later noticed that the implementation for this was written in the HAL_SPI_TxCpltCallback callback function, which pulled the CS pin high after transmission was completed. I stepped through the code in debug mode to verify the callback was being called (it was).

I tried to flash the code once again and run; however, I was still just getting the white screen. I took some time to verify that my wiring was correct, and did not find any inconsistencies. At this point I had to leave the lab for another class.

lcd_setup

Image of the final setup.

Later in the day I took another look at the data sheet and found that I had misunderstood which interface type we need to be using. Since we are making use of the D/C pin, we need to be using 4-line serial interface II as defined in the ILI9341 datasheet:

4_line_serial

This means that when I pulled the IM[3:0] pins to “1101”, I was setting the wrong interface and the data was not being read correctly. From the diagram earlier you can see that in the 3-line interface the first bit sent prior to the 8 bit command/data is the defining bit for D/C. In our driver’s case, we do not factor this into how the data is transmitted, we just want to set the D/C pin as shown below:

4_line_serial_protocol

My next step tommorow will be to fix these inconsistencies, and also try to verify the driver will work on the F746 Nucleo board with the LCD that Shubo was able to interface on the F407 Discovery board.

I also took a look on FAQ’s on the Adafruit site where users were having issues with displaying anything on the LCD: https://learn.adafruit.com/adafruit-2-8-and-3-2-color-tft-touchscreen-breakout-v2/f-a-q. It seems as though there is some delay required after initial power up to be able to display data correctly, I will reference the Adafruit Arduino library that was provided for the LCD (https://github.com/adafruit/Adafruit_ILI9341) to see the outline of their implementation and translate this to the driver we are using.


____________________________________________________________________________________________________________________



Date Reported: 2/7/2024
Start Time: 12:30 PM
Work Time: 2 Hours

Team/Course Staff Meeting
Today in manlab we had a team discussion about finalizing our PSDR’s and stretch functionality. We also had some doubts over what was expected of us to have completed by the mid-semester design review. I brought up my concern with the F746 Discovery not being able to support the encoders. Liam brought up there was a F746 Nucleo board; however, the website stated it was supposedly not ideal for new builds.

During the meeting with course staff, Joe said he had an F746 Nucleo board available and we decided to check it out since this was our only option at prototyping everything on an F7 board rather than an F4. We also found out that we did not necessarily have to have the entire project integrated into our prototype (although this is the ideal goal). I also brought up my concern regarding Faust, which Prof. Walter clarified that using Faust would still count as a PSDR since the design of how the algorithms are implemented would still be ours.

Initial LCD Interfacing
After the meeting with course staff, we decided that Josh and Liam will be focusing on interfacing the codec, whereas Shubo and I will focus on interfacing the LCD screen. Shubo already made some attempts to interface the Adafruit TFT LCD screen with the F407 Discovery board, however unsuccessful. He shared with me a tutorial he followed which also included some driver files for interfacing: https://www.micropeta.com/video37, apparently the library in this tutorial was based on another project which was done on the F407 Discovery board https://github.com/eziya/STM32_HAL_ILI9341; however, Shubo did not yet try to flash that code. I pulled the project from GitHub and tried to run it on the F407 Discovery that Shubo was using but it also did not work.

I took the time to do some review into the components making up the Adafruit LCD. It contained an ILI9341 chip and a “FT6206-compatible CST026 capacitive touch controller chip”: https://www.adafruit.com/product/2090

I found the necessary datasheets to use as reference.

ILI9341: https://cdn-shop.adafruit.com/datasheets/ILI9341.pdf
FT6206: https://cdn-shop.adafruit.com/datasheets/FT6x06%20Datasheet_V0.1_Preliminary_20120723.pdf

The goal for Friday will be to either find a new library or to use the one Shubo found in order to interface the LCD with the F746 Nucleo board.


____________________________________________________________________________________________________________________



Date Reported: 2/6/2024
Start Time: 6:00 PM
Work Time: 3 Hours

Hardware/UI Interfacing
Today my goal was to implement reading hardware input and displaying this on the LCD screen. My goal was to use the rotary encoders that Josh was able to interface with on the F407 Discovery board. To interface with the rotary encoders, you need to use a timer with two channels which are set into encoder mode. When reviewing the available timer pins on the F746 Discovery board, I found out that for each timer, there was either only one or no channel pins exposed for input.

This brought up the question of how we can even prototype our project on the F746 board if we cannot even integrate the encoders with the onboard screen. I opted for using a regular potentiometer as the hardware input instead, so I can at least learn how to interface hardware with the screen. I followed a tutorial on how to display analog values on a screen using TouchGFX: https://www.youtube.com/watch?v=EbWOv_0Lp-U

The setup consisted of sampling via ADC and sending the data via DMA transfers into memory. This tutorial also used RTOS tasks to schedule the sampling and transmission of the data. The program's function relies on the RTOS preventing the model from updating until the ADC value has been transferred by the DMA into memory.

model_code

This is my implementation of handing the ADC data inside of the Model class. The "model listener" is actually the Presenter, since the Presenter is a derived class of the model listener class. On each tick, the Model checks the semaphore to indicate if the ADC value has been converted, and if it is ready, then the value from the ADC is then sent to the View via the Presenter, which is then used to set the state of the GUI.

Upon completing the tutorial and making some minor adjustments to fit my project, I was unsuccessful in being able to read off the values from ADC and display them on the UI. I found another tutorial on this subject: https://www.youtube.com/watch?v=2LQpaz44Ug4. In this tutorial the configuration of the ADC is different, using a timer as the external trigger conversion source rather than “regular conversion launched by software”. Upong making these changes, I was able to successfully run an initial demo of displaying the ADC values as points on a graph.



I decided to flesh out the demo more. I generated a sine wave using the same method I found yesterday of plotting a point on every tick, and then I added in a volume variable which was controlled by the ADC value. The goal was to use the potentiometer as essentially a volume knob for the simulated wave. I also wanted to display some sort of progress value on the screen, which is something we will feature in our final project. I decided to create my own simple knob container in TouchGFX, which helped me understand how we can essentially build our own UI elements with custom behavior rather than solely relying on the default containers offered by TouchGFX. Upon finishing this design, I was able to realize my goal demo:



Note: the video was taken on a different day than when I got it to work.

I will use this as a base for integrating the rotary encoders with the LCD screen; however, we need to figure out on which board we will be doing this on since the F746 Discovery could not support them.


____________________________________________________________________________________________________________________



Date Reported: 2/5/2024
Start Time: 8:00 PM
Work Time: 2 Hours

TouchGFX Demo
Today I began to investigate the TouchGFX, a graphics library used for STM32 products. My goal is to synthesize the graphics displayed with the DSP being performed to help us visualize the effects of the algorithms whilst we are still getting the codec input/output to work.

I began to play around with TouchGFX to get more familiarized with its functionality. I created a simple application that can change screens on touch input and adjust values using sliders. I quickly learned that the values displayed on the screen would be reset upon switching from one screen to another.

By refrencing the TouchGFX documentation (https://support.touchgfx.com/docs/development/ui-development/software-architecture/model-view-presenter-design-pattern), I learned that TouchGFX uses a “Model-View-Presenter” architectural pattern to manage the UI and the data it displays. The “Model” serves as the backend of the system, this is where data for display is being received/sent. The “Presenter” serves as a middleman between the model and the “View”, where the UI is managed.

mvp_diagram

By keeping this in mind, I investigated the source code for the project where I could see the necessary functions for these interactions defined. With some trial-and-error I noticed that you can “bypass” this architecture for simple cases such as saving the state of the slider by just using private variables for the ScreenView class and setting them as the value for the slider every time the screen is set up.

After defining the necessary functions and variables I was able to run a simple demo of changing screens and saving the state of the slider upon screen switching.



Note: the above video is run on the simulator available in TouchGFX Designer.

When looking through the available components in TouchGFX, I noticed that there was a “dynamic graph” available which can graph data points being added to it. I found that you can set a function to trigger on every tick by using an interaction: https://support.touchgfx.com/docs/development/ui-development/designer-user-guide/interactions-view

By doing this, I was able to simulate a simple wave.



At this point I have become familiar with some basic functionality in TouchGFX, and my next step would be to translate this to running on it on hardware and receiving input from interfaces external to the touchscreen. My goal for the next day is to interface the rotary encoders using Josh’s journal entry as a guide, and then reflect their state on the screen while changing the state of the simulated wave.

FFT Library and Faust Programming Language
I switched my focus to briefly looking more into the DSP. My goal is to implement a FFT of the input signal and display this transform on the screen to visualize the EQ filter effects. I found a tutorial from Phil’s Lab on how to add the FFT to your project: https://www.youtube.com/watch?v=d1KvgOwWvkM. I added the libraries listed in the video and after fixing some dependency issues I was able to compile my project. I will follow the video further once we can interface with the codec on the STM32 boards.

I also looked at the Faust programming language which Prof. Walter pointed us to. I found a in-browser demo which you can use to try out how Faust works: https://faustdoc.grame.fr/examples/ambisonics/. My only concern was that I was not sure whether this would still count as a PSDR since this is a tool that’s generating the algorithms for us.


____________________________________________________________________________________________________________________


Week 4


Date Reported: 2/2/2024
Start Time: 9:30 PM
Work Time: 1.5 Hours

TinyUSB Fix
Today I tried again to get TinyUSB to work. I decided to double check my configurations and that all the macros were properly defined. After I failed to find anything wrong, I decided to take another search through the github discussions, where I eventually found this: https://github.com/hathach/tinyusb/discussions/819

In this discussion the user reported following the same steps as I did when following a previous discussion I read. However, they also decided to copy all the functions and variables from the main.c file included in the example. I attempted to implement this, and after making some adjustments to let the code compile, I flashed the device and I was able to get the audio_4_channel_mic example to “work”. The device was successfully recognized like last time, however now it was being properly showing up in my microphone inputs.

audio_4_channel_mic

The STM32 device is the first microphone option.

I also did an input test to see if there would be similar issues when using the STM32 library from the previous example project I looked at. audio_4_channel_mic_packets

Although the input from the device was seemingly a flat line, from the packet analyzer you could see that it was inputting some sort of data. Upon inspection of the main.c file, there is a dummy audio buffer which outputs a wave for each channel.

tinyUSB_channels

These waves are output using the audio_task function.

audio_task

I loaded up my FL Studio software where I could choose to play each channel individually. I recorded the input from the 4th channel which was expected to be a sine wave.



From the audio you can tell there is some noise coming through with the signal. I will look further into this to see if we can get clean sine input to use for DSP testing.


____________________________________________________________________________________________________________________



Date Reported: 1/31/2024
Start Time: 12:30 PM
Work Time: 3 Hours

Team Meeting
Today in man lab we discussed about whether we still wanted to make our project output audio through USB. I was unsuccessful thus far with the F746, so I suggested Shubo could try and follow the example project I found on the F407. In the meantime, I was going to focus on trying to implement the TinyUSB library, as I’ve seen some posts on the STM32 forums suggesting that this was a better option.

Implementing TinyUSB
TinyUSB: https://github.com/hathach/tinyusb

I tried to follow the tutorial I found on the TinyUSB reference page: https://github.com/hathach/tinyusb/blob/master/docs/reference/getting_started.rst

However, this approach was not working out for me, I did find a discussion on github where another user described their process of integrating TinyUSB into their STM32 project.

Source: https://github.com/hathach/tinyusb/discussions/633

Following these instructions, I was able to add the library and get it to compile. I changed the usb_descriptors.c and tusb_config.h files to match one of the audio examples provided in the TinyUSB library. I flashed the code and my device was able to be detected by the computer, I confirmed that the descriptors were being read correctly.

audio_4_channel_mic_descriptors

However, my device was not showing up in the list of input devices, so I checked the device manager to see what the issue might be. I quickly noticed that there was some sort of error with starting the device. I tried to update the driver, but it did not fix anything.

tinyusb_device_event_log

I tried reading through the discussions on the TinyUSB github to find a solution relating to the error; however, I failed to find something which provided an answer. There were similar issues from other users using the HID class: https://github.com/chegewara/EspTinyUSB/issues/42. But there was no clear fix provided, other than there was an update to the library.

My initial thought was that this might be an issue with the USB port I was using; following the example project, I had my device configured to be operating in full speed mode, however, TinyUSB implements Audio Class 2.0 which is designed to support high speed devices.

I decided to reconfigure my project by initializing the USB HS port and editing the config files to match this. Upon flashing the code, my device was not being detected by the laptop anymore. I stepped through the code and noticed there was a macro named TUD_OPT_RHPORT not being properly defined, which was preventing the device from initializing the USB. After changing this, I flashed the device once again; however, there was still no detection from the laptop.

I tried to run some other audio examples while in full speed mode; however, they did not work either. When I tried to run a MIDI example to test if anything would work at all, it was working as intended, and the device was recognized as a MIDI input with no error. I was stuck on how to proceed at this point.




____________________________________________________________________________________________________________________



Date Reported: 1/30/2024
Start Time: 5:00 PM
Work Time: 1.5 Hours

DSP Filters
Today I spent some time fleshing out the testing environment I created for the DSP algorithms. I added more detail to the graphs and converted the frequency units in the frequency spectrum graph to Hz. I also debugged the lowpass filter function I wrote. I found out I implemented the filter equation incorrectly and incorrectly updated the arrays for past inputs/outputs. I originally updated them in the reverse order, causing all the values in the array to be set to the value stored in the first index, this is what was causing the filtered signal to be a flat line. Upon fixing this, I ran the lowpass filter test, where the goal was to filter out the 900 Hz signal, while retaining the 200 Hz signal.

dsp_test_lowpassa

I ran the noisy signal through the filter at a cutoff at 300 Hz. The plots show a significant decrease in the amplitude of the 900 Hz signal, whereas there's a slight increase in the 200 Hz signal. When comparing the audio of the noisy vs filtered signal, you can still hear some of the contents of the 900Hz sine.





The test shows that the IIR function is working, but there is room for improvement. I quickly implemented and tested the high-pass and peaking filters which used the same IIR equation but required different coefficients whose formula I referenced from the eq cookbook.


____________________________________________________________________________________________________________________



Date Reported: 1/29/2024
Start Time: 1:30 PM
Work Time: 3 Hours

USB Input
Last week I investigated the USB Libraries provided by STM32. The generated code was tailored towards USB Speakers, when I flashed the code onto the board and connected it to my laptop, the device was recognized as a speaker output device.

stm_speakera

However, for our purposes we needed to implement a USB input device. I was able to find a project done on a SMT32F407 which sampled audio from the ADC and input the data through USB into a computer: https://hackaday.io/project/181868-stm32f4-usb-microphone

I followed the steps inside of the project and learned more about how to configure the USB descriptors to allow the device to be recognized as a USB input. Upon flashing the code, the device was recognized as an input, and I used the Thesycon USB Descriptor Dumper to confirm that the descriptors were correct.

stm_speakera

These are the input devices recognized by my laptop. The STM32F746G-Discovery is the last one in the list.

tdd_stm_speakera

The descriptors read from the STM32. They are configured for a one channel microphone.

To test out how the device behaves when asked for input from the host, I set up Audacity along with Free Device Monitoring Studio to see the packets being sent from the device. As a reference, I first tested out input from my Blue Snowball microphone to see how it behaved, and then I ran the STM32.

blue_snowball_packeta

Side by side view of the Blue Snowball input into Audacity (right), and the packets being sent to/from the host (left). You can see that there are some initial packets being sent to select the interface before the microphone transmits isochronous transfers. The red text is the data being sent from the microphone.

stm_mic_packeta

I did the same for the STM32, however I noticed that there was no data being sent. On the packet view, you can see that the Isochronous transfers are sending packets of length 0. There is also no URB_FUNCTION_CLASS_ENDPOINT packet being sent, which could indicate some issue with the input endpoint. I spent a lot of time stepping through the code in CubeIDE to identify where there might be an issue. I verified that the USB setup didn’t return any errors, and that the input endpoint was set up correctly. I did find that the USBD_AUDIO_DataIn function was never being called from the interrupt handler. I tried searching online to see if anyone else experienced this issue, but I was not able to find any clear answer. I did find that someone recommended the TinyUSB library as an alternative.


____________________________________________________________________________________________________________________


Week 3


Date Reported: 1/25/2024
Start Time: 5:30 PM
Work Time: 4 Hours

DSP Library
Today I implemented a lowpass filter function in C, referencing the formulas I was able to find from the audio cookbook I found a couple days earlier. I also set up a Jupyter notebook for testing the algorithms by plotting the wave/frequency response of the signal before and after filtering and listening to the audio to hear the filtering effect. For right now, a basic test case I am running is passing in a signal comprising of two sine waves at different frequencies, and then applying the low pass filter to isolate one of them. Although the testing environment is mostly ready and functional, I was getting unexpected results from my C function after passing through the signal.

dsp_testing_1


____________________________________________________________________________________________________________________


Date Reported: 1/24/2024
Start Time: 12:30 PM
Work Time: 2 Hours

Team Meeting
After our meeting with the course staff and some discussion, we decided to change some of the features of our project. We will try to implement our project within one microcontroller. Liam wanted to try to implement an external codec to learn how to interface with it and since our prototype will require one to be implemented. I decided to switch my focus to building the DSP library and trying to figure out sending USB data to the host.

After an initial look at the USB device library, I was able to find functions used for transmitting data as a USB device.

usb_device

However, most, if not all, of these functions don’t do anything besides sending a USB device status. I will have to look more closely at the code implemented in the example project I found to learn how to correctly send audio data to the host computer.


____________________________________________________________________________________________________________________


Date Reported: 1/23/2024
Start Time: 11:00 AM
Work Time: 5 Hours

Difficulties with USB Microphone
We decided that USB microphones will be too time-consuming to try and implement in our project; therefore, we decided to switch to analog microphones instead.

USB Audio Visualizer
I was looking for similar projects on GitHub to see how they interfaced with the USB host and audio jack. I was able to find the ‘USB Audio Visualizer’ project which essentially displays a waveform of incoming audio from the laptop and allows the user to modify the volume of the output through the audio jack.

usb_audio_visualizer

Source: https://github.com/qqq89513/stm32f7-usb-audio-visualizer

I looked deeper into the project and saw they used the ‘BSP’ driver which was missing from the generated code from STM32CubeIDE. I downloaded the library and added it into an empty project. The library gave us access to interface with the codec on the board which will allow us to implement audio input/output from the audio jacks. I began working on setting up the necessary peripherals for the microphone.

BSP_audio

Library for interfacing with the codec.


____________________________________________________________________________________________________________________


Date Reported: 1/22/2024
Start Time: 2:30 PM
Work Time: 2 Hours

Team Meeting
During the meeting, I discussed my efforts to initiate the connection for the microphone. I set the PJ12 Pin as a GPIO Output and configured it as the ‘Drive_FS_VBUS’ pin. Once I flashed the firmware, I noticed that the red LED on my microphone was turned on, as well as the green LED LD5 on the board was also on; however, Liam found that the USB host library for STM32 has an incomplete implementation of USB Microphone communication. He found a thread online where a user was able to fix this. I implemented the code from the user; however, there was other missing code which it depended on to be able to work. I decided to try and look deeper into this, perhaps coming up with our own solution to this; however, it is difficult for me to try and understand how to implement it.


____________________________________________________________________________________________________________________


Date Reported: 1/21/2024
Start Time: 7:00 PM
Work Time: 3.5 Hours

Initial USB Microphone Host Setup
I came into the lab today with the goal of connecting my Blue Snowball microphone to our prototype board and be able to configure the board as a host for audio class devices. I started to configure the settings for the board from scratch, so I referenced the STM32F746 user manual and found this diagram:

usb_connector

When I tried to implement this setup, I noticed that LD5, the LED which is supposed to light up green when the board is in host mode, was not lighting up and the microphone was not receiving any power. I then decided to reference the CAD schematics from the same website as the manual: https://www.st.com/en/evaluation-tools/32f746gdiscovery.html#cad-resources

f7CAD_1

When referencing the CAD schematic, I noticed that the OTG_FS_VBUS signal was no longer supported on pin PA9.

f7CAD_2

From the above diagram I found that the OTG_FS_VBUS signal is connected to pin PJ12. At this point I was no longer in the lab and will test this out the next day.


____________________________________________________________________________________________________________________


Week 2


Date Reported: 1/19/2024
Start Time: 1:00 PM
Work Time: 3 Hours

Team Meeting
We had a team meeting where we worked on our functional specification. I began to draw out a block diagram of our device and began to review concepts such as FFTs and the Z-Transform which are relevant principles to bring up in our “Theory of Operation” section. During the meeting I brought up the topic of how we will handle the state of the FX after the device is shut off. We decided a stretch goal would be to save the current parameters for the FX, and they would be restored on startup; however, for now we will have a default state which the FX will start off with on startup.

Fruity Parametric EQ 2
As a hobbyist in music production, I have used Image-Line's EQ plugin “Fruity Parametric EQ 2”. I decided to look through their documentation to get an idea of what type of filters they implemented in the plugin.

Source: //www.image-line.com/fl-studio-learning/fl-studio-online-manual/html/plugins/Fruity%20Parametric%20EQ%202.htm

All I could find out about the filters was that there are two modes available: Standard (IIR) and Linear Phase (FFT). The Linear Phase mode preserves the signal phase and allows for quickly changing parameters, which isn’t possible with conventional FFT filters. Also, there is a detailed description of all the parameters involved in the plugin, which might help frame what other parameters we might like to implement in our own device. I realized we had not planned to implement some kind of input from the user which would determine the type of filter being applied, or its slope.

Fruity Parametric EQ 2

At the end of the document, I was able to find the credits to the “Audio EQ Cookbook” by Robert Bristow-Johnson.

Audio EQ Cookbook
Source: https://webaudio.github.io/Audio-EQ-Cookbook/Audio-EQ-Cookbook.txt

This document details helpful formulas for different types of filters, and which user-defined parameters to use when creating a digital EQ. This will be of great use when writing the software for the equalizer.

Intro to USB Hardware and PCB Guidelines
I stumbled upon this document which introduces guidelines to follow when designing PCBs to ensure compliance with USB standards. I passed this off to the team so that we can look over it for reference later down the line.

Source: https://www.st.com/resource/en/application_note/dm00296349-usb-hardware-and-pcb-guidelines-using-stm32-mcus-stmicroelectronics.pdf


____________________________________________________________________________________________________________________


Date Reported: 1/18/2024
Start Time: 12:00 PM
Work Time: 1 Hour

STM32 USB Library and Wiki
I began setting up the environment for our devboard. The IDE we will be using is the STM32CubeIDE. I believe this will be ideal as it allows us to quickly set up our environment depending on the MCU/board we are using, and it also offers a helpful UI which allows us to configure peripherals and other parameters such as clocks.

I had some trouble adding the X-CUBE-USB-AUDIO library to the STM32CubeIDE, so I decided to investigate the standard package for STM32F7 devices. It turns out this package already includes libraries for both USB hosting and acting as a USB device. Liam was able to find a manual for this library.

Source: https://www.st.com/resource/en/user_manual/um1720-stm32cube-usb-host-library-stmicroelectronics.pdf

I looked more into the STM32 wiki for a guide into using USB with STM32. It contained information about general USB concepts and an overview of the contents of the USB library.

Source: https://wiki.st.com/stm32mcu/wiki/Introduction_to_USB_with_STM32


____________________________________________________________________________________________________________________


Date Reported: 1/16/2024
Start Time: 12:00 PM
Work Time: 3 Hour

Intro to USB Protocol
I spent the day learning about the USB communication protocol.

USB Protocol consists of transactions being sent between the host and client devices. These transactions are made up of packets which initiate the communication, send data, and end the transaction. To understand how these packets are sent, it is important to understand the wires inside of the cable.

USB Cable Diagram

Source: https://www.usb.org/document-library/usb-20-specification - Universal Serial Bus Specification

The VBUS and GND wires are intuitive, this is what carries power to devices. The D+ and D- wires are a signaling pair which are used to transmit packets. They can have 4 different states:

(The following diagrams will be used from an introductory video I found: https://youtu.be/HbQ6q3skZgw)

usb_states

When data is transmitted along these lines, they must follow a NRZI encoding scheme. Meaning that data is transmitted through the transition in data states:

usb_transmission

From the above diagram, you can see that transitioning in states J to K (or vice versa) is encoded as 0, whereas no change in states is encoded as 1. An important concept to grasp is bit stuffing, which is inserting a 0 for every six consecutive 1’s. This is done to keep the clocks for the devices synchronized. This is important to consider in the case of implementing a USB host device, like our interface, since the stuffed bit does not represent any data and the host must discard it to read the data correctly.

In USB protocol, packets follow the following structure:

usb_packet_structure

All packets must use the same SYNC pattern as shown above. The PID is used to identify the kind of packet being transmitted. Data is optional depending on the packet. The EOP (End of Packet) is intuitive.

There are different types of packets that can be sent over:
  • Token Packets (host only)
  • Data Packets
  • Handshake Packets
Transactions are made up of a series of packets. The following order defines a transaction:

usb_transaction

Groups of transactions are called transfers. There are multiple types:
  • Control
  • Isochronous
  • Interrupt
  • Bulk
Isochronous transfers are particularly relevant to us since they are used for continuous data such as audio. Control transfers are used for configuration so that the host can recognize the device being plugged in, which I suspect will be important in our implementation of the interface as a host.

Transfers can be identified by endpoints, which are destinations/origins for data. These endpoints must be configured to identify their transfers. Endpoint 0 must always be configured for control transfers.

Brief Intro to USB Audio Class
During our prior man lab, we discussed how the computer will be able to recognize USB devices. I did some very brief research and found that USB devices can follow a set of protocols belonging to a class. If the device follows the rules for the class, the drivers on the host will be able to recognize the device as a USB audio device. I was under the impression that we might need to implement our own driver to be able to recognize the attached microphone as an USB audio device.

USB Audio Library
I started looking into libraries that can help us implement USB communication between devices. I found the X-CUBE-USB-AUDIO expansion for STM32, which is a library that allows for the development of USB audio applications on STM32 devices.

Source: https://www.st.com/en/embedded-software/x-cube-usb-audio.html#overview

Initial Parametric EQ Sketch
I also did some brief sketches for the elements we would find on the UI of the parametric EQ. This helped me determine that it would be convenient to have at least three rotary encoders acting as inputs for the frequency, bandwidth, and gain for each band, along with some buttons to cycle through which band is being currently affected.

parametric_eq_sketch

Team Task Manager
I set up a Notion page for our team to use to track progress of what needs to be done and which assignments are due.


____________________________________________________________________________________________________________________


Week 1


Date Reported: 1/12/2024
Start Time: 12:00 PM
Work Time: 3 Hours

EQ Implementation
I was searching around for tips on how to implement an equalizer. I was able to find a thread on DSP Stack Exchange which described the implementation of an EQ as a summation of filters. To implement this as a parametric EQ, we would need to allow user input to specify the parameters controlling the behavior of the filter.

The answerer in the thread mentioned making use of IIR filters over FIR filters in audio/music applications. This is because IIR filters take up less CPU time compared to FIR filters.

Source: https://dsp.stackexchange.com/questions/24017/which-filter-for-an-audio-equalizer

Filters
An immensely helpful resource I found was a DSP guide by Prof. Steven W. Smith. It provides an exhaustive guide to DSP: https://www.dspguide.com/pdfbook.htm

In his book, Smith discusses the topic of IIR filters. These filters perform worse than FIR filters; however, they are quicker to compute. Making them ideal for use in a system that operates in real time such as ours. The following is a simple formula for an IIR filter provided in the book:

IIR_equation

Smith points out that about 12 coefficients is the max number that can be used before causing the filter to become unstable. This is something to consider later when designing our algorithms.

Programming Language and Tips
Smith also discusses the choice of programming language to use when writing DSP algorithms, pointing out C and C++ as most used for DSP; however, pointing out that Assembly languages are also effective to use when there is a need for high efficiency. My current thinking is to implement most of the DSP using C++, but it could be possible to have to use inline assembly to achieve more efficiency.

Smith also lists some tips to use when programming:
  • Use integer variables whenever possible – they are processed faster than floating point variables.
  • Avoid using functions like sin, cos, exp, etc. – since the nature of calculating these functions is a series of addition/subtraction or multiplication, this is extremely slow.
  • An alternative is to have precomputed values stored beforehand in a LUT.
I like the idea of including a LUT to increase the performance of our algorithms; however, the practicality of this will be clearer once the initial prototype is finished.

Team Meeting
We worked on a draft of the project's structure and the final proposal. After the meeting I set up a Discord server which we will use as our primary form of communication and storing useful links/images.