* Team Name: Dream Team
* Team Members: Abhik Kumar, Shreyaans Singh, Stanley
* Github Repository URL: https://github.com/upenn-embedded/final-project-dream-team
* Github Pages Website URL: https://upenn-embedded.github.io/final-project-dream-team/
* Description of hardware: Microcontroller:
1. Atmega328PB Xplained Mini - 4
2. Zigbee Xbee S2C - 2
3. IMU - ADXL335 - 2
4. Neopixel Ring - 3
5. Bit addressable LED strip - 1
The video demonstrates the full functionality of our project, showcasing the seamless integration of the sensor controller and its interaction with the system. Below are the key features and functionalities presented:
We can see both gloves, each equipped with an ATmega328PB microcontroller on the palm surface, along with a Zigbee X2C transceiver, a Neopixel ring, and an ADXL335 accelerometer on the knuckle side of the glove.
The stage is large enough to accommodate a MacBook Pro. Furthermore, the stage features an LED strip to make it more colorful and visually appealing. Inside the stage, we have two ATmega microcontroller: one acting as the receiver node and the other communicating with the receiver via GPIO pins to send MIDI signals to the laptop.
The full project being used:
The image above illustrates how the user interacts with the project, while the video demonstrates how the user can control the music through hand movements.
SRS 01 – We successfully interfaced a 3-axis IMU that provides ADC data, and we sampled this data at 200ms intervals. We utilized ADC and interrupt to make the high sampling rate viable. Further, we used techniques such that we only registered a viable change in the IMU sensor data and only this data would be send wirelessly.
SRS 02 – We developed firmware to support wireless communication using the Zigbee X2C module, which operates on UART to transmit the states mapped to IMU data in real-time. We pivoted from using a Bluetooth sensor to a Zigbee module. We implemented queues, and state change functions so that only useful changes in states are relayed over wireless communication. Further, with the 3-axis nature of the ADXL 335 sensor, we had to implement queues so that we could administer all 3-axis changes if they have occurred and relay them one-by-one in order to keep the user-experience and functionality practical.
SRS 03 – The firmware we developed can send MIDI signals to any software compatible with MIDI protocol. Additionally, we implemented a Python bridge to accept MIDI control signals over UART from an ATmega microcontroller. In our final implementation, we used GarageBand instead of Mixxx DJ software. We were able to verify these states were being transmitted correctly over both Atmegas using logic analyzers, serial monitor and oscilloscopes
SRS 04 – We successfully deployed firmware capable of changing audio modalities in GarageBand based on hand orientation and gestures. We were able to confirm the changes were happening accordingly by using the garage bands GUI.
SRS 05 – We interfaced four LCD screens over SPI that displayed real-time changes in the audio modalities occurring in the music.
SRS 06 – Using our firmware, we successfully utilized the Neopixel LED ring on the glove to change colors according to the hand’s gestures and orientation.
We were not able to leverage SPI communication and interface a nrf24L01 sensor with the amtega328PB for our wireless communication.
Further, we did not utilize the IMU ADXL345 that uses I2C communication.
HRS 01 – We used four ATmega328PB microcontrollers in total: one on each glove and two inside the stage.
HRS 02 – We utilized an ADXL335 sensor to detect gesture and orientation changes, leveraging ADC and interrupts for accurate data acquisition.
HRS 03 – We successfully designed a circuit to run four LCD screens over a single SPI line and arranged them in an orientation to effectively create a large screen effect.
HRS 04 – We used the laptop’s speakers to play music and demonstrate the changes in audio modalities.
HRS 05 – We fully controlled the color sequencing, brightness, and intensity of the LED strip and the Neopixel rings by utilizing timers.
HRS 06 – We modified this hardware requirement by using an XBEE S2C (Zigbee) module to establish real-time wireless communication instead of the Bluetooth HC05 module.
The USB chip (ATMega32U4) on the XPLAINed board cannot be programmed and hence were not able to use our Atmega to direclty communicate with the garage band software and had to implement a python bridge. (Hard booting the firmware in the ATmega32U4 will remove the programming and debugging capabilities of the mEDBG. If the EEPROM is altered the mEDBG would not be recognized by compiling Studio anymore. Leaving the XPLAINed MINI board unusable.)
Further, we were unable to use USART1 via PB3 and PB4 pin on the Atmega328PB and hence we had to create a GPIO pin bridge to communicate between two atmega’s instead of using only 1 at the receiver node.
This project was an enriching experience that allowed us to blend embedded systems, wireless communication, and real-time control into a functional and creative system aka “DJ Gloves”. We are proud of successfully integrating multiple hardware components, such as the ADXL335 sensor, Neopixel LED rings, and LCD screens, to create an interactive platform that dynamically controlled music. The decision to pivot from Bluetooth to Zigbee for wireless communication significantly improved reliability, showcasing our adaptability in problem-solving.
One of our major accomplishments was designing a robust pipeline for transmitting real-time sensor data and mapping it to states that controlled audio modalities in GarageBand. However, challenges, such as the inability to program the USB chip on the XPLAINed board and limitations in utilizing USART1 on the ATmega328PB, required us to implement innovative workarounds, including a Python bridge.
This project reinforced the importance of planning for hardware constraints and adaptability. Future steps could involve enhancing modularity, supporting more music platforms, and optimizing sensor precision.
All the used codes are uploaded in the github and linked here:
1) Submit your GitHub URL: https://github.com/upenn-embedded/final-project-dream-team
2) Show a system block diagram & explain the hardware implementation:
The above block diagram can be best explained in 4 stages of how the proposed embedded system works:
All of these interfaced over the same Atmega328PB. ZigBee -UART–> Atmega328PB. IMU(ADXL345) -I2C–> Atmega328PB. NeoPixel -GPIO(PWM)–> Atmega328PB.
Microcontroller - Atmega328PB: a. Here the data sent via Zigbee (802.15.4) is mapped to finite states. This is done to de-couple or abstract the input from the output or the actuation. b. These mapped states are sent over to another Atmega328PB over a GPIO bridge. The second Atmega328PB maps these finite states to an LUT of MIDI control mnemonics that help us change the modalities of the music.
Laptop (Music Mixing and MIDI tools): a. A python MIDI bridge is implemented which takes the MIDI mnemonics over UART from the Atmega328PB and communicated with garage band to modulate the music. b. Speakers: the music is outputed from the laptop’s inbuilt speakers
The output stage (Not yet implemented): b. LED strips: The light gradient and intensity changes in accordance with the intensity and tempo of the song.Present on the prop stage and both gloves. c. LCD Screen: A LCD screen with a custom library (something similar to a old school windows media player) graphics which also changes in accordance to the input state. Implemented with circuitry that will allow us to mirror 4 LCD over the same SPI communication Line.
GLOVE: Provides a easy to use wearable light-weight device that enables bluetooth input based on the movement and orientation of the users hand. Making it a more generalized and intuitive input method.
ADXL335 IMU (Sensor): This is a 3-axis accelerometer (IMU) used to detect hand motion and orientation. The IMU communicates with the Bluetooth module using either SPI or I2C communication protocols to transfer data. It captures data related to acceleration and orientation, which helps in identifying the gestures performed by the user.
Zigbee Module: The Zigbee XBEE S2C module transmits the IMU data wirelessly to the central microcontroller (ATmega32PB). This module ensures the glove can communicate in real-time with the main system without the need for physical connections. Each glove has its own Zigbee module, allowing two-way communication between the gloves and the central system.
NeoPixel Ring on the Glove: The NeoPixel Ring strips are mounted on each glove to provide visual feedback. The LEDs light up based on user actions, adding a visual element that syncs with the music. The LED control is influenced by the motion data collected by the IMU, creating an interactive experience as the LEDs respond to gestures.
ATmega32PB Microcontroller: The ATmega32PB serves as the main processing unit of the system at various sub-systems like the gloves, MIDI control, Zigbee Hub, Stage Output. It receives data from the Zigbee modules in each glove, interprets the motion data, and translates it into commands for the DJ software and connected devices. The microcontroller handles Timers, PWM (Pulse Width Modulation), Interrupts, and Duty Cycle controls for managing connected components.
DJ Software Interface: The DJ software, running on a computer, receives data from the ATmega32PB over UART to adjust music parameters such as tempo, volume, and pitch. This data originates from the gloves and is processed by the microcontroller to control the audio output, providing an interactive DJ experience.
3) Explain your firmware implementation, including application logic and critical drivers you’ve written.
It first takes in 100 accelerometer values of the peripheral at resting state to determine the baseline XYZ values. These raw accelerometer values are than converted to G force values. It then takes the current accelerometer reading and compares it to the baseline acceleration values. If the current values are within at threshold of +/- 0.025 G’s in each XYZ direction, it maps to a state of 0 change. However, if it exceeds the threshold of +/- 0.025 G’s the XYZ values map to either -1 or 1 depending if the values increased or decreased above the threshold. This is done to create 6 different states (-1 or 1 for X, Y, and Z directions) that will be mapped to garage band to control volume, pitch, and tempo of the output speaker.
The NeoPixel WS2812 LED ring is controlled by a digital pin on the ATmega328PB. It will change color depending on ADXL335 XYZ state changes.
First we configured the Zigbee modules using the XTCU software to give them the same PAN ID: 3332 (in our case). Both Zigbee were configured for no parity, 2 stop bits, 8 bit data.
The sender node: (Serial Number and Destination Address Config):
The Destination address (DL, DH) is changed to match Serial address of the receiver.
The Receiver Hub: (Serial Number and Destination Address Config):
The Destination address (DL, DH) is changed to match Serial address of the sender.
After configuration of the ZigBees a library utilizing USART0 (UART) protocol was made to help send data wirelessly from one Atmega328P to another.
zigbee.H
#ifndef ZIGBEE_H
#define ZIGBEE_H
#include <avr/io.h>
void UART_Zigbee_init(int prescale);
void UART_Zigbee_send(unsigned char data);
void UART_Zigbee_putstring(char *StringPtr);
unsigned char UART_Zigbee_receive(void);
#endif /* ZIGBEE_H */
zigbee.c:
#include <avr/io.h>
void UART_Zigbee_init(int prescale)
{
UBRR0H = (unsigned char)(prescale >> 8);
UBRR0L = (unsigned char)prescale;
UCSR0B = (1 << RXEN0) | (1 << TXEN0); // Enable RX and TX
UCSR0C = (1 << UCSZ01) | (1 << UCSZ00);
}
void UART_Zigbee_send(unsigned char data)
{
while (!(UCSR0A & (1 << UDRE0))); // Wait for empty buffer
UDR0 = data; // Send data
}
void UART_Zigbee_putstring(char *StringPtr)
{
while (*StringPtr)
{
UART_Zigbee_send(*StringPtr++);
}
}
unsigned char UART_Zigbee_receive(void)
{
while (!(UCSR0A & (1 << RXC0))); // Wait for data
return UDR0;
}
Sender main.c:
#define F_CPU 16000000UL
#define BAUD 9600
#define BAUD_PRESCALER ((F_CPU / (16UL * BAUD)) - 1)
#include <util/delay.h>
#include <avr/io.h>
#include <stdbool.h>
#include <stdio.h>
#include "UART.h"
#include "zigbee.h"
bool flag = true;
int main(void)
{
UART_Zigbee_init(BAUD_PRESCALER); // Initialize USART1 for Zigbee
UART_Debug_init(BAUD_PRESCALER); // Initialize USART0 for debugging
unsigned char data1 = 'A'; // Example byte to send
unsigned char data2 = 'B'; // Example byte to send
while (1)
{
// Send one byte of data via Zigbee (USART1)
if (flag)
{
UART_Zigbee_send(data1);
flag = false;
}
else
{
UART_Zigbee_send(data2);
flag = true;
}
_delay_ms(200);
}
return 0;
}
Receiver main.c:
#define F_CPU 16000000UL
#define BAUD 9600
#define BAUD_PRESCALER ((F_CPU / (16UL * BAUD)) - 1)
#include <util/delay.h>
#include <avr/io.h>
#include <stdbool.h>
#include <stdio.h>
#include "UART.h"
#include "zigbee.h"
#define LED PC3
int main(void)
{
UART_Zigbee_init(BAUD_PRESCALER); // Initialize USART1 for Zigbee
UART_Debug_init(BAUD_PRESCALER); // Initialize USART0 for debugging
uint8_t received_data;
DDRC |= (1 << PC3);
PORTC &= ~(1 << LED);
while (1)
{
// Receive one byte of data via Zigbee (USART1)
received_data = UART_Zigbee_receive();
if( received_data == 'A')
{
PORTC |= (1 << LED);
}
else if (received_data == 'B')
{
PORTC &= ~(1 << LED);
}
}
return 0;
}
These Codes were used to implement proper Zigbee wireless communication.
The sender Zigbee sends alternating bytes of values ‘A’ and ‘B’. The receiver interprets these values and wicthes the LED on/off based on the incoming byte.
The following video demonstatrates the toggling LED. The sent data can also be printed on the serial monitor as it uses USART0. Further we used a logic analyser to read the sent and received data on the RX/TX pins of the sender and receiver.
Hardcoded states looping were transmitted from the transmitter (to emulate ADXL states) to the receiver node.
These states were trasfered to the Atmega communicating with the MIDI device using a 4 wire GPIO bridge that uses bit sequence to validate the states.
This transmitted data was then converted into MIDI mnemonics relating to certail music controls.
TR/RX Data Rate shown on logic analyzer:
3) Have you achieved some or all of your Software Requirements Specification (SRS)?
Acheived:
Successfull MIDI control using Atmega and Laptop with Python and UART.
Successfull interfacing of IMU with Atmega328PB over I2C.
Successfull wireless communcation using Zigbee module interfaced with Atmega328PB over UART.
Successfull interfacing of NEOpixel and LED strip using GPIO pins and PWM.
Not Achieved:
Unable to interface and establish wireless communcation using nrf24L01 over SPI.
Unable to Program USB chip (Mega32U4) to bypass using python and make MIDI interfacing fully in baremetal C.
4) Have you achieved some or all of your Hardware Requirements Specification (HRS)?
Acheived:
Complete wireless communication between glove atmega and receiver node.
Exploiting 3 DOF from an ADXL335. Hoping to get all 6 by final demo!
Utilizing UART and I2C protocol.
Utilizing PWM and addressable LEDS for the neopixel.
Not Acheived:
Power supply is not isolated and not wireless for the gloves
Prop stage with LCD not yet completed will be completed for the final demo.
Unable to conigure zigbee on USART1 of the atmega. the data was being received by the Zigbee but not being transmitted to the Atmega over USART1.
5) Show how you collected data and the outcomes.
We are using 8 different states to change the configuration of sound in the Garadgeband workstation, out based on the input states out ATMega send a MIDI (Musical Instrument Digital Interface) command over the UART protocol to the a Python bridge script running on our Laptop which forward this to the Gradgeband to manipulate the audio to give us DJ. We are also monitoring these MIDI command using MIDI Monitor, to verify the correct operation.
We tried programming the USB chip in ATMega328PB to act as a standalone MIDI controlled but the after multiple implementation and discussion with TA’s we concluded that USB chip is flashed and firmware update wire is Cut to prevent further change in the USB controller, Hence we leverage the MIDI Bridge to overcome this gap.
Zigbee communication test
6) Show off the remaining elements that will make your project whole: mechanical casework, supporting graphical user interface (GUI), web portal, etc.
Integration of the input and actuator sub-system present at the glove, and successfully being able to send the IMU data over wireless communication.
Wring and clearing up the cable management while reducing the form factor to make everything fit on one glove.
Testing codes for the completed project and looking for all upper/lower limits of data ranges.
Documenting each and every interface and library created along with schematics.
Final Expected System diagram:
1) Demo your device
Garage band demo: https://drive.google.com/file/d/1ITvoW6TBajpoYM1fCPO_hUhhKps99rsm/view?usp=sharing
NeoPixel WS22812 LED ring: https://drive.google.com/file/d/1u8GsDy0wE-SAH6kXk1ogv6VY79Owm-f3/view?usp=sharing
Zigbee test Demo: https://drive.google.com/file/d/1_yGVNGBudKrngQBVt4NNvV8jsyJfRqOu/view?usp=drive_link
Zigbee + MIDI integration demo: https://drive.google.com/file/d/1a6MHJp4Mq8Fa0_69L3BxjQOlwaN_MNdY/view?usp=drive_link
8) What is the riskiest part remaining of your project?
The riskiest part of out project remaining:
Integration of IMU, Zigbee and Neopixel on the same Atmega. All these interafaces utilize different kinds of communication protocol and have been currently implemented using polling.
Managing the form factor of the prototype on a glove.
a. How do you plan to de-risk this? We are looking into multiple communication standards and ways to make them fit together. Utilization of interrupts will be key in integrating all the parts on the atmega and will help reduce the polling load.
9) What questions or help do you need from the teaching team?
Our NRF is not properly communicating, we did discusses this with the teaching staffs and also used debugging methods. We have moved to Zigbee Xbee S2C to cover the wireless communication without sacrifising any functionality of out project. Further, we would need help to understand why the atmega wasnt able to receive any data on the
ADXL345 Update :
We have finished I2C communication between the ADXL345 and our microcontroller. Currently this data is displayed using UART protocol. The data is represented as XYZ acceleration and mapped to state variables to determine the intensity of movements.
MIDI Update : Test Setup
Code </br> We have created a serial input to test the MIDI controller data input, we are working on the development of MIDI controller using the ATMega32PB. The current system is handling the interaction using the UART protocol to handle the date transfer.
RF transmission Setup and test:
Library and interfacing code for RF is being implemented to get successful wireless data transmission. Currently, no successful data transmission has been noted.
1. Currently, we've implemented I2C communication between the ADXL345 aceelerometer and the Atmega328PB. The accelerometer successfully streams XYZ position of the user to the serial monitor via UART. We are testing this component individually before integrating with the other parts of our system. 2. Additionally we've implemented SPI communication and made a library for SPI commination. Further, we've started work on how the wireless communication will be conducted using the NRF24L01 RF module. Currently trying to create a library for the NRF module that would be able leverage the above created library for SPI communication. 3. We are testing the incorporation of ATmega signal with different applications which can process the data in to desired output (Volume Up, Volume Down). We have tested dummy signals for this.
Last week, we implemented I2C communication between the ADXL345 aceelerometer and the Atmega328PB. The accelerometer successfully streams XYZ position to the user and these values are displayed over the serial monitor via UART. Further, we were able identify what protocol we would like to use for our wireless communication (between BLE and RF). Upon deciding that we have started writing necesaary codes required to interface the RF module.
Next week, we will working on doing analysis of the 6 axis ADXL345 data and encode these accelerometer readings into state values. The XYZ position states will be normalized from 0-1 with the default XYZ position representing 0.5. Further, upon attaining correct functionality and data transmission of the SPI leverage NRF module we aim to move on to integrating this part of our project so that we can wirelessly send the ADXL345 values over RF communication.
The project is in the component testing and interfacing phase, with key parts like NeoPixel rings, IMUs, and accelerometers identified and partly ordered. The team has finalized the bill of materials and acquired most components, prioritizing testing before committing to additional orders. They are evaluating different accelerometers for 6-axis data extraction and gesture recognition and experimenting with DJ software for controlling audio features like pitch, volume, and tempo. Wireless communication options are also being tested, particularly the Nordic nRF24L01 module, to establish reliable data transfer with the ATmega328PB. The focus remains on testing individual components separately to ensure compatibility and functionality before integrating them in the upcoming stages.
Last week, we identified all key components needed for our project and finalized the bill of materials. We ordered two NeoPixel rings, having acquired most other components from Detkin, and shared the BOM list with the team. We are also testing several IMUs to determine the best fit for our application and plan to finalize and place an order for the selected IMU by next week. In a meeting with our project managers, Chen Chen and Nick, we refined our system block diagram and identified data transmission rates as a potential bottleneck due to the ATMEGA32PB’s limited memory. To improve form factor and performance, we discussed using a smaller MCU on the glove and potentially switching to the Nordic nRF24L01 for wireless communication instead of the HC05 Bluetooth module.
Next week, we plan to test various accelerometers by interfacing them with the ATmega328PB to extract 6-axis data and perform basic gesture recognition. Additionally, we will explore different DJ software options and begin working on interfacing with them to control speaker pitch, volume, and tempo. For wireless communication, we will focus on connecting the Nordic nRF24L01 module with the ATmega328PB to ensure smooth data transfer. Our goal is to individually test each of these components to confirm their functionality. By doing so, we aim to identify and address any issues early on. This approach will set a strong foundation for integrating the components in future weeks.
### Current state of project ### Last week's progress ### Next week's plan ## Final Project Proposal ### 1. Abstract
The DJ Gloves project introduces an innovative way for users to control and customize live music through simple hand gestures. By equipping concert-goers or DJs with gloves embedded with motion sensors, this project enables seamless, intuitive control over music settings such as volume, tempo, and pitch, providing a personalized listening experience. The concept leverages wireless technology to allow users to adjust music in real-time from anywhere in a venue, revolutionizing the interaction between people and sound.
### 2. MotivationProblem: In many live music settings—such as concerts, festivals, or parties—the music experience is standardized for everyone in the audience. Factors like volume, tempo, and pitch may not suit individual preferences, and in large venues, adjusting these settings can be challenging without interfering with others' experiences. DJs and DJing in general has a very confusing and complicated bar of entry which often limited physical movements especially keeping their hands busy, limiting their creative expression.
Purpose and Inspiration: This project is inspired by the concept of a conductor leading an orchestra with hand gestures, creating a flow that resonates with the audience. Similarly, the DJ Gloves project seeks to give users, whether they are audience members or DJs themselves, the power to interact with and adjust music through hand gestures. By translating natural hand movements into music control signals, the DJ Gloves empower users to personalize and enhance their musical experience seamlessly, making concerts and events more engaging and immersive. Further, the input methodology being implemented has a general purpose of creating a device that allows a user control over any bluetooth enabled device in a more interactive and intuitive way. Ex: RC cars, light, swarm robots etc.
### 3. GoalsThe primary objectives of the DJ Gloves project are as follows:
Each goal contributes to the overall vision of transforming the way music is experienced in live settings by enabling users to interact directly with sound through intuitive hand gestures.
### 4. System Block DiagramThe above block diagram can be best explained in 4 stages of how the proposed embedded system works: 1. The input stage: a. The ADXL345 sensor (Inertial movement unit): capable of measuring 6 degrees of freedom. b. Bluetooth Module: Enables wireless transfer of IMU measurement to the ATmega32PB. 1. Proceeding unit (Microcontroller): a. Here the data sent BLE is mapped to finite states. This is done to de-couple or abstract the input from the output or the actuation. b. The microcontroller also provides the hardware to generate PWM, manage interrupts to have proper functionality of the actuators. 2. Laptop: a. Manages the DJ software that changes the tempo, amplitude and modularity of the sound/song being played. 3. The output stage: a. Speakers: Play the modulated sound/song. b. LED strips: The light gradient and intensity changes in accordance with the intensity and tempo of the song.Present on the prop stage and both gloves. c. LCD Screen: A LCD screen with a custom library (something similar to a old school windows media player) graphics which also changes in accordance to the input state.
Provides a easy to use wearable light-weight device that enables bluetooth input based on the movement and orientation of the users hand. Making it a more generalized and intuitive input method.
This is a 3-axis accelerometer (IMU) used to detect hand motion and orientation. The IMU communicates with the Bluetooth module using either SPI or I2C communication protocols to transfer data. It captures data related to acceleration and orientation, which helps in identifying the gestures performed by the user.
The Bluetooth module transmits the IMU data wirelessly to the central microcontroller (ATmega32PB). This module ensures the glove can communicate in real-time with the main system without the need for physical connections. Each glove has its own Bluetooth module, allowing two-way communication between the gloves and the central system.
The LED strips are mounted on each glove to provide visual feedback. The LEDs light up based on user actions, adding a visual element that syncs with the music. The LED control is influenced by the motion data collected by the IMU, creating an interactive experience as the LEDs respond to gestures.
The ATmega32PB serves as the main processing unit of the system. It receives data from the Bluetooth modules in each glove, interprets the motion data, and translates it into commands for the DJ software and connected devices. The microcontroller handles Timers, PWM (Pulse Width Modulation), Interrupts, and Duty Cycle controls for managing connected components.
The Prop Stage includes an LCD screen and additional LED lights that serve as visual indicators synchronized with the music and 3D stage if time permits. The ATmega32PB controls these actuators by adjusting brightness, colors, or patterns based on the music or gestures.
The DJ software, running on a computer, receives data from the ATmega32PB to adjust music parameters such as tempo, volume, and pitch. This data originates from the gloves and is processed by the microcontroller to control the audio output, providing an interactive DJ experience.
The speaker is the primary audio output device. It plays music from the DJ software and responds to changes made by the glove movements. The audio output is influenced by the real-time data from the IMU sensors, allowing users to control the tempo, volume, and other audio effects through gestures.
### 5. Design Sketches Gloves with IMU and LED Strip: Each glove is equipped with an Inertial Measurement Unit (IMU) and an LED strip. The IMU sensors detect the movement and orientation of the user’s hands, capturing data that represents the position and orientation of the hand. These coordinates and axis's correspond to specific music control actions, like adjusting volume or tempo. The LED strips on the gloves provide visual feedback based on motion, creating an interactive and immersive audio-visual experience. IMU Data Transmission via BLE: The data from the IMUs on each glove is sent wirelessly via Bluetooth Low Energy (BLE) to the central microcontroller (an ATmega32-based system). BLE is used for its low power consumption, ensuring that the gloves remain functional for longer periods without frequent recharging. Microcontroller Processing: The ATmega32 microcontroller processes the incoming IMU data. It interprets the gestures and translates them into music control commands. For example, certain hand movements might adjust the music's tempo, volume, or pitch. Additionally, the microcontroller handles the LED and LCD inputs to display relevant information or visual effects, enhancing the user experience. DJ Software Interface: The microcontroller sends the processed data to the DJ software on a connected laptop, where it is interpreted as commands to control the music. This DJ software (such as Mixxx or another open-source platform) then adjusts the music output based on the user’s gestures, allowing the user to manipulate music settings wirelessly and interactively. Output to Speaker: The modified audio output from the DJ software is sent to the speakers, allowing the audience to hear the real-time adjustments made by the user. This can create a dynamic and engaging musical performance where the user can control various aspects of the sound with intuitive gestures. LED and LCD Feedback: The system includes additional LEDs and possibly an LCD display that provide real-time feedback to the user and audience. This feedback could include visualizations of the music's rhythm or beat, which change in sync with the music, enhancing the concert or performance atmosphere.
### 6. Software Requirements Specification (SRS)
The DJ Glove project is an embedded system allowing users to control DJ software and produce music through hand gestures. By utilizing an IMU sensor to detect motion and a Bluetooth module to transmit data, the glove will control various audio effects, such as pitch, volume, and filter adjustments, via open-source DJ software like Mixxx. The system includes custom software algorithms for gesture recognition, communication, and seamless integration with DJ software.
The primary users for the DJ Glove are:
The DJ Glove hardware is built around the ATmega328P microcontroller, designed to process IMU data for gesture recognition and transmit information wirelessly to DJ software via Bluetooth. Additional components, such as an LCD display and LED strips, provide visual feedback, while a speaker generates audio cues for user feedback.
| Component | Purpose | Link | |-------------------------|--------------------------------------------------------------------------------------------------|-------------------------| | ATmega328PB | Central microcontroller, selected for low power, sufficient IO pins, and compatibility with IDEs | [ATmega328PB Datasheet](https://drive.google.com/file/d/1ZoxflTe-DveEnRnLlahV9N1bbNCXoxui/view) | | MPU6050 IMU | Detects acceleration and rotation, essential for gesture control | [MPU6050 IMU ](https://www.adafruit.com/product/3886) | | HC-05 Bluetooth Module | Wireless communication with computer/mobile DJ software | [HC-05 Bluetooth ](https://components101.com/wireless/hc-05-bluetooth-module) | | 160x128 TFT LCD Display | Provides visual feedback on glove status, gesture, or audio levels | [160x128 TFT LCD ](https://www.adafruit.com/product/358) | | LED Strips | Visual feedback for effects and gestures | [LED Strips {LAB} ](https://www.adafruit.com/product/4278) | | Speaker | Provides audio cues based on gesture recognition events | [Small Speaker {LAB} ](https://www.adafruit.com/product/1314) |
These are just components we have lightly researched more components maybe added and the LCD screen and speaker can be subject to changes based on budgeting and interfacing complexity.
### 9. Final Demo
By the final demonstration, we expect to achieve:
The approach to building the DJ Glove involves:
To measure the effectiveness of the DJ Glove in meeting project goals, we’ll use the following metrics:
Slides added and Presented!</br> Yay!!!!
## References Fill in your references here as you work on your proposal and final submission. Describe any libraries used here. ## Github Repo Submission Resources You can remove this section if you don't need these references. * [ESE5160 Example Repo Submission](https://github.com/ese5160/example-repository-submission) * [Markdown Guide: Basic Syntax](https://www.markdownguide.org/basic-syntax/) * [Adobe free video to gif converter](https://www.adobe.com/express/feature/video/convert/video-to-gif) * [Curated list of example READMEs](https://github.com/matiassingers/awesome-readme) * [VS Code](https://code.visualstudio.com/) is heavily recommended to develop code and handle Git commits * Code formatting and extension recommendation files come with this repository. * Ctrl+Shift+V will render the README.md (maybe not the images though)