DIGIBAND: Accessibility Drum Machine - Bluetooth Receiver, "instrument" Controllers, IOS App
by pintchom in Circuits > Assistive Tech
27 Views, 0 Favorites, 0 Comments
DIGIBAND: Accessibility Drum Machine - Bluetooth Receiver, "instrument" Controllers, IOS App
Hello! This is Digiband, a drum machine that easily pairs with external switches for those with physical impairments. With digiband, individual contributors can musical masterpieces through bluetooth signals to an IOS music app. Each individual connects a switch to their box (a, b, c, or d) and whenever they click their switch, a sound will play through the IOS/iPad app, or a bluetooth speaker, headphones etc, hosted by a proctor. The proctor can browse and assign sounds to each button to customize the experience and also record their sessions to play back in the future.
Video Demo: https://vimeo.com/1036080627?share=copy
All code and files here: https://github.com/pintchom/DigiBand
Supplies
ESP32-S3 N8R8 (Bluetooth + Wifi Host controller):
https://www.ebay.com/itm/266213558442?chn=ps&var=566037787128&google_free_listing_action=view_item
4 x Raspberry pi pico w (or any other wifi capable circuitpy microcontrollers): https://www.adafruit.com/product/5526
4 x Mono 3.5mm jacks (for external buttons) (F and M):
https://www.adafruit.com/product/4181
https://www.adafruit.com/product/4182
4 x NeoPixel Flora (for visual aid when buttons are pressed):
https://www.adafruit.com/product/1260
5 black wires, 5 red wires, 5 yellow wires (for Flora lights)
Printing Encasing
This project includes 5 separate enclosures for controllers, 1 for the host and 4 for the instruments. All the boxes are the same size and the STL files are included in the GitHub attached to the description of this project. Print 5 boxes each with text HOST, A, B, C, D. They will look like the boxes in the main cover image.
Host Controller
The first controller, and the more difficult controller, is the Host. The way this project works is demonstrated in the image attached. Each instrument communicates the host via WiFi (API calls) where the Host acts as a remote server handling these requests, then the host communicates to the IOS app via bluetooth to transmit which instrument is currently requesting to play a sound.
The code is dense and you can take it straight from my GitHub, but here's the general overview.
First, the board will attempt to connect to WiFi, this requires you to add your wifi information in a settings.toml file like the following
After the board connects to wifi, it will setup its server with only one route: send-signal. This is used to receive communication from each instrument and it only takes in the device number that is being communicated.
The send-signal route does the following:
parses the json body from the request, determines the device ID (1,2,3,4) to transmit, then writes either A, B, C, or D to bluetooth which will then eventually be received by the IOS app.
The board also configures its Bluetooth presence by establishing a BLE advertisement using the adafruit_ble.advertising.standard with a standard advertisement name (found in the BluetoothManager in the Swift package)
Then it will start the server and do the following:
establish the server port (:80), display it's local IP address (which is needed in your instrument controller code), and wait for a bluetooth connection. Once a connection is established, the small neopixel flora light will light green instead of flashing blue and that means it is ready to start sending signals to the app.
Instrument Controller
The instrument controllers follow a simpler code structure. All they need to do is connect to wifi, read a button press, and send a API call to the host controller.
First, create your settings.toml file for each controller like before. Then, change the HOST_IP to the ip previously returned by the host controller. This will only needed to completed once on every wifi network. Then, configure a button with digitalio to whatever pin of your choice. Then the board will attempt to connect to wifi in the same manner as the host. The main loop of the instrument controller will wait for a button press. When the button signal is received, it will send a API call to the host controller to play it's sound in the app, and quickly flash a green LED to confirm the request went through.
Wiring
The wiring for this project is fairly straightforward. The host controller only has 1 neopixel to deal with. Simply connect the - pin to ground, the + pin to power, and the upwards facing arrow to a data pin of your choice. And that's it. For the instruments follow the same steps for each neopixel, and then also connect the female audio jack. The black wire goes to ground, the red (or white in the link I pasted above) to a data pin of your choice. The male end of the audio jack is optional for testing, but the way I tested is I attached each male wire to a small button by connecting black to ground and red (white) to power. The neopixel flora's typically take 5V power, but 3.3V also worked just fine for me since we're only attaching one light to each.
IOS App
This is the hardest part - creating an IOS app to read and play signals from the bluetooth controller. Since for this project only the hardware is what you actually need to build yourself, I'll keep the app description brief.
The Digiband IOS does the following:
Creates a bluetooth manager object to read and connect to incoming bluetooth advertisements. Then, the app pulls from Firebase storage some base sounds I found online to play with. The app allows you to browse, play, and assign sounds to instruments. When some sounds are assigned to each button, the app user can either play them on their own by pressing their coordinating buttons, or receive bluetooth signals which will automatically play the sounds. The app also allows to start a recording which will record all sounds played before stopping the recording. It does this by simply saving the order of button presses with their respective time stamps locally to quickly and easily save them to your device without have to deal with large audio-files.
The app will also allow for uploading and recording of custom sounds to attach to buttons in the future, (by Friday 12/06/2024). It will also soon be on TestFlight and I will update this instructable with a link to the TestFlight beta testing group soon.