Dancing TWIP Robot
By Mihir Dhamankar and Simon Yu
Background/Goals
The goal of our project was to make a two-wheel inverted pendulum (TWIP) robot dance to music. The most straightforward way to do this would be to record a set of commands that go well with a particular song and replay them in time with the music. However, we wanted to be able to adapt to different types of music. With it, there comes a sacrifice in musicality. A pre-recorded, well-choreographed dance moves well probably look better than our results, but we aim to try to create a more versatile system.
Dance Videos In Order:
- "Crab Rave" after finding the bpm using Matlab (tilting at 125 bpm)
- "Dance of the Sugar Plum Fairy" by manually sending individual notes (tilting)
- "Gymnopedie No. 1" using live MIDI (tilting)
- "Dance of the Sugar Plum Fairy" using live MIDI (spinning)
- "Macarena" (103 bpm found using Matlab) with 4/4 time signature path planning (tilting and spinning)
- This video was having some issues, an alternative link is here
- Simulated twin twip: "Supernova" a fast-paced EDM with bpm 128, well-suited for the energetic moves allowed by the spring link. Video
- Simulated twin twip: "Snowman" a slow-paced song with 6/8 time signature, which fits a swaying dance move. Video
GitHub
All files mentioned (and more) are here
Supplies
- The Elegoo Tumbller robot
- Arduino IDE
- Usb cable
- MIDI music files
- Non-MIDI music
Balancing
We originally implemented balancing using a simple PID controller with a Kalman filter for just the tilt angle. This did not account for accelerometer drift, leading to less stable balancing overall. Since our “dance moves” would add instability to the robot, we needed a more robust controller. We took inspiration from open source code by Elegoo as well as a Kalman filter design based on this article: http://blog.tkjelectronics.dk/2012/09/a-practical-approach-to-kalman-filter-and-how-to-implement-it/
Our Kalman filter includes the tilt angle as well as bias in the state (Image 1)
The code essentially implements a discrete time Kalman filter by calculating the priori estimate, priori error covariance matrix, innovation step, and innovation covariance at each time step. It uses those to find Kalman gains and output the new state estimate.
We had to do some trial and error to find good constants for the filter:
Q_angle = 0.001,
Q_gyro = 0.005,
R_angle = 0.5
As well as setting the default target angle to -3° from vertical.
Dance Moves
We initially chose to implement dancing by alternating the target angle from -5° to -1° (3° +/- 2°) to make the robot move slightly forwards and backwards. We call this move "the tilt". Normally setting the target angle away from the balancing point would cause it to run away in one direction and fall over, but if the next beat arrived sufficiently fast, the balancing logic would handle the deviation and keep the robot from falling.
With the success of tilting, we also implemented "spinning". The vertical axis can rotate freely without affecting the balance too much as long as the wheel commands do not saturate. Thus, we can add a constant to the left wheel command and subtract the same constant from the right wheel command to command the robot to spin in place. We also add a scaled value of the z axis gyroscope reading to the constant to get it to spin a consistent amount.
Beat Detection
Our initial approach was to use software to detect live beats and send the data via serial port to the robot. We could not find a good solution for live beat detection, so this did not work. Our second approach involved finding the BPM of a song within a few seconds, sending it to the robot, and then letting the robot continue dancing at this BPM. Video 1 shows a simple example of this.
Signal Processing
We use a MATLAB toolbox called MIRtoolbox (Music Information Retrieval) https://www.jyu.fi/hytk/fi/laitokset/mutku/en/research/materials/mirtoolbox.
This toolbox allows us to extract information from an audio file such as the bpm. The workflow is as follows. The mirenvelope() function extracts the amplitude outline of an audio wave.
(Image 1)
From there, the mirpeaks() function extracts the peaks from the audio envelope. This is helpful as most music expresses beats with percussive instruments, which produces spikes of volume in regular intervals. Even if a piece does not have a percussive component, for example piano, the music should still be stressed at regular time intervals. (Image 2)
The mirtempo() function then finds the periodicities in the peaks, which matches the music to a bpm. This rule out non rhythmic spikes in volume, as the algorithm assigns a score to different bpms and picks the best one, which reduces the effect of outliers (Image 3).
Our current bpm detection workflow is shown in Image 4.
The audio recordings are done in Sampling frequency of 44100Hz and bit depth: 24bit. We deployed the toolbox with presets “Halfwavediff” for mirenvelope() and “Resonance” preset for mirautocor().
We tested the performance of the detector by letting it listen to some songs (table in Image 5)
Path Planning With Time Signature Detection
MIRItoolbox can also guess what time signature a song has. This information can be used to plan paths or patterns for different dances depending on the time signature. Instead of moving on every beat, the robot can move more on say the first beat of every measure and do small readjustments on other beats. This mimics what people do when dancing (large and small steps). Some ideas for paths are given in Image 1.
Video 5 shows an example of a dance that is designed for a 4/4 time signature. The pattern used is shown in Image 2. It was done by using a different combination of spin and tilt moves on each beat. The arduino code is also attached.
In a live song setting, MIRItoolbox would listen to the the song, send its bpm and time signature to the arduino. Once the robot is ready on the next measure, it would start dancing to the pattern corresponding time signature at the desired bpm (4/4 and 103 in this case).
Downloads
Live Dancing Using MIDI Files
When first experimenting with how to make the robot dance, we found that it was possible to manually send serial messages to get the robot to make certain moves in time with the music (as seen in video 2). Furthermore, moving on every note seemed somewhat more interesting than the dancing seen in video 1. Although humans tend to dance based on an underlying rhythm, there was merit in exploring how a robot would dance reacting to individual notes. Since listening to live audio would have tremendous processing delays, we decided to use MIDI.
MIDI, or the Musical Instrument Digital Interface is a way for computers to communicate about making and controlling sound. A MIDI file consists of a bunch of these messages, which if played back in real time, play a song. The Mido python library allows us to examine a MIDI file, and we can play it back using a software synthesizer such as Fluidsynth. In midi_player.py we used sample code for the Mido library to look at each note and its attributes in real time. Since songs can have multiple layers of MIDI instruments as seen in the sheet music visualization of a MIDI file in Image 1, our code only makes dance moves when it finds a message with a time attribute > 0. This attribute is used for timekeeping purposes in the MIDI file (it is not exactly how long a note is held, but it seems the longer notes have a larger time value). Based on the time attribute, the code sends a serial message to the arduino to either do a tilt move, spin move, or temporarily stop dancing (return to vertical). Due to the way the bluetooth on this robot is configured, we were not able to send serial messages over it, so the USB cable needs to be attached the whole time. This unfortunately makes spins a bit more difficult. On the arduino end, the robot listens on the serial port and takes the respective action once it receives a message. Videos 3 and 4 show how this style of dancing looks.
Any MIDI file can be danced to using
python3 midi_player.py -s SongName.mid
Twin TWIP Simulation
We also had the idea to simulate what it would look like if two of these robots danced together, attached by a spring -- a twin twip, or perhaps, a Twinp?
We decided to go with this design as the this resembles two dancers holding hands, the spring provides a force between the robots that can influence their movements, creating new movement possibilities.
We simulated the two TWIP robots using MATLAB, adapted from assignment-1 of 16-299: TWIP Assignment (cmu.edu).
We modified the simulator to support a simulation of two robots. See image 1. We modified the dynamics of the robot to include a spring force (f_spring) term, which the force due to spring tension, for simplicity we assume the spring obeys Hooke's Law. See image 2.
Since this is a proof of concept, we build the simulation in 2D, so the robots cannot "turn". But it should still demonstrate some of the moves that can be done with a spring link.
The robots are individually controlled by a full state feedback controller with state [x theta dx dtheta], and manually tuned gains.
The dance moves are generated by the beat detection method, the audio analysis analyzes the bpm from microphone. With the bpm, the goal locations of the x and theta are updated at the correct time, resulting in rhythmic dance. Time signature detection is temporarily disabled for simplicity, so currently the dance moves are pre-selected from a move list, but the rest is automatic and adaptive.
We prepared two examples to highlight complexity added. The First one is "Supernova" by Laszlo, a fast-paced EDM. The spring can store elastic potential energy thus the robot is able to make more energetic dance moves compared to a single robot configuration. Video
The second example is "Snowman" by Sia, a slow paced 6/8 time song. A 6/8 song has a natural "sway" feeling to it, as the emphasis is placed on the first and fourth beat in a bar. The two twips are able to lean at different angles in the sway motion, creating a alternating "lead" and "follower" role with each sway. Video
There are more that can be done with this twin twip set up, with more creativity you create more such unique dance moves to your heart's desire.
*Files in this section can be found in the repository, as matlab files are not supported by this platform
Results and Discussion on Future Steps
Despite only being able to stand on two wheels, videos 3, 4, and 5 best show how capable this TWIP robot is at dancing. The twin twip concept further shows how much can some additional complexities bring. Although its dance moves look rather simplistic at times, we believe this project was more of a proof of concept. With enough manual tuning and hard coded commands, almost any possible dance could be performed by this robot. However, we were able to capture a surprising amount of intricacy with relatively generalized code, which we think is a massive success.
Nevertheless, here are some improvements which can still be pursued
- The ability to balance seems to be highly dependent on the quality of surface, so certain dance moves may be possible on some surfaces and not on others. Currently we took a rather conservative approach to how long and how far we could tilt, but this could be adjusted on the fly if the robot keeps track of how far it is drifting
- Programing in more of choreographed dance moves/patterns would be a reasonable way to make a dance more exciting. The robot could pick a pattern based on midi events or detected time signature changes
- Even though analyzing live has a significant delay, detecting what song is playing (using a Shazam-like API) and looking up the song's details may be a much faster way to sync up with the beat
- Adding an onboard computer (e.g. a Raspberry Pi) or finding a way to use the onboard bluetooth for communication would get rid of the need for the USB serial cable, allowing for greater freedom of motion
- A real twin twip can have even more complex dance moves with rotations. For example, have one robot by pivot of which the other spins around like in figure skating.