Experiencing and Analyzing Music through 3D Quadcopter Movement Drawings
That’s a hell of a title for this post, isn’t it? This is my proposal for my senior honors thesis in the mechanical engineering department at Lafayette. It is written in a very “engineering” tone, but this project will be an engineering, music, and visual art project. In an effort to not give away everything I will be doing I *did* slightly edit a couple of things here, and I also did not include the bibliography section in this post. Enjoy!
I. Introduction
In modern society the concepts of engineering and art are often intentionally and erroneously separated. The concept of STEM (Science, Technology, Engineering, and Math) and the conceptual debate between “left vs right brain” has forced a wedge between the two disciplines. American society specifically places more value on the STEM fields, providing funding for expanding these [17] while the creative arts are often times the last the receive funding, and the first to be cut in budgeting processes. Within the past decade there has been a movement to remove this wedge and once again combine the disciplines as they have been in the past. STEM has become STEAM, adding “Art” into the previously more valued subjects [1]. Now, artists are more frequently working with computer scientists, mathematicians, and engineers to both create works of art and research advanced technology.
While a significant gap still exists between the worlds of engineering the creative arts there has been work done to close this gap by researchers in both fields. While robotic devices are now commonplace, individuals in a variety of fields are using this technology to explore music, theater, and art [2-4]. These devices include computer programs, robotic arms, and unmanned aerial vehicles. One type of these, quadcopters, are currently being used for personal enjoyment by the general public (specifically by hobbyists), but also for exploring human emotion through dance [6-8].
This type of research is closing the gap between engineering and the creative arts, but this needs to be furthered. While many different types of technology have been used to explore the interactions between humans and this technology, more research needs to been done to explore human understanding of the creative arts. Specifically, there has yet to be research done to analytically break down and experiencing music in 3-dimensions by use of controlled autonomous aerial vehicles.
II. Motivation
This begs the question, how does one experience music? On the most basic level, the source of the music causes vibrations in air molecules which are picked up by the ear, and interpreted by the brain. Music theory defines what sounds “good” or “pleasing” when notes in a particular order are heard. But music can be experienced through other means as well. People perform and watch dances, play and listen to musical instruments, and work with others to combine these in new ways. Music is a way of expressing oneself, a way of conveying emotions, and often times a way of spending time and relaxing. Yet, all of these methods to experience music involve that basic interaction between the music and the human ear. In what other ways does one experience music?
Professional dancers and actors would undoubtedly respond quite vocally to that question, however performing to music is significantly different than truly experiencing music [5]. All music is experienced with a filter of sorts, a veil which places it into context. This filter could be the choreographer who wanted a dance to give off a certain emotion and found a matching song. Simply listening to music is in itself a filter, the music is contextualized based on one’s mood and environment. Is there a way to remove this filter and experience music in its purest form? Perhaps not, but this project seeks to explore another means of analyzing music to allow others to experience music through both the senses of sound and sight, where the viewer can witness a live “drawing” created by quadcopter motions.
III. Proposed Project Development
The final goal of this project is to create an autonomous swarm of 5-6 micro-quadcopters (around 6″ in diameter) that will move autonomously to any musical input. Each quadcopter will be controlled by a program which will “listen” to the input in real time, create movement commands, receive location data, and continue looping through this process.
The specific movements of each quad will be controlled based on the dynamics of music. A Fast Fourier Transform will be used to convert music from the time domain to the frequency domain in order to analyze pitch. Another algorithm will be created to time the motion of the quads with the tempo and beat. Since part of this project’s goal is to create a series of drawings based on the in-depth analysis of music, this aspect of the project will require some regimented experimentation. This will require several tests, focusing on changing different dynamic music inputs to control different aspects of the quad swarm. Controlling the thrust of each propeller could easily be controlled by the frequency of the input, however if this is visually unappealing it would need to be changed, or controlled differently. The success of this project will depend on how visually appealing each “performance” is which is determined by the accuracy of the controller. Success here will mean that there are no collisions between the any individual quads, or the quads and their environment. When this is done correctly, the quads will autonomously create a unique visual representation of any music input, including a large range of notes and genres of music. A successful quadcopter swarm will be able to respond to inputs of 150 beats per minute (bpm) or more, and be able to sense frequencies from 50Hz to 5kHz (a comfortable range for the human ear). This is a reasonable goal that will allow the swarm to create a response to many, but not all, songs.
IV. Design Component Statement
This project will require several critical design decisions to be successful. Algorithmic systems will be designed to create tracking coordinate systems: a global system and a separate system for each quad. In order to accurately track the quadcopters’ movement in these systems and to control the position and attitude of the quads, control algorithms will be created using the designed coordinate systems. This process will also ensure that the quadcopters maintain a safe distance from each other, the environment, and any spectators while still performing the desired movements synchronized to any musical input.
In order for a visually appealing 3D drawing to be created by the quad motion, each quad needs to be deliberately controlled and have the motion carefully designed to both follow a music input while still maintaining flight. To avoid creating a “motion library” each motion defined by a control input will need to be scaled appropriately in order to fully realize each quad’s range of motion and acrobatics capabilities, while still avoiding obstacles.
V. Conclusion
Many research groups have explored quadcopter control, several of which specifically focus on music and quadcopter performances. Work has been done since 2012 that focuses on controlling quadcopter swarms in indoor and outdoor locations [11,12,16]. Other groups have focused on determining the movement limits, agility capabilities, and response times of these flying machines [13-15]. These groups have spent time exploring the technical aspects of quadcopter control. Another combined effort by researchers from the University of Toronto and the Institute for Dynamic Systems and Control, Zurich, Switzerland, have examined how different aspects of music can be used as input controls. The tempo and beat of a musical piece has been utilized to create a “swaying” motion in a quad [7], and further explored to synchronize its motion to a song [9]. This group utilized work that had been done previously by Dixon, the University of London, in 2007 which focused on creating an audio beat tracking system [10].
While research groups have used music as an input for quad motion before, none have specifically focused on using the using intrinsic properties like pitch and tempo to entirely control quad motion. Other music based quad control projects have heavily used choreography to determine how a quad should move. This method utilizes a “motion library” which pre-programed movement commands that are activated at certain points to create a visually appealing reaction to a certain part of a song.
Other groups have also only used 2 to 5 quadcopters in music based projects. Rather than focusing on a visual, analytical breakdown of music, the group from Toronto and Zurich focus on emotions conveyed through dance-like performances. This project is substantially different from all other music based quad-control projects in that it will be using an analytical breakdown of a musical input to control the quadcopter swarm rather than creating a “motion library” with preprogrammed movements. Using tools like an FFT, Arduino chips, and several micro-quads, this project will focus on creating art and experiencing music, but through the process of engineering design.
Leave a Reply