UX/UI/Research Based Project
Role: Interactive Designer, Researcher, Developer, Co-Author
HTML, CSS, Javascript, Web Audio API, Canvas
As part of my ongoing research, I am investigating how sound can play a role in improving our memory and the potential for learning or GBL applications. As part of this, I investigated web audio API and the intricacies of sound-based interactions, mainly looking at the frequency and generative oscillating tones like Sine, Triangle, Square, and Sawtooth. This is dynamically generated through Javascript coding. It illustrates the power of web audio API, the potential for future web applications, and creating generative music within the browser. I developed the example below as an extension of research that I co-authored on non-speech audio. You can read about it here. In this example, I also feature a visualizer that analyses frequencyBinCount, gain, and sound duration. I use this code to create a notation system that begins at a frequency of 16.352 Hz, which is assigned the value C 10. I use the traditional tone system (A to G) which is followed by numbers showing which octave they are part of.
Before you activate the frequency chart, be sure to turn your volume to mid, especially if you are wearing headphones. Sawtooth can be very harsh and unpleasant to the ears.
Also just a note to readers, I am well aware that this frequency chart is not completely accurate. What I am trying to illustrate in this example is not so much how a frequency chart works, but how powerful audio web API is. As you investigate the code you will discover it only took one web audio API class to generate 384 tones, including 4 different waveforms, not to mention a complete functioning Frequencies visualizer visualizing in real-time. This is an example of how Audio web API provides a powerful and versatile system for controlling and generating audio from a web browser. This also touches on a personal investigation into one’s sensitivity to tone through interaction or what is being seen as Audio UX. As you explore and listen to sounds your brain is sampling tones similar to the way we look at colors and develop a visual taste or repertoire. This repertoire serves as a resource for UX sound. Try it out!
As this project develops it becomes a possible tool for UI/UX designers to quickly reference sounds that are generated through the browser and gage what frequencies are more usable than others. As I continue researching, I am starting to develop an Audio synesthesia App that will leverage Artificial Intelligence to explore the long-held idea of a relationship between color and sound. Meaning, Kandinsky or the band “Love and Rockets” have it right when they said “sounds emit certain colors”? Below is a screen shot of what I am working on. A live version coming soon.