Visualising Waveforms with Web Audio

Visualising Waveforms with Web Audio

This post is part of the series Make Noise with Web Audio API and originally appeared on Sonoport Tech Blog.


Welcome to another segment on web audio! Previously we posted a simple tutorial on how to use oscillators, filters and gain nodes. That tutorial will come in handy here as we’re about to go one step further, which is to have basic visualisation of waveforms using web audio. A good reference would be this, however when we first started it was difficult to understand most of that because we basically dived in straight into web audio without really understanding any of the basics yet.

So… are you ready? Let’s start!

Simplifying the code

Taken from the previous tutorial, to make sound we ended up with this code:

// Create the Audio Context

var context = new AudioContext();

// Create your oscillator, filter and gain node by declaring them as variables

var osc = context.createOscillator();

var filter = context.createBiquadFilter();
filter.type = 'lowpass';
filter.frequency.value = 250;

var oscGain = context.createGain();
oscGain.value = 0.3;


// Connect the nodes together

function makeConnection() {
    osc.connect(filter);
    filter.connect(oscGain);
}

// Play the sound inside of Chrome

function playSound() {
    oscGain.connect(context.destination);
    osc.start(0);
    osc.stop(3);
}

makeConnection();
playSound();

For the purposes of this tutorial, we will not use the BiQuadFilter node. Instead we’ll just create sound using an oscillator node and a gain node.

// Create the Audio Context

var context = new AudioContext();

// Create your oscillator

var osc = context.createOscillator();

osc.frequency.value = 60;

// Play the sound inside of Chrome

function playSound() {
    var osc = context.createOscillator();
    osc.frequency.value = 60;
    osc.type = 'square';

    oscGain = context.createGain();
    oscGain.gain.value = 0.2;

    osc.start(context.currentTime);
    osc.stop(context.currentTime + 3);

    osc.connect(oscGain);   
    oscGain.connect(analyser); /*Connect oscillator to analyser node*/
    analyser.connect(context.destination);
}

playSound();

We will also drop the makeConnection function, and connect the oscillator to the analyzer inside of our playSound function.

You may test this code out at JSFiddle.
The reason why we are simplifying the code is so that you won’t get confused once we integrate the Analyser Node, data collection methods and HTML5 Canvas Element.

Using the AnalyserNode

Now before we get to the exciting part, I would like to introduce you to the AnalyserNode which is essential in helping us visualising our audio waveforms.

So what is an AnalyserNode?

It is an AudioNode that passes the audio stream unchanged from the input to the output, but allows you to take the generated data, process it, and create audio visualizations.

Our audio source is our oscillator node, and the analyser node extracts the frequency, waveform, and other data from the original oscillator node. We must create the analyser node using the AudioContext.createAnalyser() method.

Creating the analyser node

var context = new AudioContext();
var analyser = context.createAnalyser();

The Connections

osc.connect(analyser);
analyser.connect(context.destination);

An interesting thing to take note is that you do not have to connect the analyser node to any output for it to work. It will work as long as the input is connect to the source directly or via another node. For now, we are connecting it to the context.destination so that we will be able to hear our oscillator.

The analyser node will then capture audio data using a Fast Fourier Transform (fft) in a certain frequency domain, depending on what you specify as the AnalyserNode.fft property value (if no value is specified, the default is 2048.)

This brings us to…

Analyser Node’s Data Collection Methods

There are different methods for capturing various types of data.

To capture frequency data, we use:

To capture waveform data, we use:

The AnalyserNode.getFloatFrequencyData() will return a Float32Array typed array. Float32Array represents an array of 32-bit floating point numbers (corresponding to the C float data type) in the platform byte order. More infomation on that here.

However, the rest of the methods AnalyserNode.getByteFrequencyData(), AnalyserNode.getByteTimeDomainData() and AnalyserNode.getFloatTimeDomainData() returns a Uint8Array, which is an array of 8-bit unsigned integers. More infomation here.

Let say you are dealing with the fft size of 2048 (which is the default) and we are using the AnalyserNode.frequencyBinCount method and returning its value:

analyser.fftSize = 2048;
var bufferLength = analyser.frequencyBinCount; 
var dataArray = new Uint8Array(bufferLength);

What is happening here is that we are creating a new variable, bufferLength and transfering data from the AnalyserNode.frequencyBinCount method to it. Then we take that bufferLength and convert it into a Uint8Array data type which is now located in the variable dataArray.

That was just the process of converting the data. To actually retrieve the data and copy it into our array, we must call the data collection method we want, with the array passed as it’s argument.

Like this: analyser.getByteTimeDomainData(dataArray);

Now then we have the audio data captured in our array and we can move on to using it to visualise our waveform, which brings us to…

The HTML5 canvas Element

Once again, a good reference would be here.

So what is the HTML5 canvas element?

Added in HTML5, the HTML canvas element can be used to draw graphics via scripting in JavaScript. For example, it can be used to draw graphs, make photo compositions, create animations, or even do real-time video processing or rendering.

The HTML 5 canvas element is really fun to play with and I don’t think I would want to squeeze in all the tutorials in one blog post so here are some links for you to check out!

Web Articles

Video Tutorials

It’s really a lot to take in, I would recommend you to take your time learning the basics and experiment around with the <canvas> element until you are comfortable with it before moving on with the rest of the tutorial here as it might be hard (but not impossible) to understand if you do not have any previous knowledge.

Are you ready?

As a continuation from the previous steps, we would already have the data ready for visualisation.

Now firstly we would need to clear the canvas of any previous drawings to get ready for display.

myCanvas.clearRect(0, 0, WIDTH, HEIGHT);

Then we define a function draw()

function draw() {

Here we are requesting requestAnimationFrame to keep looping the drawing function once it starts.

drawVisual = requestAnimationFrame(draw);

Then refering to the previous section on retrieving the data and copying it into our array, here is where we do it.

analyser.getByteTimeDomainData(dataArray);

After that we fill the canvas with a solid colour.

myCanvas.fillStyle = 'rgb(200, 200, 200)';
myCanvas.fillRect(0, 0, WIDTH, HEIGHT);

Set the width of the line and stroke colour for the waveform, then start drawing the path.

myCanvas.lineWidth = 2;
    myCanvas.strokeStyle = 'rgb(0, 0, 0)';

    myCanvas.beginPath();

Set the width of each segment of the line drawn by dividing the canvas length by array length (which is the FrequencyBinCount defined earlier). Then we define a x variable to set the position to move for drawing each segment of the line.

  var sliceWidth = WIDTH * 1.0 / bufferLength;
      var x = 0;

Here we make a loop, defining a small segment of the waveform for each point in the buffer at a certain height based on the data point value from the array, then moving the line across to the place where the next segment will be drawn.

  for(var i = 0; i < bufferLength; i++) {

        var v = dataArray[i] / 128.0;
        var y = v * HEIGHT/2;

        if(i === 0) {
          myCanvas.moveTo(x, y);
        } else {
          myCanvas.lineTo(x, y);
        }

        x += sliceWidth;
      };

Then we finish the line on the middle, right hand side of the canvas and draw the stroke we defined.

  myCanvas.lineTo(canvas.width, canvas.height/2);
      myCanvas.stroke();
    };

Finally we call the draw() function to start off the process.

draw();

So the whole canvas code we just did would look like this.

myCanvas.clearRect(0, 0, WIDTH, HEIGHT);

function draw() {
  drawVisual = requestAnimationFrame(draw);
  analyser.getByteTimeDomainData(dataArray);

  myCanvas.fillStyle = 'rgb(200, 200, 200)';
  myCanvas.fillRect(0, 0, WIDTH, HEIGHT);
  myCanvas.lineWidth = 2;
      myCanvas.strokeStyle = 'rgb(0, 0, 0)';

      myCanvas.beginPath();
  var sliceWidth = WIDTH * 1.0 / bufferLength;
      var x = 0;

  for(var i = 0; i < bufferLength; i++) {

        var v = dataArray[i] / 128.0;
        var y = v * HEIGHT/2;

        if(i === 0) {
          myCanvas.moveTo(x, y);
        } else {
          myCanvas.lineTo(x, y);
        }

        x += sliceWidth;
      };

  myCanvas.lineTo(canvas.width, canvas.height/2);
      myCanvas.stroke();
    };

draw();

Then if we combine the web audio codes and the visualiser codes, we would get this.

// Create the Audio Context

var context = new AudioContext();
var analyser = context.createAnalyser();
var WIDTH = 300;
var HEIGHT = 300;

function playSound() {
    var osc = context.createOscillator();
    osc.frequency.value = 60;
    osc.type = 'square';

    oscGain = context.createGain();
    oscGain.gain.value = 0.2;

    osc.start(context.currentTime);
    osc.stop(context.currentTime + 3);

    osc.connect(oscGain);   
    oscGain.connect(analyser); /*Connect oscillator to analyser node*/
    analyser.connect(context.destination);
}

var canvas = document.querySelector('.visualizer');
var myCanvas = canvas.getContext("2d");

analyser.fftSize = 2048;

var bufferLength = analyser.frequencyBinCount; 
/*an unsigned long value half that of the FFT size. This generally equates to 
the number of data values you will have to play with for the visualization*/

var dataArray = new Uint8Array(bufferLength);

myCanvas.clearRect(0, 0, WIDTH, HEIGHT);

function draw() {
  drawVisual = requestAnimationFrame(draw);

  analyser.getByteTimeDomainData(dataArray);

  myCanvas.fillStyle = 'rgb(230, 20, 210)';
  myCanvas.fillRect(0, 0, WIDTH, HEIGHT);
  myCanvas.lineWidth = 2;
  myCanvas.strokeStyle = 'rgb(40, 95, 95)';
  myCanvas.beginPath();

  var sliceWidth = WIDTH * 1.0 / bufferLength;
  var x = 0;

  for(var i = 0; i < bufferLength; i++) {

        var v = dataArray[i] / 128.0;
        var y = v * HEIGHT/2;

        if(i === 0) {
          myCanvas.moveTo(x, y);
        } else {
          myCanvas.lineTo(x, y);
        }

        x += sliceWidth;
      };

  myCanvas.lineTo(canvas.width, canvas.height/2);
      myCanvas.stroke();
};

var analyserButton = document.getElementById("myAnalyserButton")

analyserButton.addEventListener('click', function() {
  playSound();
  draw();
});

One last bit of HTML to finish off the whole process.

  <canvas class="visualizer";id="myCanvas";width="640" height="100"></canvas>

You can check out the whole code here.

Oscillator



Facebook Comments
Aqilah Misuary

Aqilah Misuary

Social Media Coordinator at Sonoport
Aqilah Misuary is a marketing associate in Sonoport. She writes about anything related to music technology, sound design and web audio. She also has an underlying passion for anything related to visual communications. Graduated with a B.A in Music Technology from Lasalle College of the Arts, Aqilah is both a musician and visual artist in the electronic-indie band Elektone & audio-visual group Setosuary. In her free time she enjoys riding her scooter and being in museums.
Aqilah Misuary