Audio / Custom / View

Visualiser – Part 2

In this short series we’re going to look at how we can provide a visual representation of the currently playing audio stream. While this is fairly easy to do there are a couple of tricky points on the way which, if you don’t know about them, can make the task much harder.

We previously saw how we can register to receive callbacks containing waveform data and in this concluding article we’ll look at how we can actually display that data. Let’s start by creating the renderer class:

We pass in foreground and background colours and from the foreground colour we create a Paint object which will be used to actually draw the waveform. We also create a Path object which will be used to actually render the waveform.

It is important that we create both of these here and not in the render() method. The reason for this is that render() will be called from the onDraw() method of our custom View and we want to avoid object creation in onDraw() which could result in jank. By creating them up-front and re-using them we avoid this and keep our onDraw() cleaner and leaner. Let’s take a look at the render() method itself:

First we draw the background colour on to the Canvas and then we clear the waveform Path object by calling reset(). We then populate the waveform Path data depending on whether we have a valid waveform (null will be passed in if there is no audio playing).

If we have a valid waveform then renderWaveform() is called:

The waveform data is an array of byte values with each representing the amplitude of a different frequency band of the waveform. We need to render these according to their position in the array in the x-axis, and according to the aptitude (i.e. the value itself) in the y-axis. While each byte is in the range -128 to +127, the values below zero are actually offset from -127 rather than balanced around zero. In other words a value of +1 represents a tiny positive amplitude, +127 represents a large positive amplitude, -127 represents a tiny negative amplitude, and -1 represents a large negative amplitude. While this makes sense that a larger amplitude is represented by a larger value than a smaller amplitude, we actually want smaller amplitudes to be closer to zero so we need to invert the negative values to achieve this. We use a lineTo() for each point to draw a straight line between each pair of adjacent points and this will represent the visual form of the waveform.

When we don’t have a valid waveform then renderBlank() is called instead:

This is much simpler – all we do is draw a horizontal line across the vertical centre.

To use this we first need to create a renderer factory:

…and then call this from the onCreate() method of our MainActivity:

When we run this we can see the ‘flatline’ that we draw when there is no audio playing back, the waveform when we start playing some music, and even the amplitude of the entire waveform changing as we alter the volume:

That concludes this short series – the source code for this article can be found here.

© 2016, Mark Allison. All rights reserved.

CC BY-NC-SA 4.0 Visualiser – Part 2 by Styling Android is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Permissions beyond the scope of this license may be available at http://blog.stylingandroid.com/license-information.

1 Comment

  1. Hi great article, i’ve included your code in my project with success. But i have an issue as i’m testing your tool with an input of 1KHz tone pure wav.
    I started increasing the volume from 0 amplitude. It performs a right representation with each step that i increase with the external volume control from my mobile. But when it reachs a threshold output level, if i increasing again, the representated wave decrease its amplitude.
    I think that it is caused by the increase step value, when the output amplitude it is over the max of the graphics, it resolve decreasing the wave.

Leave a Reply

Your email address will not be published. Required fields are marked *