Audio / Custom / View

Visualiser – Part 2

In this short series we’re going to look at how we can provide a visual representation of the currently playing audio stream. While this is fairly easy to do there are a couple of tricky points on the way which, if you don’t know about them, can make the task much harder.

We previously saw how we can register to receive callbacks containing waveform data and in this concluding article we’ll look at how we can actually display that data. Let’s start by creating the renderer class:

class SimpleWaveformRenderer implements WaveformRenderer {
    @ColorInt
    private final int backgroundColour;
    private final Paint foregroundPaint;
    private final Path waveformPath;

    static SimpleWaveformRenderer newInstance(@ColorInt int backgroundColour, @ColorInt int foregroundColour) {
        Paint paint = new Paint();
        paint.setColor(foregroundColour);
        paint.setAntiAlias(true);
        paint.setStyle(Paint.Style.STROKE);
        Path waveformPath = new Path();
        return new SimpleWaveformRenderer(backgroundColour, paint, waveformPath);
    }

    SimpleWaveformRenderer(@ColorInt int backgroundColour, Paint foregroundPaint, Path waveformPath) {
        this.backgroundColour = backgroundColour;
        this.foregroundPaint = foregroundPaint;
        this.waveformPath = waveformPath;
    }
    .
    .
    .
}

We pass in foreground and background colours and from the foreground colour we create a Paint object which will be used to actually draw the waveform. We also create a Path object which will be used to actually render the waveform.

It is important that we create both of these here and not in the render() method. The reason for this is that render() will be called from the onDraw() method of our custom View and we want to avoid object creation in onDraw() which could result in jank. By creating them up-front and re-using them we avoid this and keep our onDraw() cleaner and leaner. Let’s take a look at the render() method itself:

class SimpleWaveformRenderer implements WaveformRenderer {
    private static final int Y_FACTOR = 0xFF;
    private static final float HALF_FACTOR = 0.5f;
    .
    .
    .
    @Override
    public void render(Canvas canvas, byte[] waveform) {
        canvas.drawColor(backgroundColour);
        float width = canvas.getWidth();
        float height = canvas.getHeight();
        waveformPath.reset();
        if (waveform != null) {
            renderWaveform(waveform, width, height);
        } else {
            renderBlank(width, height);
        }
        canvas.drawPath(waveformPath, foregroundPaint);
    }
}

First we draw the background colour on to the Canvas and then we clear the waveform Path object by calling reset(). We then populate the waveform Path data depending on whether we have a valid waveform (null will be passed in if there is no audio playing).

If we have a valid waveform then renderWaveform() is called:

class SimpleWaveformRenderer implements WaveformRenderer {
    private static final int Y_FACTOR = 0xFF;
    private static final float HALF_FACTOR = 0.5f;
    .
    .
    .
    private void renderWaveform(byte[] waveform, float width, float height) {
        float xIncrement = width / (float) (waveform.length);
        float yIncrement = height / Y_FACTOR;
        int halfHeight = (int) (height * HALF_FACTOR);
        waveformPath.moveTo(0, halfHeight);
        for (int i = 1; i < waveform.length; i++) {
            float yPosition = waveform[i] > 0 ? height - (yIncrement * waveform[i]) : -(yIncrement * waveform[i]);
            waveformPath.lineTo(xIncrement * i, yPosition);
        }
        waveformPath.lineTo(width, halfHeight);
    }
    .
    .
    .
}

The waveform data is an array of byte values with each representing the amplitude of a different frequency band of the waveform. We need to render these according to their position in the array in the x-axis, and according to the aptitude (i.e. the value itself) in the y-axis. While each byte is in the range -128 to +127, the values below zero are actually offset from -127 rather than balanced around zero. In other words a value of +1 represents a tiny positive amplitude, +127 represents a large positive amplitude, -127 represents a tiny negative amplitude, and -1 represents a large negative amplitude. While this makes sense that a larger amplitude is represented by a larger value than a smaller amplitude, we actually want smaller amplitudes to be closer to zero so we need to invert the negative values to achieve this. We use a lineTo() for each point to draw a straight line between each pair of adjacent points and this will represent the visual form of the waveform.

When we don’t have a valid waveform then renderBlank() is called instead:

class SimpleWaveformRenderer implements WaveformRenderer {
    .
    .
    .
    private void renderBlank(float width, float height) {
        int y = (int) (height * HALF_FACTOR);
        waveformPath.moveTo(0, y);
        waveformPath.lineTo(width, y);
    }
}

This is much simpler – all we do is draw a horizontal line across the vertical centre.

To use this we first need to create a renderer factory:

public class RendererFactory {
    public WaveformRenderer createSimpleWaveformRenderer(@ColorInt int foreground, @ColorInt int background) {
        return SimpleWaveformRenderer.newInstance(background, foreground);
    }
}

…and then call this from the onCreate() method of our MainActivity:

public class MainActivity extends AppCompatActivity implements Visualizer.OnDataCaptureListener {

    private Visualizer visualiser;
    private WaveformView waveformView;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        Toolbar toolbar = (Toolbar) findViewById(R.id.toolbar);
        setSupportActionBar(toolbar);

        waveformView = (WaveformView) findViewById(R.id.waveform_view);
        RendererFactory rendererFactory = new RendererFactory();
        waveformView.setRenderer(rendererFactory.createSimpleWaveformRenderer(Color.GREEN, Color.DKGRAY));
    }
    .
    .
    .
}

When we run this we can see the ‘flatline’ that we draw when there is no audio playing back, the waveform when we start playing some music, and even the amplitude of the entire waveform changing as we alter the volume:

That concludes this short series – the source code for this article can be found here.

© 2016, Mark Allison. All rights reserved.

Copyright © 2016 Styling Android. All Rights Reserved.
Information about how to reuse or republish this work may be available at http://blog.stylingandroid.com/license-information.

2 Comments

  1. Hi great article, i’ve included your code in my project with success. But i have an issue as i’m testing your tool with an input of 1KHz tone pure wav.
    I started increasing the volume from 0 amplitude. It performs a right representation with each step that i increase with the external volume control from my mobile. But when it reachs a threshold output level, if i increasing again, the representated wave decrease its amplitude.
    I think that it is caused by the increase step value, when the output amplitude it is over the max of the graphics, it resolve decreasing the wave.

  2. I’m making an audio visualizer for school and this was extremely helpful! The permissions articles were particularly clarifying. Thanks!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.