Audio / Custom / View

Visualiser – Part 1

In this short series we’re going to look at how we can provide a visual representation of the currently playing audio stream. While this is fairly easy to do there are a couple of pain points on the way which, if you don’t know about them, can make the task much harder.

Before we get stuck in to any code, let’s first take a look at what it is we’re looking to achieve:

visualiser

We’re going to implement a simple waveform graph of the currently playing audio. The next thing that’s worth mentioning is that I hail from England and we spell ‘visualiser’ with an ‘s’ whereas the API class we’ll be looking at uses the US spelling of ‘visualizer’ with a ‘z’. Therefore there will be a spelling discrepancy when I refer to the API class as opposed to using the word ‘visualiser’ within a sentence.

So, as you may have guessed, we’re going to be looking at the android.media.audiofx.Visualizer class to do a chunk of the work for us. This was introduced in API 9 (Gingerbread) and a quick look at the Javadocs for this class indicate that it requires the android.permission.RECORD_AUDIO permission. The previous series of articles covers permissions on Marshmallow, so we won’t go in to any detail here – the code to verify and request the permissions is in the source so you can study, if necessary.

However, this does pose us something of a problem in terms of getting the user to accept the permission. While RECORD_AUDIO makes some sense because we are capturing an audio stream, this permission actually gets presented to the user as the “Microphone” permission. Therefore we have a bit of explaining to do to the user to (hopefully!) enable them to understand that while we require the “microphone” permission we won’t actually be capturing anything from the microphone at all.

So let’s take a quick look at our MainActivity:

public class MainActivity extends AppCompatActivity implements Visualizer.OnDataCaptureListener {

    private static final int REQUEST_CODE = 0;
    static final String[] PERMISSIONS = new String[]{Manifest.permission.RECORD_AUDIO, Manifest.permission.MODIFY_AUDIO_SETTINGS};

    private Visualizer visualiser;
    private WaveformView waveformView;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        Toolbar toolbar = (Toolbar) findViewById(R.id.toolbar);
        setSupportActionBar(toolbar);

        waveformView = (WaveformView) findViewById(R.id.waveform_view);
    }

    @Override
    protected void onResume() {
        super.onResume();
        PermissionsChecker checker = new PermissionsChecker(this);

        if (checker.lacksPermissions(PERMISSIONS)) {
            startPermissionsActivity();
        } else {
            startVisualiser();
        }
    }

    private void startPermissionsActivity() {
        PermissionsActivity.startActivityForResult(this, REQUEST_CODE, PERMISSIONS);
    }

    @Override
    protected void onActivityResult(int requestCode, int resultCode, Intent data) {
        super.onActivityResult(requestCode, resultCode, data);
        if (requestCode == REQUEST_CODE && resultCode == PermissionsActivity.PERMISSIONS_DENIED) {
            finish();
        }
    }
    .
    .
    .
    @Override
    protected void onPause() {
        if (visualiser != null) {
            visualiser.setEnabled(false);
            visualiser.release();
            visualiser.setDataCaptureListener(null, 0, false, false);
        }
        super.onPause();
    }
}

So this handles all of the permissions stuff. You may be wondering why we also need the MODIFY_AUDIO_SETTINGS permission. According to the Javadocs for Visualizer:

Creating a Visualizer on the output mix (audio session 0) requires permission MODIFY_AUDIO_SETTINGS

This is actually a normal level permission so gets granted automatically on Marshmallow, but I feel it safer to explicitly check for normal permissions nonetheless.

Once we have the requisite permissions we call startVisualiser():

public class MainActivity extends AppCompatActivity implements Visualizer.OnDataCaptureListener {

    private static final int CAPTURE_SIZE = 256;
    .
    .
    .
    private void startVisualiser() {
        visualiser = new Visualizer(0);
        visualiser.setDataCaptureListener(this, Visualizer.getMaxCaptureRate(), true, false);
        visualiser.setCaptureSize(CAPTURE_SIZE);
        visualiser.setEnabled(true);
    }
    .
    .
    .
}

This first creates a new Visializer instance. While it’s possible to attach this to different audio streams, we’re interested in the currently playing audio which is audio session 0 (in the Visualizer constructor).

A Visualizer attaches to the specified audio stream and make periodic callbacks, at the frequency specified, to the listener. The final two booleans indicate whether we’re interested in waveform or FFT data – we’re only interested in the waveform data (Fast Fournier Transform is outside the scope of this article). We then set the capture size – this represents how many frequency channels will be returned in each callback. Finally we need to enable the Visualizer which will start capturing data.

We need to also implement the two OnDataCaptureListener callback methods:

public class MainActivity extends AppCompatActivity implements Visualizer.OnDataCaptureListener {
    .
    .
    .
    @Override
    public void onWaveFormDataCapture(Visualizer thisVisualiser, byte[] waveform, int samplingRate) {
        if (waveformView != null) {
            waveformView.setWaveform(waveform);
        }
    }

    @Override
    public void onFftDataCapture(Visualizer thisVisualiser, byte[] fft, int samplingRate) {
        // NO-OP
    }
    .
    .
    .
}

We’re only bothered about the waveform data, and we pass this on to a custom view in our layout which is responsible for rendering the waveform. This view is actually pretty simple:

public class WaveformView extends View {
    private byte[] waveform;

    private WaveformRenderer renderer;

    public WaveformView(Context context) {
        super(context);
    }

    public WaveformView(Context context, AttributeSet attrs) {
        super(context, attrs);
    }

    public WaveformView(Context context, AttributeSet attrs, int defStyleAttr) {
        super(context, attrs, defStyleAttr);
    }

    public void setRenderer(WaveformRenderer renderer) {
        this.renderer = renderer;
    }

    @Override
    protected void onDraw(Canvas canvas) {
        super.onDraw(canvas);
        if (renderer != null) {
            renderer.render(canvas, waveform);
        }
    }

    public void setWaveform(byte[] waveform) {
        this.waveform = Arrays.copyOf(waveform, waveform.length);
        invalidate();
    }
}

This actually uses a separate renderer to actually perform the drawing of the waveform. The only thing worth of note here is that we make a copy of the waveform data because this data is only valid within the scope of the callback. In our case we’re calling invalidate() to force a redraw of the component but this may happen after the callback has completed so the waveform may go out of scope if we don’t take a copy.

WaveformRenderer is a simple interface:

public interface WaveformRenderer {
    void render(Canvas canvas, byte[] waveform);
}

The reason I’ve implemented it this way is that we could go on to add a number of other renderers on top of the simple waveform renderer that we’re going to implement and designing things in this way completely decouples the renderer from the layouts and Views to make for a completely pluggable architecture.

In the concluding article in this series we’ll look at how we render a simple waveform.

The source code for this article is available here.

© 2016, Mark Allison. All rights reserved.

Copyright © 2016 Styling Android. All Rights Reserved.
Information about how to reuse or republish this work may be available at http://blog.stylingandroid.com/license-information.

4 Comments

  1. Hey, great article. I’m currently working on putting in a visualizer to an app I’m working on that records audio from the mic. The visualizer doesn’t work with the mic, plus, if you look at stackoverflow and github repos with visualizers, you’ll see that there are many, many devices where visualizer is buggy or just doesn’t work. Would it be possible to go over how to also use recorded aac samples?

  2. Hey Mark,

    Great article. I just cloned your project from github and tested it on my phone and a genymotion emulator. On both it doesn’t show anything, just a white surface (though a song is playing) ,and when I press back button I get this exception:

    Caused by: java.lang.IllegalStateException: Unable to retrieve Visualizer pointer
    at android.media.audiofx.Visualizer.native_setPeriodicCapture(Native Method)
    at android.media.audiofx.Visualizer.setDataCaptureListener(Visualizer.java:590)
    at com.stylingandroid.vizualiser.MainActivity.onPause(MainActivity.java:81)

    What’s the problem here?

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.