ImageView / LruCache

Memory Cache – Part 4

In the previous part of this series we implemented a simple memory cache based upon WeakReference. It certainly improved performance when we were using the same image multiple times within a short period, but it is easy to predict that our cached images would not survive an intensive operation, such as an Activity transition. Also, the policy of when to free cached items was completely outside of our control. While there are certainly use-cases where this is not an issue, it is often necessary to have rather more control. In this article we’ll look at LruCache which gives us precisely that.

LruCache cache was introduced in API level 12 (3.1) but is available through the Support Library back to 1.6. It is a generic class which we can strongly type when subclassing it:

[java] public class LruMemoryCache extends LruCache
{
private final Context context;

public LruMemoryCache(Context context)
{
super( 10 );
this.context = context;
}

@Override
protected Bitmap create( String key )
{
return Utils.loadAsset( context, key );
}
}
[/java]

We’ve defined a cache which will use String keys to index Bitmap objects, much the same as we did in the previous part of this series. The implementation is actually pretty straightforward: In the constructor we call the constructor of the base class to set the size of our cache to 10 items; and we override create which calls the utility method that we defined in part 2.

Using this cache is even simpler still. We first create an instance of our cache:

[java] lruMemCache = new LruMemoryCache( getApplicationContext() );
[/java]

and then obtain items using the get() method:

[java] Bitmap bitmap = lruMemCache.get( ASSET_NAME );
[/java]

So, how does it work? LruCache maintains a list of cached items (in our case we defined a cache size of 10, so this list will be up to 10 items). Whenever we try and access a specific item, it tries to find it in the cache. If the item already exists, it is moved to the head of the list, and returned. If the item does not exist, then our create() method is called to create a new instance, and this is added to the head of the list before being returned. As a new item is added to the list LruCache checks whether the addition would cause the list to exceed the size that we declared earlier. If not, then it simply adds it, but if so it first deletes the item at the tail of the list. Thus we now have a cache which is not controlled by garbage collection, and which is optimised to keep the frequently accessed items in the cache.

So, how we’ve got some pretty useful functionality in just a few lines of code, but LruCache gives us more. Images can be tricky beasts because they vary in size quite significantly. Holding an arbitrary quantity of images (i.e. 10 in our example) can result in the total size of the cache varying quite enormously depending on the sizes of the individual images. So what if we want to limit the memory usage of our cache? LruCache allows us to override the way that the cache size is calculated. We do this by first changing how the size of each cached item is calculated:

[java] @Override
protected int sizeOf( String key, Bitmap value )
{
return value.getByteCount();
}
[/java]

The default implementation of sizeOf() simply returns 1, so this provides the default behaviour that we have already seen with a satic number of items. We have overridden this to return the size of the bitmap instead.

Next we need to change how we specify the size of our cache by changing the constructor:

[java] public LruMemoryCache(Context context)
{
super( 5 * 1024 * 1024 );
this.context = context;
}
[/java]

Here we are specifying a maximum cache size of 5MiB. Whenever the cache exceeds this, items will be evicted from the tail until the size drops below 5MiB once again. We can actually do whatever we want provided that we match the units that we’re using in the constructor to specify the maximum size, to those used in our sizeOf.

This is a much better cache implementation. It gives us greater control and cached items will potentially last much longer than the next GC.

Remember to clear your cache if memory is running low. If onLowMemory() is called on your Activity, it is much better to call evictAll() on your cache and allow it to be rebuilt than it is for your app to crash with an OutOfMemoryError!

LruCache can also be used to manage a cache on the SD card. Rather than storing bitmaps in the cache, you can store File objects instead, calculate the size using file.length(), and override entryRemoved() in LruCache to delete the physical file on the SD card when its File is evicted from the cache.

That completes our look at caching. Hopefully you will have a better understanding of some of the tools available to help you to select the correct approach to meet your caching requirements.

The source code for this article can be found here.

© 2012, Mark Allison. All rights reserved.

Copyright © 2012 Styling Android. All Rights Reserved.
Information about how to reuse or republish this work may be available at http://blog.stylingandroid.com/license-information.

2 Comments

  1. Unfortunately, onLowMemory() is called only when the system as a whole is running out of memory, not when your app VM budget is running low. Typically, one will get OutOfMemoryError without receiving onLowMemory() at all.

    It looks like in Android you don’t have a reliable callback to notify you when your app is running out of memory. So a better strategy is to plan your memory budget in advance. At app startup, you can fine-tune the size of your LruCaches so all of them fit (along with the rest of your app) in the budget returned by getMemoryClass(). That will ensure your app never exceeds it’s memory budget.

    It is definitely a good idea to clean up all your caches when onLowMemory() occurrs, but that alone won’t save you from OutOfMemoryError.

Leave a Reply to Kaloian Doganov Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.