Archive for September, 2009

How to implement sharpen and blur tools

The sharpen and blur tools are simple bitmap tools. They allow you to blur or sharpen parts of an image using a brush metaphor. In addition to being able to control the strength of the sharpen or blur, these tools also typically allow you to control the blend mode when applying the effect.

The sharpen tool has the following effect on this tweed pattern:

Tweed pattern with sharpen applied

Notice the stripe through the middle where the edges are more strongly defined.

The blur tool has the opposite effect of sharpen on the tweed pattern:

Tweed pattern with blur applied

Implementation Overview

Like the sponge tool, the sharpen and blur tools are subclasses of the FilterBrush class introduced in the dodge and burn tutorial. However, unlike sponge, they do require a slight modification to the GenericBrush class to work.

Since almost all of the code in this tutorial has been covered before, I’ll just highlight the single change to the GenericBrush class, and the new Sharpen and Blur classes. If you want to see everything, download the code.

Changing GenericBrush

If you recall, at the very beginning of this article I mentioned that sharpen and blur tools typically allow the user to modify the blend mode. This is the blend mode used when stamping the filtered image back on to the original image. GenericBrush contains the code that handles the stamping, so it will need to be modified.

We change the render:at: method for GenericBrush to be:

- (void) render:(Canvas *)canvas at:(NSPoint)point
	CGContextRef canvasContext = [canvas context];

	CGPoint bottomLeft = CGPointMake( point.x - CGImageGetWidth(mMask) * 0.5, point.y - CGImageGetHeight(mMask) * 0.5 );

	// Our brush has a shape and soft edges. These are replicated by using our
	//	brush tip as a mask to clip to. No matter what we render after this,
	//	it will be in the proper shape of our brush.
	CGRect brushBounds = CGRectMake(bottomLeft.x, bottomLeft.y, CGImageGetWidth(mMask), CGImageGetHeight(mMask));

	CGContextClipToMask(canvasContext, brushBounds, mMask);
	CGContextSetBlendMode(canvasContext, [self blendMode]);
	[self renderInCanvas:canvas bounds:brushBounds at:point];


The only change made is the call to CGContextSetBlendMode that allows brush subclasses to determine the blend mode to use. The default implementation is simply:

- (CGBlendMode) blendMode
	return kCGBlendModeNormal;

That’s it for GenericBrush.

Sharpen Tool

The sharpen tool has just two parameters: mMode, which is the blend mode, and mStrength which determines how strong the sharpen effect is applied. They are initialized in init like so:

- (id) init
	self = [super init];
	if ( self != nil ) {
		// Set the default values for our parameters
		mMode = kCGBlendModeDarken;
		mStrength = 1.0;
	return self;

Here are some examples of the sharpen tool parameters:

mMode mStrength Result
kCGBlendModeNormal 0.5 Tweed with sharpen, normal blend mode, 50%
kCGBlendModeNormal 1.0 Tweed with sharpen, normal blend mode, 100%
kCGBlendModeDarken 1.0 Tweed with sharpen, darken blend mode, 100%
kCGBlendModeLighten 1.0 Tweed with sharpen, lighten blend mode, 100%

The blend modes typically offered for a sharpen tool are kCGBlendModeNormal, kCGBlendModeDarken, kCGBlendModeLighten, kCGBlendModeHue, kCGBlendModeSaturation, kCGBlendModeColor, and kCGBlendModeLuminosity. In the examples above, kCGBlendModeDarken causes the sharpening to only affect light pixels and kCGBlendModeLighten only affects dark pixels.

The heart of the sharpen tool is the filter it creates. The code for that is:

- (CIFilter *) filter
	// We need to create and configure our filter here
	CIFilter * filter = [CIFilter filterWithName:@"CIUnsharpMask"];
	[filter setDefaults];
	[filter setValue:[NSNumber numberWithFloat:mStrength * 100.0] forKey:@"inputRadius"];	
	return filter;

You’ll notice that I’m using the unsharp mask filter to do the sharpening. I should point out that most image editors don’t use this filter to do the sharpening. Instead, they seem to use a convolution filter to do the sharpening. However, the convolution filter means it is easy for the user to overdue the effect. Unsharp mask does not have that problem, plus unsharp mask is provided by the system.

The final bit of the sharpen tool is returning the correct blend mode:

- (CGBlendMode) blendMode
	return mMode;

Blur Tool

The blur tool is very similar to the sharpen tool. It even takes the same parameters. The only real difference is the filter that it creates, so that’s what I’ll present first:

- (CIFilter *) filter
	// We need to create and configure our filter here.	
	CIFilter * filter = [CIFilter filterWithName:@"CIBoxBlur"];
	[filter setDefaults];
	[filter setValue:[NSNumber numberWithFloat:mStrength * 7.0] forKey:@"inputRadius"];

	return filter;

Here I use the built in CIBoxBlur. mStrength ranges from 0 to 1, inclusive. I set the coefficient of 7 simply by trial and error until I got something that “looked right.”

Here are some examples of the blur tool:

mMode mStrength Result
kCGBlendModeNormal 0.5 Tweed with 50% blur, normal blend mode
kCGBlendModeNormal 1.0 Tweed with 100% blur, normal blend mode
kCGBlendModeDarken 1.0 Tweed with 100% blur, darken blend mode
kCGBlendModeLighten 1.0 Tweed with 100% blur, lighten blend mode


Sharpen and blur are the last of the common filter brushes. Enjoy, and download the sample code.

How to implement a sponge tool

There aren’t many bitmap tools that I look at and think “hmmm… I could probably crank that out in about half an hour.” But because of some previous work I had done, it turned out I wasn’t too far from the mark when it comes to the sponge tool.


In concept, the sponge tool is a simple tool. It either increases or decreases the saturation of an image. Saturation describes how intense a given color is. 0% saturation means the color is gray. In the implementation that I will show, the sponge tool will behave like a brush.

Thus, if I have a simple red square with about 50% saturation, the sponge tool in the saturation mode affects it like so:

Red square with saturation applied

The desaturation mode has the opposite effect on the same image:

Red square with desaturation applied

In addition to the saturate/desaturate mode, the sponge tool has a “flow” parameter, which determines how much to saturate or desaturate the image.

Implementation Overview

The sponge tool is simply a subclass of the FilterBrush class, which I created for the dodge and burn tutorial. In fact the only difference between this tutorial and that one is the new subclass Sponge. For that reason I’ll only be covering the Sponge class in this tutorial. For an explanation as to how the rest of the code works, please refer back to the dodge and burn tutorial.

As always, I’ve provided sample code for this article.



The sponge tool has two parameters which are initialized in the init method:

- (id) init
	self = [super init];
	if ( self != nil ) {		
		// Set the default values for our parameters
		mFlow = 0.5;
		mMode = kSpongeMode_Desaturate;
	return self;

The two parameters for the sponge tool are mFlow and mMode.

  • mFlow Flow determines how strong the saturation or desaturation effect is applied to the image. It ranges from 0.0 to 1.0, where 0.0 means the effect isn’t applied, and 1.0 is where the effect is at its strongest.

    Sponge in saturate mode, varying flow:

    mFlow Result
    0.25 Red square with 25% saturation applied
    0.5 Red square with 50% saturation applied
    1.0 Red square with 100% saturation applied

  • mMode Mode determines if the sponge saturates or desaturates the image. It has only two settings: saturate or desaturate. Saturate increases the color’s saturation, while desaturate decreases saturation.

    Sponge examples:

    mMode Result
    kSpongeMode_Saturate Red square with saturation applied
    kSpongeMode_Desaturate Red square with desaturation applied

In the examples I used a red rectangle with about a 50% saturation. That way, both saturation and desaturation effects will show up on it.

Creating the filter

If you recall from the dodge and burn tutorial, the only real responsibility of a FilterBrush subclass is to create a filter that will be applied to each brush stamp.

- (CIFilter *) filter
	// We need to create and configure our filter here. CIColorControls has controls
	//	for saturation which is what we care about.
	CIFilter * filter = [CIFilter filterWithName:@"CIColorControls"];
	[filter setDefaults];
	if ( mMode == kSpongeMode_Saturate )
		[filter setValue:[NSNumber numberWithFloat:1 + mFlow] forKey:@"inputSaturation"];
		[filter setValue:[NSNumber numberWithFloat:1 - mFlow] forKey:@"inputSaturation"];

	return filter;

Unlike the dodge and burn tools, I don’t use a custom filter. Instead, I use the system provided CIColorControls, which has a parameter for saturation. The saturation parameter has a range of 0 to 2, where 1 is the identity value. Thus, if I’m trying to saturate the image, I add 1.0 to mFlow, and if I’m trying to desaturate the image, I subtract mFlow from 1.0.

That’s it. The rest of the heavy lifting is done by the framework I created in the dodge and burn tutorial.


The main thing this tutorial demonstrates to me is that the time spent building a framework for filter brushes paid off. It should help in the future if I decide to tackle the blur and sharpen brushes as well.

Hopefully you found this post enlightening. Don’t forget to download the source code.

[UIImage imageNamed:] is a memory leak, but only on 2.x

Update: I was contacted by an Apple engineer saying this bug was fixed in 3.0, and asked if I could reproduce it there. I tried my sample project again, and I was unable to reproduce it on 3.0. This problem only seems to affect 2.x devices.

Not too long ago I was profiling an iPhone app that I had written for a client. Their testers had found that using a certain feature for long enough ended up causing the app to be ejected because of a low memory situation.

After some quality, yet painful, time with Instruments I discovered I was never deallocating image memory in the form of CGImageRef and its internal drawing cache. (It turns out that if you draw a CGImageRef while on a thread other than the main thread, it creates a cache of the bitmap data. However, this wasn’t my problem.)

At first I thought I simply forgot to release the UIImages somewhere. I put in some logging code and was able to confirm that I was releasing the UIImages the correct number of times, in the right places. But it appeared that someone had called retain on it an extra time.

That’s when I happened to notice that only the UIImages I had constructed with imageNamed: were failing to be released. I suspected that they were being retained in a global cache. The only problem with this cache is not released in low memory situations. Even when my app receives a low memory notification and I free up all the memory I can, the UIImage global cache just sits there clutching it’s unused UIImages to its chest, muttering.

Unfortunately for me, some of the UIImages I was loading were large, and I was expecting them to be deallocated when I released them. As the feature was used, different UIImages were loaded with imageNamed: and never released until the phone ran out of memory and my app was ejected.

The solution was to write a replacement method for imageNamed: that didn’t cache the UIImage. Here’s what it looks like:

@implementation UIImage (OrderNDev)

+ (id) imageNamedNoCache:(NSString *)name
	NSString *basename = [name stringByDeletingPathExtension];
	NSString *extension = [name pathExtension];
	NSString *path = [[NSBundle mainBundle] pathForResource:basename ofType:extension];
	return [[[UIImage alloc] initWithContentsOfFile:path] autorelease];


The moral of the story is imageNamed: is only for use with small images that are used constantly throughout the app’s lifetime. Don’t use imageNamed: if you want that memory back on a 2.x device. Everything will work fine on a 3.x device though.