Archive for the 'Macintosh' Category

NSConference 2010 Quiz

For the uninformed, NSConference is a Mac developer’s conference put on by Scotty “The Scottster” Scott and his faithful sidekick, Tim “The Faithful Sidekick” Isted. They’re kind of the Batman and Robin of the Mac programming conference world, but have a slightly lower probability of bat-gassing you than the real dynamic duo.

This year they aren’t content with bringing conference justice to only the UK, so they’re branching out to the good ‘ole U.S. of A. by way of Atlanta, GA. To help you decide which conference you should attend, US or Europe, I’ve prepared the following quiz:

  1. What is your opinion of Seattle?

    1. It’s nice, but isn’t nearly rainy or dreary enough.
    2. I like their coffee.
    3. The residents have too many teeth.
  2. Describe your driving habits

    1. I like to drive on the left side of the road.
    2. I like to drive on both sides of the road.
    3. I like to merge right six lanes without signaling while going 147 mph on the off ramp and giving the finger with both hands in my black Camaro.
  3. The pinnacle of human achievement is…

    1. Afternoon tea
    2. Sliced bread
    3. Hee Haw

Scoring: Give yourself -1 points for any 1 answer, 0 points for any 2 answer, and 1 point for any 3 answer.

If you scored is less than zero, you should attend NSConference Europe; if greater than zero, NSConference USA. If you scored exactly zero, you are truly a cultured individual and should attend both.

Personally, I’ll be attending both, and not just because of peer pressure and insightful quizzes. I’ll be presenting a talk on how to implement a watercolor brush using Core Image and OpenCL and maybe some duct tape. If you enjoy the graphics articles that I post here, you’ll probably enjoy my presentation. If not, I hear Steve “I’m Batman” Scott does a mean Adam West impression.

Implementing AppleScript Recordability

For one of my side projects that I’m currently working on, I decided to implement an AppleScript interface. Designing and implementing one wasn’t that bad, although Apple’s AppleScript documentation was sometimes confusing. Fortunately, CocoaDev has a good overview on how to implement AppleScript support.

However, one of my frustrations with working with other app’s AppleScript interfaces was trying to figure out how the interface was intended to be used. Sure, the AppleScript Editor would show me all the actions and classes, but it isn’t always obvious how things are supposed to fit together. Something that would help in these cases is AppleScript recording. I could record the app performing the actions I cared about, then examine how the app itself used the actions and classes. Unfortunately, it seems like only the Finder and BBEdit ever got around to implementing AppleScript recording.

In the hopes of increasing the number of apps with AppleScript recordability, I’m going to document my approach to implementing it. For brevity, I’ll assume you’ve already an AppleScript interface implemented for your Cocoa app.

Thinking Big Design Thoughts

If you have an AppleScript interface for your application, you may think of your app architecture as something like this:

Traditional Model for Implementing AppleScript

Here your AppleScript interface and graphical interface are independent peers, and both modify your model classes directly to accomplish your application’s tasks. Each interface is separate and largely ignorant of the other.

However, when implementing AppleScript recordability it is helpful to think about your app’s architecture in a different way:

Recordable Model for Implementing AppleScript

In this case the GUI is dependent on, and implemented in terms of, the AppleScript interface. The general guideline is that anything the GUI does that mutates or alters the model goes through the AppleScript interface. However, if the GUI simply needs to access or get information from the model, it would go directly to the model, not through the AppleScript interface. Accessing the model can happen at seemingly random times to the user, and spamming the AppleScript Editor with these accesses when recording only confuses the user.

Suppose an application has table view and a button that deletes the currently selected item in that table view. The table view data source would be implemented the standard way, simply going directly to the model, bypassing the AppleScript layer. However, the delete button, since it alters the model, would be implemented by invoking the AppleScript delete command on the object represented by the current table row.

This design has some other benefits besides recordability. Notably it helps test your AppleScript interface design and implementation. If you find that implementing a GUI feature in terms of your AppleScript interface is impossible or difficult, congratulations, you found a bug! Also, merely testing your GUI also exercises your AppleScript interface. It is not a replacement for testing your AppleScript interface explicitly, but it certainly helps.

Implementation Hardships

Everything I’ve talked about so far hasn’t been all that novel, and probably has been met with large bucket fulls of “well, duh”s by anyone who’s ever looked into implementing AppleScript recording. The problem isn’t thinking about how to design in recordability, but actually implementing recordability.

As things stand now, doing something as simple as “invoking the AppleScript delete command on the object represented by the current table row” is incredibly involved and painful. You have to manually build up the AppleEvent that represents the delete command and the target model object using functions like AEBuildAppleEvent or classes like NSAppleEventDescriptor. Then you have to remember to target your app by specifying the kCurrentProcess process serial number (specifying kCurrentProcess as the ProcessSerialNumber is currently the only way to enable recording. Bundle identifiers, urls, and pid_t’s do not work.), and parse the AppleEvent you get back into something useful. You’d have to do this for every property or method on your model object you want exposed for recordability.

Dreaming of Ponies

The thing is, the Cocoa runtime has a lot of the AppleScript information from your SDEF file at its disposal and could theoretically generate these interfaces for you. In the ideal hypothetical situation, invoking the delete action via AppleScript inside your app could be as simple as:

// Suppose ONEmployee is our model object, with the appropriate AppleScript interface implemented
ONEmployee *employee = [_employees objectAtIndex:0];
ONEmployeeASProxy *proxy = employee.appleScriptInterface;
[proxy delete];

Here, any object that implemented the objectSpecifier method for AppleScript support would automatically get an appleScriptInterface property. The object returned by appleScriptInterface would be a proxy object implementing the same methods and properties as the original object. The proxy object would implement these methods by building up the appropriate AppleEvents, sending them, and parsing the resulting event back into a usable object.

Apple actually gets tantalizingly close to this with the Scripting Bridge. Outside users of your app can run your SDEF file through the sdp command line tool and get a nice Objective-C interface of proxy objects that build, send, and parse AppleEvents to and from your app. However, there is not currently a way tell these proxy objects to target kCurrentProcess, or to initialize one of the proxy objects by passing in a model object that implements objectSpecifier. (I’ve written this up as rdar://problem/7359646.)

Harsh Reality

Since I didn’t want to wait on Apple to extend the Scripting Bridge to make my life easier, I decided to write a couple of classes to help out. You can download the code here. The code is under the MIT license.

The classes work similar to the SBObject and SBElementArray classes in the Scripting Bridge framework. Using these classes, you can invoke the delete AppleScript method like so:

// Suppose ONEmployee is our model object, with the appropriate AppleScript interface implemented
ONEmployee *employee = [_employees objectAtIndex:0];
[ASObj(employee) invokeCommand:@"delete"];

The ASObj function creates an ASObject proxy object for any NSObject that implements objectSpecifier. invokeCommand takes care of marshalling the parameters into an AppleEvent, sending it, and unmarshalling the return value into an NSObject. The name of the command is the name of the name used in AppleScript, not the Cocoa implementation.

invokeCommand can take parameters, although it gets more tricky:

ONEmployee *employee = [_employees objectAtIndex:0];
[ASObj(employee) invokeCommand:@"giveRaise" with:[NSNumber numberWithInt:10], @"Percent", nil];

First, the parameters must be named (it’s not done by parameter order), and those names must match the Cocoa Key in the SDEF, not the user visible parameter name. Second, the marshalling code (from random NSObjects to AppleEvents) is a bit sparse. I’ve only added code for the types I needed for my project. If you use it, you may need to add support for other types. The same goes for return values; I only added support for the types that I use.

ASObject also has support for properties. For example this marks an employee as exempt:

ONEmployee *employee = [_employees objectAtIndex:0];
[ASObj(employee) setObject:[NSNumber numberWithBool:YES] forProperty:@"isExempt"];

The same type restrictions for parameters apply to properties as well. The Cocoa Key for the property must be used here, the same as parameters.

Elements also have basic support, which is where the ASElementArray comes in. Right now the only interesting thing to do with an element array is retrieve a reference to a specific element:

ONEmployee *employee = [_employees objectAtIndex:0];
ASObject *dependent = [[ASObj(employee) elementForKey:@"dependents"] objectAtIndex:0];
[dependent setObject:[NSNumber numberWithBool:YES] forProperty:@"insured"];

Unlike other methods, ASElementArray‘s objectAtIndex does not execute an AppleEvents or otherwise take any actions. Instead it constructs an object specifier (i.e. an ASObject) for the given element.

The code is still somewhat rough and incomplete, but should help with anyone wanting to implement AppleScript recording. If nothing else, it should serve as a starting point or sample code for anyone rolling their own solution.

How to implement sharpen and blur tools

The sharpen and blur tools are simple bitmap tools. They allow you to blur or sharpen parts of an image using a brush metaphor. In addition to being able to control the strength of the sharpen or blur, these tools also typically allow you to control the blend mode when applying the effect.

The sharpen tool has the following effect on this tweed pattern:

Tweed pattern with sharpen applied

Notice the stripe through the middle where the edges are more strongly defined.

The blur tool has the opposite effect of sharpen on the tweed pattern:

Tweed pattern with blur applied

Implementation Overview

Like the sponge tool, the sharpen and blur tools are subclasses of the FilterBrush class introduced in the dodge and burn tutorial. However, unlike sponge, they do require a slight modification to the GenericBrush class to work.

Since almost all of the code in this tutorial has been covered before, I’ll just highlight the single change to the GenericBrush class, and the new Sharpen and Blur classes. If you want to see everything, download the code.

Changing GenericBrush

If you recall, at the very beginning of this article I mentioned that sharpen and blur tools typically allow the user to modify the blend mode. This is the blend mode used when stamping the filtered image back on to the original image. GenericBrush contains the code that handles the stamping, so it will need to be modified.

We change the render:at: method for GenericBrush to be:

- (void) render:(Canvas *)canvas at:(NSPoint)point
{
	CGContextRef canvasContext = [canvas context];
	CGContextSaveGState(canvasContext);

	CGPoint bottomLeft = CGPointMake( point.x - CGImageGetWidth(mMask) * 0.5, point.y - CGImageGetHeight(mMask) * 0.5 );

	// Our brush has a shape and soft edges. These are replicated by using our
	//	brush tip as a mask to clip to. No matter what we render after this,
	//	it will be in the proper shape of our brush.
	CGRect brushBounds = CGRectMake(bottomLeft.x, bottomLeft.y, CGImageGetWidth(mMask), CGImageGetHeight(mMask));

	CGContextClipToMask(canvasContext, brushBounds, mMask);
	CGContextSetBlendMode(canvasContext, [self blendMode]);
	[self renderInCanvas:canvas bounds:brushBounds at:point];

	CGContextRestoreGState(canvasContext);
}

The only change made is the call to CGContextSetBlendMode that allows brush subclasses to determine the blend mode to use. The default implementation is simply:

- (CGBlendMode) blendMode
{
	return kCGBlendModeNormal;
}

That’s it for GenericBrush.

Sharpen Tool

The sharpen tool has just two parameters: mMode, which is the blend mode, and mStrength which determines how strong the sharpen effect is applied. They are initialized in init like so:

- (id) init
{
	self = [super init];
	if ( self != nil ) {
		// Set the default values for our parameters
		mMode = kCGBlendModeDarken;
		mStrength = 1.0;
	}
	return self;
}

Here are some examples of the sharpen tool parameters:

mMode mStrength Result
kCGBlendModeNormal 0.5 Tweed with sharpen, normal blend mode, 50%
kCGBlendModeNormal 1.0 Tweed with sharpen, normal blend mode, 100%
kCGBlendModeDarken 1.0 Tweed with sharpen, darken blend mode, 100%
kCGBlendModeLighten 1.0 Tweed with sharpen, lighten blend mode, 100%

The blend modes typically offered for a sharpen tool are kCGBlendModeNormal, kCGBlendModeDarken, kCGBlendModeLighten, kCGBlendModeHue, kCGBlendModeSaturation, kCGBlendModeColor, and kCGBlendModeLuminosity. In the examples above, kCGBlendModeDarken causes the sharpening to only affect light pixels and kCGBlendModeLighten only affects dark pixels.

The heart of the sharpen tool is the filter it creates. The code for that is:

- (CIFilter *) filter
{
	// We need to create and configure our filter here
	CIFilter * filter = [CIFilter filterWithName:@"CIUnsharpMask"];
	[filter setDefaults];
	[filter setValue:[NSNumber numberWithFloat:mStrength * 100.0] forKey:@"inputRadius"];	
	return filter;
}

You’ll notice that I’m using the unsharp mask filter to do the sharpening. I should point out that most image editors don’t use this filter to do the sharpening. Instead, they seem to use a convolution filter to do the sharpening. However, the convolution filter means it is easy for the user to overdue the effect. Unsharp mask does not have that problem, plus unsharp mask is provided by the system.

The final bit of the sharpen tool is returning the correct blend mode:

- (CGBlendMode) blendMode
{
	return mMode;
}

Blur Tool

The blur tool is very similar to the sharpen tool. It even takes the same parameters. The only real difference is the filter that it creates, so that’s what I’ll present first:

- (CIFilter *) filter
{
	// We need to create and configure our filter here.	
	CIFilter * filter = [CIFilter filterWithName:@"CIBoxBlur"];
	[filter setDefaults];
	[filter setValue:[NSNumber numberWithFloat:mStrength * 7.0] forKey:@"inputRadius"];

	return filter;
}

Here I use the built in CIBoxBlur. mStrength ranges from 0 to 1, inclusive. I set the coefficient of 7 simply by trial and error until I got something that “looked right.”

Here are some examples of the blur tool:

mMode mStrength Result
kCGBlendModeNormal 0.5 Tweed with 50% blur, normal blend mode
kCGBlendModeNormal 1.0 Tweed with 100% blur, normal blend mode
kCGBlendModeDarken 1.0 Tweed with 100% blur, darken blend mode
kCGBlendModeLighten 1.0 Tweed with 100% blur, lighten blend mode

Conclusion

Sharpen and blur are the last of the common filter brushes. Enjoy, and download the sample code.