If you provide usability testing as a service, it’s pretty standard for your clients to expect video footage to be digitally recorded and piped into a viewing room.
So what do you when you’re usability testing on an iOS device? You can use a kit like Mr Tappy to mount a camera pointed at the screen, but it’s a little awkward and a software solution would be much tidier. This has historically been a blackspot, but recently Reflector has made it possible, along with a couple of hacks that I’ll explain below.
Reflector allows you to mirror iOS devices on your Mac’s screen.
Let’s start off with the viewing room. Reflector turns your Macbook into an Airplay receiver, so you can have the screen of your iOS device appear on your Macbook’s screen as if you were using an AppleTV. It’s simple (instructions here) although it only works on newer iOS devices. To get this footage showing in your viewing room, all you need to do is run a long HDMI cable out of your Macbook (which will be positioned in front of your participant) through into other room, then mirror the screen in System Preferences. Although it’s clunky, it’s reliable: unlike streaming there’s no risk of lag or drop-out. Next you’ll want to get audio and footage of the user’s face. Just stick a DV camera on a tabletop tripod and run an AV cable out of it into a TV in the viewing room. The picture you’ll get from any old DV camera is going to be pretty clear in comparison to a webcam, so it’s worth the effort. If, for some reason, you’re determined to stream it over IP, you could instead try using Wirecast (which looks like an awesome app and I’ll probably be reviewing it soon).
So that’s the viewing room sorted. Next up you want to set up that digital recording of the iOS device’s screen along with the user’s face. Silverback does the trick (disclosure: this is one of our products at Clearleft, where I work). It’s intended for recording your Mac’s desktop. With Reflector running, your Mac’s desktop happens to be showing the iOS device’s screen – so voilà , problem solved. Simply position your Macbook so that the built-in webcam is pointed at the participant’s face. There’s a sample video below.
Sample Silverback recording of a Reflector session.
There are two shortcomings you should be aware of. Firstly, you’ll notice in the video footage that you can’t see what gestures the user is doing. In many cases this isn’t a huge problem, but if you really need to see this then you might be better off using something like Mr Tappy to point a camera at the screen. Secondly, Silverback and Reflector have not been designed to work together and they might not be reliable on your machine. Having said that, they run just fine on my Mid-2011 Macbook Air.
Edit 1: good news everybody! Squirrels LLC are kindly offering 10 free licences of Reflector (normally $12.99) to readers of this blog.
Edit 2: All the licences have been snapped up, though you can still email Napkin Studio for an extended trial.
Edit 3: All the licences and extended trials are gone now. If you want Reflector, go buy it – it’s only $12.99!
Thanks for the tips Harry. If you need a bit more range we’ve found an Apple TV works quite well for beaming out the footage a bit further without the need for wiring. Although you lose the ability to record the footage. For all the PC users out there, there is also some software called AirServer for PC (and Mac too) that does much the same thing as Reflector (but even allows multiple iOS devices to stream to it at once).
We’re going to start using the same combination of Reflector and Silverback, but previously we were just using Reflector and the built-in screen recording option in QuickTime. This allows you to record the app usage in addition to any voice recording of the user using the app.
A more rudimentary way we’ve done to record video of the user and stream to another room is simply initiating an iMessage video or Skype video chat and recording the screen to another room.
To get around the problem of not seeing where the user is touching the screen, if you are testing on your build, you can integrate Touchpose (https://github.com/toddreed/Touchpose) which will display where the user is touching the screen on mirrored displays such as your Reflector screen.
I’ve found UX Recorder in the Apple App store for iPhone and iPad and it basically does all that Silverback does for the mac but on mobile devices, capturing interactions at the same time as capturing the front-facing cam.
I have not tried it myself but it seems like a much less complicated option.
UX Recorder is Web only – it’ll record usage of web pages, but not apps. I’ve also heard that the recordings aren’t quite as good, so that’s worth checking before picking a solution.
This is the solution that I use for my mobile user testing. However we need to user test an android only app, so does anyone know of a similar solution for android?
Also, when I’m on a network that blocks Airplay, I use a USB cord and the iTools (http://itools.hk/en_index.htm) mirroring function instead of Reflector.
BugClipper is an iOS SDK that allows you to capture screen videos with voice on your iDevice. It easily fits inside the app and lets you do a lot more. http://www.bugclipper.com