Recent Work
FitFlo Membership Management Interface
As a senior product designer specializing in UI and UX, I had the opportunity to rethink the design of FitFlo's Membership Management Interface. This project aimed to enhance the usability and efficiency of managing gym memberships for both staff and members. In essence, this interface is a kiosk that staff and management use.
Key Design Goals:
Intuitive Navigation: Simplified navigation to ensure quick access to key functions such as suspending, canceling, or creating new memberships.
User-Friendly Layout: A clean, organized layout that allows staff to easily view and manage member information and payment status.
Action-Oriented Interface: Clearly defined action buttons to streamline common tasks, reducing the time and effort required by staff.
Design Process:
User Research: Conducted interviews and surveys with gym staff to understand their workflow and pain points.
Wireframing and Prototyping: Developed wireframes and interactive prototypes to iterate on design solutions based on user feedback.
Usability Testing: Performed usability testing sessions to ensure the interface was intuitive and met the needs of end-users.
Impact: Since the redesign, we’ve got a number of LOIs (letters of intent) for purchase once implemented.
This project showcases my ability to design user-centered interfaces that solve real problems and deliver measurable results. My approach is always focused on understanding user needs, iterating on feedback, and delivering intuitive and effective design solutions.
Dubber Recording Management Interface
In my role as a senior product designer specializing in UI and UX, I led the design of Dubber Corporation's Recording Management Interface play transport. This project was geared towards enhancing the functionality and usability of an interface used by sales teams, customer support teams, and team managers for listening to, noting, labelling, and tagging recordings.
Key Design Goals:
Simplified and Approachable UI: The primary objective was to create a user-friendly and intuitive interface that streamlined tasks and reduced the cognitive load on users.
Increased Efficiency: We aimed to make the web app faster and more efficient, addressing the pain points of the previous design, where playback required navigating to another page, slowing down the user experience.
Design Process:
User Feedback Collection: We gathered valuable insights from Dubber’s support team to understand their workflow and the challenges they faced with the existing interface.
Wireframing and Prototyping: Utilizing Figma, we created wireframes and simple click-through prototypes to visualize the new design and iterate based on feedback.
Usability Testing: Feedback highlighted the need for advanced filtering capabilities, which we integrated to enhance the user experience further.
Key Features:
Rich Data Presentation: The interface displays detailed information for each recording, making it easy for users to access relevant data at a glance.
In-Place Transport Controls: Users can play, pause, and navigate through recordings directly within the same page, significantly improving the efficiency of interactions.
Advanced Filtering: Implemented based on user feedback, allowing users to quickly find and sort recordings by various criteria.
This project exemplifies my ability to transform user feedback into actionable design improvements, ensuring that the final product is both functional and delightful to use. My approach is always centered around understanding user needs, iterating on feedback, and delivering designs that drive efficiency and satisfaction.
Case Studies
Scannable: a 5-star design process
Scannable is a market-leading app that grew from noticing a user habit. It’s currently rated 4.9 stars on the App Store with 377,000 reviews. So, how does an app like this get designed and built?
STARTING WITH USER BEHAVIOUR
Evernote noticed that increasing number of its users were taking photos of documents, rather than scanning with a hardware paper scanner. Evernote also noticed increasing use of its APIs from 3rd-parties that were building simplified ‘snap and save’ phone-camera-based apps. So we knew that the user model was already there…people had a paper document they wanted to either save, or send somewhere, and that loading a ‘heavy’ app like Evernote was too cumbersome for the task. So our team was tasked with designing and building a best-in-class document scanning app for iOS.
Scannable App
There were three designers that worked on the core app…with each focussed on a particular area. I focussed on the capture experience, so I’ll focus on that here.
STARTING WITH UNDERSTANDING THE CUSTOMER
The team started with three real-world ‘personas’. Or more accurately, alpha-users who we could ask to come in and actually try out prototypes, test other existing software, and meet on-site to understand their context and needs. Each of these people worked in the real world with paper documents, in a small office, home office, or roaming role. One in real-estate and the others were business consultants. They proved invaluable.
Scannable Capture Prototype
One of the first things we did was to bring them into our little design studio, and watch them use other apps that already existed in the space. We also utilised remote usability testing on other, existing apps. Something became very clear — the actual process of capturing the image was challenging to users.
Users intuited that they could be quite close to a document in order to capture it. However, and especially with the iPhone cameras of the time, they needed to be further away than expected. The consequences of this was that either the user took a partial image of the document, which our software was unable to deskew correctly. Or, they moved the required distance away from the document (usually sitting on a table or desk) and found themselves in an ergonomically difficult place to hit any on-screen capture button.
CAPTURING INTENT
So, I started to animate, and prototype a capture experience that would do three things — capture the document in the same view as the camera, show a preview of the capture, and inform the user that it was being added to the existing pages. This meant users could capture, capture, capture, without needing to resort to changing their hand position in order to capture, or review, or approve a snap.
But early prototypes showed an issue: users were suprised by the automatic capture experience. They didn’t realise the the app would capture automatically, and repeatedly surprised when it did. So, I worked on various attention-getting ‘document detected’ animations in order to help educate the user.
Attention-grabbing ‘detecting document’ animations
A simple overlay and timer told the story we needed.
In the end, a much simpler count-down timer proved to be the winner. And what looked like quite simple experience was powered by a very in-depth relationship between design and development.
And this was needed due to an inherent problem in taking a photo of a document on a desk: gravity.
THE GRAVITY OF THE PROBLEM
Paper on a desk is parallel to gravity. This means accelerometer data is useless for orienting the document, requiring some other way of working out which way is ‘up’. And our solution was extremely sophisticated — realtime on-device OCR. This solution, like many, came with a side-effect. In the case of the user taking a photo of a whiteboard (a common use case), the OCR would be unreliable due to the sloppy handwriting that is so common in white boarding sessions. A whiteboard can’t be accurately OCR’d, but accelerometer data is this case is actually the truth. So I developed a simple prototype that listed to accelerometer data on a real phone tethered to my Mac. This allowed me to work out the ergonomic angles and other subtle heuristics coming from the 3D sensor that could be ‘plugged in’ to the codebase. This allowed us to revert to using accelerometer data for whiteboard cases.
Paper on a desk is parallel to gravity. This means accelerometer data is useless for orienting the document, requiring OCR. A whiteboard however can’t be accurately OCR’d, but accelerometer data is more useful.
In the end, after healthy organic iteration in the designs, we shipped the app. And it’s something I’m very proud of — and is reflected in its continued use and extremely high App Store rating after many years of staying basically untouched.
Sunflower Labs early hardware renders.
Sunflower Labs: design and deep tech
HIGH TECH HOME SECURITY
Sunflower Labs had an audacious goal — autonomous home security drone directed by super-intelligent garden lights. And innovation, with engineering working hand-in-hand with design, was the only way to achieve it.
The startup was in early stages. Prototype hardware existed, but the aim was to show the entire experience that an end-user would would. This was to include seeing and sending basic intentions to the drone, as well as controlling the garden lighting.
SUNFLOWER AND THE BEE
The ‘sunflowers’ in the system are solar-powered, extremely smart garden lights. If you’ve ever gone to Bunnings and bought a handful of little solar garden lights, then you’ll understand. The job of the lights is usually to bring a little light and safety to those areas of your front and back yard where the houselights don’t reach.
The ‘bee’ is a sophisticated autonomous drone, with no user-controlled flight. It’s job is to literally be told where to go by the system (cutely called ‘the hive’) and to get itself there safely, without running into a tree branch, and point its camera at the area of interest. But what is the area of interest? Well, that’s where the sunflowers, ahem, shine.
Each solar-powered sunflower packs a sophisticated CPU and radio communications package, as well as a set of IR and other sensors. These let each sunflower have a sense of movement around it, measured in various degrees. A basestation took all this information from the various lamps in a yard, and triangulated to work out the speed, direction, cadence and other information about animals (lost dog or raccoon-gone-bad) or possible intruders. My initial job was finding ways to represent initially vague positional information to the user, who may decide to remotely deploy the drone.
MOVING FAST, NOT BREAKING THINGS
My next challenge was to design and work very quickly with a developer to build a prototype iPad control app. This was used for showing positioning information as well as live drone camera view. It had to be big and robust, but also be reasonably sophisticated looking as it was used for real demos to shareholders. This demo needed to work, since there was often only a single chance to demo to a single real potential investor.
Sunflower Labs: A view of the app in Garden Light Mode, showing color selection as well as lamp placement and activity.
An interesting challenge was working with very early electronics — being constrained by, and in some cases defining the very characteristics, for example refresh rates of colour interpolation, various ‘white’ colour temperatures and required physical lumens of brightness. I designed a set of themes, including animated networked styles for various modes including parties, home alert, and ‘welcome home’.
Mentoring: Building a team as well as products
EXPERIENCES IN TEACHING AND MENTORING IN PRODUCT DESIGN.
I’ve been extremely fortunate to have worked, and been mentored by, many great designers as well as developers, product managers, CEOs and others. For example, while at Evernote, I had a team of three designers report to me and I must say I think I learnt as much from them as they did from me!
As a result of the opportunities I’ve had, I try to give back to young and upcoming designers and founders in various ways.
This is the third year I’ve been invited to be a ‘Giants’ mentor at the renown Australian VC firm, Blackbird’s accelerator programme. There I help early founders understand the basics of design, product design, marketing, decisions on features, apps and all matter of things.
I also have spent multiple years as a mentor for the University of New South Wales accelerator programme. Here students are at the very early stages of ideas, and I have helped with ideation, quick mockups, and talking them down from feature-itis.
I’ve also run day-long professional masterclasses, as well as short design-101 sessions. I’ve taught Masterclass content around:
User/usability testing, including formal, guided, and guerrilla testing
Branding
App design
Information Architecture
Prototyping
Design effectiveness in organizations
and more
Other work
Swann remote camera interface
Page detection animations for Scannable
Skitch UI and interface (co-contributer), as featured by Apple