How Bobbu Abadeer does user research

Back home

The ask

As part of the digital transformation program for a project which required Government Design Service (GDS) compliance, the online application for a public service feature needed transforming from an ancient, poorly supported user experience into a clear, accessible, responsive, easy-to-use, fast user experience. Of course, we also hoped to bring a little bit of delight into it, too.

Challenges

There was little to no useful data on the existing form – no analytics, and no prior user testing. So we had only the volumes and a breakdown of some of the content of the applications. There was ongoing debate around whether our higher priority should be transformation of the end-to-end process of application, or transition of the existing process onto a new tech system. Two different design guides had to be adhered to in order to integrate with appropriate authorities, both being incomplete, and in some places they conflicted with each other.

Screenshot of the legacy online application form
The legacy online application form

Identifying requirements

The first order of business was to gather the business requirements, as the application form serves a very important function in gathering information to fulfil the key user-facing service. We worked with a business analyst to identify what data was required by the service, and in what format. We also discussed with the client what we wanted the experience of filling in the application to be – identifying key goals such as “fast,” “easy” and “informative.” The design team built a Sketch pattern library together to use during wireframing, which was built using the company design guidelines, supplemented by the GDS style guide when necessary. This created a unique and pretty comprehensive set of components to use, which would enable us to work swiftly and consistently.

Initial Research

We ran extensive research in collaboration with the client research team. This included planning and running various formats of investigative work:

This provided us with a wealth of qualitative information about user experiences with the current form, and quantitative data about the contents of applications.

User journeys

Taking the existing form, we broke it down into a user journey that represented how a standard user would traverse it. We used this to identify particular points of friction that could be erased. Then with BAs, the PO and the research team, we developed a user journey plan that separated groups of questions that served common business needs into coherent queries to pose to the users. This helped to adopt the conversational approach to the experience that we wanted to create.

A hand-drawn workflow of the user journey
Please do not judge me based on my handwriting. Translations are available on request.

Wireframes

Using the pattern library we had built up, we were able to quickly put together high-fidelity designs in Sketch that allowed us to quickly test our initial idea based on the user flow.

Guerilla testing

We printed out our quickly composed screens, and presented them to a random sample of the local public in a quick and dirty bit of guerrilla testing. This was extremely useful for gathering some rough, first-encounter feedback about how much sense our question groupings made to end users. It also gave us some very positive feedback of some of the core ideas, while shining a light on areas we had overlooked or overcomplicated. With our initial ideas tentatively validated, and some meaningful criticism to build upon, we felt confident enough to put some more work into this idea.

A selection of the wireframes created for steps in the online application process
Some of the high resolution wireframes (there were a lot)

Prototype

We coded a prototype from scratch, using only a very basic javascript script for managing multi-step forms. We chose to avoid the GDS prototype toolkit, as it was quite immature at the time. We also wanted to make our initial prototype as mobile as possible, so that we could run it anywhere without having to set up an environment, or have to rely on a web connection. This was made to be as close to the finished interface as possible, without plugging in any back-end functionality or worrying about code quality - but while ensuring we kept to high accessibility standards.

A snippet of the code for the prototype
Yeah, I can code. Don't make a big deal out of it.

Testing

We took this prototype out to do as much testing as possible, across as wide a variety of our audience as possible. This specifically included the following groups, due to their relevance to the project audience needs:

All of this testing was done with an appropriate combination of mobile and desktop usage, either in homes and businesses or in research lab conditions.

Iteration

We took the information we gathered after each round of testing and identified all the comments that were significant, and allocated priorities. This included things like flagging up show-stopping bugs and dead ends in the user journey, noting discrepancies in the design consistency, identifying friction points, and any unclear content. We addressed these problems in order of priority, working through solutions, testing them again, and putting appropriate solutions into the backlog for delivery.

Delivery

As features we considered to have been tested enough to ensure they worked well enough as an MVP, they were transcribed into user stories and put into the backlog for delivery. Improvements discovered by testing were added in as they were discovered, and run through estimation and prioritisation exercises to ensure delivery at the appropriate time. This constant addition of features and improvements allowed the development team to operate in a truly agile fashion, adjusting the angles of upcoming sprints to accept new enhancements while also keeping to a well-defined set of priorities.

Mockup showing the new online application form on desktop, mobile and laptop
The new online application form

Post-live

We did not stop iterating and improving after the initial release of the application form into beta. Once it had real users interacting with it, we had the opportunity to examine real-time analytics that we installed on the site. We also gathered feedback provided via a form added to the end of the journey specifically for users to provide us with their thoughts. We used the information gathered by these to help inform and direct our continued user testing, and refine the experience design.

Outcome

I can happily report that the end result of our work was a responsively designed online application form that consistently received very positive usability test feedback; showed an extremely low drop-off rate in our analytics; garnered glowing feedback from the feedback form; offered an AAA-rated accessibility experience; and collected reliable data for the processing of important applications. A success all round!