How Bobbu Abadeer does UX design

Back home

The ask

The client needed a new vetting system as part of their digital transformation project, which would be quick to use, and ensure the accuracy and thoroughness of their employee’s vetting decisions. I worked with BJSS to design and deliver this system in a challenging, high pressure environment.

Challenges

The business requirements for the system were unclear in places, as the process was not always clearly defined. The existing system was not fit for purpose, as it had been built a long while back for one process, and the business had moved on from then. So it brought with it a lot of preconceptions of what the process should look like which weren’t driven by requirements, but by human workarounds that had become part of the business process due to technological shortfalls. There was also some reluctance to change the process they had, as it was working for them at present. This fear of change was a major challenge that we sought to overcome through involving them in the design process from beginning to end.

A view across the river in Glasgow
It was nice working in Glasgow though

Definition

Through unstructured interviews with our audience sample, both individually and in groups, we extracted a number of scenarios that the system had to be able to handle, and mapped out user journeys that covered these. This was done in collaboration with the business analyst, who worked with us to extract business needs alongside our own exploration of user needs. We also collected pain points in the existing system at the same time.

hoto of a whiteboard covered in post-it notes, most of which have had their contents digitally obscured
I can't show you much because of the security level of the project

Collaboration

We ran a large number of co-design sessions over a few weeks to work through the vetting system design, bringing the delivery team and the key stakeholders along the entire design journey. These sessions went through several stages:

Photo of a whiteboard with multiple designs on it, most of which have had their contents digitally obscured
The outputs from communal wireframing sessions

Prototyping

Once we had a solid set of journeys we wanted the users to follow, and a clear idea of what UI patterns we wanted to solve the various problems along the way, we got straight on to prototyping. This prototype was created in Sketch, utilising the inbuilt prototyping tools to create a clickable but non-functional prototype. This was brought back into co-design workshops so that the users could have input into the final design as much as they were involved in the information gathering, and appreciate the reasoning behind design decisions.

The logo for the Sketch design program
Sketch was the only real option when we did this; I actually prefer Figma now

Testing

We ran testing sessions with users who had not been part of the co-design sessions, to help neutralise the tendency towards groupthink, and any bias created by our sample audience having an inside understanding of how it was intended to work. This mostly involved usability testing, but also card sorting and structured interviews.

Refinement

With the additional insight gained from the testing sessions, we refined our prototype and came back to our co-design with new problems and new ideas. We worked together with the delivery team to resolve these using the same great collaborative environment we had built. If we felt it necessary, we ran further testing sessions and brought more data back to the discussion, until we were happy with the service we had designed.

Delivery

Having involved the BA and the developers in the design discussions from the beginning, the transition into delivering the design was extremely smooth. The user stories were constructed as we progressed through the design, and the developers understood the purposes of every component. We continued to work with them on any issues or uncertainties they discovered when estimating and delivering stories, ensuring that the finished product was consistent not just with the pixels of the design, but the intended experience.

Ongoing improvement

Just because delivery was well underway didn’t mean we stopped working. We continued to test the prototype, prioritise issues and work on solutions to things that were not yet in sprint. We added enhancements into the backlog to be prioritised and worked on once the MVP had been delivered. We put together a plan for how to monitor the performance of the new system, and ways to continue developing it after we left.

Outcome

We created a process that redefined and optimised the user journey based on the original system. The users were left with an efficient system, a group of power users who understood not just how it worked, but why it worked the way it did. It was extensible, flexible and clear. It had been validated through testing to show it helped users produce fast, reliable decisions.

The delivery team were also armed with the skills to continue to work on the design as they progressed, due to being brought through the design process from beginning to end. The design proved and would continue to prove excellent in reducing human errors; due to a reduced cognitive load from a clear, well-distinguished journey that included in-built auditing and QA tools. Most importantly, they were happy with the experience they were provided with, in terms of efficacy, clarity, and ease-of-use.