PwC

Consulting

PwC

My role was to design a tool to improve efficiency of an operations team by digitising parts of their consulting processes.

I was a UX designer working alongside one other designer, a user researcher and product manager.

Overview

PwC's Operations & Transformation team sought our assistance to improve their efficiency internally. This team advises external companies on optimising their entire operation lifecycles, focusing on areas such as supply chain, procurement, and management to enhance productivity and overall performance.

We collaborated with the team to identify ways to reduce the time and effort required to produce insightful and valuable reports for their clients.

Talking to users

#1 Time-consuming

Current processes involved manually processing the data from the research and compiling a report in a condensed form.

Example: PwC employees were required to manually analyse survey data, often involving up to 500 responses, and subsequently generate graphs in Excel. Any changes to the data necessitated manual updates across multiple locations.

Impact: It would lead to the team rushing to meet the deadlines set by clients, sometimes working overtime to complete the reports.

#2 Low employee productivity

Manual follow-ups and approval requests from colleagues would mean work is not standardised or structured.

Example: To produce a single report for a client, it would take a team of 3 over the course of two months.

Impact: Low producutuver and employee morale

#3 Technology creating additional delays

A huge backlog of tasks to complete during busy periods due to the labourius processes.

Example: PwC would go to companies to carry out research on how the place conducts every-day activities through surveys, questionnaires and interviews. All this data would be recorded, added into an spreadsheet to produce graphs to then be added into a document with written insights.

Impact: Led to human errors or a massive chunk of their time ensuring data was entered and calculated accurately.

User research

Eight people were interviewed: six from the team who would use the tool on a day-to-day basis, and two stakeholders who were not users but were interested in the potential for expanding the course beyond the organisation.

Affinity mapping

Eight people were interviewed: six from the team who would use the tool on a day-to-day basis, and two stakeholders who were not users but were interested in the potential for expanding the course beyond the organisation.

Types of users

To understand who our users were, we interviewed three different groups as each one had a different user need. We needed to understand how each group would acheive the goal of the product and be a frequent user and what would motivate them to continue using the tool.

What users wanted

In order to grasp our user base, we conducted interviews with three distinct groups, each exhibiting unique user requirements. Our aim was to comprehend how each group would utilize the product to achieve their goals, become regular users, and identify the factors motivating their continued engagement with the tool.

Convenience

Users wanted key stats to be drawn without having to do manual calculations to monitor a child's attendance.

Transparency

It should be clear how often data is refreshed. Users want to feel confident using the tool. 

Sharing

It should be easy to share attendance data with local authorities or parents. 

Easy onboarding

Current journey to access the tool required too many permissions and deterred them from using the service consistently. 

Changes we implemented

Changes implemented

Search

Addition of a search filter to quickly find attendance of a pupil.

New dashboard

Taking into account feedback from users on what key stats they would like to see on a daily basis with a search feature to find a pupil easily. 

Easy sign-up

Reduced the number of steps for onboarding, with a single-sign up and log in process.

Page navigation

Reduced the number of individual pages within Power BI and re-organised page navigation.

What we learnt

Know the limitations

Speaking to Power BI developers helped me gain an understanding of how to make tools accessbible way before I started designs.

To not assume

I had to consider the fact that our users were not familiar with how dashboards work. As a result, I focused on creating a user-friendly page navigation system. This understanding of our users' experiences also helped me design data visuals that were specifically tailored to their needs.

Fail fast

It's a necessary step to ensure whatever you create is exactly what users want. I created multiple iterations of concepts and tested them fast to rule them out as not desirable.

Reflections

At the beginning, the focus was solely on the product and how it would look and function by itself. An MVP was launched with little user research or insights, with the goal of providing a tool to help schools track pupil attendance. However, a large part of building a successful tool was missing beyond the actual UX and design. How would onboarding work? What would be the steps for account creation? Would we need to provide tutorials to help users onboard? What happens when they need troubleshooting?What I discovered was that these processes of product development, onboarding, usage, and support don't act alone, or you risk ending up with a brilliant tool with low usage and retention rates. It confirmed my understanding of how product development works: to create a great tool, it's not just about great UX but also about the experience of the whole journey.

I’m Samiya — a product designer based in London

©2024

I’m Samiya — a product designer based in London

©2024

I’m Samiya — a product designer based in London

©2024