Department for Education

Education

Department for Education

My role was to increase usage of the service by re-designing the attendance tool and increasing the value proposition of the core product. 

My role was to increase usage of the service by re-designing the attendance tool to improve user experience and encourage usage of the app.

Sole interaction designer working in a team of content designers, business analysts, data scientists, product managers, and user researchers

Overview

A government tool to monitor attendance in schools, used by various educational bodies and internal DfE analysts. The service currently has over 20,000 signed up to share attendance with DfE and an 86% usage rate.

Daily Attendance Tracker, developed by the Department for Education, supports schools, local authorities, and trusts in England by providing comprehensive monitoring of student attendance through dashboards and reports.

These reports serve as critical tools for government and school officials to make informed decisions aimed at enhancing pupil attendance. Notably, the focus is on identifying and addressing the needs of vulnerable children who may otherwise be at risk of slipping through the cracks.

Declining levels of pupil attendance after lockdown

During the COVID lockdown, schools witnessed a significant decline in pupil attendance rates, posing challenges for schools, local authorities, and trusts in England to devise strategies for improving attendance, particularly in underperforming areas.

Before 2020, the process of collecting pupil attendance data was infrequent and laborious. Schools manually entered data for each student once a term, which was then submitted to the Department for Education to assess national performance. This delayed data collection made it challenging to provide timely guidance to schools on improving attendance or addressing early signs of declining attendance.

Talking to users

#1 No relevancy

Low conversion rates as schools were required to go through a long process to sign-up to the tool. 

Example: The sign-up feature required 2 weeks for processing and many said by the time their account was opened, they had found another way to acquire the information. 

Impact: The initial sign-up rate was 38%, a lot of users began the process and ended up clicking out of the tool. 

#2 Cognitive overload

Excessive data without categorising led to users getting overwhelmed. 

Example: The main table included 14 columns of data, with no option to filter or search what they were looking for.

Impact: This led to users looking at the wrong sets of data and drawing incorrect conclusions.

#3 Dead ends

It was unclear what a school was expected to do with the data or how to proceed through the journey. ​

Example: There were no clear CTA's once they accessed the dashboard or options to export/print. 

Impact: Users were having to screenshot the dashboard and then copy out this information into their own systems which took too much time. Resulted in low retention rates.

Gathering statistics to back the ideas

Gathering statistics to supoort the ideas

User research aimed to understand how schools want to use the product and how it would achieve the business goals. 

As a result of combining both quantitive and qualitative data from interviews and other research methods, we built a strong set of personas, depicting customer needs, behaviours and motivations.

User research aimed to understand how schools want to use the product and how it would achieve the business goals. 

As a result of combining both quantitive and qualitative data from interviews and other research methods, I built a strong set of personas, depicting customer needs, behaviours and motivations.

User journey mapping

Outlining the experience the user had with the tool, highlighting their emotions and building a narrative to highlight opportunities for improvements.

Key analytics

Reviewing usage and retention rates, duration and geographical usage using Microsoft Clarity.

Interviews

1 to 1 interviews conducted with schools and local authorities main users of the tool to identify pain points and manage expectations.

Types of users

Capturing all the users

To understand who our users were, we interviewed three different groups as each one had a different user need. We needed to understand how each group would acheive the goal of the product and be a frequent user and what would motivate them to continue using the tool.

School Headteacher

Looking to get quick insights on attendance. Finds huge amounts of data difficult to decipher so wants key statistics to make decisions.

'"I want to see attendance data from the previous day when I log in every morning and find the students with unexplained absences."

Local Authority Officer

Comfortable analysing large quanities of data and relaying this back to schools.

"I want to export all the data to do my own analysis on a frequent basis."

Trust Attendance Leader

Hesitant to use new data tool but willl try it out if it saves a lot of time.

'"I like to create reports on how the schools in my trust are performing and target the schools that are performing lower."

What users wanted

Hearing their voices

I grouped the answers from the interviews and surveys using a card sorting activity amongst the team.

In order to grasp our user base, I conducted interviews with three distinct groups, each exhibiting unique user requirements. I aimed to comprehend how each group would utilize the product to achieve their goals, become regular users, and identify the factors motivating their continued engagement with the tool.

Convenience

Users wanted key stats to be drawn without having to do manual calculations to monitor a child's attendance.

Transparency

It should be clear how often data is refreshed. Users want to feel confident using the tool. 

Sharing

It should be easy to share attendance data with local authorities or parents. 

Easy onboarding

Current journey to access the tool required too many permissions and deterred them from using the service consistently. 

Changes we implemented

Search

Addition of a search filter to quickly find attendance of a pupil.

New dashboard

Taking into account feedback from users on what key stats they would like to see on a daily basis with a search feature to find a pupil easily. 

Easy sign-up

Reduced the number of steps for onboarding, with a single-sign up and log in process.

Page navigation

Reduced the number of individual pages within Power BI and re-organised page navigation.

Changes implemented

When designing the new interface, I gathered all the requirements, all the findings from the user research to ensure I could go into a stakeholder meeting explaining my design choices and using the data to solidify my reasonings. A few requirements were deemed as non-optional from non-user stakeholders and after confirming these were wanted with schools and staff who would be using the tool daily, we factored these into the final designs. Some of these requirements were:

Once the high-fidelity wireframes were built in Figma and confirmed with the dev team as possible to build, I ensured the onboarding journey and each page within the tool was accessible using tools within Power BI and GDS guidelines.

Accessibility

Internal audit

Carried out an internal audit of tool before it was ready for a Service Assessment. Used these findings to make accessibility improvements. 

Accessibility tools

Carried out changes to the tool to make it compatible with screen readers and A11Y Colour Contrast Accessibility Validator.

Easy sign-up

Reduced the number of steps for onboarding, with a single-sign up and log in process.

Page navigation

Reduced the number of individual pages within Power BI and re-organised page navigation.

What we learnt

Know the limitations

Speaking to Power BI developers helped me gain an understanding of how to make tools accessbible way before I started designs.

To not assume

I had to consider the fact that our users were not familiar with how dashboards work. As a result, I focused on creating a user-friendly page navigation system. This understanding of our users' experiences also helped me design data visuals that were specifically tailored to their needs.

Fail fast

It's a necessary step to ensure whatever you create is exactly what users want. I created multiple iterations of concepts and tested them fast to rule them out as not desirable.

Reflections

At the beginning, the focus was solely on the product and how it would look and function by itself. An MVP was launched with little user research or insights, with the goal of providing a tool to help schools track pupil attendance. However, a large part of building a successful tool was missing beyond the actual UX and design. How would onboarding work? What would be the steps for account creation? Would we need to provide tutorials to help users onboard? What happens when they need troubleshooting?What I discovered was that these processes of product development, onboarding, usage, and support don't act alone, or you risk ending up with a brilliant tool with low usage and retention rates. It confirmed my understanding of how product development works: to create a great tool, it's not just about great UX but also about the experience of the whole journey.

My design process

I’m Samiya — a product designer based in London

©2024

I’m Samiya — a product designer based in London

©2024

I’m Samiya — a product designer based in London

©2024