Client
Engineering brand
Sector
Manufacturing and engineering
Duration
8 weeks
What we did
Design research
User experience design
Training and workshops
Looking at the workshop wall with usability testing results

Our client was embarking on the next stage of their digital transformation. Having undertaken a significant migration of content from their North American websites they were looking to then optimise and personalise the user experience for a family of websites across the Europe, Middle East and Africa (EMEA) region.

The newly formed Customer Experience team is responsible for running and developing websites containing detailed marketing information on 1000s of products. The distributed team oversees the quality and functionality of the organisations websites across the EMEA region. This requires any content and features to be translated and localised for multiple markets.

Although the company has an established agile delivery team and local marketers that are the product and territory experts, they had a gap in creating a consistent and scalable online experience that showcases the quality of the products they sell.

This new, small team needed a way to quickly grow design capability that was effective and sustainable with limited resources.

The Results

A prioritised and delivery focussed roadmap

Working at speed and scale requires the effective use of time. The Customer Experience team is now equipped with robust tools and techniques to identify and prioritise work and to measure its success.

A method to benchmark digital maturity

Change doesn’t happen in 8 weeks. To keep the team on track for long term improvement we carried out a digital maturity assessment to provide recommended practical actions for the team’s development.

New skills and repeatable processes

We coached the team in techniques for customer research and rapid collaborative design creation that they could confidently introduce to local marketing teams across the EMEA region.

The Full Story

The size of the digital team was overshadowed by the scale of the challenge. It was both daunting and exciting to see how we could help improve the quality of the output at the speed this fast moving business demanded
Chris How
Chris How UX Lead, Clearleft

How can you identify where to start?

The way design is undertaken is subtly different in every organisation. To allow us to get an in-depth and holistic view of the end-to-end process for digital creation, and to see where the Customer Experience team could add the most value, we borrowed a technique from the world of service design.

Whilst on our initial discovery visit to the client’s offices in Europe we worked with the team to create a service design blueprint. This diagram, originally mapped out with Post-it notes, followed the journey of digital work from inception, through production and onto publication and beyond. By visualising the relationships and touchpoints between people, processes and tools we could identify points of flow and friction, and clearly see the overlaps and gaps in the current ways of working.

As practitioners in an agency, we get to work with teams of different shapes and sizes across many sectors. As an in-house team, they could highlight the unique realities and contexts of how design operates within their business. The powerful combination of these two perspectives allowed us to plot the design landscape and more importantly to spot some targets for gaining efficiencies.

Post its of what's in and out of scope at the kick off workshop
The kick off workshop

How do you ship products whilst establishing new processes?

There were some misperceptions in the company that the involvement of the Customer Experience team would slow down digital delivery. To counter this we were keen to introduce processes that demonstrate the value they could offer that would also speed up production.

To do this we picked two distinct approaches, in two different business areas, to trial using live production projects that would help the team to follow the principle of learning quickly and scaling success.

The first pilot focussed on hypothesis-driven design. We developed and documented a repeatable process to surface assumptions and accelerate potential solutions. This included close involvement with the marketing teams and business analysts. To prioritise the hypothesis we introduced a calculator to systematically score each idea based on a formula taking into account value, effort and confidence.

A table showing a mean score for a number of ideas
The calculator equation

Design optimisations, with the concept of tweaking rather than making wholesale changes, were then made and A/B tested against the current solution on the live website. The introduction of a measurement loop, based on the original hypothesis, enabled successful changes to be repeated and rolled out over other sections of the website.

The second pilot involved qualitative usability testing followed by collaborative rapid co-design. This helped the team develop the habit of doing usability research and for the wider business to see the value of testing with the intended audience.

We started by setting a date for testing and creating a screener to recruit the target audience. The day before the sessions, we ran an intensive hands-on day’s coaching in facilitating moderated usability research. The team took to the task with aplomb and the next day ran an insightful day of usability testing observed by colleagues and stakeholders.

The observations from the sessions were analysed to find emerging themes and areas for enhancement. This quickly led to collaborative lo-fi sketching sessions to create solutions to what had come out of the research.

It was rewarding to see the team so confidently carry out usability testing and to hear from the observers how useful they found the sessions
Maite Otondo
Maite Otondo Design Researcher, Clearleft
Reviewing results from user testing
Reviewing results from user testing

Throughout both of the pilots, we kept up daily communication. Sometimes on-site together, but more often using tools for remote working including Miro, Sketch and the Google suite. This was important as we wanted to mirror the way this pan-European team typically works.

As the pilots progressed it became apparent that the Customer Experience team held a vital role in being a conduit between marketing and the agile delivery team. Their growing digital expertise helping to translate and shape the needs of the business and its audience into impactful interfaces.

Customer Experience as the conduit between marketing and delivery teams
Customer Experience as the conduit between marketing and delivery teams

How do you leave a legacy for long-term change?

To wrap up the engagement we firstly created the start of a UX Hub as a repository of knowledge and information. This was set up using the company’s wiki, so it is available to all employees, the UX Hub contains a step-by-step toolkit of techniques, results and reports from qualitative and quantitative research, and a section where insights can be extrapolated into nuggets of evidence-based best practices. The intention is the UX Hub is a living curated repository that will grow over time.

In addition, we undertook a Design Maturity Assessment of the business to benchmark their current situation and to make actionable recommendations going forward.

The Design Maturity Assessment is a proprietary tool we have developed at Clearleft to quantifiably measure the state of digital practice within an organisation. It collates data from three distinct sources via an anonymous survey: from the client’s digital team, the consumers of the team’s digital services across the business, and finally, the Clearleft practitioners who have experienced working with the organisation. An algorithm then generates a percentage score for five factors we believe are indicators of design maturity, these are:

  • Collaboration - the ability to build shared understanding & alignment
  • Empathy - the organisation's curiosity in customers and human-centred design
  • Impact - design's contribution to business success
  • Trust - the organisation's empowerment, influence and belief in design
  • Purpose - how design is deployed to help solve significant challenges
The Design Maturity Assessment showing the five factors and a circle for the score out of 100
The Design Maturity Assessment assesses five factors out of 100. Delivering an overall /100 score

As a repeatable and robust survey, it is intended to be used periodically to measure changes in digital design maturity. More importantly, the scores are accompanied by practical and actionable suggestions for what to tackle next to move along the design maturity scale.

We were conscious that the long term value for the project will only be realised after we have completed our engagement. From the outset, we had plans to help the team repeat and perfect the activities we were introducing.

Transforming processes and people takes time. It usually requires incremental bursts of activity to embed new practices and time for reflection and iteration to turn new ways of working into habitual ways of working. We feel we’ve left the team with a bigger toolkit of techniques to increase the speed and quality of their work, and the organisation with a greater understanding and awareness of user-centered design methods.

More work