Introducing product best practices in an old-fashioned software company

Introduction

In 2022, I joined a software company in the risk prevention sector. As pioneers in their field, in the country with the toughest regulations (but highest rate of accidents), the company was very confident in their know-how and were looking to expand to other territories.

The 20-year-old platform was over-complicated, its code obsolete, and it couldn’t face the growth and the internationalisation that the company was aiming for, so they decided to start a new tool from scratch. A lighter SaaS tool that wouldn’t rely on internal consultants setting everything up for each new customer.

For that purpose, they increased the development team by 3x in less than a year, and started building the new platform with the only input of the internal employees who designed the previous one. With no research, ownership of the product scattered all across the company, a very junior dev team, no unit testing in place, and no guidance on usability best practices, I very soon realised that I had accepted a bigger challenge than I expected.

Main tasks

  • Mentorship of a small UX team: although there were people working with a UX role in the company, their profiles were junior and too focused on UI, more than in a holistic approach, and also new to Agile methodologies.
  • UX review of the current state of the platform: from the login page to the end of the user flow, each step of the process was isolated from the previous and next step, and each screen had copy, interaction and UI problems.
  • Review of the front-end: as, so far, every developer had been given the freedom to create their own components, standardisation of the code and achieving consistency in the experience were huge challenges. Luckily, the design system of choice was pretty well documented.
Funny thing, the chosen design system was documented in… Chinese.
  • Interviews with stakeholders: many of the business decisions were being made by the dev team during refinement sessions, and whenever a business rule wasn’t clear, the team would make the choice that made most sense to them, without testing with users or checking with the business.
  • User interviews and testing: I interviewed several people who used the current software, and also other people from regions where the risk prevention is still starting to be a thing. I found out that many of the features that had been developed, were replicas of patches that were included in the old software because its outdated technology didn’t support a more adequate solution. I also found that the data model didn’t respond to the needs of the users from other regions.
  • User & buyer personas: in order to help the dev team understand the needs, skills and priorities of the potential users and the decision makers, I put together a set of proto-personas with the help of marketing, customer success and consultants. Then I validated my assumptions with potential customers that were interested in buying the platform and shared the outcome with the engineers, who started applying this new knowledge straight away in their discussions.
Reminding the developers and the business about the skills of the users always helps keep them grounded.
  • Journey mapping and detection of user pain points: with poor product guidance, developers had created a list of disconnected features that could theoretically fit the needs of the user, but only if the user had a map, instructions, and training. Not what you expect from a SaaS unless you are happy to spend millions in a huge help-desk team, right? I mapped the user flows on Miro and the points where the flow was broken, which gave me a clear idea of where to start. In less than a month, all the pain points were detected and their solutions defined.
Analysis of gaps in the user flow (with undefined business rules in yellow, dev limitations in red, navigation issues in green).
  • Guidance on accessibility: the company hired a design agency to come up with a new brand, colours and look & feel of the platform, so I assisted them in the process, explaining the limitations set by accessibility standards, to make sure that our out-of-the-box design system would be adaptable to the final design without overwork. We managed to properly apply the brand in only one sprint.
  • Content design and glossary documentation: from the names of the entities involved in the user-flow, to the capitalisation of the text, a very detailed glossary was very much needed. In about 6 months after I first shared it, the consistency improved significantly, and most developers were writing new code following the same guidelines.
  • High fidelity mock-ups: to facilitate testing of the UI and interactions, I created very detailed mockups of each page, including all the possible interactions and states of each element and describing interactions. It took only a day for some of the developers to engage with the new work-flow, but about a year until 100% of them were collaborating in Figma.
Figma’s comment feature was really helpful when collaborating with developers.
  • Emails, notifications & translations: in a 7-language platform, keeping track of every message you send to a user requires LOTS of work. I first outlined the points of the user flow where a notification was needed, and then wrote the text for English and Spanish, following the content style guide. Then I worked with translators to make sure the tone was right and the translations were accurate. I also kept up-to-date the automatic translations of UI elements, making sure they were corrected if needed. This is a never-ending process, but improvements were noticed by users after three weeks.
  • Definition of dev tasks: the ownership of the product was very blurry due to internal issues. This meant that most of the tasks weren’t properly analysed, scoped, or even understood, when they made it into the sprint planning. Developers struggled with this, as it’s impossible to properly estimate a task without understanding it first. This also created more bugs than I had ever seen before. I saw a quick improvement in both the quality of the work and the mood of the developers after I started writing tasks more in detail.
  • UX evangelisation of the dev & QA teams: as I quickly realised, most of the developers didn’t have a solid knowledge of UX principles either. Used to work with more experienced developers, I realised that they weren’t applying UX best practices only because they didn’t know them. I decided to start a UX workshop with games and a little contest to keep people engaged. The effects were immediate, developers found UX interesting and useful, and they started to ask themselves more questions since day one.
Example from one of the UX lessons on how to make a text readable.

Biggest challenges

Communication is key

Even the best of the teams won’t work properly if nobody knows who is working on what, when stuff will happen, who is struggling, and (above all else) where the project is going.

Don’t be afraid to go back

Most of the greatest improvements we experienced in the project, happened after months of debating if they were supposed to be implemented or not. Being too loyal to or to in love with what you’ve done so far, makes you waste time.

Old habits die hard

It’s not easy to draw the line between what must change and what can’t change.

Cultural barriers are real

With a couple of dozens of people from 4 nationalities and different points of view, working in isolated teams with a lack of communication, it’s essential to have a good manager that deals with the conflict. Otherwise, you’ll find yourself building software in the middle of a war zone.

What I learned from this project

  • The size of a team shouldn’t grow faster than its maturity
  • Pick your battles, you can’t row against the current forever
  • Usability is not a collection of features, but understanding the task flow ( I knew this already, but now I’ve seen proof of it!)
  • Scrum was right: people over processes is the way to go
  • Being open to trying new technologies is key
  • You just can’t work without proper research, analysis and useful acceptance criteria
  • Defining a Sprint Goal, a Definition of Ready and a Definition of Done are a must

--

--