• Article
  • Oct.8.2012

Software Fail Story – Stop blaming developers!

  • Oct.8.2012
  • Reading time mins

We often blame bad software on bad developers, although this may happen, bad software is more likely to be caused by poor methodology and/or poor management. I’ve got a good example of how a company wasted hundreds of thousands of dollars just because of a few, easily avoidable, bad decisions.

The context

In this large online company processing tens of thousands of transactions every day, much time was being spent in reviewing outbound transactions. While inbound transactions (purchases) were highly automated, outbound transactions (refunds, withdrawals) all required manual review and processing.

An old in-house tool allowed analysts to mass-approve/reject transactions however it did not provide analysts any information about the transaction, so they wasted time searching information on the company’s systems to decide whether the transaction could be approved or rejected. Because of challenging KPIs, staff would end up mass-approving transactions without even reviewing them and this would cost the company money in abuse and fraud. Management logically decided a tool was needed to help analysts make better decisions and be more productive.

The product

The idea was to bring all the decisional information required on a single screen so that the analyst can approve/reject the transaction in 2mn instead of 10 as-is. Mass approvals would no longer be needed if staff were 5x faster at reviewing transactions and since each transaction would be reviewed individually with access to better information, better decisions would be made and the company would save money. So far, so good.

What went wrong?

The project was assigned a Project Manager (PM), who wrote the specs based on the requirements given by the functional manager.

The specs were written in one block, signed off and handed to the Dev Team.  6 months later, the PM was told his project had reached the top of the waiting queue and was ready to be rolled out.

Next, our product is rolled out on the production environment and the project manager completes UAT. The staff is told to use the new product and management is looking forward to seeing productivity rise up while bad decisions decrease.

One month, two months later, nothing’s changed. In fact some analysts are doing even worse than before. The functional manager concludes that something must be missing from the tool and presses for more enhancements. The cycle repeats itself until the Dev Team’s PMO refuses to accept any new projects, saying that they all claimed to provide a great ROI but all failed to deliver so, and he is therefore wasting development time.

One day, the Project Manager bumps into one of the transaction analysts at the cafeteria and asks him how he’s finding the new transaction reviewing tool.

“Well, I’ve used it only once and I hated it”

“WTF! I spent 6 months getting this developed, we built exactly what you wanted, down the colour of each button”, the project manager couldn’t believe what he’d heard.

“Sorry but that tool is not at all what we need, I don’t even understand how I’m supposed to use it and when I tried it, my productivity scores went down”

Time/Money spent

  • About $100,000 was spent in development and project management hours but the real cost has to include:
  • The cost of developing or buying a solution that will deliver what the current solution failed to provide.
  • The cost of lower productivity and bad decisions from analysts for another 6 months while a solution is found.

Lessons learned

  • Not involving end-users = Guessing exercise

At no point of the process were end-users involved. Neither the Functional Manager nor the PM had worked as transaction analysts and their specs were only based on what they assumed the processes and needs to be. This means the product was completely unsuited to the end-users who preferred to stay on the previous tool. Although the company used Confluence, they did not think of putting the specs on it and open them to comments.

  • ‘Closed’ JIRA = No collaboration

Although developers used JIRA for their project management, the end-users couldn’t access it and even the PM could only read but not write. During UAT, it was very difficult to report bugs and the PM ended up bombarding PMOs and Devs with emails, which of course they hated. And when end-users found bugs, they didn’t know how to report them.

  • No Agile methodology = cross fingers and hope you get what you want

Just as there was no exchange of ideas with end-users, there was none with developers. They were handed the specs and the PM would have no idea what his product would look like until delivery time. This is especially dangerous in a project were UI was so critical to its success. Similarly, there was no involvement of the business at the QA stage and the PM only saw the product when it was rolled out in production, giving him no margin for changes.

  • No change management = No adoption

When the product was rolled out, no training was provided to the end-users. The PM did write release notes on the company’s Confluence, but these were only visible to management who was worried about divulging too many technical details to the staff. The few members of staff who did give a go at the new system quickly found their productivity go down and reverted to the previous tool.

This project came from a good initiative and seemed to have a guaranteed ROI, but what caused it to fail was the excessive focus on technology and not at all on the human factors: collaboration and processes.

Related resources

View all resources