You're on your way to the next level! Join the Kudos program to earn points and save your progress.
Level 1: Seed
25 / 150 points
1 badge earned
Challenges come and go, but your rewards stay with you. Do more to earn more!
What goes around comes around! Share the love by gifting kudos to your peers.
Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!
Join now to unlock these features and more
Hello, Atlassian Community! My name is Mikael and I'm here to share my experience of choosing and purchasing software.
To start, I'll tell you about the problems my team was tasked with solving. At the time of the switch, I was working for Hill-Rom IT Solutions, and I was responsible for administrating the tools we used for Application Life Cycle Management (ALM). One of the driving factors that we started to look at new tools was that we were using a test management tool that had been end-of-life for a while. Hill-Rom had started to migrate one project to a new test management tool, but where not happy with the functionality of it and the response from the vendor. So this kicked off the search for a new test management tool. The core team put together a list of requirements that we wanted the new tool to cover and we selected 3 tools for our initial evaluation. Soon we discovered that two of the tools we picked also covered requirement management, so we went back and changed the scope of the evaluation to look at all of our ALM tools.
In all comparisons I've created, I've either used Word, PowerPoint, or Confluence. Above is an example of what it looked like when I used PowerPoint to document the requirements and compare the different tools
The picture above just shows a subset of the things my team needed to consider when going through the process of switching software. The list of requirements that the core team came up with was based on what we already liked/disliked in the current tools or business requirements. In total we had about 60 or so different subsets. This list of requirements had everything that we wanted the new tool to cover, and it consisted of things that we liked about the existing tool, current gaps and things that we wished the new tool could provide. Some of the wish list requirements where based on research that we had already done during initial trials of the tools. We then went back and did a deeper evaluation of the tools, and the result was the input to the matrix in the PowerPoint.
After the initial evaluations, we summarized everything and created an executive summary slide deck that listed all the requirements and how well each tool that we evaluated lived up to those requirements and presented to key stakeholders. As part of the summary we also added a cost summary for the next 3 years based on the recommended tool.
Our biggest challenges was how do we get the data out from our existing tools. In most cases we were able to create a script for it, or we could export it out in a format that could be imported into the new tools. But one tool presented an issue because it uses a proprietary database and the data we needed to get out was not available using the REST API provided, so we had to manually copy/paste the data between the tools.
Unfortunately I was not around for the full roll out, but based on feedback I have heard from former colleagues they are happy with the switch.