Case study - introduction
Testing the product is essential because we make mistakes. Human errors cause defects at every stage of the software development process. Some of them are trivial, but others can have life-threatening consequences. Introducing testing at the beginning of the project will reduce development costs and improve performance.
There are many projects where testing is limited to the unit tests or basic UATs for months or even years.
How to create a test process for such projects? What are the most significant difficulties and challenges for a tester and the rest of the team?
Let's look at an example of the project our Quality Assurance Specialist worked on.
We worked on a platform for users in the USA to share messages on their walls; it also had a shop with a payment system. The web app was in production for a few years, and they decided to introduce a software tester who would strive to improve its quality with manual and automated testing.
I knew some significant bugs developers were working on, and the business team was doing terrific work performing UATs (User Acceptance Testing). But the development team was creating more and more new functionalities, and there were not enough resources to properly test them. So when I got the project, I was very excited - I had introduced a few projects to testing in the past, so I knew what I could expect.
Firstly, I had to get to know the app, so I started exploring it independently. In this case, anyone with an internet connection could access and use the app, so this was the best time to verify if it is intuitive for a new user.
This type of testing is also useful when project documentation is unavailable or only partially available. The biz team wrote basic test flows, which helped, but I had to talk to team members to understand more complicated flows.
The next important step is to create a test plan - a guide through the entire testing process: helping define the scope of testing, create pass/fail criteria, document the process, and deliver everything we need to make the product the best it can be.
It's worth taking some time to investigate, so it's clear what process would be the best for this particular project and how to approach it correctly. That will result in organized testing where the time is well-managed.
What should we include in this document?
Introduction - the overview of the project and the goal of the test plan and process.
System/devices requirements - the system requirements to be able to perform tests. Apart from that, you'll need browsers and additional tools, a list of mobile devices that are the most popular among users, and tests will be performed on them.
Scope of tests - all types of tests we plan to perform in the project. The most important is functional, regression, and smoke tests, often automated ones. We can also include performance tests, security tests, integration tests, and so on.
Test Acceptance Criteria - we usually have the Acceptance Criteria specified by the Project Owner, but we also have to consider other aspects like security, performance, or usability.
List of functionalities to be tested - we can also include scenarios for the functionalities.
Test schedule - information of how often or when exactly particular test types for which functionalities will be performed.
Test report - specify what its form will be. It can be a document with a list of all tested functionalities with information about issues, bugs, and results of tests. Sometimes a comment on the task and bugs reported in Jira are enough - it all depends on the project.
Roles - each team member has their responsibilities pointed out in the document.
With a well-made test plan, we can proceed with further steps.
I found many bugs in the exploratory testing process, which leads to the first issue.
At the beginning of the project, I didn't know the app well enough to prioritize the bugs I reported on my own - I knew which bugs were severed from the development point of view, but still, I lacked User Experience.
So, I worked closely with the business team as they had what I needed. That's when we created a list of a user's essential functionalities and flows. Then, in combination with technical knowledge, I made a simple guide for prioritizing bugs. In this document, I assumed that priority depends on two questions:
- How many users does it affect?
- How severe is it?
In combination with the list of the most important functionalities, I could start prioritizing bugs with little help from the business team.
You may find our previous article useful - 4 tips for prioritizing bugs
Introducing the business team to the Acceptance Criteria was also a challenge. There was no documentation or notes from the meetings when the features were discussed - even the development team had to remember what the business team agreed to.
I created a document with tips on writing the Acceptance Criteria, explaining how it would improve the development, and met with the business and dev team to explain the problems and solutions.
What was the result?
- Developers knew what to develop and how the feature should work.
- Estimation became easier and didn't change so often during the sprint.
- Developers checked if the scope of the basics was ready, so I wasn't blocked by unfinished functionality and could focus on the edge cases and detailed testing.
- They didn't have to return to the developed functionality multiple times during the sprint to focus on the following tasks they needed to work on - this caused an increase of functionalities developed during the sprint.
- More cards were getting through QA because most bugs were not blockers.
- I knew exactly what to test, how everything should work, and what is within the scope of the card.
- I didn't have to meet with the business team so often, so I had more time for tasks I planned to do in the Testing Process, like writing automated tests.
Read more about the AC on our blog post here.
The team was excited to implement the testing process. First, the business team and the designer started creating user stories with detailed Acceptance Criteria. This took some time initially, but every step was more accessible.
The dev team noticed improvement immediately - it was easier to follow the design while developing. Furthermore, after implementing the features, they could quickly check with the AC list if they completed the whole task.
Further, in the testing step, parts stopped returning as unfinished to the dev team, and testing yielded fewer critical bugs. As a result - testing was taking much less time which reduced costs.
Each product and feature will have specific testing criteria, strategies, and needs, so test plans and processes will vary between projects. But the goal is always the same: to ensure the product meets the requirements, is intuitive, easy to use, and the best it can be.