top of page
Image de Sean Sinclair

Test automation

UX design, UI design

Test automation is a system that organize, sort and execute software automated tests

Timeline
Category
Type
My role
Tools

 
6 months
UX design, UI design
Web platform
UX resarcher, UX designer, UI designer
Figma, 

 

This project was realized during my 6-months internship at Nokia as an UX.UI designer. I was responsible for designing the interface of a new tool that aims to facilitate the testing phase of a software.

?

!

Solution

Problem

In software development, the testing phase is often a very time-consuming and cost-consuming phase. With the increase of computing speed and adoption of DevOps principles in companies, manual test cases are ineffective as their execution does not allow time-to-market deployment of software.

Thus, many organizations are investing in automated testing by building more and more automated test cases. However, those test cases are often executed separately from the others as test case management systems rarely integrate automation tools.
Build a test case management system that embeds automation tools to allow the execution of automated test cases in addition to organizing, sorting and managing test cases.

The system has to address the complex needs of testing processes, enhance collaboration between testers and integrate automation seamlessly.

Design process

Understand

Benchmark, Story interviews, Personas

Design

User flow, Information architecture

Implement

Prototype in Figma

Test

Usability testing, improvements

Benchmark

Since the new product was not released yet, I started to conduct research on test case management systems as it represents a large portion of the entire product.
I spent the first weeks learning about the topic of testing either with the guidance of the test automation engineer team, or by myself through certified testing documentation and related papers found on literature web search engines.


I undertook research on other test case management tools that are available on the market either on open source or commercialized, beside the one used by my team (T system). Some of them proposed limited free trials, thus I was able to briefly investigate their content, the organization of the artifacts and the layout of the pages. All of them have their own distinguished UI design system.
However, they all follow a similar pattern regarding the content’s organization, which is often composed of a test plan repository page, test case repository page, a reporting page, and a dashboard.

Story Interviews

I interviewed the solution architect and the test automation engineers of the team on their work and their experience with T system. I requested them to demo their use of T system during one-to-one meetings.
What made these recordings enriching is the fact that the engineers would naturally criticize the tool during their demo.

I identified several problems regarding T system that I can tackle on the design of the new system, such as:
1. Test case management systems (TCMS) should facilitate collaboration between test members
2. TCMS should favorize reusable test cases (with a test case library)
3. TCMS should be flexible to accommodate to diverse testing team methodologies and procedures
4. TCMS should support automation specificities
5. T system is unnecessarily complex to use. There are too many functions available but several of them are not even needed.
6. T system does not embed automated test cases execution.
7. T system's dashboard is not straightforward.

Personas

The target audience is pretty specific, it is composed of network test case engineers, network administrators, project managers, and telecom professionals. I defined the personas of the product through the interviews and demo sessions with the test automation engineers of the team as they were the first end-users I was in contact with.

The first persona is John Doe, an automation expert that has 7 years of experience in the network testing field and in charge of integrating automation in the testing process. He monitors testing activities that involve automation, which includes the execution of automated test cases.

The second persona is Chris Cohen, a product manager in charge of defining test requirements, testing strategies and monitoring the progress of a testing campaign. He collaborates with network testing engineers to ensure that the test campaign will end within the timeline and readjusted testing strategies accordingly.

The 2 personas depict a neat difference in terms of goal and expertise. Thus, both points of views must be encompassed in the TCMS module design.

Define the main artifacts

Some of the testing artifact’s labels of the new product didn’t match with the terminology introduced on certified software testing-related documents. There was a need to align and define the vocabulary of the product accurately.

Thus, I created a shared vocabulary list with terms, concepts, artifacts found in several TCMS documents with their definition. The team members were collectively asked to correct, adjust the definitions and add other relevant terms that were missing or relevant according to them.
Then, I visualized the TCMS testing artifacts through a schema that visually represent attributes, behaviors, and the relationship between them.

The test automation engineers were able to enlighten my understanding by fixing and completing the schema. I used the vocabulary list and the schema as a baseline for the next design steps.

Information architecture

With the precedent data collection, I was able to build the first information architecture (IA) diagram of the new system. The purpose of the IA I built was to serve as a guideline for prototyping the UI frames.
To create the IA diagram, I firstly grouped task descriptions by artifacts, then by types of actions inside. Once the clusters were formed, I rearranged them into bigger groups that became the main pages of the new system.

I ended up with 20 groups in total. Those pages are then organized into 3 levels: (P1) the home page; (P2) the project repository page; and (P3) one project dashboard page.
.
Then I determined the navigation structure with arrows that illustrate the navigation flow. Lastly, I renamed the clusters and actions by keywords that are more explicit and that fit more to the end-users vocabulary. I was able to build 3 versions of the Information Architecture, with the help of the test automation engineers who provided comments and corrections at each iteration. 

Wireframes

To build the first wireframe’s version, I first determined the key pages of the new system based on the IA diagram. To compose the layout of the pages, I did brainstorming for each artifact visual first. I obtained 9 main tabs split across the 2 other pages. 

The wireframes were designed using Figma. I reit
erated wireframing as much as possible until the test automation engineers were satisfied with the layout and the behavior established.

Mockups

Regarding the design system, Nokia possesses its own visual identity system. Thus, the UI components I used for the mockups (search bar, buttons, menus, etc.) were directly taken from it.
I had the freedom to change the color, size and shape to enhance the usability of some components, and sometimes I adapted the behavior and mechanism of standardized components to enhance usability, within reasonable bounds.

In the end, I was able to prototype 5 tabs out of 9, with their corresponding tasks and functionalities.

User interviews for the dashboard

In TCMS, the dashboard is the place where information across disciplines will be shared, making it the center of all decisions. With the current data I possessed at this stage of the internship, I couldn't design a useful dashboard. Thus, I took the initiative to conduct another design sprint in parallel, dedicated to the dashboard design of the new system only.

I conducted user interviews with end-users outside the team. The main requirement was to interview people
with different roles in a testing campaign. I was able to reach 3 people in total. Two of them are integration professionals and the other person is a test automation manager. I conducted a 1-hour semi-structured interview with each of them to discuss their work, their role in a testing campaign and their experience with tools related to test automation in general. 

The common suggestions I received were:

    - The TCMS dashboard would be mainly used by managers, as they need to explain the current test campaign state to stakeholders, so metrics such as the number of test cases in a test plan, passed test cases, and the execution progression should be visible.

    - The dashboard should show regression testing (comparison), displaying the number of tests passed of the previous build and of the current build to compare the evolution. They would notice an issue if the latest build has more failed test case executions.

    - The dashboard must show a list of passed and failed test cases, as well as the possibility to investigate the reason of their failing.

Next steps

Due to the limitation of time, the user interface of the the new system was not fully completed. I was able to build the prototype of the “Project Repository” page with all its functionalities, and half of the functionalities of the “Project Dashboard” page that include creating and editing test cases, as well as creating and editing test plans.

The
usability testing sessions would have been the next step to assess the usability of the current prototype. Furthermore, insights on the dashboard have been sorted out from the various interviews and can be used as research materials to its design and implementation. From my perspective, this may require additional interviews to broaden the spectrum and dig into charts visualization documentation for its development, which may demand a whole distinct research sprint on its own. Especially when progression, trend and comparison charts are needed for strategic planning.

Research and design of test execution needs to be undertaken as it involves integration of an automation tool. Also, the manual and automated test cases don’t demand the same testing settings. The user interface should encompass the two procedures simultaneously in order to ensure a harmonized result.

Learnings

I would like to highlight the importance of involving the users in the design process. The current design work was only possible with the precious involvement and feedback of the end-users. Especially for a system that requires such extensive and detailed comprehension of its artifacts’ utility, mechanism, and relationship, as well as a deep understanding of the testing process and its challenges.

I realized the importance of communication and popularisation when it comes to exchanging with people from different areas of expertise. I, as an UX.UI designer and them, as test automation engineers, were constantly learning from each other. This learning phase actually continued throughout the internship, making me update and evolve the content of my deliverables on a regular basis.

bottom of page