• Home
  • My work
    • Cobalt: Introducing a new product
    • Cobalt: Improving peer feedback delivery
    • Vivy: Guiding users through a complicated signup flow
    • Wefox: Introducing dynamic in-app checkout
  • About me
EMMA L. SANDBERG
  • Home
  • My work
    • Cobalt: Introducing a new product
    • Cobalt: Improving peer feedback delivery
    • Vivy: Guiding users through a complicated signup flow
    • Wefox: Introducing dynamic in-app checkout
  • About me

Cobalt: Improving peer feedback delivery

Helping pentesters grow by showing them feedback from their peers

TL;DR
​
​I worked as a product designer on a team at Cobalt, a company in the Pentesting (friendly-hacking) industry. We aimed to improve feedback for the Pentesters in our community. The problems were: Pentesters couldn't see constructive feedback, ratings lacked consistency, and there was often a lack of context. We brainstormed solutions, including aggregating feedback trends, clarifying ratings, and encouraging longer comments. After multiple design iterations, we released changes incrementally, resulting in a 21% increase in monthly visitors to the feedback page and an 11% decrease in comments with less than 50 characters. Pentesters found the updates very helpful in tracking their progress.
Picture

Intro

Cobalt is a Pentest as a Service (PtaaS) platform with a community of pentesters who simulate cyber attacks and deliver insights to help companies remediate risks and innovate securely (friendly-hacking for hire).
My role: I was the product designer in a team with a PM, an EM, and six engineers. ​
​Dates: Feb 20 - Mar 28, 2023 (research happened throughout Q4 2022)

Setting the stage

Pentesters are friendly hackers Cobalt employ as contractors. They are all over the world and often do penetrating on the side of a full-time job. 
​After each pentest, the Pentesters are asked to rate and comment on the performance of the testers they collaborated with. The community team uses this to spot trouble and help the Core improve. The survey consists of a star rating, predefined tags for what the pentester did best and could improve the most on, and a free text comment field. The pentesters could only see their positive tags in the app. If they wanted other feedback, they had to go via the community team (the people managing the testers).
Picture
The original peer feedback page

Discovering the problem

The cobalt pentest community (Roughly 400 pentesters) is an essential part of our business model. If we don’t have people running the pentests we don’t have anything to output for our customers. Through the feedback, we have received from the community directly and from working with internal stakeholders we believe there is a gap in pentesters getting clear and consistent feedback. Our goal with this is to help mature pentester talent. Though Slack conversations, pentester interviews, and stakeholder working group sessions, ​we identified three main problems and used the job story format to convey them to the team:

The problems

#1
​When pentesters
are looking to improve their skills, they want to see the feedback they have been given for recent pentests, so they know the areas to focus on.
But today, they only get the positive tags and need to reach out to the community team to get the constructive tags and comments.
#2
​When the community team
is reviewing the peer rating given to a pentester, they want the data to be consistent, so that they can be confident they are sharing accurate data during the review sessions.
But today, the score can mean different things depending on which Pentester is doing the rating. Some use e.g. a three as a baseline and some a four.
#3
​When the community team
is sharing feedback with a pentester, they want the ratings to be backed up with comments, so that they provide context to what they are sharing.
But today, many pentester don't give any, or leave very little constructive feedback, even if it is mandatory for a rating of 3 or less.

Ideating with the team 

My PM and I had now gotten as far as we could on our own and it was time to involve the team. Our latest addition, Mahmoud, mentioned that he had really enjoyed the way of working on in another team, so I reached out to the designer of that team and got some inspiration. 
​
​I facilitated three workshops: 
Workshop #1
I starting with an ideation session with my team (Feb 20th). After explaining the problem, the full team contributed with ideas which we discussed and afterwards grouped and gave a effort estimation. Everyone on the team enjoyed this session and agreed we should do it this way again.
Picture
Work session 1 - Team ideation and effort estimation
Workshop #2
The second session was with the community team, where they were presented with the ideas and helped us estimate the potential value they would have based on the Problems (Feb 21st).
Picture
Work session 2 - Community team value estimation
Workshop #3
In the final session, the PM and I mapped the ideas with the estimations on the value/effort chart and decided which ones to continue with (Feb 22).
Picture
Work session 3 - The PM and I decide the next steps

Solutions

#1
Aggregate peer feedback to show trends
  • We show both positive and improvement data 
  • We only show feedback from the last 12 months 
  • We only show if a feedback tag has been selected 5 or more times 
#2
​​Provide clarity to ratings.
We explored two options:
  1. Replace star ratings with labels 
  2. Or show clarification text for ratings
#3
​Show a caption warning for comments which are less than 50 characters
  • The comment field is already mandatory for ratings of 3 stars or less
  • We display a warning in the caption for comments less than 50 characters urging the user to type more. 

Turning out ideas into screens

​After exploring a few options in Figma, I shared it with the team and stakeholders for feedback. In this video you can see a part of that feedback call.
We made some small changes, for example to the colours used for the constructive feedback, and once we agreed on the designs I tidied up the doc and the PM created the tickets (Feb 27).
Async presentation of the screens to get feedback

Delivery

The low risk changes were released as soon as completed, but for the new charts we first released to a small group of 7 pentesters who had requested changes in this direction. We got some good new insights and a generally positive response so we could go ahead and release to the full core.

​In the end, we increased the monthly visitors to feedback page by 21% and decreased the number of comments with less than 50 characters by 11% . 
Picture
Final designs for showing peer feedback in the app
Picture
Labeled star rating and caption hint for short comments
Picture
Display when the pentester don't have enough feedback
“This looks great and provides insightful data because after every finished pentest I go and check this section. Before this update I forgot what feedback I received on the previous pentest and I didn't know if the latest received feedback was really the "latest". With this update, I do not look at it that way anymore since now it changes the whole perspective, letting me focus on what I did the last 12 months.” 

In conclusion

Our collaborative effort at Cobalt to elevate peer feedback in the pentesting community resulted in substantial improvements. Addressing issues of limited feedback visibility, inconsistent ratings, and insufficient context, our solutions led to a 21% surge in monthly feedback page visits and an 11% decrease in short comments. Pentesters now benefit from a more insightful and coherent feedback experience, underscoring Cobalt's commitment to nurturing talent and community growth in the cybersecurity landscape.
Picture
​Vivy: Guiding users through a complicated  signup flow
Picture
wefox: Introducing dynamic in-app checkout
Picture
Cobalt: Introducing a new product
Site powered by Weebly. Managed by Porkbun
  • Home
  • My work
    • Cobalt: Introducing a new product
    • Cobalt: Improving peer feedback delivery
    • Vivy: Guiding users through a complicated signup flow
    • Wefox: Introducing dynamic in-app checkout
  • About me