top of page

WHAT IS PEER INSIGHTS?

Gartner Peer Insights helps choose IT solutions with confidence by providing software reviews that you can trust. Every review is verified before publishing to ensure completely authentic insights from the peers

Who verifies these reviews? Peer Insights Verification and Moderation team (VNM)

MY ROLE

TEAM

TOOLS

DURATING

ARM

A smart Review Moderation tool designed from scratch with the goal to reduce user's efforts and increase efficiency & productivity.

Overview
Group 1517_2x.png

UX Research
Market Research
UX/UI Design
Usability testing

Product
Design
Engineering
Review Moderation

Figma
Participatory design

Ongoing since May’21

Context

Reviews are rigorously vetted by Gartner to ensure there is no vendor bias, no hidden agendas, just the real voices of enterprise users.

The process includes verifying the reviewer’s identity, assessing whether a conflict of interest exists, and moderating of content. Gartner Peer Insights’ moderation team takes up to 3 business days to process a review and conclude its approval/rejection.

Problem

Review moderation is completely excel-driven and there has been a significant drop in the current review publishing cycle.

Goals & Success Metrics

Simplify process to save time

20% increase in Publishing cycle of reviews

Reduce moderator’s efforts, increase productivity

Problem

Task 1: Understand what & where are the process challenges

Now was the time to deep dive into details and identify and understand key problems within the current moderation process. I collaborated to perform:

1. Shadow Activities

2. Contextual inquiries

Research

RESEARCH: SHADOW ACTIVITY

Breaking down moderation process

Manual processing & consistency check takes almost 6-7 hours of the day & thus that could be a good area to solve first.

The current moderation process is so complex, solving for entire journey seemed a long way, thus we considered taking baby steps and adding value for at least one section of the process that is highly impactful for our business

Group 237_2x.png

KEY TAKEAWAYS

  • Juggling between 8 tools and 9 files per review

  • Series of manual steps (chances of error)

  • Lot of offline data enrichments

  • Lack of flexibility

  • Redundant/extra steps

  • Extremely time consuming

  • Process developed internally within Moderation team

Group 229_2x.png

Overview of current Review moderation process

But wait...considering the above situation, If almost

7 hours goes into manual moderation, when do they attend other calls, have lunch, tea breaks...?

This question pushed us to have a closer look into how the users are actually processing the reviews. Are there any makeshifts?

Group 64_2x.png

Task 2: Understand User Pain points

While I was still struggling to understand each detail of this complex process, it was also important to understand the key challenges users face and their behaviors and attitude.

1. More Contextual inquiry

2. Use case Identification

KEY USER INSIGHTS

3 behavioral patterns were identified while talking to each Moderation team member (total 6 in numbers)

01/

Skim through spreadsheet data

This helps in providing cues and patterns to identify any potential error

sketch1625953947001_2x-1.png

02/

Batch processing for decision making

They prefer saving time by performing similar tasks together and batch process the final decision

sketch1625953947001_2x.png

03/

Seek help from team

Due to process complexity, they often seek help when there are doubts in the process

Group 238_2x.png

KEY USER PROBLEMS

5 problematic areas were identified

01/

Emoji_2x.png

Time inconsistency

Often it takes more time than estimation since some online details are difficult to find, for multiple reasons

"

"

Emoji_2x-3.png

04/

No collaboration

I couldn't find any previous notes so as per my initial checks, I rejected the review...

"

02/

Emoji_2x-1.png

No real time data updates

Why do I have to recheck the same details if somebody else has already done it?

"

"

05/

Emoji_2x-2.png

03/

Emoji_2x-1.png

No centralised data sources

Just to check one or two previous data, I have to open 4 files and take multiple actions to look for desired information

"

"

Constant thought of missing steps

I don’t want to commit mistakes thus I double check the details but this consumes more time

"

"

"

Task 3: How could I translate and scope out these pain points into design opportunities for our MVP?

Before moving to opportunity space, it was important for me to explore around Industry best practices for UGC Moderation. 

1. Literature Study

2. Market Research

KEY USER PROBLEMS

MARKET RESEARCH & LITERATURE STUDY

I interacted with Linkedin Trust team members as well as Gartner’s other publishing team to understand their approach

KEY LEARNINGS 

  • Focus on automation to enhance productivity

  • Design for quick decision making

  • A lightweight system thats easy to maintain

  • Eliminate biased decisions and differences

  • Eliminate extra efforts

  • Empathise with the moderator’s work

Group 231_2x.png

Also, as per "Use of AI in online content moderation", by Cambridge Consultants, 

1.

Automated systems will be key for future online content moderation

2.

Human input will be required to augment AI systems for the foreseeable future

Design

With that in mind, the next step was to define the Problem statement and scope opportunities

HMW provide easy access to all relevant information while reducing efforts & complexity of steps and provide best moderation quality?

IDEATION

Persona

To put my bias and assumptions aside, I created primary and secondary personas to remind myself who I essentially design for and what are their frustrations.

Group 239_2x.png

IDEATION

Design Opportunities
& Ideas

How can we reduce process complexity?

We tried generating ideas as per the pain points identified as well as the learnings from User and Market research

RESEARCH FINDINGS & PAIN POINTS

OPPORTUNITIES & IDEAS

Group 266_2x.png

IDEATION

Design Opportunities
& Ideas

How can we provide one point access to relevant information?

Once we identified what information would be relevant to users we started segregating information and clubbing similar items together,

Group 240_2x.png

It was time to design

Emoji_2x.png

As everyone says, don’t marry your first design idea... WELL...Not even my users did! Yay!

IDEATION

Initial Wireframes
(Low fidelity)

I made multiple versions of design starting with user flows to adding conceptual features so that we can ask for quick feedback

We wanted to get feedback on

  • Overall navigation of the process 

  • Content validation

  • Usability of the design

  • Other requirements if needed from their side

Group 268_2x.png

KEY USER FEEDBACKS

Positive

  • Overall navigation is pretty clear

  • Happy to see rejection flags upfront

  • Good to see clear filters on top

  • More information and past notes on the same screen helps save clicks and time

Critical

  • Users wanted action items at closer proximity

  • They did not want to mix different review tiers together

  • They did not want modals for doing actions (will have to open and close it every time

  • Would want to have minimum clicks possible

  • Would want to see more red flags

It would have been great if I could just click on the name and see the entire history of the reviewer, so i can catch suspicion...

The “add comment” seems to be too far, every time i will have to drag my mouse from the details to add comments section individually...

Would want to see reviewer name and job title together to quickly copy and paste it on google for Profile search...

Final Design

Based on the key findings we synthesised through conducting affinity diagram, we iterated on the design and upgraded the prototype from mid-fidelity to high-fidelity

Final design

INTRODUCING...

Gartner Peer insights ARM (Automated Review moderation tool)

One point access tool designed to bring faster review moderation and increase moderator's productivity significantly 

A  R  M

Group 1517_2x.png

BREAKDOWN

1. Curating data (hell lot of data) on one screen was a big challenge and I wanted to      utilize maximum real estate to place information that would be:

1.

Limited toggle between screens

2.

Quick online & offline search experience

3.

Instant decision making and updates

4.

Reduce errors, ensure consistency

Group 813_2x.png
  • Merged decisions and comments (lesser clicks, quick decision) 

  • Flags for visual cue (reduce cognition)

  • Quick links (no more copy-pasting)

  • Batch level processing

  • Grouped similar checks

  • Navigation (flexibility and information segregation)

  • Binary filters with no's (faster slicing of information)

  • Soft nudge (avoid errors and provide next steps)

  • Quick real-time information to refer (lesser clicks)

  • Data capture and updates (consistent & informed decision)

2. Ensuring review moderation quality was another big challenge to solve. The idea was to

1.

Reduce decision bias among users

2.

Provide review context & sentiment

3.

Consumable & scannable content 

  • Go through history without loosing context

Group 816_2x.png
  • Review Relevance score (easier to identify potential errors)

  • Matching keywords (easier to navigate through relevant content within review)

  • Sentiment analysis (to initiate what the review's context and sentiments are)

  • Identify any confidential data input (easy to find and mask it)

  • Consumable review format (skim through content and make edits wherever necessary)

5% increase in productivity
(quarterly basis)

Illustration - guy+cat 1_2x.png

POTENTIAL IMPACT

The shift from excel to a smart tool for manual moderation is expected to increase 

"I wasn't sure about how can all these excel driven actions to be done on just one tool, but this seems to be very exciting and promising, I am looking forward to using it asap...when is it happening?" 

-Saumya sharma, Product Ops Associate

Wait... there's more!

Can this help us work on the review quality as well?

We can leverage some smart rejection reasons to potentially “nudge” reviewers about best practices of writing reviews to increase chances of approval

Group 191_2x.png
Reflection

REFLECTION

Key learnings from the project

Wow! You get paid to watch videos and read posts?

That's something which I heard from a lot many people who gave me contacts of people working for UGC moderation. And NO, its not a "wow" job. The market research helped me understand and empathize with the people who go through mental harassment and trauma due to consuming all sorts of triggering content. They have to watch everything so that we don't get to see it! 

Thinking small but impactful changes

I learned that taking baby steps can eventually help me grow with the process of understanding project in and out. Process complexities like this can be very overwhelming but the idea is to break them into multiple chunks and solve for the most impactful area.

Ideas can come from anybody

As a designer I also learnt that it's more important to align every team member with the problem as well as the thinking that is going behind, this drives more interest among others and I have seen people calling me up just to share their ideas.. When I was stuck, those ideas were enough to ignite next steps. :)

Thanks for reading! :)

Don't shy, say Hi!

Emoji_2x.png

Shoot me an email at nidhimdes17@gmail.com or find me on

  • LinkedIn
  • Instagram

@2021 Design by Nidhi Kumari | All rights reserved

bottom of page