Kelly Nakawatase
 

Auto-Close Marketing Emails


Expel

 
image-asset.jpg

Context

Expel security analysts (Expel SOC) look at every email that customer users report as suspicious. However, roughly 30% of these reported emails are marketing emails or other sales outreach messages (“20% off!”/”Schedule time with me to chat about how great we are for your company!”). These benign emails took a significant amount of time to analyze and close out.

In an effort led by the data science team, Expel sought a way to automatically remove these emails from the SOC’s workflow based on a “marketing score” machine learning model. If an email scores high enough, the email would be removed from the analysts’ view automatically.

My role

Using our established styles and patterns, I was responsible for creating a visualization to represent this model, as well as understand the implications of this model within the analyst and customer workflow. I had the following questions, and then worked on answering and designing for them

  • How do we show this score to the analyst in a way they will trust it?

  • If we receive a submission and auto close it, what do we tell the customer?

    • If the customer has questions for our analysts, how can the analyst talk to the customer about what happened if they never saw it?

  • If the marketing score is not high enough to be auto closed, what do we do? Should we inform the analyst?

 
 

 Process


Understanding the model

I sat down (virtually) with the data scientists (Elisabeth Weber and eventually Jane Hung) who created the model. They explained that the model worked based on a set of variables found in marketing emails.

Example of the scores and variables Elisabeth showed me.


Stakeholder Management & Research

To get an understanding of how we wanted this model to function for the analysts, I spoke to the following stakeholders

  • Expel SOC (also a user)

  • SOC managers

  • Director of Expel SOC Operations

  • CTO

  • Product Marketing Manager (to see what we wanted to call Expel Data Science/Machine Learning initiatives)

In speaking to these stakeholders I understood that the analysts would want to to know the minutia of what contributed to something being auto-closed. The CTO was also adamant that he wanted to make sure Expel was getting credit in the work we were doing in automation.

I was also able to understand how the SOC would want to treat various thresholds of the marketing score.

Me and the CTO chatting over zoom.


Design Iteration

Things I kept in mind

  • The analysts would want to know what the variables that contributed to the scoring and this would need explanation, but this wouldn’t likely need to be visible all the time

  • It needed to be clear why we were auto closing something for both the customer and the analyst

  • Expel needed to be able to take credit for the work

  • The importance of the marketing score visual is dependent on what the score is - It is more important to know if an email is 98% marketing versus if it’s 54% marketing

I worked closely with data science and engineering to re-define the variables of the model so analysts and customers could understand, while also discussing tradeoffs of displaying all variables versus a select few.

Various design iterations

 

Designs & Outcomes


 

Together, the whole feature has reduced overall analyst workload by 11%, a percentage that increases every quarter. With my design, I was able to answer the questions I posed above.

How do we show this score to the analyst in a way they will trust it?
We display the variables that contributed the score and why those variables are important.

If we receive a submission and auto close it, what do we tell the customer?
The email has an automated comment, stating that Ruxie (Expel’s robot) determined the email was a marketing email.

If the customer has questions for our analysts, how can the analyst talk to the customer about what happened if they never saw it?
If the customer does not look at the email themselves, the analyst can look at it and find the marketing score, along with the combination of variables that determined that score.

If the marketing score is not high enough to be auto closed, what do we do? Should we inform the analyst?
Yes, this will surface in the alert details UI.

 

Post Processing Rules

The data science team wanted to add in additional guardrails to the model called “Post Processing Rules.” The failure of any of these rules would fail the model and require an analyst to look at the email in question, even if the marketing score was 100%. I worked with the data science team, engineering, and the SOC analysts to determine how this should surface in the UI.

Because the engineering LOE was unknown, I provided an ideal version of the design and an acceptable, but less detailed version.


Feature MVP

As with most designs, additional engineering constraints popped up during development. We could not identify specifically which variables contributed the most score of without taking additional time and effort. I pared back the original design to remove this aspect, although as this feature matures there are plans to fully flesh out the design.