Measuring & Tracking SE Teams – Solved?

Reminder: you can subscribe to updates on the right —>

Apologies on the delay in posting.  It is amazing how much time passes when you are having fun and there is nothing more fun then watching your new team start to come together and perform.  I have been in my new role/company for about six months and it is time to share a few lessons learned of the most viewed/shared topic we have had here on SEthoughts.com: Measuring & Tracking Sales Engineering.

In the previous post I reflected on the two types of tracking SE Leaders implement:

  1. Activity Tracking
  2. Outcome Tracking

When I joined my current company it had implemented Activity Tracking – an approach I have struggled with.  Fortunately, the cross functional teams (Sales Ops, Sales, Product Management, Finance, and Support) were supportive of taking a crack at simplifying and improving the approach (see Give – Get: Executing a 90 Day Plan).

I am happy to say that for the first time ever I am experiencing an SE Metrics & Tracking approach that is lightweight, automated where possible, and provides value first and foremost to the SE and Regional Sales Managers (RSM) and secondarily to the leadership teams.  I will detail the previous approach and how we transitioned to:

  1. Technical Sales Process –> Outcome based tracking
  2. Proof of Concept/Evaluation Approach –> Simplified & Consistent
  3. Reporting –> outcome based and aligned to the Sales Process
  4. SFDC Hygiene –> how to make it consistent and accurate 

Previous Approach

Tracking

  • Significant Customer Interactions (SCI): for every customer meeting the SE, Sales Rep, or other employee would create a sub page off of the Salesforce.com opportunity and document details of the interaction.
  • POC Inline SCI: If a ‘production’ POC was implemented by the SE for a customer a special SCI format would be used.  This format had additional sub-pages for every technical stage a POC went into, i.e. config, basic testing, advanced testing, etc.  If an engagement didn’t involve a production POC (Demo, lightweight POC, design) then the teams just hacked this format to try and do it.

Key Metrics:

  • X # of POC per SE per Quarter
  • 8 in-person  (SCI) per SE per  week
  • POCs shut off  in 30 days (manual extensions last 15 days)

Reporting

  • To SE Leadership:
    • I eventually found a dashboard that had: # of opportunities per SE, # of POCs, amount of traffic per POC, etc.
  • To Sales Leadership:
    • Amount of Pipeline $ in Sales Stage 2-6 (POC Configuration in Process, Basic Testing, Advanced Testing, POC Stalled, Technical Win)

The main challenge with the existing approach was that it was built around the concept of a POC (an action) rather than a Technical Win (outcome).  Further more – the technical progress of a deal (POC Inline) was buried in sub pages off of the Opportunity record and mixed into the rest of the other sales rep’s SCIs.  Thus it would take about 12 clicks just to find the page to look at with technical info.  As a sales leader you had no idea (or no patience) about how to find this information and even if you did there was no way you would click through and try to find it for every opportunity.  Therefore the sales leaders would have a massive amount of pipeline ‘trapped in Sales Stages 2 through 6’ and have no idea what was needed to move it forward to Sales Stage 7 Negotiation & Legal.

Evolved Approach

We decided to simplify not only the Technical Sales Process (what SE does) but also the Sales Process (what the sales rep does).  

screen-shot-2016-11-06-at-2-23-03-pmExample of Technical Stage 1 Documentation

The Solution:

  • Outcome Based: the SE owns the Technical Win for an Opportunity via these steps:
    1. Discovery & Architecture Workshop: output is typically a joint Tech Validation (POC) Plan
    2. Tech Validation Plan: what is the jointly created plan with the customer to prove out a technical win?  Could be a POC, Design Session, or just a demo.
    3. Progress: what is the adoption by the customer of the plan?
    4. Findings Report: a summary presentation of the Tech Validation shared at a customer exec meeting with economic sponsor.
    5. Technical Win/Loss/Stalled: confirmation or proof from the customer that we are technically superior or not.  Or if we do not have confirmation but there are no more outstanding technical actions we consider it stalled. 
  • Cross Functional Outcomes: what are the outcomes cross functional teams need?
    1. Technical Close Date: just as a sales rep forecasts when his deal will close, the SE should forecast when they will achieve a Technical Win.  All the rules that apply to a rep setting/changing a Sales Close Date apply to the this one as well.
    2. Technical Sales Stage: a drop down menu summarizing the outcomes above and what Technical Step we are in the sale.
    3. Simplify Sales Stages: remove all duplicate stages from the sales stages (Stages 2–6) and rely on the SE owning all progress in Sales Stage 2 (Technical Validation)
      screen-shot-2016-11-06-at-2-09-01-pm
    4. Provisioning Requests: any outstanding evaluation or production pilot should be displayed inline.  Further more all customer usage metrics should be pulled from our cloud and embedded into the view automatically.
    5. Technical Next Steps: A short note updated frequently on the progress, next steps, and actions needed to get to Tech Win.
    6. Post Sales Handoff Date: if this deal closed when was the design and information handed off to the post sales implementation and support teams?

The Result:

The key outcomes detailed above were ‘promoted’ to be a section on the Salesforce Opportunity Record right next to the Sales Rep’s content.  This way any employee in the company could view an opportunity and see a complete summary of the deal progress.  If they want more detail they can click through to the documents themselves. The only section that has to be updated on a frequent basis by the SE is the Tech Validation Next Steps field.    All other progress, notes, etc. should be reflected in the Validation Plan, Findings Report, or Cloud instance.

screen_shot_2016-08-08_at_11-53-24_am

Customer Verifiable Outcomes

An outcome only matters if it impacts a customer and as such SE effort should be spent on customer facing work.  If leadership wants to track for accountability/progress we should tap into those feeds.

  1. Discovery & Architecture Workshop: when we meet with a customer to create a design they should follow a standard agenda.  These agendas usually follow a format of 1. You show me yours (current environment, challenges, business drivers, requirements) 2. We show you ours (platform capabilities mapped to requirements, differentiation, proposed solution). 3. Formal output should be a co-authored Tech Validation (POC) Plan.  
  2. Tech Validation Plan: what is the agreed plan with customer to prove out a technical win?  This could be a POC, Design Session, or just a demo.  A template that details key stake holders, support, success criteria, scope, etc.  This document is created in Google Drive and shared with the customer.  We can verify customer engagement by viewing the google doc and confirming it is shared with them, how much the customer has edited/contributed to it.  Google docs also has change control built in so you can control scope creep etc. 
  3. Progress: is the customer using the platform, how many users, traffic, etc?  No need to ask the SE when you can just pull the usage data directly from our cloud (see above provisioning request summary).
  4. Findings Report: before we agree to do a Tech Validation the customer must commit to having a scheduled exec review meeting where the economic buyer and technical decision maker are present.  This meeting is scheduled in google calendar and visible to all.
  5. Technical Forecast: key aspects of a sales forecast are close date, stage, next steps, and linearity.  So why not do the exact same thing for tracking the Technical Win progress? After all the best predictor of a sale and sales linearity is whether or we have the technical win in place.  Below is an elegant view of a quarter forecast covering both sales and technical. 
  1. screen-shot-2016-11-06-at-2-15-11-pm

Automation, Simplification, Consistency of Execution

 As previously detailed in my posts on Sales Enablement and Metrics if you want to change behavior your best approach is to first automate it so it takes no change in behavior, then drastically simplify as much as possible, then finally use a big stick (and carrot).  Below are some techniques we leveraged to ease the above transition and ensure consistency.  

  • Emails Automatically Logged: I don’t quite know how salesforce does this but I believe it is some form of Google Mail integration.   Every email sent to or from a customer is automatically tracked as an activity.   No need to keep customer communication secret when you have a clear code of business conduct.  screen-shot-2016-11-06-at-2-43-33-pm
  • Technical Validation Plan: when I first joined the team an excellent POC Plan was shared with me.  I emailed the team and asked for the source template.  I received 7 completely different versions back.  So much for not reinventing the wheel every time.  To create efficiency one of our Aspiring People Leader SEs volunteered to consolidate and simplify all of the plans.  He did an excellent job doing this and even expanded the plan to include needed info by the Product Management team and pre-populated test cases.  These plans need constant evolution as every new product or feature will often result in separate ‘splinter’ plans.  It is the SE Leaders role to show the value of a single plan to all of the cross functional stakeholders.  
  • Provisioning via SKU: provisioning for POCs was previously handled by about five different processes including SFDC and google form questionnaires.  When provisioning it was often done via text descriptions of what should be enabled.  Instead we transitioned to a single format driven via SFDC and tied to our official SKUs.  If something custom was needed on the backend then we removed that requirement from the SE and instead pushed it to a sales ops or support team.  Those teams are much more process driven than a Sales team.  
    screen_shot_2016-08-10_at_12-20-58_pm_0
  • Evaluations that don’t need to be extended: previously every POC was provisioned for only 30 days and then if an SE wanted to extend the length it was done for only 15 days at a time. An extension required a support ticket, order management team, sales ops approval, and Sales leader approval.  After doing some research I found out that the average length of a POC in the Enterprise Segment was 76 days.  That meant at least 4 extensions were required for every customer and about 20+ touches.  Instead we decided to manage top down rather than bottoms up.  Rather than extend licenses we set them by default to be 365 days.  Then we implemented a top down management and tracking approach via sfdc where we would require SE Director justifications for any POC extending past the committed technical close date and timeline defined in the Technical Validation plan.  This reduced support cases and touches by the thousands and helped us to reinvest that time in customer support.  
  • POC Usage Data: how often have we done POCs with hardware and had the customer tell us “oh yeah we are testing and it is going great” and then months later we find out they only had one person tinker with it occasionally.  Instead of having to ask the customer or SE how the POC is going we made it so we could view usage statistics right in SFDC and the opportunity.  We can now see if one person or 20,000 people are.  
    screen_shot_2016-08-08_at_11-53-24_am
  • SFDC Hygiene: how do we ensure consistency of execution for things like the Technical Next Steps or POC Plans? We run an automated check on a recurring basis that notifies the SE and SE Leader for things that are missing or out of date.  

Report Card

How did we do?  Below are the key items and criteria we laid out in the original post.  Personally I am finding I can satisfy all of these criteria with these few fields and techniques our team applied in SFDC.  A big thank you to our SE, Sales Ops, and Finance teams who made this possible and effective.  

  1. Justification of SE Team
  2. Individual SE Performance 
  3. Business Insight & Operations


  1. Business Value: Provide value back to both the SE and Management on a weekly/monthly/qtrly basis
  2. Accurate: Be accurate via mandatory completion and not manager inspection/honor system
  3. Consistent: Track as much via automation as possible otherwise have it built into SE workflow

 

Parting words…

I apologize if the format or grammar for the post is sub par this time.  I  ashes in some frequent flier miles and Hilton points and am taking a few days off. Score our next SE Summit.  This post was made possible by an infinity pool in Mexico, a waterproof iPhone 7 Plus, and a margarita.  Enjoy your work and your life!

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

3 thoughts on “Measuring & Tracking SE Teams – Solved?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s