How to Forecast
A CS leader’s guide to forecasting in today’s revenue-driven world
TL;DR — What This Post Covers
Forecasting in Customer Success wasn’t always required — but today, it’s non-negotiable. In this post, I walk through:
Why CS didn’t forecast 8+ years ago (and why that changed)
How we first forecasted using a simple spreadsheet
How forecasting evolved into Salesforce and Gong
Where things got messy (carryovers, non-payment, win-backs, product swaps)
Best practices I learned the hard way
What CS leaders and ICs should understand about forecasting today
If you’ve ever felt like forecasting was more complicated than it needed to be — or wondered if everyone else was struggling too — you’re not alone.
Why CS Didn’t Have to Forecast (And Why That Changed)
Eight years ago, most Customer Success teams didn’t forecast.
Not because we didn’t care about revenue — but because we didn’t have to.
There was so much new business coming in the door that as long as churn wasn’t egregious, things worked out. Retention was “good enough.” Expansion happened organically. And CS looked at numbers in retrospect, not ahead of time.
At the end of the quarter, we’d say things like:
“That was a good quarter. Lots of new business and we retained most of our revenue. Great.”
That world doesn’t exist anymore.
Why Forecasting Is Non-Negotiable Today
When COVID hit, everything changed.
Customers’ budgets tightened. Buying decisions slowed. New business dropped. Suddenly, leadership wanted answers CS hadn’t been expected to give before:
How much revenue are we actually going to retain this quarter?
What’s at risk?
What’s likely to expand?
Where should we intervene now to change the outcome?
This is when forecasting became mandatory — not just for Finance or Sales, but for Customer Success.
And this is also when CS became undeniably tied to company revenue.
We were always tied to revenue — recurring revenue is the lifeline of a SaaS business — but forecasting forced CS to own revenue outcomes explicitly, not just relationships and health scores.
How We First Forecasted: Start Simple
Our first forecast wasn’t fancy.
It was a spreadsheet.
For each renewal in the quarter, we tracked:
Account name
Revenue up for renewal
A health score
A prediction: renew, downgrade, expand, or churn
A column for expected dollar change (upgrade or contraction)
We then used a simple formula to calculate the forecasted renewal amount.
For example:
$30k up for renewal
We predicted a $10k contraction
The formula calculated $20k as the forecasted renewal amount
If we predicted an expansion, that amount was added. If we predicted a downgrade, it was subtracted.
We totaled everything and calculated GRR and NRR.
Simple.
Moving Forecasting Into Salesforce
As forecasting became more important, spreadsheets stopped scaling.
Salesforce became the most natural place to forecast.
Every renewal was created as an opportunity and moved through the same stages as new business. Each stage had a probability attached.
For example:
Qualified → 50% probability
Pending Signature → 90% probability
That probability was applied to the renewal ARR, giving us a weighted forecast.
CSMs were responsible for:
Adding notes about risks and next steps
Flagging potential expansions
Logging expected downgrades
Expansion and contraction fields were structured so product selection auto-populated ARR, keeping the math consistent.
Where Gong Came In
We fed Salesforce data into Gong’s Deal Board, which gave us a consolidated view of:
ARR up for renewal
Forecasted churn
Forecasted expansion
Salesforce stage and probability
Health score
Touchpoints and recency
CSM notes and next steps
Salesforce was our system of record, but CSMs mostly worked out of Gong.
Not all fields were bi-directional. Core forecast and revenue fields lived in Salesforce, while Gong surfaced that data alongside activity, conversation insights, and CSM inputs. Some updates synced back to Salesforce, but Salesforce remained the source of truth.
In practice, CSMs worked primarily in Gong unless they were executing a contract, in which case they worked directly in Salesforce.
This setup allowed us to forecast in Salesforce while using Gong to validate assumptions, highlight risk, and understand whether customer activity supported what we were predicting.
Where Forecasting Got Complicated (And Why You’re Not Alone)
This is the part no one talks about enough.
Renewals That Didn’t Close on Time
Not all renewals closed in the quarter they were due.
A renewal might be up in Q1 but not close until Q2.
That raised uncomfortable questions:
Do we go back and change Q1 numbers?
Or do we roll the renewal into Q2?
I’ve seen both approaches.
Personally, I strongly prefer rolling renewals forward. Changing numbers already communicated to the board is risky and makes the company look like it doesn’t understand its own numbers.
Once a quarter is closed, it should stay closed.
Customers Who Said They’d Renew — Then Didn’t Pay
Some customers told us they were renewing, but the revenue was never recognized.
We chased them for a quarter or two before eventually wrote it off as churn.
Again, the question came up:
Is this churn in the original renewal quarter?
Or do we change historical numbers?
Best practice here is simple:
If revenue isn’t recognized, it’s churn — recorded in the period the decision is made to write it off.
Win-Backs
Sometimes customers churned and then came back within a short window.
We were tempted to erase the churn.
But rewriting history creates more problems than it solves.
Best practice:
Record the churn when it happens
Treat the win-back as new or reactivated revenue in the period it closes
Product Swaps
This one comes up constantly.
A customer drops one product and picks up another. We counted the dropped product as churn and the added product as expansion.
The result:
Churn looked worse than reality
Expansion looked better than reality
Best practice:
At the account level, this is neutral
At the product level, it’s contraction and expansion
Be explicit about which lens you’re reporting through and apply the rule consistently.
Weekly Forecasting and Executive Roll-Ups
All of the data flowed into a spreadsheet that automatically calculated GRR and NRR.
We forecasted entire quarters out, and the expectation was that CSMs updated their forecasts weekly. We pulled the forecast at exactly the same time every week and kept historical versions.
This allowed us to see whether we were closing the gap or getting further away, meaning we were predicting more churn.
Every Monday, I rolled up the GRR and NRR forecast to the executive team and shared whether it was higher or lower than the previous week and why.
I also walked them through:
The largest accounts up for renewal
The biggest risks
How we were mitigating those risks
Next steps for expansion on the largest accounts or where we had the greatest opportunity
What I Learned About Forecasting
After doing this multiple times, here’s what actually matters:
Salesforce must be the system of record
Gong should validate forecasts, not replace them
Separate facts from judgment
Forecasting is forward-looking, reporting is historical
Consistency matters more than precision
Final Thought
Forecasting isn’t about being right.
It’s about being early enough to change the outcome.
If you’ve struggled with messy renewals, confusing churn math, or forecasts that felt more political than factual — you’re not alone.
I hope this post gave you some ideas for how to tackle this head on.


