Your end goal of post-event analysis should be the ability to accurately calculate incremental volume, incremental revenue, incremental profit, and return on investment. These event metrics measure promotional effectiveness and efficiency.
A few definitions before we get started:
- Incremental: We mean volume, revenue and profit — above and beyond base expectations.
- Base: For the purposes of post-event analysis, we define base as what you would expect in the absence of a promotion.
5 common challenges to post-event analysis:
- The Right Data to Determine Incremental
- An Accurate Baseline
- Data Anomalies
- Correcting Data Sources @ Account and Product Level
- Trade Spend Truths
The concepts seem pretty straightforward… You look at those measures and they don’t seem that complicated. I’ll tell you that based on my experience, most challenges are data related, but process deficiencies are often the cause of those data challenges. ✭ Watch the webinar: 5 Challenges to Meaningful Post-Event Analysis
#1: The Right Data to Determine Incremental
What data determines incremental? Below is weekly a weekly shipment graph. This is what you’re shipping to the customer on a weekly basis. You can see that every 4-6 weeks you’re shipping product in – some spikes are higher than others but it’s a fairly typical shipment pattern. So, I’ve got some questions for you about that:
Can you determine a non-promoted baseline from this data?
- The answer is probably no. And to complicate things further, if most of your promotions are scan promotions, there’s not going to be anything in the shipment data that tells you whether it’s promoted or not. Most times, in most categories, shipment data is not going to be able to determine a non-promoted baseline.
Can you tell what was actually consumed versus warehouse loading?
- No. That’s another problem.
How does diverting impact your analysis?
- You have no idea where this product is going. Identifying things like diverting brings value to the shipment data but not for determining our base.
- Indirect: The other thing to consider with shipment data is a lot of your large customers are indirect and you’re not going to have any shipments into those indirect.
- DSD: Now there’s one exception where shipment data will actually provide a pretty good view of what’s happening in terms of base and incremental and that is if you’re fortunate enough to be in DSD categories. We have a few clients who are in DSD categories – salty snacks for example – there you can get a pretty good sense for what your base and incremental are from the delivery data.
This next graph shows POS data. Consumption data is typically from a syndicated supplier like Nielsen or IRI. Each bar represents a week of consumption, a week of scanner sales.
The gray bars are weeks where there’s no promoted sales volume. There’s no promotion or in-store merchandising support. There’s no pricing action.
The colored bars represent sales with merchandising support, whether it be ad, display, TPR, etc. and we can also see price deviations.
There’s a lot of information here.
Let’s go through some questions:
Does this provide a good estimate of the volume that you would expect in the absence of promotion?
- Yes. This view lets us determine that.
Does this provide a clear view of total consumption from which we can calculate incremental?
What other insights does this POS chart provide?
- Syndicated POS data for this type of analysis works. You can use customer POS data – it gives volume – it won’t give you merchandising. There are some additional sources of data. But for the most robust analysis the syndicated POS data is really what we go.
#2: An Accurate Baseline
Does using syndicated consumption data solve of all of your baseline challenges? The short answer is no. This alone doesn’t solve your problem. There are a lot of different factors that can affect baselines.
So what we’re looking at here – the black line – this is the baseline on the syndicated supplier generated. You’ll see things like this during a promotion, base volume is going up. So if I were to analyze this event, is this where I want to analyze incremental from?
We need a baseline that represents what we would expect in the absence of a promotion.
This baseline probably isn’t going to do it. Just having the syndicated data alone does not get us what we need in terms of determining the correct baseline
Now, what we’re looking at here is what I would call a more ideal baseline. Does the dotted line here represent our definition of base?
Baselines aren’t perfect, but they’re modeled to be a pretty good representation of what I should expect in the absence of a promotion. And even when there’s a pretty dramatic swing in base consumption it’s adapting very quickly.
One thing to remember is that with baselines it isn’t always a one size fits all. So when you build your process you had to keep that in mind.
The baseline methodology needs to be able to handle a variety of different consumption patterns that you have within your portfolio. All baselines are some form of a moving average. There’s a lot of logic beyond that, but it’s some form of a moving average. The greater the number of weeks included in that moving average, the longer it’s going to take for it to respond to things like seasonal changes. Use a shorter number of weeks, the baseline should respond more quickly.
#3: Data Anomalies
So, have you ever seen anything like this big volume spike?
No merchandising support. No price action. Ever see anything like that?
There are a few reasons why this kind of spike can happen:
- Either they didn’t pick something up that was relevant (digital promotion for example)
- You’re benefiting from something else that’s going on in the marketplace (competitor out-of-stock for example)
Regardless of the reason, we want to be able to do something about this. Looking at this chart, I’m going to understate my incremental volume, incremental revenue from profit and my why all of it’s going to be understated because of that one data anomaly.
Understanding the price points that are coming through in the syndicated data and having some way to address those to make sure that they don’t affect your analysis of historical events in your predictive analytics is very important.
#4: Correcting Data Sources @ Account and Product Level
If you’ve been working on event analysis, you understand that trying to marry data from different sources is quite difficult.
You have data from syndicated suppliers, internal sources, TPM data sources, ERP data. How do you get all this data lined up so that that they make sense?
- Multiple definitions within internal systems (ship to, bill to, sold to)
- Key account definitions within syndicated POS data
- Multiple definitions within internal systems (SKU, case pack)
- UPC level vs. “industry subtotals” within syndicated POS data
In my observation, a well-designed and implemented TPM solution generally takes care of these internal definitions because you go through that process when you’re implementing so in many cases that will help you work through these issues. (That’s some good news!)
Think about this example:
You’ve got a chain that’s got 10 ship-tos and they’ve got 5 different divisions that all operate independently and a single bill to. Very extreme example right?
You can’t use either ship to or bill to to define that customer. You need another definition and it’s got to be in your core systems. So while this doesn’t necessarily have to do with post-event analysis, it’s about laying the foundation. When you’re setting your systems up (your ERP system, your TPM system) you need to make sure that you’ve got the customer definitions correct for the purposes of trade promotion.
This is foundational. If you don’t address this, you’re going to struggle with TPM and you’re going to struggle with post-event analysis. It’s definitely got a cascading effect.
#5. Trade Spend Truths
This is an interesting one. This is something that we see with new CPG clients. Ask yourself a few questions:
Is your trade spending explicitly tied to promo events?
- This depends on where you are in your journey toward managing your trade promotions. If you’ve got a purpose built trade promotion management tool you probably are tying your trade expenses directly to the promo. That certainly makes things easier.
- If not, then you have to ask yourself: How am I going to align that data? How am I going to find it?
- You know when the payment was made, but if you don’t have it tied to an event, do you know what activity it’s tied to? It becomes really complicated if you’re in that situation. This certainly a challenge that many clients will face early in their development of a full closed loop solution is your trade spending assigned to the correct counting and product level.
Is your trade spending assigned to the correct account and product level?
- If you have a TPM solution, this is probably taken care of for you
- If you haven’t, then this is a smart question to ask. I’ve worked with clients, for the sake of simplicity, charge everything to the first product or to the brand. That makes it efficient for the person creating the settlement, but you’re left with no usable information. You eliminate your chance of having meaningful data.
Once you have all the data at the right level and it’s clean, you’re ready to do something with it! Don’t wait for everything to be perfect. This is not a perfect process. It’s never going to be a perfect process for every account.
Get into the discussion — how do you align consumption with trade events?
- If you’re doing this in spreadsheets, it’s very time consuming. You’re getting the weeds. It’s a lot of what I would call heavy lifting associated with doing that.
How do you apply your financial information?
- In order to calculate incremental revenue and incremental profit, you need to understand what the list price is and what the cost of goods is for that that incremental volume. You need to apply that somehow to those events to get to that piece of it.
What tools do you use?
- Excel? Some sort of vii tool to pull all these things together? Or do you use a purpose built TPM?
CPG companies spend a lot more time pulling data together and cleansing it than they do actually analyzing it. It’s pretty amazing how much time is spent on this robust post-event analysis. I don’t know any manufacturer who says I’m there, I’ve got this down. It’s a journey. You’re always looking to get better. You’re always facing new challenges. Success depends on laying the proper foundation.
This is a key point – payback is huge. Think about the tens of millions some cases implies hundreds of millions spent on trade and how much of that is really being effectively analyzed and how much better could you do with that trade budget. So the rewards are big.