Product-led growth (PLG) fuels the fastest-growing SaaS companies—Slack, Calendly, Loom, Datadog, the list goes on. By leveraging their products to acquire free users, who then convert into paying customers, PLG companies grow faster and more sustainably than their rivals.
However, PLG content strategies are undermined by legacy marketing hangovers—and measurement’s the chief culprit.
Content marketing teams often borrow measurement models and reporting frameworks from sales- or marketing-led companies. These old processes don’t make sense in a PLG context.
To align content marketing with a PLG strategy, you need new measurement and reporting processes. For this article, we interviewed content leaders at thriving PLG companies, learning how they:
Before PLG, most companies ran either a sales-led or marketing-led go-to-market strategy.
In sales-led companies, the sales team drives revenue growth. Marketing’s job is to fill the pipeline with leads for reps to work. Marketing teams typically track metrics like form fills, marketing qualified leads (MQLs), sales qualified leads (SQLs), and sales accepted leads (SALs).
In a marketing-led GTM strategy, the marketing team (unsurprisingly) takes the lead. Here, marketing’s goal is to generate traffic, convert leads, and nurture prospects into customers. They track metrics like organic traffic, marketing qualified leads (MQLs), and sales.
Both sales- and marketing-led companies follow a similar acquisition funnel. The difference lies in which department takes the lead. PLG, on the other hand, works totally differently.
Take a look at this visualization of Mixpanel’s PLG motion:
“Product-led growth (PLG) is a business methodology in which user acquisition, expansion, conversion, and retention are all driven primarily by the product itself,” write the experts at the Product-Led Growth Collective. “It creates company-wide alignment across teams—from engineering to sales and marketing—around the product as the largest source of sustainable, scalable business growth.”
PLG companies don’t target traditional buyer roles like decision-makers, champions, and influencers. Instead, they focus on end users. It’s a new way of looking at not just growth but business operations. A fresh approach needs new metrics.
But what should you track?
The Product-Led Growth Collective recommends seven key metrics for PLG companies:
The Collective also stresses that PLG metrics are a shared responsibility. For example, almost every team contributes to TTV. If a product’s onboarding content is effective, TTV accelerates. If the UI is intuitive, TTV accelerates. If the technical infrastructure is robust and reliable, TTV accelerates. You get the idea.
Departments can't focus on everything at once. Content teams usually select product-qualified leads (or a closely related metric) as their North Star.
For example, analytics platform Joinr tracks two content marketing metrics:
“Our content exists for one reason only: to attract visitors to our website and entice them to try our product,” says Stephanie Totty, Joinr’s VP of Marketing. “Once they’ve signed up, our product nurtures them into continual engagement and, eventually, the paid version of the app.”
She explains that the C-suite (her boss) is mostly focused on user acquisition. That’s why she tracks page signups—Joinr’s version of PQLs. Organic search clicks is a holistic marketing metric. It’s a quick way to demonstrate that “the rest of the marketing machine is working the way it’s supposed to.”
Grain, another product-led company, takes a more granular approach. Rasheed Ahamed, content lead at the video meeting workspace, sets different metrics for each content format. Here are three examples:
Rasheed’s longform content is the first interaction visitors have with Grain. Its goal is to convince them to sign up, test the product, and experience the value of Grain. Product-qualified leads is their primary content marketing metric.
Grain’s second KPI (key product action completions) relates to TTV. Rasheed has a list of key actions that, if completed, demonstrate the product’s value. The quicker people complete those actions, the faster their TTV and more likely they are to stick around.
By tracking the impact of educational content (onboarding, help, support, customer success) on key actions, Rasheed can assess whether his content is accelerating or slowing Grain’s average TTV.
The third metric benchmarks the overall impact of Grain’s content by comparing product usage of blog readers against non-blog readers. This is more of a directional metric, indicating whether or not their content strategy is working. Higher product usage correlates with higher expansion revenue, ARPU, and CLV, as well as lower net churn.
Most content teams track product-qualified leads as their North Star metric.
However, tracking user behavior is tricky. Those with less sophisticated product tracking may opt for a related metric like signups.
Mature content teams may also measure their impact on other funnel stages like onboarding, tracking their impact on TTV, ARPU, and CLV.
Setting a North Star metric is easy. Tracking it is the hard part.
The B2B buying process is complicated. Buyers jump forward and loop back in the sales process. They’re autonomous and informed. Much of the buyer’s journey happens off-site—on social media, private communities, and third-party review sites.
This makes tracking difficult.
Say someone reads a glowing testimonial for your product on LinkedIn, asks for a second opinion on a professional Slack group, and reads a few articles on your blog. When they eventually sign up, what channel gets credit for the acquisition: social media, content marketing, or community?
Your answer will depend on your attribution model.
Ignoring a few niche options, you have five models to choose from:
“Virality is a common growth tactic for PLG companies, and it’s difficult to measure,” Abhishek explains. “Also, in most cases, the product is discoverable via Google search, which is overrepresented in most types of attribution logic. Given this channel only tracks the ‘last mile,’ the attribution misses out on all the channels that led a user to search.”
But content teams need to track conversions—even if the process isn’t perfect. Abhishek suggests companies use a time-decay model, which spreads attribution across all touchpoints.
Think again about the above example. Using time-decay attribution, you'd assign the following credit:
All three receive some credit for the conversion. But because the new user read an article immediately before they converted, content marketing gets the lion’s share.
Tracking user signups is only half the challenge. If you want to report on product-qualified leads, you need to track user behavior after they convert. Remember, PQLs represent users who have experienced your product’s value.
A user becomes a PQL when they hit a certain value metric threshold. Here are some examples:
“As you can see, each of these PQL definitions is closely tied to what the business offers as a product,” Wes writes. “And if you become a PQL, your likelihood of becoming a paying or returning user increases significantly.” He also has an in-depth guide on how to figure out your company’s value metrics.
At this point, you’ve identified your North Star metric (product-qualified leads) and a way to track performance (time-decay attribution modeling). But what do you do with that data? That’s where reporting comes in.
Reporting plays a couple of key roles.
First, it guides your work. At the risk of trotting out Drucker's overused management mantra, “If you can’t measure it, you can’t improve it.” Tracking performance at campaign and asset level allows you to figure out what’s driving results and what’s underperforming.
Content teams often develop internal dashboards, adding a bunch of more granular metrics alongside their North Star. Take a look at Joinr’s dashboard.
“I’m interested in the total number of sessions, engagement rate, and time on page, but I consider these “vanity metrics,” says Stephanie. “They help us optimize the content and our strategy, but don’t measure an asset’s actual performance.”
Think of this as a qualitative dashboard. Is your content good? Is it engaging? Do readers share your articles and eBooks?
While that information is invaluable for content teams, it’s less important for senior marketing and business leaders. They want answers to a simple question: How are you moving the needle?
That brings us onto reporting’s second role.
The second (arguably more important) role reporting plays is demonstrating impact. Even today, with countless surveys and studies on the effectiveness of content, business leaders remain skeptical. They say it’s impossible to tie content to revenue. Unless you can disprove that belief, it’s impossible to secure budget and headcount.
In Allan’s example, he’s tracking website users, users on trials, customer wins, and the conversion rate between each. His approach works perfectly for the content marketing North Star—PQLs.
To create your funnel, add one upper and one lower funnel metric:
Although your North Star for content is PQLs, it’s important to draw the throughline to revenue. Without demonstrating your impact in dollars and cents, it’s impossible to earn trust and buy-in among executives.
During economic downturns, marketing leaders tighten their budgets. Content teams can’t fall back on old school measurement and reporting processes. To earn respect within your department and safeguard your budgets, you must track impact and demonstrate impact in your PLG motion.
If you can’t show how your content strategy contributes to company growth, it’s much harder to justify continued investment and support.
Conversely, if you can prove that content drives the majority of your product’s PQL acquisition, your position will be rock-solid.