In this article, Gary Gonzalez (Managing Partner, Catchy) and Tom Williams (Head of Consulting, Catchy) provide a roadmap on how to effectively market to developers, and how to measure the results through a data-driven approach.
Gary
Data-driven marketing is very buzzwordy. We threw this term into Google a couple of days ago, and it came up with pages and pages of results which were all very broad and a bit too general.
Developer marketing is its own field with its own rules and regulations, so we decided to make an actionable framework that applies to developer marketing programs, and is rooted both in building these programs and in data.
What is the developer marketing flywheel?
At Catchy, we have the good fortune of working across the entire industry with the best and brightest developer marketing programs, which gives us a really interesting horizontal view of what's happening across the industry.
For the core of the framework, we built a flywheel that we think applies to developer marketing programs, and is built around the assumption that you are involved in some aspect of building a developer marketing program.
We wanted to build a flywheel and a framework that enables you to build an end-to-end developer marketing program no matter your experience and no matter where you are in the journey.
Maybe you've been marketing to developers for over a decade, or you’ve never talked to a developer in your life. We think our framework has something for everyone.
There are varying degrees of complexity as we go out from the middle of the flywheel. At the simplest level, there are three phases when building a developer marketing program.
The first is discovery, which is when you're doing your research and building your approach to the market.
Then you have the go-to-market phase where you're trying to reach developers and your audience. This is all about increasing awareness, facilitating consideration, and developer acquisition conversion events.
Then thirdly, we have the growth phase. We've got all the building blocks and our strategy in place, we've gone to market, but now we want to take this to the next level.
You could endlessly debate the core activities that go into each of these three phases, but for simplicity, we've broken it down into nine different core activities.
Finally, the ultimate level and the part that a lot of marketers and programs miss, are the key data points. What are the key pieces of data you can take away from each of these different categories? What data can you use to judge the efficacy of each of these categories to know whether or not your program is working?
The discovery phase
Stakeholder interviews
When building a developer marketing strategy, we have a product, a platform, and a market, and we’re trying to figure out where they come together, what the messaging is around it, and who we want to talk to when we go out to market.
Catchy has been doing developer marketing for over 12 years. Our experience shows that 80% of this information exists with people you already have access to. Coded interviews are the best way to do the majority of the legwork of assessing your product market fit, and where we start taking qualitative data and turning it into quantitative.
If you're not familiar with these interviews, go out and take a selection of people that work within your organization and interview them. Ask them about the product, talk to them about what the market fit is, and then go talk to some of your customers.
Coding is the act of going in and looking for themes and then counting out how often those themes come up. It's a really nice way to take pages and pages of qualitative interviews and make them quantitative.
We worked with a cloud technology platform a few years ago who was getting ready to go-to-market. They had a product that they wanted to relaunch and go-to-market because they hadn't been getting a lot of traction on the platform.
So we first started by talking to the marketing and sales team and asked them: what have you gone to market with? What do people want to know?
When we started coding the data, it was really easy to see that security, control, and risk were the main themes. Everyone was convinced that developers were obsessed with security. They also wanted to mitigate risk, and have better control, speed, and agility.
But when we started talking to customers, we noticed that they weren't talking about security at all. When we dug deeper, we found out that it was because they already knew and trusted that the platform did security.
They were really focused on performance, problem-solving, and scale, so this was a really helpful exercise to help guide everything we were doing to go-to-market.
It's much more powerful to get buy-in across your organization if you have the supporting data, and coding is the technique that allows us to do that in the stakeholder interview phase.
Secondary research
Secondary research is a great way to fill out your market understanding. It’s very easy for people who are new to the developer marketing space to get overwhelmed. They don't know developers, the space, the tools, or technologies, so how are they supposed to understand this audience?
Luckily, a lot of this information already exists. If you’re looking for demographic data and research, you have a lot of great sources. Stack Overflow does an Annual Developer Survey that over 70,000 developers contribute to.
If you have tried to do a developer survey, you know how hard it is to get developers to take a survey and give you data. Stack Overflow doesn't have that problem. They have thousands of developers from around the world who want nothing more than to contribute. It’s quite possibly the best source of developer research that exists out there, and it’s also free to access.
There are other developer research and data-heavy companies that also publish free versions of their reports, so go and check them out.
Gain market insights through competitive analysis
Manual competitive analysis is a great way to look at what’s happening out in the market.
The key here is doing the extra bit of work to take qualitative observational data and make it quantitative. We rank each of these categories from 1 to 5, which makes it really easy to do comparisons of where your program sits, where other programs sit, and to do a gap analysis and identify where you may want to go given the data you're seeing.
Stack Overflow has already identified the top 10 populations where developers are distributed around the world. This is a great place to start to figure out where you may want to target your campaigns or developer efforts.
These are easy exercises that take a lot of groundwork, but you won’t be limited by resources. It just takes time and effort to go in and do these rigorous analyses. And you get really good data, which makes your decision-making easy as you start building your developer marketing strategy.
Primary research
Primary research is very much focused on product. You should have a pretty good idea of what's going on from your stakeholder interviews, market research, and secondary research. However, you may find yourself with very specific questions about your audience, their views, and how they're engaging with a platform or a product.
There are a few different methodologies you can use to ask some of those final mile questions before going to market. Qualitative research and stakeholder interviews are a type of qualitative research. Observation, interviews, and focus groups are really helpful for understanding mindset needs and pain points.
It's good to go into this with a hypothesis that you've developed in doing your stakeholder interviews. Go to market and ask some specific questions from a global developer population and find out if that hunch that you have from your stakeholder interviews is true or not. And that's where you can do that with data at scale.
I'm now going to let Tom take over to guide you through the next two phases of the flywheel.
Go-to-Market
Tom
Broadly, you can split go-to-market activities into three separate buckets, and I'll introduce them briefly in turn.
Own the properties that you have within your organization, which are your website, your developer portal, your documentation, and your content.
I cannot overemphasize the importance of good content, whether we're talking about case studies, success stories, getting started guides, and the channels that come out of community, which is where other people start to do that work for you through reviews, word of mouth, organic, and social.
The other category is paid, which is where you're putting dollars against eyeball impressions. You're putting ads out into market, you're getting display ads, and you're paying for search rankings. However, it's not enough to just bundle all of your efforts into paid.
I like to encourage a healthy balance of all three of these buckets. Paid is great for getting awareness, but when we're thinking about a developer journey moving down into an evaluation phase or an adoption phase, paid isn't going to cut it. At that point you need community, resources, and content.
In order to measure this, be aware of where your spend and your efforts are, and aim for that sweet spot of approximately equal. In some of the very best cases, you can even start deprioritizing paid because community and your own channels are doing the hard yards for you.
Just as developers are aware of their various options, it's really important if you're running a developer marketing program to be aware of the competitive landscape.
There's many initiatives that we could talk about, but I want to focus on the following three:
Share of voice
You can aggregate data from Twitter, Reddit, and Stack Overflow to understand when developers are talking about a given topic, how much of that conversation includes mention of your program, and your competitors.
If you track this over time, you can be aware of whether that's diminishing or growing in terms of your share of that developer conversation.
Sentiment
The second bucket is quite literally the sentiment of those mentions of your developer program. Are they positive or negative? At a macro level, you want to map that trend over time. Are those mentions generally getting more or less favorable? This can yield some insight into the quality of your developer experience.
At a micro level, this can be really good for yielding some tactics. For example, we were working with a company and we were observing that among the negative sentiment bucket, there was a lot of mention of getting the proposition and the features, but what can you actually do with this? Where are the success stories and case studies?
That ended up yielding a really rich content project that started with going out and speaking to developers and telling these success stories, which is a really important part of that evaluation process.
Increase awareness
If you're an organization that perhaps doesn't have a culture of awareness around developer marketing or developer programs, this can be really important for securing internal buy-in.
You might have a tiny developer marketing budget, but if you're able to aggregate these buckets and say ok, our competitor has got a huge share of voice, their budget is enormous, how do you expect us to keep to compete with them directly? That's quite a useful tool.
You’ll go out and evaluate not only the size of the budget, but where competitors are spending, which is a really valuable piece of information for securing internal buy-in.
We can aggregate data from multiple sources such as Twitter, Reddit, and Stack Overflow. Catchy runs a service called the Developer Signal Hub, where we produce reports on a systematic basis and aggregate information from all of those platforms, showing share of voice, sentiment, and analysis, and turn all of these conversations into something that we can measure.
Developer acquisition
A surprising but very common occurrence when we work with developer marketing programs is the fact that organizations don't always have a shared understanding internally as to what we mean by an acquired developer.
I'll give you three examples. An acquired developer could be someone who's come in and registered for an account, somebody who’s made a set number of API calls, or someone who's converted from a free tier to a paid tier. All are valid measures of what counts as a developer in our program.
But, unless you've reached a consensus internally, measuring developer acquisition becomes a bit of a non-starter.
Second, and closely related is that whatever metric you decide on, you've got to be able to measure it. Once you've decided on that metric, you need to add tracking pixels to be able to actually quantify and measure how the program is doing over time.
The third example is that there are different routes and touch points from which developers are coming into your program. There's a lot of data being produced by all of these platforms, and something we encounter surprisingly frequently is this fragmented data landscape around a program.
Quite often this translates to the need for a CRM, but also any sort of holistic data strategy so that all these touchpoints start to unify. And when you see developers coming into your program, you're able to actually generate some insight as to which channels are the most powerful and where to shift your budget to.
Growth
Developer experience
Developer experience borrows a lot of good practice from user experience, but these are not synonymous.
Developer journeys are a lot more complex than typical user journeys. They occur over multiple sessions and involve several touchpoints of testing products and sandboxes, doing test API calls, evaluating documentation, etc. So having good developer experience across all of these touchpoints is an absolute prerequisite.
But how do you measure something as quite nebulous as that? We sometimes encounter programs where the absolute core metric that people care about is registered developers, but that's not even a measure of your developer experience.
If you're only worried about the gross number of people who've signed up for your product, that’s a bit of a vanity metric because it's only ever going to go up, and at worst it’ll flatline if developers stop coming in.
When we're talking about how to measure developer experience, the thing you should really care about is attrition. When developers have had enough of your product, they won't necessarily let you know about it. They'll just quietly go elsewhere and you won't even notice, especially if you're only tracking registered developers.
I like to look at activity. If someone hasn't logged into their account, made an API call, or logged into the developer portal in a certain number of days or months, then they're inactive, and that can be an attrition metric which is something to monitor over time.
If your attrition metric is going up, that's a strong indicator your developer experience is poor and it’s probably time to go back to the research phase.
Building a community through engagement and retention
Everybody wants to grow a community. However, audience is not the same as community. If you’re trying to grow an audience, you might look at how many followers you have or how many likes your post got.
However, community is multidirectional, collaborative, and inclusive, and those one-directional follow metrics aren’t telling you if you've got a community. We're going to have to look at different measures for that.
Retention is the first, which is the opposite of attrition. Retention looks at whether people are sticking around. The second is engagement. Are those people who are sticking around collaborating and participating?
Retention and engagement are much more of an indicator of whether you've got a community, or whether you're starting to grow one.
Elevating champions
This is about identifying those people within the community who are already doing the behaviors that you want to encourage. They're supporting other developers, sharing posts, creating their own content, and being elevated to positions where they can support the growth of the community.
Elevating champions can be as modest as just resharing and liking their posts, and giving them a small budget to run their own meetup groups in their local community.
The very notion of a flywheel implies that there’s a lot of hard yards to get started and it's quite expensive to get this wheel turning. But through growth and community, it starts to move a bit quicker and the incremental effort of bringing new developers into a program becomes less.
You could even consider a metric for this as something like cost per acquisition or cost per developer. Initially, it's going to be very expensive to bring your first 100 developers into a program, but over time and through these efforts, that will start to go down.