Go to app
BlogInspiration

Most features don't get used, with Judd Antin, Alastair Simpson, and Claire Drummond

Published
18 June 2024
Content
Alissa LydonClaire DrumondJudd AntinAlastair Simpson

Watch this dynamic panel talk on measuring value, aligning features with metrics, identifying unused feature causes, and much more. Moderated by Dovetail Head of Product Marketing Alyssa Lydon, this buzzy session includes insights from Dropbox VP of Design Alastair Simpson, UX Consultant Judd Antin, and Atlassian Head of Product Marketing Claire Drumond.

Jeremie Gluckman—emcee: I’ll give you a quick overview of this impressive panel: Judd Anton has had a varied journey from big tech brands like Yahoo to Facebook and Airbnb. He’s now a coach, consultant, and teacher who recently published an article about the reckoning going on in the research industry. Alastair is the VP of design at Dropbox and an advisor to many startups. He was previously at Atlassian and LexisNexis. Finally, Claire Drumond specializes in brand voice and is currently the senior director and head of product marketing at Jira and Jira Suite at Atlassian.

Alissa Lydon—moderator: We’re bringing three different perspectives on how customer insights should fuel our decisions—not just within our own functional areas but how we can come together and drive that throughout the business. I’m super excited to learn more about the areas of overlap between our three disciplines and what kind of perspectives I will gain as a product marketer during this session.

A recent Pendo report found that feature usage conforms to a Pareto distribution. That means that only 20% of features get 80% of the usage, and the remaining 80% were reported as rarely or never used. That is a shocking statistic: 80% of features are rarely or never used.

But what do we think about this number?

Is quantifying feature usage the best way to understand the value your products produce for your users?

Try the new magic features

Ask questions of your data, get AI-suggested highlights from customer calls and more with the new magic features.

Get Dovetail free

Metrics that matter—gauging product value effectively

Judd Antin: 80% is low. More than 80% of features are probably not used. I do think it’s useful to look at how many people are using a feature—but I don’t think that’s the only way to quantify value.

Case in point: did you follow the news about Snapchat’s recent Solar System feature? They were using friend ranking in a way that turned out to be hugely harmful (particularly for young people). You can imagine how friend ranking on Snap would be a bad idea. And the CEO said, “Oh, well, only a fraction of a percentage of people are using it.”

Okaydoes that mean we should ignore the potential for a feature to cause harm? Well, no. No more than we should ignore the potential necessity of a rarely used feature.

At Airbnb, we spent a long time working on error states. We hope a user never encounters error states, but they’re really important when they do. Or at Facebook, we call them checkpoints (where people need some verification). We hope you never get in one, but you better be able to get out. So, I don’t have a problem with quantifying usage, but it’s not the only way to look at the problem.

Alastair Simpson: As Judd said, it’s important to look at how many people are using your features. However, it’s also important to understand how specific customers use those features and how they drive input metrics like usage, which drive output metrics like retention and churn. You need to look at what is important for the specific customer you’re building for.

Alissa Lydon: Yeah, that’s right. Some features are just like checklists. It’s table stakes—we must have these things to sit at the table.

While it looks shocking, usage is not the only metric we use to measure a product’s usefulness, success, or value it is driving for customers.

So, if that isn’t the best indication, what are some good ways to understand if you are building the right things for your customers?

The trouble with linking product features to business success

Claire Drummond: To Alistair’s point, whether you are building for the right types of customers and ensuring that your product is solving specific problems for specific cohorts within your whole customer and user base is a good indication of whether the product is solving those problems.

The data we look at is difficult to parse out because it could be really specific, especially if you’re building features that are just table stakes. As product marketers, we often look at the competitive landscape and think, “We just need to tick these boxes because that’s table stakes,” even though it might only be useful for a small subset of users.

But there are other indications of product usefulness. You have to examine customer feedback to understand if you’re providing good solutions that solve those problems and if your customers need your product.

Judd Antin: All the researchers in the house have probably encountered this stereotype where people think, “Oh, if you want to know whether something’s adding value for a customer, just go ask them!”

Did you like it? Would you use this?

Of course, it doesn’t work that way—we want revealed preference very often, not stated preference. The best measure of value is loyalty, repeat usage, and retention. But for many products, there’s a long cycle. You might be a daily TikTok user, but the average person only travels on Airbnb once a year, for example.

So, what value means and how often it can show up depends on the context. Part of this conversation must be: Is the feature level even the right resolution [at which] to ask that question?

For instance, Dropbox might have all these individual features, but overall, the whole is greater than the sum of its parts. It’s not at the feature level that you derive value. It’s this whole thing that delivers a customer experience.

Claire Drummond: Let me jump in quickly on the value part as a product marketer. We think about that a lot as well. When we take certain features to market, we don’t want to do so based on the one particular feature that will solve this one small problem.

Claire Drumond, Head of Product Marketing, Jira, Atlassian
Claire Drumond, Head of Product Marketing, Jira, Atlassian

It’s more about stepping back and asking what the broader set of problems we’re solving are.

What is the value proposition, and what are all the features that solve that broader problem?

And that would help with feature adoption, too. You communicate to that subset of customers by saying, “Hey, look, this is the value you’ll get from this whole collection of features! And this is an experience that you can expect!” It’s about that whole collection that delivers a really good customer experience.

Alastair Simpson: Something we all miss—especially when talking specifically about features—is that the customer isn’t necessarily coming to your product to do one specific task. They might have a whole set of jobs.

Or, something we miss—because we do talk about Jobs to be Done a lot—is they’ve got emotions and feelings, and they might be trying to feel safer or more secure with their specific team (we hear this from Dropbox customers). And we always miss that.

Instead, we try to design for a specific feature. You’re trying to design to make somebody feel happy, comfortable, and trusted and feel good sharing things with their team.

How do they feel when sharing that product with them if they’re looking at a product to use with their team and a specific cohort?

If you’re measuring a small feature, you can often miss the larger emotional things you’re designing for and the larger set of jobs a customer is coming to your product to use.

Judd Antin: If you look at the feature level, you can get distracted by all the garbage. There are a lot of products, you ship a lot of features, and they become calcified.

Judd Anton, UX Coach, Consultant, Teacher
Judd Anton, UX Coach, Consultant, Teacher

It’s hard to ‘un-ship’ something even if you find (in talking to customers) that it’s confusing or actively a problem. You can feel backed into a corner where you can’t change anything without harming some metric that someone cares about.

When product development involves treating each feature individually, you can’t see the whole picture. You’re never going to understand what value means at the level of the whole product.

The learning is always to pull out from the level of an individual feature when you’re making an argument about how customers are feeling or behaving or the value they derive. It is about a feature—we must choose—but in the context of the broader product experience.

Alastair Simpson: There’s also that myopic feature level. If you understand your customers, you understand how they’re feeling, and you’re doing a great job designing for them. It’s a set of features they’re using to derive value or get to.

Facebook popularized that ‘aha!’ metric some years ago. But getting to some form of ‘aha’ where they will be sticky and activate on your product is probably not one thing. It’s probably a combination of a few different things that, in sum, provide a customer with a lot of value and will lead to being a loyal customer for a long time.

Tying feature development to revenue growth

Alissa Lydon: I love how this conversation is rooted in customer empathy and understanding their Jobs to Be Done. But it’s hard to remove that voice in the back of my head:

But what about the ARR numbers? Are we selling the thing? Is our revenue growing because of this feature?

So, from another perspective, how about connecting successful feature development with ARR or revenue growth? Is that a lagging indicator? Is it something that we need to connect more discreetly?

Claire Drummond: You’ll miss the whole point if you are trying to connect a feature launch directly to revenue.

Marketing is about more than just revenue. Are you attracting new customers? Are you developing something that will give you a competitive advantage? Are you going to solve a new problem in a unique way with whatever you’re developing?

So, to the broader point we were all talking about, it’s more of a collection of features that deliver an experience. That’s when you can see any uptick in revenue you might get from that experience because you’re developing something unique.

It will also cast you in a different light from your competitors. We all want to develop and ship products that make money—most of us do, I hope. Some products don’t care about making money, but all of ours do. So, you have that end goal in sight, but all of these other metrics along the way should indicate if you will achieve that goal. And it’s definitely a lagging indicator.

Alastair Simpson: When considering the core journeys you have for a customer, it’s important to consider how you delight the customer in a small, meaningful way.

We have many small motion design pieces and specific customer journey pieces in Dropbox. If you upload many files, you’ll get a little upload balloon that will come up, and it’s a small moment. We can’t measure that. We can’t tie it to any kind of ARR, but you must put those small Easter eggs in so that customers feel you care about those moments of detail and delight in a customer journey.

Alastair Simpson, VP of Design, Dropbox
Alastair Simpson, VP of Design, Dropbox

Understanding input and driver metrics

Alastair Simpson: Do you know the input or driver metrics for the product you work on? Your input metrics are the things that will drive the end result - like revenue or customer satisfaction. For a product like Dropbox, it might be uploading or sharing files or inviting team members.

If you understand your customers’ core Jobs to be Done, you can break that down into input and driver metrics. You can then see the steps we’re trying to get a customer through to ensure they can easily do the things we believe will eventually drive revenue.

If you’re unfamiliar with those things, I encourage you to go and understand them because they will be much better indicators of success and things you can move.

You can look at it qualitatively: did we make this thing easier for a customer?

And then, when you ship that thing, did it actually move that input metric from a quantitative measure?

If so, why? Go and understand, rinse, and repeat. And does it drive your output metrics? Or lagging metrics like CSAT or ARR.

Claire Drummond: Here’s one concrete example showcasing that by tying a feature to revenue, you won’t know exactly what that specific innovation is attributable to (unless you feature-gate and monetize that feature):

Jira has a feature called advanced planning, which used to be called advanced road maps. It’s in our premium edition. This feature’s value proposition is about connecting lots of different teams to an overview roadmap view, and that resonates. Every customer I talk to is trying to solve a cross-team collaboration problem, and they need an overview to see dependencies, where there are blockers, etc.

We launched that feature within our premium edition and can attribute a lot of revenue to that one thing. But when I step back and think about what it is—is it the roadmap view?

No, it’s the value prop that it delivers to customers. And that’s what really resonates in the market. That’s what people come to Jira for. That’s what they expect to see.

And yes, all kinds of friction come with that. It may not deliver exactly what you, as a user, want it to. That’s a good example of where it’s really about that value proposition that you can attribute to that revenue growth.

Uncovering why features go unused

Alissa Lydon: We’ve all been involved in projects where we wasted time and resources and launched features that ultimately were big flops. Especially in today’s economic headwinds, with organizations needing to decrease waste and increase efficiency, we must take a step back and understand how we can better deliver those valuable features to customers so we can all be successful.

Alissa Lydon, Head of Product Marketing, Dovetail
Alissa Lydon, Head of Product Marketing, Dovetail

So, let’s start with diagnosing what you see as the main problem here. In your experience, what are some common pitfalls—whether in the planning process or during R&D—that ultimately lead to building the wrong thing?

Judd Antin: There are two things: the first one is that most product development processes are broken.

Why is so much enterprise software bad? It’s because sales organizations drive most product development processes. For example, if your product is built based on feature requests and complaints from sales, this is a ridiculous way to build a product.

If you build a product that way—based on isolated feature requests or complaints rather than an overarching strategy and vision for the problems you’re going to solve for your customers, you will end up with—forget 80%—98% of your features will be unused. You will build the wrong things.

The second thing is that a diagnosis is hard to do across the board because many companies are broken in many different ways. It’s in the way that we want to be customer-centered. How we want to be insights-driven isn’t something that can happen in a waterfall fashion. It has to happen throughout the product development process.

It’s great to think of this linear process where insights lead to a beautiful evolution of thinking and design, where afterward, we get feedback and repeat the loop. There are many models like that, but that is unlike any product development process I’ve ever been in or seen. It’s way messier—it stops, starts in fits, backtracks, and rethinks. So, it has to be way more adaptable.

That means you can take a really cool, beautiful, holistic tool like Dovetail and just use it to cherry-pick evidence to support the thing the founder wanted to build anyway. It’s great at that, I’m sure. But, if instead, you use it to genuinely adapt to the ups and downs of a product life cycle—all along the way—then you are much less likely to build something that people don’t use. Those are two diagnoses. There are lots of them, though.

Claire Drummond: I generally agree with you, especially on the cherry-picking part. However, some elements are in the ideation and planning cycle before you even kick off development and design.

Do you have the right stakeholders for that phase, whether it’s waterfall or not? And when are those check-ins?

Most product triads don’t include marketing. And, and you know, at least Atlassian, our researchers want a seat at that table too, which makes a lot of sense.

But really, who’s going to represent the voice of the customer? And make sure that you have the right metrics to motivate that team to deliver customer value (not just revenue).

Product marketing is the voice of the customer, and we fight hard to get a seat at the table in those early stages of development. The collection of all those features will resonate, and we can tell a compelling story about it. If that isn’t at the very beginning of the phases of that life cycle, then what you get in return is a bag of parts you must try to put together and make sense of. And those pieces don’t always fit together.

So, if you can’t explain what you’re building to customers, how do you expect anybody to use it? That’s the biggest issue that I see.

Alastair Simpson: I’ll add one from the designer’s perspective. You may have picked the right product or feature to build. You may have the insights team, the product managers, the founder, or whoever has defined ‘the right thing.’ But then, you want to redesign a known pattern as a designer. You want to make it “innovative.” And you end up designing something, and then a customer comes to it, and they’re like, “I don’t know how to use that.”

You’ve changed it from their mental model to how you want them to experience it. Sometimes, you need to innovate—but sometimes, you’ve done that for no good reason. You may be far too ahead of the curve, and actually, it doesn’t resonate with your customers. So, we, as designers, can fall prey to that.

Alissa Lydon: I’m already hearing some solutions here. We diagnosed the problem: improve collaboration and incorporate more user research into the ideation, or even before, phase.

What other action items or things can these folks take away to help overcome the obstacles they might see in their organization? It could be some common traits you’ve seen on high-performing teams you’ve worked with. Do you have any tips and tricks on how to build features that actually get used?

Building features users love

Alastair Simpson: Building products is a team sport. We hear that a lot, but it really is. And I’m constantly surprised from talking to different people around the industry about the number of silos that exist. Or the waterfall nature of, “OK, we’re Research, we’re going to go do the research, and then in six months we’ll come back.”

That’s not how it works. You need to be a proper cross-functional team working together with a shared, common goal, which comes to my second point about having an agreed, shared goal. I don’t like the term vision because I don’t think that’s helpful. What does that mean?

But for Dropbox, we use the twice-a-year release model. And so we get ahead of saying, ok, what will we think will be coming in our full release? What’s the high-level view of that customer experience or the themes we integrate into product marketing?

What is the customer value?—an agreed, high-level view of where you’re going or think you’re trying to get to and knowing that you will need to iterate and speak to customers along the way. Then, how will you know if you’re successful? What success metrics can you all align to as a cross-functional team?

Many people talk about having good trust and psychological safety as a team to challenge one another.

Building trust is incredibly important; it’s something we’re all talking about. You can have the best processes and practices in place. But ultimately, you’re working in a team—teamwork is really messy. We’re dealing with humans; humans have emotions, and we have all of the things that come along with being humans.

And so, building up that trust and psychological safety as a team is really important, and the onus is on all of us, not just leaders, to invest in that. Find out who your partners are, find out what they do outside of work, and build up that trust so that you can challenge one another respectfully.

Ultimately, if you wrap all those things up, that’s a growth mindset. So, having a growth mindset is important when building products and services so that you can do the right thing by your customers.

Claire Drumond, Head of Product Marketing, Jira, Atlassian
Claire Drumond, Head of Product Marketing, Jira, Atlassian

Claire Drummond: Having a shared vision of where you want to go is probably the best tip I can give because so often I see different product teams—even within a single product— developing different feature sets but not being aligned on what they’re actually trying to do, and who they’re trying to do it for.

Pressing your leadership to have that vision and ensuring everybody is aligned to that vision and goal will help cascade how you actually get there. When you have a shared vision, you can use things like good sense and intuition to help you make decisions because you know that this will be the direction you’re trying to get to.

It will also help you not to cherry-pick the customer insights that just frame up what you’re trying to do. You can also say, “Well, look, this is half supported by insights, and this is half supported by strategy,” or some mix of the two things. When I look back at the most successful launches, it’s been a mix of those—and always knowing where we’re trying to go. And it seems like such a basic thing, but there are so many times I see different teams say, “I don’t actually know where the North Star is.”

Judd Antin: Maybe the overarching solution is constant engagement. If we’re product people, we’re not trying to think of an insight as an end in and of itself, but as an artifact we use inside of relationships.

Judd Anton, UX Coach, Consultant, Teacher
Judd Anton, UX Coach, Consultant, Teacher

What that looks like is constant engagement across the product cycle, not in a waterfall fashion. Customer insights do not only inspire the idea to develop a feature—they are consistent across the development cycle. And that means that as researchers, designers, marketers, or product people, we better have this constant point of view, where we are willing to be wrong, where we’re looking to falsify our beliefs constantly.

We could argue whether Agile as a product development model is dead, but agility will never be dead. So, the fact that we need the most successful product processes that get features efficiently and deliver value is one that adapts constantly to new insights— not just at the beginning and the end or at any particular point. And they do that because the relationships and the collaboration are strong.

And I think that’s a mindset shift for many people who do research, they think of the insight as the artifact that their job is to deliver, and as something ‘someone else’ will use to make a decision. That’s a recipe for being sidelined. You’re just throwing an insight over the transom, hoping it works rather than building a solid relationship. Which, I’ll grant, takes two sides. You cannot always do that by yourselves. However, thinking of the relationship as the artifact for the most successful product development process is one way out of building features no one uses.

Alastair Simpson: I love that you use the term agility there because, if I think back to the best teams I’ve worked in, they have not functioned well because of some process—whether Agile or in a design sprint. It was because we all had agility and (Judd said this, but it’s worth doubling down on) that we were all willing to be wrong.

We weren’t going in like, “Okay, I’ve got the solution, or this insight.” It was about collaborating as a team, showing vulnerability, and being willing to say, “Hey, my solution might not be the best solution,” or “My insight may now be proven wrong based on new data that we’ve had.” I think it’s really important.

Alissa Lydon: It empowers everyone to go out and collect those insights and bring them back to the group rather than just relying on Research. Like, oh, Research will answer that question for me.

I am a product marketer. I talk to customers frequently. And I should feel empowered to bring my insights to the table. And, if we have a psychologically safe environment, we can lay everything out and see which idea is winning. Which idea do we want to go after? What do we want to chase? What do we want to build next? (As a group, rather than being a single decision-maker). I like that paradigm a lot.

In the spirit of that, we have a researcher, a product marketer, and a designer. All here on one stage. We’ve been talking a lot about cross-functional collaboration, and there are already some great takeaways.

Do you have one more piece of advice on maximizing deep collaboration and building those connections to deliver shared success?

Institutionalize your intentions

Judd Antin: I have one piece of advice—that I can’t take credit for—I’m going to name drop Barack Obama, who gave this advice to Airbnb.

Alastair Simpson: A good friend of yours?

Judd Antin: Yeah. You know, my buddy—no, he did not give this to me personally.

Alissa Lydon: Big UX research guy, Obama. Little known fact.

Judd Antin: But he said to Airbnb and Brian Chesky, the CEO, “Institutionalize your intentions.” He said the way to make any change sticky is to create structures that scale, that make the right thing easy and the wrong thing hard.

You can fight for a one-off. But if you want to encourage better collaboration, think about how you create an Agile product development cycle, how you make meetings and forms for communication and sharing, how you make a research process, insights development, or a voice-of-the-customer process. Or how do we write our PRD?

You design a scalable process that makes the right thing easy and the wrong thing hard, which means the right inputs and the right people at the meeting. We don’t start from the premise that we all need to be there. This is not because we’re going to make a decision by consensus but because in a truly effective cross-functional team, you’re not just providing input for somebody else’s process; you’re a part of the decision-making process.

So, what if researchers, designers, writers, marketers, and everybody else were not just influential in others’ processes (although they are that, too) but also part of it themselves in making decisions? What is the structure that would institutionalize that intention?

Transparency in collaboration

Claire Drummond: Something simple that I learned during the pandemic (when, before the pandemic, my team was in Sydney and San Francisco, and there was a real water cooler happening in Sydney). I would go to sleep. I’d wake up. Decisions were made. Collaboration was not happening. I was informed, but I was not part of the process.

During the pandemic, we weren’t able to do that anymore. Everybody was remote, which unlocked this interesting insight into cross-functional collaboration. The aha moment for me: Transparency in the thought process is key to bringing in people at the right time. And it’s not just a meeting. It can’t be a meeting; it has to be something easier and more fluid.

Not a product drop here, but Confluence is really good for this. It gives you visibility into that thought process, and anybody interested or a part of that team can then comment and collaborate along the way. We rely heavily on these different rituals and checkpoints, and they’re often just like face-to-face or Zoom meetings.

There has to be another way, where you’re bringing the water cooler discussion into the virtual space. And that will help foster a lot of trust, too. Because you’re removing this barrier of ‘If you’re not a part of this 30-minute meeting, you’ve missed out,’ and you don’t get to contribute anymore. I think figuring out how to make that virtual water cooler a reality within teams helps unlock that.

Alastair Simpson: Remote work is really hard. Most of you are working in some form of remote setup. I designed and ran Dropbox’s shift to remote for two years (because we treated it like a design process). But we also learned in the open. So, getting to know your team, building healthy boundaries, and building healthy remote practices are things we’ve published online that you can go and find. It’s called Dropbox Virtual First Tool Kit. It’s a set of about 20–25 practices that will help build trust with your teams who are often remote—I highly encourage you to use that.

Building trust in remote teams

Alastair Simpson: On the non-Dropbox side, I’m a designer but also a father, a son, and a husband. I’m a failed professional football (soccer) player. I’m a wannabe DJ. I’m not very good. The point is that I’m a bunch of stuff outside of work, and so are your coworkers.

Try to get to know them, ask how they are, care about the answer, and ask follow-up questions. Treat people like humans. You can have all the processes and practices, but we’re all humans.

Teamwork is messy. Care about the people that you’re building products with. Because if you care about them, you’ll probably build better products together because you’ll give a shit about each other and also give a shit about the products that you’re building.

Try the new magic features

Ask questions of your data, get AI-suggested highlights from customer calls and more with the new magic features.

Get Dovetail free

Editor’s note: This article is a condensed overview of the panel discussion with Judd Antin, UX Coach, Consultant, Teacher; Claire Drumond, Head of Product Marketing, Jira, Atlassian; and Alastair Simpson, VP of Design, Dropbox at Insight Out 2024.

Keep reading

See all

A whole new way to understand your customer is here

Get Dovetail free

Product

PlatformProjectsChannelsAsk DovetailRecruitIntegrationsEnterpriseMagicAnalysisInsightsPricingRoadmap

Company

About us
Careers11
Legal
© Dovetail Research Pty. Ltd.
TermsPrivacy Policy

Log in or sign up

Get started for free


or


By clicking “Continue with Google / Email” you agree to our User Terms of Service and Privacy Policy