The Global VC

WMD ’16: Post-Conference Q&A with Casey Winters

Article image
Guest Author

Guest Author






Guest blogger – Casey Winters, Growth Advisor to Pocket, Airbnb, Darby Smart; Former Growth Lead at Pinterest & GrubHub

The below article is comprised of audience questions asked to Casey Winters during his presentation at Weapons of Mass Distribution 2016. Casey took the time to answer the questions in detail post-conference.


How do you personalize emails across all the metrics you recommended if you don’t have an in-house email data tool like Pinterest does?

At GrubHub, we organized our user data into a weekly FTP upload to ExactTarget (now Salesforce Marketing Cloud) and built a complicated workflow inside of ExactTarget (you can do the same in Responsys). You can use their APIs too, but I don’t remember them being very good.

What you are doing is creating data tables inside the email tool, and using programs to query those tables for what previous emails your users have received, what they’ve liked, to determine what to send them next. At GrubHub, we were very focused on not sending a similar template over and over, so we would join the tables we uploaded with ExactTarget’s data on what users were sent inside the tool, and then look for templates we had new data that we hadn’t sent them recently. If we were to do this for Pinterest, we would have focused the query on finding the template with updated content that had the highest historical click through rate.

This is all achievable in enterprise email tools like ExactTarget and Responsys. If you’re not on one of those, you have to determine when you have enough scale that it’s worth investing in a switch instead of using a more basic tool like Mailchimp.


With constant improvements to your product, marketing, etc, how do you figure out attribution to increased retention and tie it back to specific improvements that you’ve made?

You have to run experiments and have long-term holdout groups. Sure, there will be other experiments those users may see, but there should be an equal amount of those people in both enabled and control. Every time we tried sending a new email, we had a control and enabled group and would at least send the email a few times to see if the effect was a novelty or sustained. In some cases, we did year-long holdouts to make sure something was a sustained lift.

Since it takes a long time to see if retention experiments worked or not, what are some examples of leading indicators you tracked?

Well, you definitely want to see if the experiment is having a short-term impact first. For example, if we designed an experiment to get people to repin more, thinking that will drive long-term retention, the question to ask is are they repinning more in the first week? If not, it’s probably not going to have a long-term impact.

Retention experiments are really about taking a baseline cohort curve, and see if you can adjust. So the early indicators are seeing how the week 1 or week 2 parts of that curve look compared to control, but you have to measure longer to see if it sustains, or goes back down.

What are the best ways to find out reasons for rejection/cause of low retention before testing anything with product or pricing?
The best way is to talk to rejectors. At GrubHub, we’d give people a free meal if they talked to us. We’d also bring people in and watch them use the product to see where people got confused and would give up.

If you have a lot of data, you can also look at what data correlates to people who stick around vs. not. You have to be careful about this because the typical answer from a model will be, if they did anything, they are more likely to stick around. So build some hypotheses, do some data analysis, and see if it makes sense.

For example, we had a hypothesis at GrubHub that number of restaurant results tied into conversion and retention. We pulled the data for how many search results people saw, and how frequent they were. We then graphed the conversion rate by the number of results. We then saw a clear inflection point that was different by city, as well as the point where adding new restaurants had diminishing returns.


What program did you use as “personal assistant”? How did you make it scalable?

This is something we built in house due to Pinterest’s volume of users. It was too cumbersome to move data around for over 200 million accounts. I’ll give an oversimplified overview.

So what we did was create a new table that aggregated response rates by template for each user ID. We then ranked each email template for each user by response rate. This is stored as a table in a database and is updated daily.

Then, you have to figure out how many emails to send to each user, and what time/day to send it. We used a model that measured by click through rate by increasing frequency, and we stopped email sends once an additional email sent would have a lower CTR than the previous one. For the day and time, we looked at the most common times people opened emails, and created slots based on the volume per month we’d already determined.

Whenever a slot opens for a Pinner, the system pulls the top ranked email that has content and sends it.


Do you have any book recommendations on user retention?

As usual with startups and growth, there aren’t many direct books on these subjects. “Hooked” [by Nir Eyal] is a pretty solid introduction to creating engagement loops. Other than that, I’d recommend general psychology and optimization books like “Thinking Fast and Slow” by Daniel Kahneman and “The Goal” by Eliyahu Goldratt.


How did you find out on Pinterest that you should connect men to interests instead of friends?

Through user research, we watched men go through the onboarding flow, get asked to follow friends (who were mostly women), then get a home feed they didn’t like. So, our researchers then asked the research participants something they were interested in and had them type it into the search bar. Then, they saw content they cared about and their opinion changed. So then we brainstormed how we could get men directly to what they cared about. We tried forcing search on them, but it was clumsy. Topics ended up being a better way.


How did you convince the restaurants to decrease their minimum and delivery fees? Please provide some examples.

We did a couple of different things. In one instance, we had a local press story we were doing, and told the restaurants they would be prominently featured if they dropped their minimum. They did it to get the exposure.

We built a case study off of that restaurants and talked about it to other restaurants, so they could hear how much total order volume changed.

We also tried ranking lower minimum restaurants higher in search results to get them to try it.


How do you define growth KPIs for different products? How do you segment users and personalize a good mix of channels?

The growth KPIs should match the goals of the business. You can look at that from a company mission perspective or from a revenue perspective. At Pinterest, the mission is to help people discover what they love and help them do those things in real life. So, while Pinterest doesn’t know if you did something in real life, it can tell if you found something you love by seeing if you saved it. So that is the key metric the company focuses on.

You then build a cohort analysis based on that metric and see at what point in time if they are doing that action, they will continue to do it. So, you look at where the cohort curve flattens. At Pinterest, it flattens after four weeks.

Once you have that, different parts of the growth team forms goals around getting people to that regular pattern of saving, like bringing in enough traffic, converting enough of that traffic into signups, getting enough sign ups to their first save, etc.

As for segmentation, it depends on the business. When SEO is the start of your growth loop, and what keywords you rank for is based on what content people add to the platform, you give up the ability to target certain people. It’s whoever is attracted to the content on your platform that you’re targeting. So Pinterest’s segmentation was just around usage: new, core (daily), casual (weekly), marginal (monthly), dormant, and resurrected.

At GrubHub, we spent money on advertising, so we had a bit more control. We looked at our early users and found there were four segments based on if you ordered alone or with someone else, and whether you planned to order ahead of time. Two of them were a much better fit than the others. One was attractive if we could build some product additions for them.


How do you know the specifics (e.g., time of day) on how users like emails

You just look at the times each user tends to open your emails historically and abstract trends (morning, weekdays, etc.). At Pinterest scale, you can get pretty granular.


What types of retention cohorts do you look at most often (ie. by month acquired, channel, device/platform etc)?

At GrubHub, I looked at monthly by source and by platform and by signup method (email vs. Facebook Connect vs. guest).

At Pinterest, we looked at weekly by source and platform and page type (board vs. topic vs. pin vs. home page).


Where would you recommend someone redirect focus where retention efforts don’t make sense? (e.g. products that people purchase once every 10 years)

In this case, retention is a lot harder. It has to be an incredibly memorable experience for someone to remember ten years later. So, what most people do is focus on acquisition channels knowing that some of those people will be re-acquisitions. was like this. It’s used to infrequently to expect people to come directly on their own, so we invested in SEO because people naturally go to Google when they don’t know where to find something. Later, the company also did television advertising during major apartment hunting seasons.


View Casey’s full presentation at WMD ’16:



Thank you to Casey Winters for contributing to the 500 blog. For more insights from Casey, follow him on Linkedin or Twitter.

Guest Author

Guest Author