'Hacking Growth' with Sean Ellis
Tim O'Keefe and I are excited to share with you a Q&A session we had with the Godfather of Growth Hacking, Sean Ellis.
For those who don't know who Sean Ellis is, Sean is the digital entrepreneur who coined 'growth hacking' and pioneered many of the popular data-driven methodologies in marketing that we see today.
In a nutshell, Growth Hacking is the proverbial lovechild of User Experience, Product and Marketing. The biggest focus is the replicable actions and results from split testing.
Many of the big companies like UBER, AirBNB, Dropbox, Twitter, LinkedIn and Pinterest all use these 'hacks' to create a significant gain to their growth.
Sean joined us online in a forum WAMA session which you'll be able to review at the bottom of the page. I've done my best to keep the thread nice and tidy so it's easy to read.
Enjoy the Q&A session! Tim leads the conversation as we go through questions on product, marketing, team management, and his new book co-authored by Morgan Brown - 'Hacking Growth'. You can grab your copy today at 35% discount here:
View the Q&A Here:
Transcription from Video:
Tim O'Keefe: Okay, hi everyone! Welcome to Warrior TV. We’re here today with Sean Ellis. He’s been part of the Growth Teams that logged me in at Dropbox, quite famously, and he’s the founder of both Qualaroo and GrowthHackers.com. Today he’s launching, or he’s launching this week his new book Hacking Growth and so, we’re here today to discuss all things growth hacking with Sean. Welcome.
Sean Ellis: Thanks, guys. It’s great to be on.
Tim O'Keefe: So, Sean, let’s just start off. So, how did you actually get into growth hacking?
Sean Ellis: So, yeah… It was interesting. When I started… I started, like, over 20 years ago with this and at that point, people were still figuring out online marketing. There was no book that showed how to do it, any textbooks were more traditional marketing books and so, I originally got in and just, you know, got relatively creative about way to grow the business, but I wouldn’t say that it was really growth hacking. My big a-ha moment for cross-functional testing as a team came when I was at LogMeIn. So I was at second company that I had joined. First one, we took it to become a public company and sold. Same team then started LogMeIn, founded by… Mike Simon had founded both of these companies and we got to the point where we tried to grow LogMeIn by spending ten thousand dollars a month and we just hit a wall. And I think so many marketers can, kind of, emphasize or, you know… Have had a similar situation where you’re just frustrated. You can’t grow the business, and so for us, we couldn’t grow the business beyond that ten thousand dollars a month in spending, but when I looked at what was happening. We were driving signups and I actually had board members, investors who were saying “Gosh, you guys are signing up a thousand people a day. That’s awesome!” The classic vanity metric now, but at the time we were given a praise around that, but when I looked at those thousand people who were signing up, most of those people were never actually using the products. Like, over 90% of the people weren’t using the products and I knew there was no ROI if they didn’t use the products so that’s when I… I brought the data to my CEO and Mike agreed. Like, we’ll never be able to grow the business to a meaningful size, until we figure out how to effectively get people from interest to sign up to usage and if we can do that then we’ve got a lot of potential in this business, so… He said, number 1 imperative for everyone in the business, we’re going to put a pause on the product roadmap and everyone’s going to focus on… All the product engineers are going to focus on getting that first user experience right, so what we call activation in the book. So, everybody focused on that and in a matter of two or three months, we were able to get significant gains. I think within four months we had… We had about a 10x improvement in the number of people who signed up to actually using the product. And what that meant from a marketing perspective now, I went back to the same channels that I previously tested, so no new creativity, no clever way of working the channels. I just went back with much better economics on the conversion side and, what previously scaled at ten thousand dollars a month, now scaled to over a million dollars a month and that really kicked off growth in the business and getting all of that usage. By the time I left a few years later, 80% of out signups were coming through just pure word-of-mouth, so it’s that cross-functional testing and getting all those pieces working together, that really opened the door to… Opened my eyes to this idea of teams working together in cross-functional testing.
Tim O'Keefe: Yeah, that’s interesting. So, it’s basically… A big part of it is being able to really attack the product and not just the marketing side. I think in the first section of your book, you’re actually talking a lot about, you know, obviously getting… Getting to a must-have product. So what is it that actually defines must-have in a product? Sean Ellis: Yeah, I mean, I think, conceptually, we think we all know what a must-have is, but the problem when you’re a founder entrepreneur in the early days of a business, you’re… You’re looking at data and thinking that you’re… You’re looking through rose-colored glasses and everything looks good, so I came up with a question that has been really effective for me and just asking users how they would feel if they could no longer use the product and when I ask that question, I’m looking for the people who say they’d be very disappointed without the product and that starts to tell me… If they’d be disappointed without the product, it’s a must have for them and suddenly I have some pretty good signal on what I need to do to get more people like them.
Tim O'Keefe: Yeah, that’s interesting. So, I mean… That’s actually a little bit, almost surprising of an answer in a way because, you know, what a lot of people will say growth hacking is really about is data, but what you’re sort of saying there is that for the founders looking at data, they kind of get a rosy view of things, but what you’re doing is more looking at, sort of, qualitative stuff, right? By looking at what the customers are saying. I mean, do you… You know, when you’re talking about figuring out if it’s a must-have, are you just, sort of, looking at the qualitative stuff, or just the quantitative or, you know, how do you, sort of, melt the two together?
Sean Ellis: Yeah, so… So the qualitative question that I mentioned is kind of a leading indicator so, obviously, if people stop using the product, that’s a pretty good sign that it’s probably not a must-have for them. So, you know, looking at retention cohorts is really important but I… You can essentially predict if they’re going to churn if you ask them how they’d feel if they couldn’t use the product and they say they’d be very disappointed, then there’s a good chance they’re going to keep using it and so… The problem is just looking at the data. You can’t really dig into the why and figure out what it is about your product that makes it a must-have, so I like to use both qualitative and quantitative to figure it out.
Tim O'Keefe: Yeah, that makes sense. So, do you think that, sort of, any product can, kind of, be shoehorned into being must-have or is it, like, you know… So, when do you decide to quit, or can you just endlessly hack away and make any products have that must-have quality?
Sean Ellis: So, it’s an interesting question. I think your knowing when to quit is a really… Really hard thing for, probably, any founder or, you know… And really any effective growth person has those founder-like qualities where… That you’re tenaciously going after it and I think… I think for me, probably, the signal of when you should quit is when you… When you lose faith that the problem you’re solving is an important problem. That you can keep iterating on the solutions, but if your entire premise for building the business is that people are suffering from problem X and, as you go after it, you find out that problem X really isn’t that important to people. So, maybe they’re suffering it, but it’s like… A must-have solution needs a very painful problem and so, I think that a lot of times that’s when you’re not going to be able to iterate your way into a solution to a problem that’s really not that important for users. So that would be the time that I would say, normally, that’s probably pretty good time to give up.
Tim O'Keefe: Yeah. Makes sense. So, moving on there to sort of testing because, I mean, that’s obviously a big part of growth hacking. So, obviously, it’s 2017 and hopefully now almost everyone is testing their landing pages, you know, A-B testing them. Is that enough? You know, should you go further and how do you do that?
Sean Ellis: Yeah, so, watching the evolution, as I told you, I’ve been doing this for a long time. It used to be pretty… Pretty, like, cutting edge to be testing landing pages and, you know, it was even cutting edge to track data beyond what is your CPM. So, over time, I think people have gotten a lot more sophisticated and not surprisingly… Not surprisingly, people are tracking ROI all the way through the funnel. But if you’re tracking all the way through the funnel your ROI, then you need to understand where are the levers that you can… That you can test against to improve that ROI. And so, landing pages are really the… Just the starting point of that. The problem is that most… Most marketers don’t have the ability, in most tech companies, to go beyond the landing page. As soon as you go beyond the landing page, you’re into the first user experience and that’s outside your realm of influence as a marketer and so, testing deeper in the funnel is actually pretty hard for a lot of marketers and that’s where, yeah, building a growth team and cross-functionally testing all the way down to, you know, what… How do I drive ongoing engagement and monetization and referral in my product and then, even before that, just your activation is probably where the lowest hanging fruit is for big improvement for a lot of people, beyond the landing page. You get somebody to sign up for the product and they never use the product, then you’re not going to get a return on investment and that’s what I mentioned happened to LogMeIn and it’s really… It’s really, kind of, thinking all the way through the funnel and testing all the way through that makes a big difference.
Marcus W K Wong: Hey, Sean, sorry just to flip the subject for a moment, but um, you know, I was sort of going through Hacking Growth, your new book with Morgan Brown as well, so it’s pretty exciting. you know, I’ve read through a few of the chapters as well that you guys are sent through and I know, you know, Niall and Eric Rice have called it, you know, distinctive playbook that you guys have brought out this year, so it’s really exciting for a lot of, you know, small business owners, people looking for, you know, to make money online, passive income… I guess the big question for us is, you know, why should people actually… You know, why should they start growth hacking? You know, what’s so big about growth hacking for them to start from day one?
Sean Ellis: Yeah, I think that the reason that people sometimes dismiss it is because a lot of people have, kind of, defined it as just, sort of, really creative marketing or even shady trickery-based marketing, but at the end of the day what really defines growth hacking is this test-driven approach to growth across all of the levers of growth, not just the ones that are focused on customer acquisition that we were just talking about. So, like, beyond the landing page, all the way through, there’s so much to be gained in doing that and you look at the companies who’ve taken that approach like Facebook or Uber, you know, Airbnb… There’s these companies that have created billions of dollars in value, you know, there are hundreds of billions in the cases of Facebook in a relatively short period of time and it’s because they’ve taken a fundamentally different approach to growth and, yet, to me, that’s something we all as individuals should be looking at. How can we grow like they do? Like, there’s so much success and what we’ve really done in the book is broken down how to get started with that and then how to get through the really hard parts, which are, like, you know, how do you break outside of the area that marketing is allowed to influence, to start run experimentation all the way through the funnel. There’s two big things that you need there. You need kind of executive buy-in and then you also need… You also need a common success metric across different departments and those are things that we cover in the book that really start to… Start to get you to that point where, culturally, you can approach growth in a way that’s just much more effective and it’s not easy and that’s why we’ve kind of broken it down into very distinct steps in the book.
Tim O'Keefe: Yeah, absolutely. So, you’ve broken it down to, I think, essentially the… A lot of the funnel, so there was, I think, acquisition, activation, retention. I didn’t see resurrection in there, but the key one I’d actually noticed wasn’t there was referrals, so I was wondering if that’s… Or at least in the chapter names, I haven’t had time to read the whole book yet. So is that, sort of… Is that something that’s kind of all throughout the book or is it something that’s not as relevant in 2017 or…?
Sean Ellis: That’s… It’s interesting. It is actually… We go into quite a bit about referral but it’s in the acquisition chapters so it’s extremely important and there’s a lot of detail to how to effectively drive referral and even just formality and sort of referral turbo-charged and so we… We do cover that in the acquisition chapter.
Tim O'Keefe: Makes sense. I think the last time I spoke to you, I think you mentioned something about seeing word-of-mouth as the only chapter… The only channel, sorry, that doesn’t, you know, that’s sort of analyst in a way. So moving on a little bit to just back to testing. You… So one of the issues that I think that you, sort of, face in growth hacking is… You know, you’re talking about the application of marketing and that sort of data all the way through the funnel and using that to grow the company. Of course, it’s going to be, you know, through testing. It’s um… Obviously a lot harder to test those later down funnel, things like testing for retention is obviously a lot harder. How do you approach that? I mean retention is something that you might not know for 30 days, 60 days, 90 days or more. So how do you run high tempo or frequent testing on retention metrics?
Sean Ellis: Yeah, so the down funnel metrics to me… You do want to… You do want to run testing there but I think it’s generally a matter… Like, the big question is what prevents people from being able to run testing so I think that’s the starting point that you have to look at. So one, as you just said, time… Time to see if it’s working, but you can start to get some proxy metrics pretty quickly but I think that the… Probably the bigger issue is access. So testing is pretty normal thing for marketers to do but it’s, you know… Especially a high velocity testing with a fast feedback loop is something that core product people in most organizations just aren’t doing and so, when I think about, like, the very bottom of the funnel is this great customer experience so if you can get people to a great customer experience and you understand that by that earlier question that I talked about was you’re asking somebody how they would feel if they could no longer use that product. When you find those must-have users, you learn about the experience they’re having. If you can get more people to that point, you are going to drive long-term retention and, usually, one of the biggest gates to getting to that point is activation. So if you don’t get an initial person when they sign up to actually activate and use the product and get any value of it… So, in the growth hacking world, we refer to that as the a-ha moment or, you know, that activation point where you really get an understanding of “okay, this is the core value of what the product is going to give to me”, so if you optimize to get people to that activation point, generally that correlates very closely with long-term retention and… So that’s usually a good place to start and, you know, I think it’s something that you earn trust over time as you run experiments and you show what’s working, then you can start to get more of the team engaged helping on those experiments and, hopefully, you get to the point where your constraints are the feedback loop on an experiment, how long does it take and just how many experiments can you run simultaneously, but I think what you’ll find is that there’s always… There’s always less high-tempo testing that you’ll be able to do at the bottom of the funnel than you can do at the top. Like, there’s almost unlimited that you can do in acquisition channels. Activation… You could do a different landing page where you’re onboarding, you know, people through a bunch of different landing pages and do those simultaneously. But it’s really hard to do, like, a lot of testing really deep in the funnel, so traffic and resources are usually what will hold you back there.
Tim O'Keefe: Yeah, it’s almost diabolical in a way because you, sort of… You know, retention for most businesses I think is what would drive the most growth, being able to improve their retention, but you, sort of, don’t get the opportunity to test it as frequently. One of the anecdotes that I did actually get a chance to look through from the book was, you mentioned that at growthhackers.com, you had a period of, I think, three months where, you know, growth had, sort of, flattened down and, you know, hadn’t really improved at all and the way you guys addressed that was to essentially just, like, reenergize and launch a whole bunch more tests. What was it that made you think that it was the volume of tests that was the problem, rather than, you know, the target or the North Star metric or that sort of thing?
Sean Ellis: Mm-hmm. So, you mean, ultimately, like, when you, as a team, you sit around and you start saying “Okay, what does success look like?” Success it this output of a result, but the output of the result is based on a lot of what you’re doing to drive that result and, you know, I think when marketers or growth people aren’t very effective, it’s because they’re kind of just hoping that that result goes up and it doesn’t really work that way. How it works is that you drive the output of a great result through the input of activities and actions that are going to drive that and those actions are going to be divided into proven things that have worked in the past, so you want to probably keep doing those things but you can’t really expect any growth beyond those things if you… If you only do those things, so then testing is what helps you identify new things to do or new things that are going to contribute to growth and so what I saw on my team was that I was kind of… I kept telling them “Hey, guys, we need to do more testing”, I was out raising money so I wasn’t as involved in the da-to-day and so they would run a test and then, you know, in their mind that test you… You know, they’re running more tests but since I didn’t quantify it, just running one test in week or two weeks or… You know, for a lot of companies would be like one a month, it’s… That’s just not going to lead to the discovery that’s going to help you accelerate growth and so, for me, what… I basically did the breakout of that three-month flat period, was I actually gave the team a success metric that was within their control. It was not really directly in their control a success metric of “let’s get this much growth”, you know, let’s… I could just put a more aggressive target and say I want to have 50% month-over-month growth and, like, just having that target is not going to make it happen so what they can’t control is the number of experiments they’re running. So I said “Run three experiments per week. I don’t care what experiments you run, just run three experiments per week. If you’re doing that, at least I know you’re doing the things that are going to maybe help you discover how to grow this business and what I found is that as soon as they started running three experiments per week, growth accelerated and, basically, over the next, probably like 6-8 weeks, we grew about 60% after being flat for 3 straight months and so… It’s that testing that helps you really discover new ways to accelerate growth and otherwise you’re just going to be growing on that same trajectory until you discover something that can accelerate it.
Tim O'Keefe: That’s kind of interesting because I think that… You know, I think most companies in that situation would probably… They probably would just set the target of “we want 40% growth or 60% growth”, rather than saying to their staff “Hey, we want you to run 3 tests.” I think it’s a really interesting approach. Sean Ellis: Yeah, I just want, for what it’s worth, I did start with just telling them just “Come on, guys, this is a target, find a way to hit it!” but since it wasn’t happening, then I tried to kind of break it down a bit more and just realize they actually weren’t running any tests.
Tim O'Keefe: Yeah, I think… I think that the book said it was something like 10 in the 3 months, whereas, you know, three weeks you would have done about that many with the new targets. That’s really interesting.
Sean Ellis: Yeah, when I say “not running any tests” I mean like virtually not running any tests. Basically… Not enough to… Not enough to really figure out how to accelerate the growth of the business.
Tim O'Keefe: Cool. Sorry, one sec. It’s alright, we’ll have to do some cutting. Okay, yep, sorry. So okay, the… The testing is quite interesting there, so if you’re running 3 tests a week, you know, are you really able to combine any kind of, like, actually qualitative user data with that? So what I mean is, like, user testing. Because it strikes me at 3 per week, you could… It would be very easy for a team to be able to go in and, you know, switch colors of the buttons and those sorts of tests that probably aren’t super effective. But, I mean, it does strike me like it would be hard to test, you know, to really go to users and find out what their problems are to fix them. So, I mean, you know, what sort of tests were they actually running, I guess?
Sean Ellis: Yeah, so like, some of them were core product tests like in… I mean, some of them were pretty complicated. It basically… And they needed to be run for a while so we tried to do a test that you could, kind of, define and launch within a week, it was sort of the way we were trying to size it. Some of them, obviously, you could, you know, like with an Optimizely or a visual website optimizer type test, we could put out in 10 minutes. But, like, other ones were just… I can’t even remember. Like, an optimizing attack that was just moving the email collector for example, but things like sharing an asset, you know, a question out on social would be another kind of referral type test. At the time we weren’t really doing it as much by objectives, which is, kind of, how we do it now where first we look for leverage and problems to solve and then we generate ideas for testing and then prioritize those ideas and run them against that. So at that point, we were still able to just, kind of, test wherever but, essentially, once you start to run testing at that velocity, then a lot of your time is trying to figure out which test do I actually run. So maybe the first is “Gus, do I have enough ideas?” But once you really start digging into it, generally you can generate a lot of ideas. Like, that’s not usually a big bottleneck in companies. Yeah, for me, it’s after you’ve generated the ideas, trying to pick between those ideas and the more you can do things like user testing, the more that you can really analyze the data and figure out where people are dropping off in a funnel or not taking an action that you want them to take within the product, that might help you with a referral loop. All of those things are areas where you can then… Where you can then come up with some test ideas and run them in those areas. So I do think that the qualitative information does start to become… Does start to become pretty important in deciding which tests to run.
Tim O'Keefe: Yeah, definitely. So, going a little further now. I mean, that’s actually interesting. I think that one of the things that you worked on last year or possibly the year before was basically a new part of growth hackers where you can write your very, like… You can rate the various types of tests that you want to run on impact confidence and ease. So can you comment a little bit on how you… On what that sort of methodology is and how your team works to produce ideas that way.
Sean Ellis: Yeah, so at the end of the day, once you start to run a lot of tests, then just as we were talking about EIQ, it’s a sort of spaghetti on the wall type testing which to me, that’s not ideal testing but it’s better than not testing, you know. So don’t… So many times people kind of overthink it and then it becomes an excuse not to test at all, so random scattershot testing is better than not testing, but once you start doing that random scattershot testing, realizing “God, we’re wasting a lot of energy on these tests.” Like, sometimes it’s even successful but it barely moved the needle. Maybe we need about if this thing works, what’s going to happen, and so that’s something that we refer to as impact and so, we created a scoring system that helps us determine… So, it’s just a prediction. I mean, it’s an estimate guess but thinking if this works, how impactful can it be? If it’s something that could be really game-changing, you might give it a 10 on an impact score, if it’s something that, you know, like “okay, I think it works so we should still do it but it’s not going to really move the needle that much” then maybe you give it a 2 or 3, but basically ranking on that and then then next part of the ranking system is C – confidence, so it’s an ICE score. I C E. Confidence is based on things like usability testing or surveys or even seeing that this is like a pretty common thing that’s happening on a lot of other sites so it’s not just a wild guess. It seems like maybe the best practice which, you know, in the testing world best practice is not necessarily like a great thing but sometimes it’s the best practice because it tends to work. So it’s basically trying to have some sort of confidence score that… What is the likelihood that this is going to work and then, finally, how easy is it to test it? So if something is super easy to test and all of your research says it’s really going to work and if it works it’s going to be game-changing, then that’s the test that you want to run first and so, being able to essentially score each of your ideas, it becomes a lot easier to compare and prioritize those ideas and then, I think, another layer that you want in there though, is using data analyst to kind of figure out where do you have the most leverage in your growth model to focus some energy and so, if you’re picking tests, you should probably pick tests focused on what you believe are the really high leverage areas, which are going to correlate pretty tightly with your high impact score anyway.
Marcus W K Wong: Yeah, actually Sean, I wanted to ask you really quickly just regarding the growth hacking cycle and when you just mentioned impact just then. How there’s a process of analyzing, testing, prioritizing and then ideation… I guess measuring impact is easy when you actually have ideas to actually… To test. But, I mean, pulling that back just one level before is how do you actually decide what is a good idea, what is not a good idea before even testing it? Because, you know, you could be sitting in a room with your team members and everyone’s throwing ideas into the pit, but it’s just coming up with those, you know, top ten ideas. How do you guys, sort of, decide what, I guess, dictates those ideas as being good?
Sean Ellis: Yeah. So, I mean, part of it is this prioritization system that we talked about, but I think, you know, part of it… This is where you start getting into the kind of strategy of growth hacking where you’re… You are looking more broadly at the situation and, you know, again we go back to that LogMeIn example I talked about earlier. If people are signing up and not ever using the product, then I need to fix that before I do anything else so that then becomes a focus area where, once you start to understand that, you know, an action is not happening, then you can do the research and you can really have the conversations around why do we think people would have that action, why are they not doing it, maybe it starts with some hypotheses and then, over time, you layer in facts as you run the research. But the more information you have around a goal that you’re focused on, the more likely you’re going to generate really meaningful ideas that are going to start to bubble up to the top in your… Ideally in a weekly meeting you would actually be able to narrow down a handful of ideas that you want to test that week and part of that process is really starting with the data and where in our growth engine is the best opportunity for growth and then, given that area, which ideas do we think have the best balance between if it works it’s going to be high impact, how confident are we and it’s relatively easy to test. And so, that’s ultimately, I think, the process that you go through to shortlist your ideas and then, you know, from there, there’s going to be some gut and you’re going to be wrong a lot of times, but if you run twice as many tests, then you can be wrong twice as often and you still are going to get the wins. So you’re finding just that balance of like “Okay, I want to get a relatively high win rate, but even, you know, 40% win rate versus a 50% win rate, if you’re running 3 tests versus 10 tests versus a 100 tests, like, you know, if you’re running a 100 tests, then the 40 to 50% difference really isn’t that big of a difference. So that’s why it’s just, you know, the wins start compounding on top each other and it just becomes very important to make sure that you are running the tests to get the discovery and even when something is not a winner, you generally learn something from it and you get a better understanding of how is this business growing and what do I need to do to make it grow better.
Tim O'Keefe: Cool, so I guess, moving on a bit there you sort of touched a little bit on the strategic side so I guess, you know, I think fairly easy in some cases to nominate a North Star metric and it might even be something that’s extremely impactful for the product itself. So, how do you know… Like, there was an anecdote in your book actually I think where, you know, you mentioned the company that had a wonderful product and all of that sorts of things. I think it was an image organizing product. But they failed to, sort of, address the business side of things. So how do you know that you’re, you know… Because it’s easy to sort of say “Well, we can improve our product, we can improve our position” and all of that sort of things, but how do you know you’re attacking the right thing there as opposed to, you know, making a fantastic product that fails as a business?
Sean Ellis: Yeah, so that was an image product and I think that basically what it comes down to, the point that we were trying to make with that story was just having a fantastic product is not enough. Like, if you’ve got… If you’ve got a great product and you’re not aggressively growing it, especially in a software business where you essentially… You’ve got your unit economics which need to be strong, but in a software business usually your biggest cost in the business is your fixed cost initially. Until you break through that covering that fixed cost, then you’re going to be losing money and so an non-aggressive approach to growing the business is going to mean that you’ve got the fixed costs of all of your developers and the rest of your team and you’re really not generating very much revenue yet. And so I think that was the point that we were trying to make with that business is that they actually got to product market fit but they didn’t make growth a big enough priority until it was really too late. So when growth is not a big enough priority, they weren’t able to attract additional capital into the business and they weren’t able to break through and become cash flow positive. So, you compare that to LogMeIn where we pretty quickly got to the point when we started scaling that business, that we became an attractive investment for outside investors, but we were actually cash flow positive with that business. So being aggressive, I think there is a lot of people that feel, like, in a SAS business, especially a freemium SAS business when you start to scale that you’re definitely going to lose money, but I don’t think that’s the case. I think that, you know, as soon as you have that product market fit, you want to scale fast enough to start to cover those fixed costs and then you sort of control your own destiny from that point and I think it was their lack of aggressiveness that ended up killing that business.
Tim O'Keefe: Brilliant. Well, you mentioned at the start there, there was no book on it at that time. Now there is. So we’ve been talking to Sean Ellis about his new book Hacking Growth.
Marcus W K Wong: Did you want to lead into a little bit more about your book as well, Sean? Because I know you co-authored this with Morgan Brown, is that right?
Sean Ellis: Yeah, Morgan and I, we… Morgan actually helped me grow Growth Hackers, something to get traction and Growth Hackers… We worked on Qualaroo together, business that we sold about a year ago and so we have a really good working relationship together and we… The other thing with Morgan is that he’s this really curious guy where he… He just was already, out of just interest, breaking down the growth engines of other businesses and trying to figure out how other businesses are growing and something that I used to do a lot and told him that I did it and he’s like “That’s a great idea!” and then took it to, like, the next level by 10 and so we had previously put out a book on just startup growth engines and what we found was that it just seemed like between my experience with Dropbox and LogMeIn and these other companies and our time working together and all this research that he did, we saw patterns that we wanted to really share in the book and things that we saw companies doing really well that we thought we could break things down… So, basically, the book is… It goes through sort of a lot of the physics of how growth works, the importance of testing, how working with the team is much more effective as the company scales, than, like, alone growth hacker and then in the second part of the book, we go into the specific areas where we can, you know, specifically give guidance about around how should you approach acquisition, how should you approach activation and engagement retention. So it’s… I think it will be really useful for people.
Tim O'Keefe: Great.
Marcus W K Wong: Great. Alright, guys. Well, thank you very much for your time, Sean. We really appreciate you taking time out of your busy week, especially with launch week this week, being, you know, Hacking Growth. So, for everyone interested, you know, Hacking Growth is available on Amazon for 35% off this week, for launch week. I highly recommend it for anyone that’s looking to grow their business, start a business even. You, know, there’s tons of ideas and things that you can implement today that Sean and Morgan have, sort of, gone through the ropes of. So, yeah, 100% you’ll find a link and we’ll share it around as well.
Sean Ellis: Yeah, thank you guys very much. I really appreciate the opportunity to come in and talk about it.
Marcus W K Wong: Alright.
Tim O'Keefe: Cheers!