5. Leigh Honeywell, Founder and CEO of Tall Poppy, The Human Side of Privacy and Security: Online Harassment and Abuse

So welcome, everyone, to the Security Podcast of Silicon Valley. I'm here today with a very special guest, Lee Honeywell, founder and CEO of Tall Poppy. And prior to that, you were involved as a venture partner at the Pioneer Fund, and still are, actually. Now you're a senior partner.
Yeah, that is a sort of parallel thing. And you were a technology fellow for the ACLU. Excellent work there. And a security response manager for Slack, a security engineer, a senior platform security developer at Hero Crew, now a Salesforce.
com company, and Microsoft. And, oh, you did some independent consulting. Not to mention, you were at Symantec as a malware operations engineer. Wow.
Oh, man, ancient history, ancient history. Let's go all the way back, all the way back. So technology application specialist as well at Bell Canada. Wow, so you really came up through the engineering ranks.
Absolutely. Building phone systems. Love to get your hands dirty. No, we have a great show scheduled for everyone today.
We are going to give Lee an opportunity to tell us a little bit about her background, sort of the passions that were driving this mission here with Tall Poppy, a little bit about Tall Poppy, and then what it's like. So thank you so much, Lee. Thanks for thinking of me as a guest, and I'm happy to be here. Absolutely.
So oftentimes as a founder, there's a big part of all of this that's very passion-driven, and sometimes that can come from, you know, childhood. And so I'm kind of curious if you'd share a little bit about your childhood or what parts of that really made you the interesting and very cool person that you are today. Well, thank you. Yeah, I mean, I grew up in a family of lawyers.
Both my parents were lawyers in the Ottawa, Canada area. And so was very much exposed to sort of analytical thinking and a degree of rigor in tackling ideas that I think is a little unusual. I also was a pretty serious athlete when I was growing up. I was a downhill ski racer, raced on a regional team for a few years, many provincial championships.
I missed going to nationals a couple of times by one spot, which always was a bit of a heartbreaker. But my dad ran his own law firm for a number of years when I was very small, and then he was part of various larger firms. And then right as I was leaving for college, he forged out his own path again. And I did IT for his small law firm for a number of years.
On trips home, I would fix whatever was broken, that kind of thing. But I had some exposure to that sort of entrepreneurial spirit from a pretty young age of the idea that you didn't necessarily need to work for someone, you could work for yourself as a professional or as a, you know, whatever the sort of industry was. So that was always like part of my realm of possibilities. But then I went to college, university, as we call it in Canada, and bounced around a couple of different majors, didn't really sort of know what my path was for a few years.
And I actually ended up dropping out and worked at the phone company, Bell Canada, for a little bit building phone systems. This is like 2005, so it's like the early, early days of voiceover IP. And we were doing stuff that was like really cutting edge at the time. I remember when that came out, everyone was very excited about it.
And it's so funny to think of like, that was like the cutting edge. And now we spend literally our entire day on. . .
That's right. We're doing a Zoom, right? That's even how this interview. .
. A distant descendant of IP telephony. Yeah. And yet we still spend how many minutes per hour making sure that everyone's microphone is working and, oh, my camera's cutting out, like got to reboot my computer, yada, yada, yada.
So we're not quite there yet. But I've been very fortunate to have this like pretty varied, interesting career wearing a lot of the different hats in cybersecurity. Nice. And so you come from a family almost of entrepreneurs.
You can really look up to your dad, it sounds like. And then you've got the competitive spirit in you with the downhill ski racing. Very nice. I love downhill skiing, though in Minnesota, there was never any hills worth skiing, really.
Little Bobby Hills. Little Bobby Hills, yes. You have to drive three or four hours to get to the Little Bobby Hills. Yeah.
Okay, so in terms of a truth that you really see in the world that most people missed that sparked or was the inspiration for building Tall Poppy, can you share a little bit about that vision or how it all came to fruition? I love this question because I think it's, you know, what is the sort of core insight? So I spent many years working with people who were. .
. Dealing with a particular type of online security threats that wasn't really getting addressed at a technical level or an organizational slash policy level. And I'm talking about online harassment, of course. And the thing that I kept seeing over and over again was that fundamentally, you know, when you look at the past 20 years of cybersecurity product development and research, and it's, I say the past 20 years, but it's sort of funny because our field is only like 40 years old, maybe.
Right. If even, like, we're in, we're very much in the, like, doctors have just discovered that childbirth deaths go way down if you wash your hands. Like, we're at that phase of computer security. And, you know, we have this whole industry that's spun up around corporate cybersecurity, but I felt like I was this sort of voice in the wilderness in terms of personal security.
And, you know, we've had antivirus for a zillion years, and then password managers started to be a thing. And then there's a whole sort of detour through, like, VPNs, which is complicated as to whether or not VPNs actually, like, protect individuals, particularly depending on the threats that you're thinking about. But there wasn't really anything else. And I, when I was working with folks dealing with these harassment issues, you know, people breaking into their personal accounts, sharing their personal data, over and over again, I just, it was so clear that there was this gap in terms of both consumer education, but also consumer tools around personal cybersecurity.
And eventually that sort of dawning realization that there was this problem here led me to start Tall Poppy. So you do actually deal with account break-ins and the personal information that is sometimes leaked or the fallout from a. . .
So our goal is to prevent hacks. And obviously you can never prevent 100% of things, but we've found in our work that so much of the kinds of attacks that you see in harassment situations, it's sort of fairly commoditized security breaches. It's people noticing that you haven't changed your password on a bunch of sites since the Dropbox breach in 2012, and they just take that password, which is like widely shared in credential dumps, and try it on all your accounts. And then they find your, like, I don't know, Tumblr account that has, like, embarrassing content on it or whatever, right?
Like, wherever it is that those, that sort of basic cybersecurity hygiene, like, hasn't been performed. And through nobody's fault, right? Like, this stuff is hard. It's complicated.
It's honestly pretty tedious. And being able to, like, raise that bar, raise that, like, difficulty level by getting folks to understand when they're in a motivated situation because they're concerned about harassment, getting them to understand what are the concrete things that they can do to improve their personal security, that's going to prevent a lot of those breaches. It's going to prevent a lot of those account takeovers, a lot of that shared personal information that you didn't want out there. That little tiny bit of prevention ends up being super, super powerful.
Excellent. No, I really love that. I love that because you're really focused on the people. And I feel like in cybersecurity in general, we oftentimes miss that, that behind all of these stories that there are people.
And we have our accounts and we have our data. And I've talked with a lot of professionals and everyone gets excited about the technologies and the products. But I feel like this focus is really on the people. So.
Yeah, absolutely. And I think, I think it sort of parallels a shift, a wider shift. You know, we've in so many ways, and whether it's like beyond Corp or zero trust or whatever the latest buzzword is, we're moving beyond this idea of like, there's a laptop and a server and you have to secure them, right? Let's get the pizza box on the network that keeps the devices safe.
Oh, what are we patching the devices? Like, yeah, you have to do all that stuff. I mean, maybe not the pizza box, but like you do need to install the patches, but the accounts on the device, that's what's getting compromised. That's the like juicy target that attackers are going after at a consumer level.
And so that's, that's really been our focus. No, that's a spectacular. So was there any specific incidents in your life that led you to see this, this grand insight or, or was it maybe that's, maybe that's a very personal question. No, it's okay.
And I, it's, it's one that I get asked all the time is like, you know, what is your sort of personal experience of online harassment? And I, I, in many ways have been really fortunate that, you know, the, the worst that's happened has been like people like tweeting at my employer, trying to get me fired because I told them to go away or whatever, you know, whatever the sort of like fairly mundane in the grand scheme of online harassment is. I was also a sort of me too, pre-me too whistleblower about an abuse situation. And that result.
I did a small amount of harassment for me, but quite a bit of harassment towards some of the other victims. And I think there was like a German newspaper that like called me a whore in German, which whatever, I don't care. Like, I feel like that says more about them than it does about me, like, whatever, guys. And so that, you know, I had some personal experience with it, but very much more ended up being this sort of person who was working quietly in the background with targets, with victims, with survivors, trying to do what I could to help protect them and help them recover from this kind of much more severe situation.
And that was really where the lessons were learned, was in working directly with people who were facing some of the worst kinds of attacks that individuals have ever faced on the internet and learning, you know, having the like privilege of learning from them and being able to roll those lessons into this thing that we're building. Right, so that you can actually not just help individuals, but scale that help to help many, many individuals and build a nice company around it too. Well, that sounds great. Props to you for taking full advantage of all of those, you know, it's those tough.
Hard, hard lessons learned and really being the change that you would like to see in the world, right? That's what we're trying for. Excellent. So can you tell me what's been the best day in your journey at Tall Poppy so far?
Oh, man, you sent me this question ahead of time and I was like, I was struggling with it. I was really struggling. Maybe there's so many, you know. I mean, I think there's been so many sort of private wins in terms of the like team environment that I've been able to build, the level of camaraderie of supporting each other as a team when we're dealing, you know, we're dealing with really tough incidents that our customers are facing.
And it's so like there's this sort of conflict of sometimes like the most meaningful days are the worst days for people, right? It's like there's that terrible, terrible, scary incident that a customer is dealing with, the fact that we can be there for them. It's both the best and the worst day at the same time because, you know, there's a terrible thing going on. Maybe it's like we were dealing with this disinformation situation targeting one of our customers a few months ago.
And, you know, it was actually like legit super scary. And we were able to be there for them and help them feel safer and make sure that their information wasn't sort of being spread about them by helping them take down a bunch of stuff preemptively. That's an amazing feeling to be able to say, like, hey, we prevented this situation from escalating into actual in-person physical violence. But it's still, it's very bracing to say, like, hey, this was actually pretty scary.
Yeah, I could only imagine. I guess for myself personally on the harassment front, when I came out in high school, it was a conservative high school. I was in a conservative household. I was sure I was going to lose friends and face harassment.
But, you know, you do those things one at a time. And I was very fortunate that actually I was completely mistaken. But the things that were running through my mind, even though I didn't actually face harassment, was very scary. It's anticipating the sort of fear of what might happen.
And the sort of, there's this like paradox in terms of incident response. Your discipline as an incident response professional is to be the one doing that thinking of like, what are those scary, scariest possible outcomes? And then what can we do to make them less bad? What can we do to mitigate?
What can we do to prevent? And at the same time as you're doing all of this like worst case scenario planning, you're still hoping for the best. And it can be a bit like unsettling when it does actually end up okay. And, you know, I have sort of a quiet hope that like, and maybe scream it from the rooftops hope that, you know, more kids growing up and coming out will have that experience.
I, you know, I am bi and remember telling my folks at some point and just being like, cool, we love you. Right? Like, and that's great. Like there's never, and I'm very fortunate that I, you know, never had any question growing up that that would be like a big deal.
And so I really hope that we, like, as the sort of generational turnover happens, that more and more people, it's just like, oh, whatever, this is cool. Yeah, I think, I mean, I think that's even starting to happen without generational turnover too. I've seen people change, like many people change my lifetime, which makes me really optimistic and happy for the future. Yeah.
Right. So I have a family member who growing up was like, I'm not so sure about this whole gay marriage thing. And then watched several seasons of Modern Family and just got it, just like got it. Yep.
Yeah, awesome. Representation matters. Awesome. It does.
Visibility can go a long way. Just a little bit of visibility take you so far in social change. That's spectacular. So I'm super curious now, Tall Poppy sounds like a great product.
Can you help our listeners understand exactly how your business model works? Do you sell to individuals? Because it sounds like individuals reap the benefits here. So the product itself is definitely pitched to individuals, but we're working on an employee benefits model.
So we've built out this platform that's focused on sort of personal security advice, but we've always wanted this to be something that we scale up by getting companies to pay for it as something that they provide to their employees. So we tend to sell either to individual teams within an organization, whether it's maybe the trust and safety team or the developer relations team, where there's like a real sort of public interface. So if it's, you know, we're working with a specific team, it'll often be one of those.
But in an ideal world, we work with the whole organization because, you know, it may be your very visible public folks that are experiencing these issues, but sometimes it's, I always pick on poor Bob in accounting who has like a bad breakup and his ex shares his nudes and it's just this whole like dramatic, traumatic situation. And what does Bob do? Well, does he go to the EAP and the EAP maybe refers him to a counselor who maybe doesn't have like this sort of technical expertise. Maybe he, well, he probably doesn't feel like we have a security field.
We haven't done a great job, like helping people feel comfortable asking, Hey, like I'm dealing with this situation. What do I do? Like there's often not that trust with something like a security team. So again, within that organization where Bob's maybe not on the front lines, but they have tall poppy.
He knows that he can go to tall poppy and get support in this kind of like edge case of personal security, personal privacy situation where there wouldn't otherwise be resources. And so it's a B2B play at the end of the day. Primarily B2B. And it's a tool that I could purchase and then use as a service for my internal employees to get support or help through difficult situations.
Do you have any cases where it's a B2B play, but instead of selling it as an internal tool, you would sell it to help with an external support of maybe a product that the company is releasing. So like, I don't know, a Facebook, for example, to help them deal with online harassment through some of their techniques. Oh, interesting. Yeah.
We've like, there's definitely been sort of, I've had at the back of my mind, like, wouldn't it be cool if when you reported some particular kind of malfeasance on one of the social platforms, if one of the recommendations was considered tightening up your security. Here's tall poppy. I got to convince the platforms that that's a good idea first, but it's definitely like, I would love for that to be like a part of that path with any of the major social platforms. I think the bigger vision, longer term vision is figuring out like what that correct B2C play is.
Right. Go right. And we're starting to do some like research and experiments there, but we don't know exactly what the right sort of go-to-market shift is. And so for now it's primarily going after that, that B2B market, particularly the incident response piece.
We know that it needs to be a lot more automated, a lot more like chatbot focused. And then the real challenge is like, how do you maintain that sort of empathy and connection and supportiveness when it's literally like a robot, right? Like, is that even possible? I mean, an empathy bot of some kind.
Maybe that will be a new innovation too. You could sell it as a service. Someday, someday. Yep.
Yeah. And I think there's, you know, the, the consumer piece also plays into this question of like, what is, what can we do as a company basically providing like self-defense tools? Like these are tools to keep your account safe. And where is the line of responsibility where the platforms actually need to make their own platforms safer?
Right. And, you know, there's always sort of limits to things that are solved with education versus solved at a structural level. And, and I, like, I don't have, I don't have all the answers on this stuff. These are just some of the things I've been thinking about.
No, these are important questions. And I'm super happy that someone of your caliber is thinking, thinking through them and helping, not just thinking through them, but actually using those answers and your insights to help build a better world and a better future for all of us. When you think about going B2C, I'm sure that you're focused on the positive experience and approach that puts the human first and empathizes with the concerns. So I don't know, is that a five-year plan or so, maybe?
Yeah. I mean, I think the, hopefully more like, hopefully more like two to three. If I had to distill our design philosophy down to two core things. One is.
is not putting ourselves in the user's security context wherever possible. So we don't currently ask for any third-party permissions to apps. We don't have access to any apps, any sort of third-party content. When we do do that down the road, we design our access in a least privileged way.
So thinking about, is there a specific API that gives us security information? Can we just look at metadata rather than looking at content? What are the ways that we can minimize how intrusive any sort of access that we have versus like, just give me your whole OAuth token. Give me, you know, everything in your, like, we don't, we don't, we don't play that way.
The second core principle is, we don't want to do the like shadowy hands over keyboard, hoodie, dark, scary hacker thing. We want to be like your smart security friend that you come to for advice, right? That you know, that you trust, who's going to give you the real deal, who's not going to like bullshit you or sell you products that you don't need, but also is going to like point you in the right direction to stay safe. And I think that, that sort of aspirational security, trusted advisor kind of thing versus the like, oh no, the world is on fire.
Your passwords have been hacked. Like that is, that's a really core sort of behavioral thing that we're trying for. I love it. And it makes me feel warm and fuzzy on the inside, even just hearing you talk about it.
So I, I do really appreciate like taking a shitty situation and spinning it so that you can, something good will come out of it and see the silver lining in all of it. So. Yeah. Well, you shared a little bit about your best day there at Tall Poppy.
Do you, is there anything that you'd like to share in terms of a worst day that you've had at Tall Poppy? I think I said it was both the best and the worst day. It was, it was the best and the worst. I was totally trying to get both of those answers.
You nailed it. You did. It's that media training coming to good use. I saw what happened.
Well, I mean, I think the, the like the hard days are, I mean, I think the hardest days are the ones where frankly, like we can't fix everything, right? Whether it's, you know, somebody's being harassed online and it, you know, the attackers are actually like technically sophisticated using strong anonymity. And they're just, you know, they're actually sometimes there isn't anything for us to do at a like technical level. We can still provide sort of moral support and risk, like risk analysis.
And like there, there is stuff for us to do, but in terms of the sort of concrete ability to like bring someone back to a sense of safety, we're not, we're not all powerful. We don't control the internet. My name is not Jack Dorsey. I can't ban people from Twitter.
Not that Jack like personally bans people from Twitter, but you know what I mean? But right. Of course. I mean, there are limits to every system and this is just one of them sometimes, but nonetheless, it sounds like you do make a positive impact in people's lives and some of the toughest situations that they may be facing.
So. Yeah. I think that's, that's where it's really meaningful is when the shit really hits the fan and we can actually be there for someone and, and just reduce the burden, reduce the impact. So do you find that you mentioned being anonymous on the internet?
Do you find that people are using tools like Tor, the onion router to hide where they're coming from or they just use like fake names and it's hard to track them down and. Oh man, there's just a real spread in terms of the tactics that you see from, from attackers and what kind of attackers and there's sort of different classes, you know, in the same way as your malware groups have different signatures. There's whether you're dealing with like an individual who's stalking someone over like a romantic obsession versus disaffected, a large group of disaffected fans versus neo-Nazi or other extremist groups.
Like each of these different types of groups and types of individuals have sort of, sort of signature moves and the, the level of technical sophistication varies extremely widely all the way from like doing stuff from their home IP address with no, any sort of obfuscation and under their real name while logged into Facebook, all the way to anonymous remailers, Tor, you know, the tools that you mentioned, hacked proxies and stuff like that, where there, there is like actually like a degree of technical sophistication where they've gone to, they've gone to great lengths to cover their tracks.
I think one of the things that is interesting in assessing the threats to someone is thinking through like, what do the tactics say about the motivations and the, and the self-belief of the person perpetrating this kind of harassment? Do they, do they know that what they're doing is kind of wrong and that's why they've gone. to like great lengths to cover their tracks, or are they so convinced of their own righteousness that they're like sending death threats from their actual Facebook account? And those are, you know, things that we look at and things that we think about in making threat assessments.
Wow. Do you often work with law enforcement in some situations that involve things like death threats? We definitely do when it's to the degree of things like death threats. Typically, we will act in more of an advisory role where we're working with the client and they are reporting directly to law enforcement themselves.
And we will often sort of advise on how to be most effective at interacting with law enforcement. You know, we're just a tiny little firm. We're not like fancy. We don't have a big reputation.
Law enforcement doesn't, you know, know us from a hole in the wall. So we do tend to advise folks to talk directly with law enforcement rather than trying to like proxy through us. Right. But we have a pretty established track record at this point of helping people be more effective at that because, you know, you go to law enforcement and you say, like, I'm getting death threats on the internet.
They're like, oh, well, don't look at the internet. And that's not really helpful. Getting it taken seriously requires a degree of documentation, requires like an understanding of which parts of law enforcement are going to be more effective at tracking this stuff down or not. So it's, you know, and that's and it's also further complicated by who feels like they have the ability to actually interact with law enforcement in a safe way, which unfortunately tends to, you know, the folks who face discrimination at the hand of law enforcement are also often the folks that are facing threats online.
So that just reminds me of everything that happened recently in Belarus. And that's an entire country for a very tough situation. Right. And I would imagine that online censorship and online harassment is a huge part of those campaigns, I guess, for lack of a better word.
Yep. But yeah, I mean, you know what we're seeing in Hong Kong right now, what we're seeing with internet shutdowns in various countries related to elections, like we're in a fragile time in terms of the intersection of democracy and the internet, whether it's, you know, what we saw in the States with disinformation in the last election. I was actually just reading about the Democratic Party in the States, their like new disinformation team that's been doing a bunch of work around like election misinformation and disinformation and sort of the subtleties there. And it's definitely not, it's not the focus of our work.
It is definitely adjacent to the work that we do. And I think that a lot of the, a lot of the folks in the, in the trenches of disinformation are very much fellow travelers. Oh, definitely. Because I could see how misinformation or disinformation.
It's tied right in with the hacking and the account takeovers and the leaking of emails and all of that stuff. Yeah. What a mess. Well, okay.
And I'm super excited to hear about the B2C long-term vision. But if you look into that future and you look out at your grand success of selling direct consumers and helping establish a stronger sense of privacy or personal security, what does, what does success for tall poppy look like in that future? I think the key piece there is really that everyone should have access to affordable expert advice on personal security. You know, it's, it's like relatively easy to go to Best Buy and get like a piece of your hardware replaced, but that sort of next level step of like, how do you keep this all safe?
There's a lot of bad advice out there, right? And getting, getting that quality expertise that is broadly accessible, whether you're a Windows user or a Mac user, iOS or Android, they're just like what a consumer has access to. Like, man, there's just, there's so much crap. There's so much crap in terms of what's in the market, whether it's like people selling anti-malware on iOS, like your, your iPhone's not getting hacked.
Your iPhone's not getting, what's getting hacked is your iCloud account because you're reusing your password, right? Like, you know, pretend I'm holding up an iPhone. Like it costs like one to $3 million to hack into this with like a, you know, one-click exploit because it's the same bugs that are used for jailbreaks. So nobody's like using, like hacking into your iPhone as an individual random human unless you're literally Jeff Bezos, right?
Like your iPhone's safe. If someone is in a situation where they're like worried about their device being compromised, like what are the actual vectors that they're not thinking about? And it's typically accounts. Yeah.
That's the stuff that we share up in the cloud, right? It's the things that leave the device and end up in the cloud. You know what they say about the cloud. It's just somebody else's computer.
It's definitely someone else's computer. I'm not sure whose. It depends on your vendor and depends where that stuff goes. And it's even the privacy laws associated with that.
Yeah, with all of that data, it can get very complex very quickly. It's a whole mess, for sure. Yep. Well, we should also give a shout out to Startout.
Oh my gosh, how have we not talked about Startout yet? I don't know how we missed this. Startout is an amazing community. They have Network Q, which is an LGBTQ founder network.
I know that you were in Growth Lab, right? Paul Pompey was. . .
Yeah, the Startout Growth Lab is the world's only LGBTQ-focused startup accelerator. We were in the fourth cohort in spring 2019, and you guys are in. . .
Yep, Peacemaker is the eighth cohort. Amazing. Summer of 2021. Summer 2021.
Amazing. Just getting out of COVID. Yeah, yep. So it's a wonderful program.
It's six months long, so longer than the sort of average accelerator. And it's like a tight-knit group. For us, it was super powerful having that like really tight-knit group that got to experience over like an extended period of time seeing each other's successes and struggles and learning from each other and really, really. .
. And working through it together. Working through it. Exactly.
Yep. Well, I've really noticed that the authenticity level in this group is just through the roof. And typically, you know, especially among founders, and I get it, you have to play a role. Very oftentimes, you have to do a lot of virtue signaling and all this other stuff.
And it's just, you know. . . Yeah, I feel like people let the like the hustle mask down a little bit.
They do, right? You get to know each other. But you still work through those tough problems and learn a lot together. So spectacular program.
Yeah. Cool. Lee, well, thank you so much for joining me. It's been a spectacular show.
It was fun. It was very fun. And we'll have to do it again. Maybe in a year or two when you're.
. . That'd be super fun. Realize that B2C play and take over the world with one secure.
. . Take over the world or prevent the world from being taken over. Or enable everyone to have a stronger sense of.
. . To take over their own world. Yes, exactly.
There we go. I like that. I'll buy it. I'll sign it.
I love it. I love it. Amazing. Do you want to do a sign off or you've probably got like an outro?
I don't usually. I just end them quite abruptly. No, but I'll just say thank you again. And it was a great pleasure to have you on the show.
And it's really lovely to get to chat in a little more depth. And thanks a lot for listening. Thank you.