36. Feross Aboukhadijeh, Founder and CEO of Socket.dev, a startup improving security and privacy on the web

Hello, everyone, and welcome to another episode of the Security Podcast in Silicon Valley. I'm here today with Faros Aboukadijay. Did I get your name right? Yeah, you did.
Awesome. Awesome is the founder and CEO of Socket, a cybersecurity platform that protects companies' software supply chain against attacks. Specifically, I think JavaScript, right? We actually do more than JavaScript now.
We've added Python and Go, and we're on a roll continuing to add more. But yeah, we started in JavaScript because that's where most of the worst offending open source code tends to be. It looks like you are an intern at Intel, at Facebook, at Quora, all big name companies I'm sure everyone is familiar with. You have a Bachelor's of Science in Computer Science from Stanford, and you also did some graduate studies there.
Not to mention, it looks like you really took like a leadership role in Stanford, being a teaching assistant, section leader. And then you started to found your own companies. Like right away, you have PeerCDN, which is a next generation content distribution network that's powered by WebRTC that facilitates efficient peer-to-peer content delivery. That sounds awesome.
You did that way back in 2013. That's right around the time WebRTC came out, wasn't it? Yeah. Yeah.
I was fascinated with the web browser and all the new capabilities that were getting added into it. And WebRTC, WebGL, there was all this stuff that was so exciting. And I felt like each one of these things was going to lead to like some type of important new company. So I wanted to work on work with WebRTC because peer-to-peer is cool and distributed systems are cool.
And yeah, so PeerCDN was what we built. And you ended up selling this company to Yahoo and you went over to Yahoo for a while and helped integrate and build, I assume, like your Peer-to-Peer WebRTC CDN into their ecosystem for a billion plus users. Yeah, it was less them integrating the technology and more us just helping with our skills, to be honest. So I was hoping they'd end up using the tech a bit more.
And that's actually what led to me leaving and working on open source afterwards. I really wanted to take some of the insights from PeerCDN and I really wanted them to see the light of day. And after Yahoo had acquired PeerCDN, it wouldn't have been possible to use that technology anymore. So I started from scratch on WebTorrent, which I think you were about to get to.
Yeah. So you were an open source software developer on WebTorrent. What was that like? Yeah, WebTorrent is basically a way to share data directly from one browser to another.
And WebRTC was the first, really made it so that for the first time you could do that. You could actually connect directly from my computer to your computer without going through a server. And before that, the web was very like clients. It was designed to be a client-server model where you have a server, an HTTP server, and you have these browsers that are basically subservient clients that can only make requests to the servers and then the servers send back HTML.
And that's basically the model. But WebRTC gave us the ability to open a socket between my browser and your browser. And so there's all these ideas of what you can do with it. The idea that I got really into was being able to do BitTorrent over those connections.
So building up like a network of peers that are swarming around a file and sharing bits of it and the whole BitTorrent story that people might know about how that works. And so that was what WebTorrent was. It was a JavaScript implementation of BitTorrent that used WebRTC as the transport instead of the traditional like TCP or UDP sockets that BitTorrent used before because those don't work in the browser for security reasons. Yes.
Yeah. It's been a great project. It was it took I worked on it for a really long time. But now after I don't know, it's been like, I guess, going on 10 years or so.
Yeah. Since I started it. There's a whole bunch of other people working on it now. It's not just me.
There's an awesome community. And now it's built into even the official BitTorrent clients like they adopted the we changed the protocol and they adopted it. And now when you Torrent stuff, you're actually in any of the major clients, including you Torrent, the big one from the official company. You're actually sharing that content with browser peers.
So people who are on websites that are trying to download that content will connect to your to your BitTorrent app. So it's really cool. It's that was the vision. And now it's actually realized.
So to see it come to fruition is it sounds very special, like that success right there, like adoption of the technology and change the ecosystem of how the underlying protocol works. Right. And you also spent some time as an open source software consultant with Brave. The the newer browser, I'm not going to say new anymore because they've been around for a while, but Brave team is awesome.
They've been trying to innovate in the browser space and in a big way. And one of the things they wanted to do was make it so just like how when you view a PDF file in your browser these days, most browsers will just show you the PDF. They won't kick you out to a to an external application to handle it. Their idea was, well, what if we did the same thing for torrents?
So you click on a torrent link. Why is that kicked out to some external app? What if we just loaded it directly in the browser? And so they used our they used WebTorrent to do that.
And so I was helping them integrate it. Oh, nice. And so you're helping bring the WebTorrent stuff into the Brave browser ecosystem. Correct.
Yeah. So now all Brave users have WebTorrent built in to their browser out of the box. And you've also spent time helping with StandardJS. Yeah.
So StandardJS was a project to help with the code style of PRs that we were getting to the WebTorrent project. So I just got tired of giving style feedback to people like, hey, this is a great change, but can you please change? We have a kind of I have a kind of weird coding style or it was weirder in the past, but it's gotten more normal where you omit semicolons in JavaScript to make it look more like Python and it actually works fine. And so the things like that, I got tired of giving that feedback in all of our pull requests.
And so I just made a library to to enforce that. It's built on ES. Yeah. It's built on another linter underneath the hood called ESLint, but basically is an opinionated set of settings.
And the kind of funny story behind the name is that I was going to call it something like Feroz's style or just some random name, some random package name. And then I wrote a little script to see like what names were available on NPM that were relevant potentially to this type of a project. And the name Standard came up as being available. And I thought that I was like, oh, this will be funny.
I'll call it JavaScript standard style as if my style is the standard style that everybody should use as a joke. And then people actually so there were two reactions to this. First, first of all, all my friends, like a bunch of my friends and fellow open source maintainers were like, oh, this is amazing. Like now I never have to get feedback on PRs anymore.
Like I don't really care whether your style is whether I agree with every decision. I just like that there's like a tool that I don't have to configure because the other linters require you to write like a 200 line config file to get it going. Right. And to select what your preferences are.
And then you have to copy that into every project you work on. And so I inadvertently solved that problem in an easy way for people where they would just install standard and then. And you gave it a very descriptive name. It's just like, OK, here's where we're going to start.
It's the standard. Yeah, exactly. Very nice. And then a bunch of other people were like, hey, that's not cool.
You're not the standards body to be able to say that it's a standard. And I'm like, yeah, well, like, LOL, like that's what I'm doing. It was a bit of a trolling going on, but that people. Yeah, a bunch of people liked it.
And I think it's like the second most popular style now after Airbnb's style in the JavaScript community. At least it was last I checked. I don't know what it is now. But yeah, so it was pretty cool.
And people really get a lot of value from not having to have those style debates on their teams anymore by using it. Yeah. That's the main kicker for me. Every time I have a discussion around style, it's like, well, what's already there and let's just keep using whatever it is.
And I'm not really going to be dogmatic about where I think things should go. I'm usually stepping into a project or code that already exists. Right. Just want to maintain readability.
Totally. Yeah. Yeah. And there's always the teammates who want to go in and like tweak it to like, they're like, they'll sneak it a change into the style in some other unrelated pull request.
Oh, they go into their IDE and they have a slightly different style configuration. And every file that they look at, like automatically gets restylized. And then their pull requests are just massive, like off. And you try to find like the actual semantic change and it's impossible.
So it's funny. And now you're not only the founder and CEO of Socket, but you're a visiting lecturer at Stanford University. It looks like you started the course Web Security, CS253. Yeah.
So Stanford used to teach this course when I was in undergrad and it was my favorite course, like by far. Just it was my introduction to web security. And I just, I loved it. And when I went back to do my master's, they had stopped teaching it.
And so I was. You're heartbroken. I was heartbroken because I was like, this was the best. This was like, this was so fun.
This was the, it was one of these classes that was so special. Like literally the final exam was here is a VM running a, a vulnerable e-commerce website server on it. And you're, and we were supposed to basically run this thing and find every possible like way to hack basically to do a pen test of this e-commerce site. And we had 24 hours.
It was so fun. Like I've never had a final exam that was that cool and that fun. And so I think I found like 13 or 14 vulnerabilities in it. And I think one of them, you would get extra credit if you found something that wasn't.
Wasn't already known. Wasn't already known. Yeah. So they, they, because they had added those in, they added like those 13 or 14 in, and I'd found like one that wasn't inserted by them.
And that was, that just felt really cool. And to be doing that in school. And so I loved it. And then they, they said, why don't you teach it?
And so I had to redo all the content because it's been, it'd been like 10 years since it had been taught and web security has moved on. It changes big time over the course of 10 years, doesn't it? Yeah. It's shocking how much better it is.
And also how much more complicated it is. Just the number of, you can't just explain, oh, here's how cookies work. And here's all these like attacks you can do with them. It's like, now you have to explain all the attributes and all the different settings that affect how they work and all the like new restrictions for ads and tracking.
And it's like insanely more complicated than it was back then, but, but more secure, hopefully after all this work. How did she get into security? I really hear the passion in your voice when you're describing these things. It must've was, is there a story about how it crept into your life and became important to you?
I mean, I, I don't know the moment it happened. I do remember this one story from my childhood that my, my parents liked to tell, which maybe shows them the mind, the security mindset. I don't know if it's a stretch maybe, but yeah, well, this is when I was like really young. Like, I think I was like five or six or something.
And I learned my parents had got this new microwave and I took the manual for the microwave and I used to love reading manuals back then. Cause there, you know, there wasn't YouTube, there wasn't the internet. You weren't overloaded with content. So when you could like learn about something, you got a new product or whatever, you'd read the manual.
I don't know if people like remember those days, but you'd like, I do remember those days. You'd read it. Cause you'd like, I want to know about this thing. And like, there's no Google searches or YouTube videos or podcasts or whatever, or people are going to be talking about this stuff.
You'd read it. So I read the mail and I learned that there's this child lock feature and child lock lets you lock the microwave unless you put in a, hit a special sequence of buttons. And so I would use that on my parents whenever I was mad at them, I'd lock the microwave and then they couldn't cook or use it. Oh, they couldn't use the microwave.
And instead they had to cook like the old fashioned way in the stove and the oven and maybe just a bowl of cereal. Yeah, exactly. So you definitely remember doing that a lot. And I don't know if that's a security.
Did they figure it out? Like, did they like, Oh, it's locked again. All right. What's the code for?
They would just yell at me and say like, Oh, I'm locked the microwave. And it's like, no, I'm mad at you. No, it sounds like a healthy dynamics that I'm sure a lot of our listeners could share with you in terms of relating to their parents as well. Yeah.
So that was, I guess, like, like the start, but I don't know. I think security is like a mindset, right? I think you, it seems like a lot of security people have this. It's the devious mind where they want to always like figure out how to break things in a way.
I don't know if that's everybody, but I definitely, you see a set of rules and you think, Oh, well, what not that I'm necessarily want to go and exploit these, but like, what are the exploits there? And you just see them because your mind works that way. And yeah. Yeah.
They're like little Easter eggs, maybe things that are there and they exist and they weren't meant to be there. And they like violate like the spirit of how the system should be working. Yeah, exactly. Exactly.
Like the movie, the matrix or something like his capture some of that spirit, like pretty well. Yeah, exactly. And that was one of my favorite movies growing up, like probably a bunch of the listeners, but yeah, it's like there's the written rules and there's the unwritten rules. Yeah.
So the written rules are what everybody thinks are the rules of the system. And I, and everybody hopes that like, that's the rules that are governing things, but there's all, there's usually a separate set of rules. And in the case of computers, but you could think of it like the specs are like, or are the, are the written rules. And then the actual code is the unwritten rules.
So the code is the actual source of truth for like what you're going to be able to do with a system. But everybody's operating in this like realm of like, well, this is how it's supposed to work. But actually, if you just go look at the unwritten rules, you go look at the code and you find a difference between the written and the unwritten, then that's an exploit or that's a hack. And that's like that spotting that is like what it, it means to like have a security mindset.
I think. I agree totally. 100%. That's the difference between the expectation and the reality.
And there's little hiccups, I would say, like between those two and discovering those like a, there's an art to that, right? You have to understand like the mechanics and there's a science to how like computers work. And we, to be good at that art, you have to understand the basics. But within that space, you can be surprised.
And those little surprises, they can be little security exploits. So what are some of the surprises that Socket helps prevent or mitigate the risk around? Speaking of security exploits and so what does Socket do exactly for just our community listeners? Yeah, yeah.
Socket is a tool that helps fight vulnerabilities and malware and supply chain attacks that come from your open source dependencies. So we built it to provide visibility, defense in depth, and proactive protection from attacks that originate in your dependencies. So basically, the reason we started Socket was we saw a gap in the way that people were thinking about their dependency security. So they, you know, most people, and I would say actually the security industry in general is really obsessed with vulnerabilities and trying to get rid of vulnerability, known vulnerabilities.
And that makes some sense, right? They're a good baseline. They're fixing vulnerabilities is a good thing, right? I'm not saying it's not, but that's overwhelming.
Yeah, so many, maybe sometimes. Yeah, exactly. There's so many. And a lot of the time, they're pretty low value, like they're not actually exploitable.
They're the severities are overstated. And there's definitely an incentive in the industry to find a bunch of these as many as possible so that these vendors can say, oh, look, like we have thousands and thousands of vulnerabilities that are in our database. But in terms of the security impact, I would guess or estimate that probably 95% of CVEs are not worth even fixing. Like they're just noise.
And the teams that are spending their time fixing these are just doing busy work a lot of the time. And you're just burning like the capital of whatever company you happen to be in. If you're spinning your wheels, like running down these rabbit holes, chasing these CVEs that probably don't even have any past from the outside world to be executed or exploited. Yeah, exactly.
And so I was working on a project, and this is actually another website that I had built before Socket, which is called Wormhole. And my friend and I were building it. And the idea is it's an end-to-end encrypted file transfer application. So it had high security requirements.
We didn't want to see anything about what our users were doing on the service. And we did everything right in terms of security. We had really strict code review, a strict content security policy. We were very careful when choosing our dependencies.
We used all the standard security tools. We did all this stuff. But despite all this, we still had a dependency tree with thousands of dependencies in it. Right.
And probably a lot of the listeners are familiar with this problem. It's a particularly bad problem in JavaScript and NPM, where you end up with just these huge dependency trees. And it was something like we were using Dependabot, I think, to keep them up to date as well and to help fix CVEs. And a lot of folks are probably familiar with that.
Yeah, you're doing the right thing there, right? Or trying to. Yeah. But something like 5% of our app's total code, lines of code, were churning every week by doing that.
And so if you think about that, like your app is 90% open source, and you're merging these PRs all the time that are changing the dependency versions. Right. You're actually churning like a huge percentage of your app code every week. And most people, I would say almost everybody, like probably literally everybody, doesn't look at the code that they're merging.
They're just clicking merge on those PRs and letting those dependencies just change and assuming that it's going to be better. And so we had this nagging feeling that we weren't doing right by our users by doing that. And so, and we also realized we were reviewing our own code. Like I was reviewing my teammates code more closely than I was reviewing the code written by these random people on the internet that were making my dependencies.
Right. And I felt like there was something off about that as well. Right. No, that, okay.
So the problem makes sense. So we have all of this code out there. We want to use all of the great work that has been laid down as a foundation that we can build on top of. Often that shows up as open source projects and it doesn't even matter if you're using like JavaScript or Go or Java or whatever, Python, like you're pulling in all of these dependencies from the open source community, which is great, but it also opens up the possibility of like, well, what if something malicious ends up coming into your project?
And a really scary piece is like, okay, if you're at a big company and someone knows that you're using a very particular like JavaScript library and that library depends on three other libraries that are all open source as well. And one of them is like maybe lesser known. Could you just contribute something into that lesser known library and introduce a vulnerability or a hiccup or, and really have like a targeted attack for something for a very juicy target when you have a lot at stake? I felt the, I felt this effect myself because I was a maintainer of WebTorrent and all these other libraries that I ended up writing while I was working on that project.
So I was a maintainer for like full-time maintainer for like, I don't know, five or six years full time just doing this. And I ended up being in, having my code in so many people's supply chains. I, and I was a random 20 something and I was, I cared about security and I was doing my best, but just thinking about how the, how far that reach was that any random maintainer has into almost every company's supply chains. It's pretty massive.
So my packages alone are downloaded something like 500 million times per month on NPM. And that's a hundred packages that I have access over. There are people who have access to who have, have created way more than that, like 500 or a thousand. And there, there's one person I know who actually has 17% of all downloads on NPM are controlled by his one account.
He's an amazing maintainer. I'm sure he's amazing individual too. And like would never do anything malicious, but like in the off case, like you lose your laptop or something like that. You can also have a security breach perhaps that way.
Right. Right. So there's that aspect. And then there's also just dependencies getting, getting compromised by their maintainers.
So it's very normal in open source that over time, new maintainers are added to help contribute to the project. Maybe you get tired of maintaining something and you give it to somebody else to take over. That's actually very normal and very healthy, but there's been cases that the most famous one that I remember started, it was in 2017 when it first got on my radar that this was a problem was a package called event stream was compromised. And the story behind it was this maintainer named Dominic Tarr was not interested in maintaining this library anymore.
He was one of these really prolific folks. He had, I think something like 500 plus packages that he had written, very creative person, but this was one of his older ones. And it had been used by, it had millions of downloads per month. And someone approached him and said, I'd love to take over maintenance of it.
And he did what you normally do, which is you say, yeah, I'm not, yeah, I'm not taking, I'm not using it. I'm not taking care of it anymore. Sure. Here's the, here's the, yeah.
And this person was nefarious. They waited a month. They did good changes for about a month and then they snuck in a backdoor and this was very hyper-targeted to one particular company. Right.
Okay. Yeah. They literally did, they did exactly what you described. They literally knew that this dependency was in a chain of dependencies that was used by the company they wanted to attack.
And they put in a backdoor and the way they did it was super sneaky. It was designed to only execute in that particular environment that they wanted to target. So nobody noticed it. It was heavily obfuscated and it used, it basically used the description of the top level project.
So pull that out and use that as a decryption key to decrypt the malware. And so what that meant was everybody else like who was running it, we would fail to decrypt and nothing would happen. But in that one environment. Yeah.
You had the wrong key. Wow. The wrong key. Yeah.
And so it was very clever and it led to, it basically caused this company's desktop app that they built to get backdoored and they shipped it out to all their users. And then a bunch of users were affected. And in that case, it was a crypto wallet. And so they had like all their crypto stolen.
Typical, right? Crypto stories. But when you have money on the line, like you become a target, right? It's a theme.
The hackers, they run a business too, and they go after the lowest hanging fruit. And oftentimes you don't need to have a perfect solution. You just need to raise the cost of the attack to make that payout not worth it from their perspective. Yeah.
It's maybe a, I don't know, that's one way to think about like folks that go after hacks and that hacker mindset. But in some sense, their time is just as limited as ours. And they're running a business. Some of them are running a business.
Some of them are, they do it for like the street cred or the respect or whatever you want to call it. And yeah. Yeah. And in this case, I think people hadn't paid much attention to, like in the past, they weren't really aware of the massive opportunity in open source, like for their, for their tax.
So people started realizing it around that timeframe. And since then we've, we now socket monitors every new package that's published to NPM, PyPy and the Go ecosystem. Those are the three we do today. And we see, it's like every, every few minutes we have a live threat feed in our kind of internal company dashboard that shows us like all this stuff we're catching and blocking.
And it's like every few minutes there's stuff we detect and it's outright malware. It's literally someone compromised a package and added something to it. Or the most common attack is actually typo squatting where they'll register a name that's similar to a popular package's name and hope that people are going to typo the install. Just typo the install.
Yeah. Yeah. And I've done that before. I've actually typoed installs before, but I've, I always rush as fast as I can to hit control C and cancel the install.
And now, now that's one of the things socket can help with is you don't have to do that anymore. It'll actually, it'll intercept the NPM install and tell you, Hey, it looks like the thing you're installing has, has 10 downloads. And there's a package that's one letter off and then it's one letter different that has 10 million downloads. Like maybe you meant that one instead.
And it'll ask you if you're sure before you proceed. That's awesome. So, so to socket, so socket, what it does is it takes all of the open source community contributions and it runs it through a static analyzer. It looks for known like malware patterns.
And then does it do any like dynamic analysis or how does it, how does it flag something is like interesting or raise it for maybe a human to go look at. And do you guys look at them yourselves too? Like when they get flagged or. Yeah, we do that.
Definitely have humans looking at everything that we flag. But before that, we also, we have a lot of automation involved here. So we have to analyze every package in all these ecosystems in real time. Right.
And it's, the scale is so massive. So like what we want, what I can share the insight here was. We looked at, I started seeing these supply chain attacks happening in the ecosystem starting in 2017. And every time one of these would happen, I'd look at the attack code and I'd say, my goodness, if somebody just looked at this, if somebody just had like looked at for one minute at this diff, they would see right there in front of their eyes, like a very obvious, like attack.
Like it's like, there's a giant blob of obfuscated code added to this file. Like it's obvious that something is weird about that. Or, oh, look, they're running eval on a giant base 64 encoded string, like to at runtime, like run this clearly obfuscated code or they're, they're making a network request and, and reading environment, the entire list of environment variables in the package and sending that off to the network. And before, before this version of the package, they'd never needed the network.
They never needed to access the environment variables. It was a completely different, the package never needed those permissions in some sense, right? It never needed those capabilities. And then now all of a sudden in this new patch version and patches and minor versions are supposed to be pretty small changes and not significant changes.
Suddenly they're claiming in this patch version that they need all this new permission, right? Yeah. Okay. So it sounds like you wrote, or you have your own like heuristic of what risk looks like.
And you can find that through a static analysis by encoding some of those patterns maybe into an initial first pass and then folks look at it. Correct. And when, yeah. And when you say folks look at it, part of what we do is we have, it's a bunch of, a bunch of heuristics, but also we have, and I hesitate to say AI because everybody uses it as a buzzword in security.
You're trying not to jump on the AI bandwagon, but correct. Yeah. For most of, I would say for most of security, AI has been like this snake oil fake. A little bit.
Yeah. Maybe. Yeah. And I don't want to, I don't want to like, I don't want to be in that realm by any means, but I have to say that with the new LLMs that have come out, like we've been, so Socket's been paying really close attention to the LLM tech and we were playing around with GPT three back in before chat GPT had even come out and had gotten everybody excited to buy it.
And we were playing around with using GPT to explain package behavior. So when we see a network call, we can say, we can actually use the reasoning capabilities of the LLM to say like, what is the reason for this network call? And like, what data does it seem like it's sending out? And so being able to analyze the source code and figure out the purpose of each of the different permissions that are being used is actually pretty cool and pretty useful.
And so it's not just like heuristic goes and then we have a human look at it and do the analysis. We actually have like the ability to within a minute of a package being published to one of these open source registries. We actually do our heuristics using static analysis. And then we also put it into this LLM.
And if it's high confidence enough about this being malicious, we actually block it right away. So we have auto blocks within minutes of publishing. And it's pretty cool. It's really it means that it's like the fastest possible way to block this.
Yeah. Yeah. And so to receive the benefit, all of that hard work that you guys have put in the blood, sweat and tears that you've poured into socket and figuring out like, what are those heuristics? We use LLMs to accelerate our judgment calls, like on obvious cases.
It sounds like it's a command line tool that you install as a developer in your local development environment to help you manage and maintain your local installs of dependencies on the various projects that you have for language supported. So we definitely have a CLI, but actually most users use our GitHub app. And the reason is that it's because we're listed in the GitHub marketplace. It's actually like a really easy install.
It's literally two clicks to install it. And if you so if you use JavaScript or Python or go and you want to try out socket, it's a two click install and then it'll add it to all your GitHub repositories and you can start playing with it. There's no need to like to go repository by repository and change the way that you do your CI or add a new GitHub action to every single repo. It's like a two click thing.
And then boom, we're on like all the repos. And then what we do is when we see any pull request that is adding a new dependency or updating a dependency, those are the two times when you're bringing in new third party code. And so we will analyze those changes. And we do it where we never want to bother a developer with risk that they didn't cause themselves in that pull request.
So anything that you already have that's risky, we're going to ignore and only tell the developer in their pull request. Here's what you are doing wrong or like what you could improve in this pull request. So here's a package that looks like it's a typo or here's a package that's malware. We're going to block this PR.
And here's a new one. Like here's a change. It looks like suspicious, like careful. Yeah, I like that.
So it's right there in line in their workflow where they're spending all their time. They don't have to go out to a separate tool. They don't have to do anything complicated. It just slots in right there and it just helps them do better.
Like it sounds so easy. That sounds so easy. I think like security done correctly fits into like our daily lives, fits into the processes that we already have and is almost invisible except when you need to flag something and there's an action required. So nice.
Yeah. So simple. Yeah. So that's the main way people use us.
But you're right though. We do have that CLI. And I did mention kind of intercepting installs on the command line is actually really cool too. A lot of people like that because it means that they're also protecting their local laptop and not waiting for the pull request to actually notice that they're using a malware dependency.
So we can shift left as far left as we can. And even to the point where we have also, we've also launched a Chrome extension where you can, as you're browsing the web, looking for packages, we can also put that insight directly into the website. So you could actually like identify scripts that have not used socket before and are trying to run maliciously identified like libraries or packages in your browser as you're looking at the page. Oh, no.
So, yeah, no. So it's more like, it's more like I'm a developer and I'm doing research to figure out what dependency I want to use because I'm trying to implement a feature. So I'm Googling around and I land on, on a nice looking NPM package that seems like it's a good choice. Right.
But I don't know how many transitive dependencies this thing has. I don't know what all those dependencies are doing. I don't know if the maintainer is untrustworthy or if they have no reputation or I don't know if there's, you know, like protest where or data exfiltration in one of these dependencies. So socket will just on that NPM page.
We'll just put that information. If there's any red flags right there at the top of the page, it'll say this package uses the network or this package steals all your data or whatever. Oh, nice. It's like those little warning signs that your iPhone will pop up like, oh, this app wants your location.
You're like, no. Go away. Yeah, exactly. Exactly.
For the supply chain. Yeah. Dependencies. I love that analogy.
In fact, that's actually the analogy that we use is like you should be able to know the permissions that your dependencies are using just like you can on your iPhone. That's so easy and simple. I love the, it just fits in with what we're doing already as developers. Right.
So seamless. Security done right is seamless. It just, it works and it fits. Yeah.
And I wish more people thought about it that way. It just seems like the industry is very, I don't know. There's, I think it's because security software is typically sold to executives. That's the way that it's been done in the past.
And so these executives, they don't have to use the software. So you can write an entire like bucket of crap. And like, if you can make that sell at the top level, like with a couple of demos and like the person on the receiving side never has to touch any of the pain points that are secretly like right underneath that hood, then you're setting up for a huge inconvenience and you don't even know it. Like in a bottom up go to market strategy, which is like what you have where you're targeting like the folks, the users and the adopters are the, and the buyers are the same people.
They're all the same. Like, and we're going from engineers on up the chain, bottom up instead of top down from CISOs and directors pushing stuff down. Like, I think that's the appropriate way. Like as a leader, if you're not respecting people's autonomy and you're not asking for solutions to hard problems, but instead you're dictating like what you expect solutions to be, you're really like violating people's autonomy and you're not allowing them to bring their full self to the table.
And that's where you lose out on all of the great benefit of the talent that you could surround yourself with. Right. So anytime there's a power asymmetry that's used by leadership to get something done. Okay.
I understand that's some people's leadership style, but it just, it feels like because you're violating someone else's autonomy at that point, because if you have to invoke a power asymmetry of the boss and here's what we're going to do, it's like leaves a bad taste in my mouth. And I know that there's things like being left out because if you can't convince someone who's smart that what you want to do is the right thing for the company, is it the right thing for the company? Like maybe it's not. And you have to have the humility, I think, as a leader to understand and appreciate that perspective.
Totally. I'm a huge fan of bottom up. Go to market strategies. Yeah.
Yeah. That makes, that's exactly what we're trying to do. So anyone can install Socket. You can try it out in two clicks and there's no, there's no talk to sales required.
And obviously we do set, we do want to work with this. We do work with CISOs and heads of application security. Of course. Of course.
Yeah. How awesome is it if the developers in the company have already used Socket to evaluate packages before, and then you come in as the head of application security and you say, Hey, we're thinking of using Socket. And then a bunch of people on the developer side raise their hand and say, Oh yeah, I love Socket. That's.
It's a natural fit. It's just, it like works. Or even if you're a CISO and you know that there's a huge list of like really difficult problems and your principal engineer comes to you and is like, Hey, this entire class of problem of supply chain, like risk management. Well, how about this?
And it's basically mitigated as, as much as you're ever going to get it mitigated. It's all. That's music to you. No tools.
Perfect. But yeah, we definitely, definitely think if you, if you go beyond just focusing on known vulnerabilities and you take a holistic look at the risk of packages and realize that supply chain attacks are actually way worse than the average vulnerability. Now it's way worse than the average vulnerability. And most companies are doing nothing about that today because the vulnerabilities companies in the, in that tooling is just so prominent.
It's so, it's so easy. It's actually been so, it's so easy to just check against this database and everybody's just doing that because it's easy. The much harder problem is to say, what is this dependency going to do? Right.
Right. And so if you do that, I think, think that that plus being developer friendly and, and being right there in the developer workflow, that's like a really good recipe for success. Awesome. So now with all of this, like with you leading the charge there at Socket and as founder and CEO of this growing successful security company, what's your typical day look like?
I do spend a large amount of time talking to customers and figuring out what we could be doing better and demoing the product, helping people use the product. And that way I can bring back those insights to the team and then help, help prioritize them. At Socket, we're a pretty small team. And so we move really fast and we actually do weekly planning cycles, which means that we do have, obviously we do have goals for the month, goals for the quarter, but we also can pivot within the span of a week.
And so when we, when we realized we've talked to a customer and there's something that's very obvious, very important that they need, we can actually go in and that could be something we work on that next, that very next week. That's really, we have a really tight loop between like customer feedback and then shipping things in the product. How big is the team? We're about 50, we're actually about 15 people right now.
Wow. Wow. That is very small. Yeah.
It must be super efficient. Yeah. Yeah. A lot of people think that we're 50 or a hundred people just because of the rate that we ship stuff.
But our team is, it's all very senior folks and all just very, very good at what they do. Amazing. So what's the best day that you've had so far? The best day that we've had.
That's interesting. I think the, I definitely love when we get new customers, like new large customers on board, because it just, it's more and more evidence that people are liking our approach. So yeah, we recently had a big customer switch over to socket from some, from another tool that just does vulnerabilities. Yeah.
Yeah. Competitor. Yeah. And, and it was just like a huge validation for us and for the whole team that we're, wow, like a company that, you know, is this large and this, I don't want to say conservative, but like this careful realizes that they had a gap and that we're the solution to that gap.
Like that, how exciting is that? Like we're actually really onto something here. So anytime we get more signs that we're building the right thing, it feels so good as a, as a founder of a company. Cause so many startups don't ever get to product market fit.
And it's like, it's really just, it's the reason that so many companies fail at this, at the pre-Series A stage. Right. And congratulations actually on your series A I saw on LinkedIn, a 20 million raise was you pulled through. Yeah.
Really excited by that. So we, we love the person who led the round. He is Zane Lackey, who is the, one of the founders of Signal Sciences. And he was actually an angel in Socket.
So he was one of the earliest people to individually like, like what we're doing and backing us. And then he joined Andreessen Horowitz and, and then led the investment in Socket. So yeah, it's wonderful. They're a great, like, I think they're the best.
They're one of the best with the best firm out there and team's incredible. And yeah, it's really great validation. What's been the most challenging day that you've had so far? We haven't had anything like dramatically bad happen.
The kind of, some of the kind of crazy stories you hear about, like at maybe other startups. We were actually a pretty mature team, like as you can imagine. We have a great group. Yeah.
Yeah. I think we have a great group. So I think the, probably the word, like the, the bad days are just days where I'm, where the day gets by you. You just, you start like a bunch of things come up and you, and then at the end of the day, you feel like, what did I really do today?
And where you, you just get pulled into too many different directions and you lose focus a little bit. Those days are my worst days, but it's never a thing that gets you down. It gets me down. I'm usually just like ready to go the next day and just got to keep showing up and putting in the work.
Consistency counts for a lot. And sometimes we can be our own like worst critic. Like we look back and we think about like, how do we handle something? So I often find like sense of compassion goes a long way and remind yourself, like you mentioned like tomorrow's going to be another day.
Have another go at it. Right. So, and there's no such thing as failure unless you actually like give up. Right.
If you give up, like, okay, it's done. But you just keep at it. Persistent. Keep showing up.
It's good. It's good. It just, success is right over the horizon. Yeah, for sure.
So what's the founding story? Did you actually start building this yourself? You are very technical, but a strong technical background. You studied computer science and you noticed the problem.
No co-founders? Yeah, no co-founders. I would love to have had a co-founder, I have to say some days that when I'm like, wish that there was somebody else who could share some of that load. But no.
So the story was, I was building that app I mentioned before, Wormhole, with a friend. And that was actually the company at the beginning. And we wanted to build a suite of productivity tools, similar to like the Google workspace suite. So docs, spreadsheets, notes, emails, all that stuff that was all end to end encrypted.
So the service provider couldn't, wouldn't be able to. It's a really difficult challenge too, especially because like key lifecycle management is extremely complex. Yeah, there's a lot of challenges with that. And especially if you want to make it be a really good user experience in the web browser.
So you want to be able to, you don't necessarily have all the data already offline and are able to decrypt it. You need to be able to fetch from the server, just the data that you need, just the segments of the data that you need, and then to decrypt it as needed in the browser. So there was a lot of, and so we started, we went to the start with something that was like just getting our foot in the door and getting our, getting kind of something out that was relatively speaking easier. And so we started with file transfer as the use case.
And so Wormhole was, we wanted it to, and our bar, our bar, our sort of goal was not just, oh, it's the, it's like this other service, but encrypted, but end to end encrypted. And therefore, therefore worse in the terms of user experience that was not acceptable to us. We wanted it to be better in user experience and better in security. And that was, those were both equally important.
So, so with Wormhole, we were able to have the user experience of something like, like Dropbox send or WeTransfer or Firefox send or these browser-based file transfer services, but better, better and better in both user experience and security. So we actually had literally, I'm still, I'm super proud of it. It's still live today. If you people want to check it out, it's at wormhole.
app to go visit and see, but it will actually, we'll let you drop your files on the browser, no account needed. Instantly, you get a link that you can share. Even before the files are finished uploading to the service, you're, you get this link that you can just go and paste and send it to whoever you're trying to send it to. And now you're done.
You don't have to like, cause we, I really didn't like the way you wait after you, you dropped the files on. You got to wait 15 minutes to upload this giant 10 gig set of files. And then you get the link and then you send it. And that, that, that was a problem.
So you get it right away. That's the first thing. The second thing. I'm trying it out right now.
Whoa. It's like, there is a wormhole. Yeah. The user experience is definitely interesting.
I think the, with the, with the wormhole animation, we definitely got some people love it. Some people hate it, but, but, but the second thing we do, that's really different is that the receiver of that link can actually start to access whatever files they want to within the set that you're sharing. Even before you've finished uploading them to the server. So you could be 1% uploaded.
And they're like, I want to load this file over here. And it's, it could be one of the ones that isn't even hasn't even been uploaded at all yet. And we will dynamically reprioritize the files that are being sent. And it, and it also uses peer to increase the speed.
So using WebRTC, we would actually connect directly the two browsers together and avoid the cloud and just go straight to your machine if you were both online at the same time. So it's like these niceties and these little, little tweaks that make the thing super fast. It was, I was very proud of it. But yeah, you asked about how we got to socket.
So, so yeah, that was where we. So it was a pivot. It was a pivot to make wormhole more secure. You built this thing or you start to build this thing and then you realize like, wait a second, I'm not the only one with this problem.
Yeah. That was, that's basically it. Yep. Yeah.
What was that? Well, how did you come to that realization that you just solved like a huge problem that the entire ecosystem and community faces? Well, there was a moment. Yeah.
I was asking around, like, what do people do? Like I mentioned, we did a lot of stuff to do security right with this. We wanted, we wanted to do the best job we could. And it felt like we had exhausted all the things we could do that were in our control.
And the like last remaining piece that we didn't feel so good about was our pile of node dependencies that were sitting there taunting us. And they're like 90% of our code that we haven't even audited or looked at. And we're like, how can we say our app is safe if like we've only looked at this 10% and 90% over here? And I think a lot of developers don't appreciate that.
I know they know it, but they don't really think deeply about it and appreciate that. But it doesn't matter whether it's code you wrote or whether it's code that somebody out there on the internet wrote. Like at the end of the day, it's going to get all bundled together and it's going to run in one process. And all of it is going to be, all of it is your app at the end of the day.
Right. So it's not an excuse to be like, well, that's someone else's code. That's someone else's problem. No, like if you're using it and it's getting run in your app and it's all running in the same process, like it's your problem.
And so we thought about it as our problem that we wanted to understand the risk of and protect ourselves from that risk. And so we started asking around and nobody had a solution. They all said, why don't you use this CVE scanner? And it was like, no, that's not really what we want.
That's not really addressing the risk here. Like I know that we have zero CVEs. Like what I'm concerned about is there's malware or a backdoor sending data. It could even be not even malware.
It could be innocent. It could be that there's some developers now that like maintainers that like to add analytics code so they can learn who's using their package. They want to collect some analytics or whatever. Right.
Perfectly like legit, like use of like, well, I don't know. You have to like, you have to be very careful. Yeah. Yeah.
It's sometimes. It's not malicious. Let's just say like, yeah, it's not malicious, but it's like, it could be for a lot of companies. It would be something they don't want.
Very risky. Yes, exactly. Yeah. And then you look at like your browser, you open it up.
And if it's JavaScript, you can see it connecting to all these random sites. You're like, what is going on here? Why does it need to pipe data to all of these random things? Like frustrating.
Yeah. Yeah. And for us, we didn't want to pipe data to anybody. So we had obviously content security policy in place that would prevent that.
But we still wanted to. With defense in depth, you want to have multiple layers of layers of security. Yeah. Yeah.
Yeah. And so that was what we were. We're like, okay, so really there isn't a solution. And then I decided to go out and actually formally interview like 40 application security leaders and ask them what they thought about this problem, like what they're, what they're doing today to mitigate it and whether they think it's a problem that they, that they care about that they're trying to solve or, or what.
And it was almost universal 30, actually 39 out of the 40 told me that they thought this was a top three priority and that, but they felt like there was nothing, there was really no good solution and they were, they were just, it was one of those things that kept them up at night. And so that gave me the confidence that this was a big problem and we had faced the problem. They were facing the problem. We had an idea for how to solve the problem.
And so it was. So you gave it a shot and then you would go back to that 40 or the 39. I don't know what the, the one person did, but. Actually, that was, I don't want to.
It's not a problem. It's not a problem. I don't want to, I don't want to name who it was. It's an app that it's definitely an app that folks have heard of.
I'm not going to mention who say who it was, but they were like, yeah, we don't really, yeah, we don't really, we don't really care about like securing our dependencies. Like we just let people, developers can do whatever they want. Like it's fine. And I don't want to say who it was because they actually got, they've been breached multiple times.
Well, props for not throwing each other under the bus. And, and that's a complicated thing too, because like sometimes people are constrained by the budgets that the companies that they work for are operating under. For sure. Yeah.
I know. And I don't want to blame people. I think you can't blame people for sure. Yeah.
I know. I think it was more of the mindset of this particular company or individual. That was the problem. It wasn't like, oh, this is important, but we have three, three other more important things.
It was more like, we don't really care. That was what I was responding to. That's difficult. Fair enough.
Yeah. Yeah. Yeah. Maybe you would like to share with our listeners a book or a movie that changed the way that you saw the world or cued you into something that you thought was just interesting and intriguing.
Hmm. The most influential movie for me was when I was a kid and I watched The Matrix, but that's probably not an interesting recommendation for most of the listeners. Really? Why not?
Like, okay, so I've met. So this is also the answer to my question is like The Matrix. But why The Matrix? First of all, it's just cool.
It's very cool. Yeah. No, but it's obviously the philosophy of it is super interesting, especially when you're younger and you're just getting exposed to that for the first time. It got me really interested in like philosophy and like, what is, what is reality?
How do you know that you're in reality? There's just a whole bunch of really cool questions in there. You know, would you take the red pill or would you take the blue pill? Yeah.
No, it's a bunch of, it sounds cliche now as when you're older and you know about more stuff, but like, I just remember at the time how mind blowing and credible it was and still is actually, I think it's actually held up, holds up really well going back and watching it. It's, uh, I think so too. Like the first one, like, I don't know what happened with the other ones, but yeah, no, let's not talk about those. No, they don't exist.
Let's definitely not like, let's not bring those up. We can confirm those don't exist. Yeah. Yeah.
No, I've met several young folks that are like, oh yeah, I've heard about the matrix, but I've never seen it. Really? Wow. I've never seen it.
Like you're in for a treat if you ever sit down and. On that, in that vein, another one that folks might enjoy is a movie called waking life. I've never heard of it. Yeah.
Waking life. It's just, it's, it's definitely a strange recommendation. So like strange. Yeah.
It's basically. It's this guy went around and interviewed a bunch of professors about like, and in different folks about like, what is, what does it mean to what is existence and what does it mean to, to, to experience things. And, and, and then they did this really cool art style where they took every frame of the movie, every frame of all the shots they took and they drew, like they drew on top of them, like, like tracing the lines of the, of the faces of the people they were interviewing. And they did that for every frame.
So they drew a separate picture and then they put it all together. And so there's no frames of like the, of the, for the whole movie, you're basically watching these drawings. And because each picture was drawn individually, it doesn't line up exactly between the frames. And so it does this like wobbling thing.
Wiggly wobbly. Yeah. Everything is wobbling the whole time. And it's, it's, I don't know, it's just, it's a really unique, like aesthetic.
Artistic. Yeah. Very artistic. It's very particular aesthetic there.
Yeah. It's pretty cool. Nice. Nice.
Well, no, thank you so much for joining on the security podcast of Silicon Valley. Would you like to leave our listeners with any words of wisdom? Stay hungry, stay foolish. There's a lot of cool stuff out there in the world and go forth and hack.
I don't know. That sounds like great advice. Thank you again for joining Pharoah. And thank you for all of our listeners for turning into another episode.
I am your host, John McLaughlin, and stay tuned for the next episode. See you, John. Thanks for having me. No, thanks for joining.
It's been awesome. Awesome. Awesome. Awesome.
Awesome. Awesome. Awesome. Awesome.
Yeah. Awesome. Awesome. Awesome.
Awesome.