Transcript: How Statsig Founder Vijaye Raji Built a $1.1B Product Development Platform | CEO & Co-Founder of Statsig
Full transcript: How Statsig Founder Vijaye Raji Built a $1.1B Product Development Platform | CEO & Co-Founder of Statsig.
2025-06-13
Host: We have a about a couple thousand users, most of them come self-served. We have a few hundred enterprise customers that are using our product um at scale.
And then we have Each one, so if you think about like the the few hundred customers that I just talked about in the enterprise segment, they all have big data. We we process about like four petabyte of data every day, which is mind-boggling to me.
So last September, we announced that we were processing about a trillion events every single day. Now we're processing about 1.6 trillion events every single day. So just to put that in perspective, that's about 18 million events every second.
Host: Hello everyone. My guest today is Vijay. Vijay is the founder of founder and CEO of Statsig, a platform for product development.
He was previously vice president and head of entertainment at Facebook and at one point was the head of Facebook Seattle office and spent a decade before that at Microsoft.
In this conversation we'll talk about his transition from Big Tech engineer to a startup founder. His tactical insights on hiring, scaling, and building a product-led SaaS company.
How modern product teams are using experimentation and rapid iteration. Uh, and also talk about what does the future of experimentation look like in an AI first world.
If this is the first time listening to Start a Project, don't forget to subscribe to us wherever you listen to the podcast.
Host: Vijay, welcome to the show.
Guest: Thanks for having me, Natraj. Good to be on your show.
Host: So, uh, as I was researching this conversation, uh, I I found a, uh, I think maybe it's on the Secoya blog post, uh, that when you're considering leaving Facebook, uh, Mark Ping Du and, um, asked you about, uh, you know, try to convince you to stay at the company. Uh, what was that moment like?
Guest: So this was not the first time. Uh, so I like tried to leave the company a couple times before then, and every once in a, every month, every single time, it was a conversation that kind of of like, okay, well there's a lot to be done here.
A startup is a small company. The impact you will have is like, you're not that big, small.
I, you know, I for all the good reasons, I, um, I stayed back at Facebook, and when it was 10 years, I knew it was like something new has to be done for uh just for my own personal sake. So I have felt like I need the the startup in my life.
So I left.
Host: you you've been in sort of big tech companies for almost two decades at that point. Like what was the personal motivation? Like did you did you always wanted to have like, uh, I mean there's always a personal carculus. Do I I'm doing all this work for a company? Or like, can I own more equity? Like what was the what's what was your thinking at that point?
Guest: Um yeah, so I I started, uh, at Microsoft. I spent about 10 years at Microsoft as a software engineer. I mean to be completely transparent, right? So I loved Microsoft. I I enjoyed, I learned a lot.
You know, everything that I know about software engineering, Microsoft is the best place. This was like back in, you know, early 2000s, where you learn how you build software.
You kind of like learn how to predict something that's going to happen two years down the line, and it's kind of like a science and there's so much to learn, so many good people. So I had a lot of good time learning all of that stuff.
At some point, I, you know, I had thought about it like, okay, well maybe I should go build something different. Um, and I I don't know.
Is this probably like something that is very common nowadays, which is like you're in, um, you're in a holding pattern for your green card. You can't really like leave or reset your green card.
Um, you know, for good reason or bad, you kind of of like uh don't really explore other options, um, when you're in that situation. So I had to be like that for a little while.
Um and then once I got my green card, uh the first thing I did was like look around and, um, luckily for me, Facebook was starting up an office in uh in Seattle.
And so that was my first um jump from what I thought was like a really, really good learning experience for a whole decade. And then I went into Facebook. At that time, Facebook was a startup. It was like a late stage startup.
It wasn't, um, quite uh ready for IPO. Um, and so I thought I was joining a very small company.
I mean like leaving behind a company that was 100,000 people to like, um, join a company that was only like a thousand or 1,200 people at that time, uh, was incredibly different and um a good learning experience.
I you know, I thought I learning a lot at Microsoft, and then I went into state at Facebook, that was a completely new, um, world.
Uh, and so that's how I went from like, you know, one big company to what I thought was a startup and then eventually you know the story of Facebook, it grew so fast and like by the 10 years that I was there, um, it had grown to like I don't know like 65,000 people or something like that.
And so that was a a lot of good learning because when you're in a company that is growing that fast, um, you learn a lot and you get exposed to a lot.
Host: And I think by the time you left, uh, you were leading entertainment at Facebook and also you were almost leading the Facebook Seattle office at that point.
Guest: Yeah, so, um, one of the things that I generally do is every, every couple years or so, I try something completely new.
So even at Microsoft, I was at I started with like uh Microsoft TV, which was like, you know, set out box for to Microsoft TV uh devices, and then uh moved on to developer divisions, where I was doing Visual Studio, where I was building, you know, compiler services for text editors and things like that.
Um, and then after that I was working on SQL Server, so building databases. Um, and then after that Windows, so operating system. So even within Microsoft, I did various little things.
And then, and at Facebook, I started out as an engineer, um, and I worked on Messenger, and then some ads products, and then I worked on uh marketplace, and then gaming and entertainment. So if you look, each one of them is pretty different.
They don't have like much correlation or uh continuation. Um that's kind of like how I've always operated in my career.
And so when I left, um, I was uh I was the head of entertainment that included everything uh from videos, music, movies, um and so on.
Um, and also the head of Seattle, uh, which when I joined the company was, um, was like, uh, a couple dozen people in Seattle office. Uh, when I left, we had about 6,500 people uh spread across like 19 different buildings.
Host: Oh, what what were the some of the interesting problems that you were working as head of entertainment and what was the scale of those problems?
Guest: Yeah, head of entertainment, um, if you think about Facebook, uh, or now Meta, the apps that you have, um, there's like there's a social aspects to it, you know, people, you know, your friends and your community and so on.
And then there's an entertainment aspect to it, which is you you you just want to like spend time. You want to be entertained.
And the kinds of stuff that you do for entertainment, um, you could watch videos, you could listen to music, uh, you can watch music videos, uh, you can uh play games, uh, or you can watch other people play games.
Um, you watch short clips from TVs, TV shows and so on. So that's the way I thought about it. Oh, podcast, another one, huge area, uh, which is like listening, um, to audio.
So, um, anything that is not pertaining to your social circle, belongs to this entertainment category. And that was my purview.
Um, and so whenever I was thinking about the problems that you're trying to solve is like how do you make, um, you the time people spend uh of is of high quality. What what what do they gain out of it? And how do they get it entertained?
Um, high quality entertainment. That includes everything from acquiring the right kind of content, understanding what people want, and then personalizing the content to them.
And then if you go on like you know removing content that is not great for the platform. Um, you know, anything that is violating policy, you know, things along those lines.
So you you invest quite heavily in like the integrity of the platform as well. So those are the kind of the problems that uh you tend to solve.
Um, on the engineering side, scale is a very important problem, you know, when you're delivering video at like 4K, high quality, high bits bit rate, um, to uh networks that may not be um, you know, uh reliable, then you have like interesting engineering problems that you have to go solve.
Um and so those are all like super exciting.
Host: Were you primarily focused on the technology of getting the entertainment on the Facebook, different Facebook platforms or also part of you know dealing with the business side of it, you know, licensing the content, acquiring content or you know incentivizing different, you know, entertainment companies to sort of post on Facebook.
Guest: Yeah, it was part of that too. So, um part of it is like when you have a product that is actually, um, observing what people watch.
Um, and so, so you know what exactly people want, and, um, then you want to like go and buy more of that content, go to the media. Uh, so we do have media procurement team and so on.
So you can go to them and say like, this is the kind of content that people consume on Facebook. So let's go get more of those. Um, so that kind of of like plays into the decision of like where, uh, the company would go invest in.
Host: So, you know, you were doing some exciting stuff at Facebook at scale and then you decided, you know, it's time for you to leave and start your own company. Uh, like what was did you evaluate different set of ideas or like the idea for StatSig was kind of brewing in your mind while you were at Facebook. Like how how did that initial thought process was like?
Guest: Yeah, it's a little bit of both. Um, so my journey was, um, the the first part of the journey was like deciding to go start a company. Um, and then the second part of the journey is like, okay, uh, what do I go build.
And so deciding to go start a company has been like brewing for a long time and so it's it's one of those things that I I would regret if I don't do it kind of of a situation. So, um, what to go build?
Um, you know, it's kind of of a because of my varied experience, like, you know, doing everything from gaming to ads, to marketplaces, to, um, to eventually like videos, um, I had lots and lots of ideas about like, you know, what we could go build.
Um, when you're evaluating an idea, you want to like take into account what the market size is going to be, uh, what the propensity of a buyer to like, you know, pay a dollar for you. Um and then eventually like, you know, what are you good at?
So sometimes you know have to like set up, okay, I'm going to go guess a lot of competitors in the industry. So what are we really good at? And what could I bring which could be a an advantage for us?
And so, um, those are all the factors that go into, which is a little bit more. If you think about it like the heart is how you, you know, your passion is driven.
Uh, your your kind of of like this logical analysis is like is part of your mind is how you arrive at what do you go build. Um if you're entirely driven by passion, you may build something that may not be sellable.
So those are the kinds of uh considerations that went into deciding, okay, I'm going to go build uh a developer platform um that includes everything from like, you know, uh decision making and we're going to like empower everyone to make the right decisions using data.
Host: So, you know, once you decide on this particular sort of like an experimentation developer platform, uh how did you go about like getting those first couple of customers?
Guest: Yeah, it's a it's a good journey. It's a good lesson for everyone that's out there building startups.
Um, our uh our company usually like, you know, when when you have a founder that, um, has a immense amount of faith and conviction that, okay, this is what I'm going to go build.
I think this is going to be great for the companies and, you know, for the industry, for the people, and so you're a very mission driven. And so you want to go build, uh, and, um, and while you're building, you're talking to a lot of people.
Um, and this is the part where I made all kinds of mistakes where, you know, you go to someone who you know who is willing to like spend 30 minutes with you, and then you're like, look, Natraj, I'm I'm going to ask you.
Look, I'm going to go build this developer platform. I think it's going to be like pretty awesome. It's going to have all kinds of like features that will allow you to like ingest the data and help you make the right decision.
What are you going to say? Chances are you're going to say, oh, that sounds like a great idea. You should go do it. Right?
And then then you're kind of of like, you know, you talk to enough people, then you you you you build this echo chamber that you are now even more convinced that everybody that I talk to needs this platform. And then you go build it in vacuum.
Like so we did this. For about six months we built. And then at the end of six months we're like, let's go talk to some people and like, you know, the people that I talked to before, and see if they're willing to like now start using this product.
And you know what? You go to talk to them and say like, hey, you remember I told you about this thing? I have this product now. Um, would you use it? And what is the answer? Let me think about it. What do you mean let me think about it means?
Uh it's like they're not really that interested. Um, much harder is to like, you know, have them integrating into their existing product. Much harder is to have them pay a single penny. So, you learn that lesson.
Um and so this is one of those things where uh when I was talking to one of um uh co-founders, um at that time, uh a person said like, go, you got to go read this book, the mom test.
And I went and read the book mom test and I kind of like realized all the mistakes I was making when I was talking to customers. So, the the point of that is, first, you need to understand what problems people are facing.
And do you have a solution for that problem? And in order to even get to that stage, you need to know who your customer is, who your ideal customer profile is.
And then you talk to them and then you make sure that the product that you're selling, um, actually solves the problem. Not only that, um, you have to be the industry best for somebody to even care about your product. And then uh open up their wallet.
So those are the kind of of like hard lessons that I learned over the course of the next few months and then, uh uh and then there's a story about like, you know, how we found the first, um, ideal customer profile. I'll touch on that in a bit.
Host: there are name like, I mean, you bring up a point which I often notice when someone who's worked for at big companies all their career and sort of transition into startups often face and like you'll see them even in the pitchtex adding a lot of advisor names.
Or you know, there are certain signals you can easily spot where, okay, this person is coming from the Big Tech and you can see clearly there are certain signals. Are there any other sort of lessons, you know, when you transition?
Because when you're in a big company, you know, there's a huge sales team, there's a whole marketing team. You're actually not spending time in, you know, convincing a customer.
The customer is already at your doors and now you're selling them and you're improving experience. Now you're doing sort of post customer acquisition things and very few jobs are actually, you know, trying convincing customers.
And also you have this advantage of like an ecosystem of products that are being sold. So you're never so you're selling one single product.
Uh so there are all these problems when you first move from a Big Tech company to doing a startup that, you know, some founders are, you know, obviously much more aware of these problems and they're sort of, um, you know, uh adapt very quickly, but there are often like these blind spots.
Only you know once you jump into it. Are there any other blind spots that you've discovered?
Guest: Yeah, no, you're absolutely right. Um, when you're in a big company, a lot of things are taken care for you, right? So then, you know, what I was doing is like, I was focusing on building products.
You know, coming from engineering, you kind of of like have intuition around engineering product, data, um sometimes a lot of design intuition. Uh, but what I was lacking was this whole go to market intuition.
Like how do you market something, uh, and how do you sell it? And you're what you're describing, you know, what the analogy that comes to my mind is like, uh, imagine a mall where people are already visiting the mall and then you set up a shop there.
Um, which is kind of of like how it is to like build a new product from, you know, in a large company because you're already, you know, all you got to do is just like open up a shop and people are going to come in, right?
Whereas, um, building a startup is kind of of like scoping out a a plot of land in the middle of a desert and then setting up a shop and then inviting people to come visit you where they normally don't go. And so that's the stark difference.
And so that was like huge uh learning or lesson for me, um, when I started, you know, trying to sell the product.
Um, and then in terms of like in a large company versus small companies, um, you know, no matter this this is my experience as in like, you know, um, I wish I had learned more of like the business side of things.
I wish I had like actually spent more time with the sales folks, how they talk, how they um incentivize, how they even like think about approaching. Um, you know, I was extremely awkward talking about my own product to potential customers.
Um, you know, it's one of those things where, oh, you know, I don't want to be salesy, but I want to like actually position my product in the right way.
You it took a while for me to like build the conviction that, no, no, no, I'm actually helping these prospects with my product. believe so strongly that my product is going to be um 100% valuable and going to be useful for you, then that the the perspective changes.
The way I talk about it changes. And so you your awkwardness goes away. Those are all stuff that you got to learn. Um, and you know, I I I you know, I learned it, um, uh in real time.
Uh, and some of the founders they've like learned it before, so they can like use it when they start, you know, a new company.
So, uh but it is one of those scratches that, you know, when you are when you only spend time in large companies, you you don't realize this.
Host: Well, so what was the value add when you were giving that initial 10 customers or so? Like what was the value proposition of Statsig at that point of time? And why is it sort of of like different from what solutions or products in terms of that already existed, um, you know, in the market?
Guest: Yeah, so the value proposition has not changed since the day we started. The value proposition has always been the same, which is the more you know about your product, the better decisions you're going to be making.
And so if if if if if I distill it down, um, what we're doing is we're empowering product builders. You know, whether you're an engineer or a data scientist or a product manager, um, you're all building products.
And the idea to like, okay, well observe your product, how people use your product, what they care about, and where they spend more time, how many exceptions you're throwing, you know, what is it what is the latency? What is the engagement retention?
All of those are important for you to know how your product is doing. Number one, what products or what features are not working as intended, um, or not hitting the mark. And then number three, use those to to know what to go build next.
And that's literally what we we we we sell as the value from day one. Um even now continuing on. So we're adding more and more uh features to our platform so that, you know, everything just operates, um, you know, seamlessly as a platform.
And so the the the value add is that. The differentiation from existing products is that, uh there are products that have been here before us. And those are all point solutions. They, okay, for for flagging, you need a separate product.
For analytics, you need a separate product. For visualization, you need a separate one. For um your warehouse analysis, you need a separate product. For experimentation, you need a separate product.
And so each one of them have been individual products in the past. And what we're doing is we're bringing them all together.
The the benefits of that is like number one, it like, you know, consolidates all data into one place and you don't have to like fragment your data.
Number two, because you're not fragmenting your data, they all are, you know, speaking the they they are the source of truth. they're not like saying two different things or multiple different uh conclusions that you're drawing from them.
Uh and then number three, it it actually opens up some really interesting scenarios as in like if you combine flagging along with analytics together, you can get impact analysis for every single feature.
And something that you can't do if you have two different products, you know, dealing with individual solutions.
Host: you spend flagging, you know for those who might not be aware of, you know, feature flags.
Guest: Yeah, so feature flagging. So feature flags are uh ways for you to control uh where to launch, how to launch, um to who products.
So you can basically decouple code from features so that when you want to launch new uh code, like you can like send it to your app store and get it app reviewed.
And then once it's live, then you can turn on features uh completely decoupled from code releases. It's extremely powerful. Number one, it's powerful to like know when to launch your product.
Number two, when something goes wrong, in real time you can turn it off. Uh, there's lots of other reasons as to doing it because you can do roll outs at scale.
So you don't turn on like to 100% everywhere, but you can do like 2%, 4%, um, 8% just to know that all the metrics that you care about are, uh, still sound.
Host: pretty much every scalable service today, you know, uses some version of this because you don't want to roll out to 100% of your users and blow up your servers at one single point of time and you want to experiment different versions of the products.
Uh all all this is sort of like standard common technique uh today for across like all major companies, right? So you have some, you know, pretty well impressive companies as your customers.
You have Open AI, I think, uh you have WhatNot, you have Blue Sky, which is the X comparator.
Uh, other any specific trends like across the big and small companies on like how they're approaching just experimentation or product validation that you see?
Guest: Yeah, so if you, um, the the the trend that we are betting on is that more and more product development is going to be data driven. Um, and that's that's the reason why we're here.
And what we're doing in the industries we're accelerating that trend.
So if you think about it like the content that we push out, um, the education that we do to uh prospects and in general to the industry, we're basically catalyzing and accelerating that product development used to be this siloed thing where, you know, a product manager comes up with an idea and then the engineers coded, you know, testers test it and then you ship it and then you wait for two years and then another release cycle happens.
This is how product development used to happen. And then now it's a very iterative process and people ship like, you know, sometimes weekly, daily, and sometimes even hourly.
And in order to like, you know, get to that level of like uh speed, you need to have controls in place, you know, and to be able to like allow people to make distributed decisions, you need to have ability for you to like know how each line of code that you wrote is performing.
And so these tools are getting more uh adoption uh day by day and people that are like starting to move from this traditional way of development to this modern way of development, they all needed.
And so if um uh you know, one of the bet uh is that, you know, AI is going to accelerate uh that movement of product development because you're going to have like lots of rich software that is built uh from like blocks that you've just like assembled together and you know either you or AI is assembling these blocks together.
And now you need to know is your hypothesis or is your original idea actually what turned out to be the product? And so by, you know, if not, how much. And so you need these observability tools built into your product to be able to know.
And so I think generally, um, this trend is moving towards like data driven development. You can also see more and more um if you look at like LinkedIn, uh more and more companies are setting up growth teams.
More and more companies are hiring data scientists. Um, those are like great, um, uh trends that we observe.
Host: So, what what what is the right time for a company to adopt StatSig? Like is it like a, you know, series A company. I think it also goes back to that customer profile that you were talking about, right? Like what is the ideal customer profile for you like when a company starts adopting Statsig?
Guest: Yeah, so StatSig is a platform. We have many different products, right? So there's feature flagging, there's experimentation, there's product analytics and so on. Um, you would you should start on day one.
You know, every feature that you're building, um, you know, on day I remember the day few days of like, you know, early early days of like us building software because I'm looking at our, you know, what the first thing we launched was our website, our marketing side and I'm like refreshing that page all the time.
And when I whenever I get a somebody visited the website, I'm looking at them, I'm seeing the session replay of them. I'm like literally spending all my time like figuring out how people are using the product.
And so that is an important step in the journey of your company. So start on day one. Integrate with feature flags, integrate product analytics, integrate session replay.
Um, you know, all of that stuff that will actually give you insights into how users are using your product. Eventually, you'll get to like the place where, okay, I want to run experiments, right?
And so you don't have to like do experiments on day one.
Um, eventually you'll get there and so when you get there, you have the same tool with all the same data with historic data, um, now capable of like running experiments and because it has historic data, it could do corrections and things a lot, you know, um, sometimes some of these advanced statistical methods, uh, a lot more efficiently.
So yeah, there's no there's no early time, just start right away.
Host: One of the, I mean, I've used, you know, different experimentation tools because, you know, I I also work in Big Tech.
One of the sort of negative side effects I see, um, when we have, I mean the tooling is great, you know, every tool has its purpose, but I think a lot of this idea that we'll use AB testing or, you know, because with tools like StatSig and others, there's often this idea that, hey, if there is a sort of product sense decision, there's always, okay, let's do take two versions and experiment and see which ones comes best.
So I often feel like there's there's it can lead to a little bit of incrementalism and sort of like experimentation fatigue. Uh, because if you see like a product like Chat GPT, like it's such an experimental product.
Like if you discuss that from and try to pitch internally at a Big Tech company or any normal company, like people will have a hundred ideas that the name is bad. The UI doesn't make any a lot of sense, right?
So, uh like do you have a take on like when to use sort of experimentation versus when to use your gut and your product sense, um, in you know, when you're doing product development?
Guest: Um, so you remember the famously misattributed quote from Henry Ford that said like if I'd asked people what they needed, they would have said faster horses and not a car. Um, so just to think about it, right?
Um, experimentation is not a replacement for product intuition. You're not going to like get rid of product, you know, you do need product intuition.
So in order to make those leaps from like a faster horse to a car, you cannot experiment your way there. Um, you need people with lots of good intuition and drive to be able to like get to to that kind of of like leaps.
Um, but then once you had your first version of the car, you remember the the the model T to where we are now, um, there has been thousands if not millions of improvements that have been made.
And those are all like, you know, in a way I think is like, you know, you can experiment, uh, and find better versions of like what you currently have. Even in there, there's lots of like leaps that require product intuition.
So my my philosophy is like, you know, product intuition and experimentation go hand in hand. Some of these things, you know, you can have product intuition. You can kind of of like want to, um, build it and you have conviction. You go do it.
Um, but then when you're like about to launch it, just make sure that there are no side effects. Uh, things that you have not thought about, things that you have like ignored.
Um, because products have gotten so complex nowadays, you know, I don't think anybody out there can like, you know, meaningfully understand Chat GPT the entire product.
Um, and so when you're in that kind of of a situation and it's only going to get, you know, uh harder and harder for anyone person to fully uh grock the entirety of your products.
So, uh in those cases is where, uh first observability, instrumentation, and analytics are going to be extremely valuable. And then, um, you have experiments.
And these experiments, they don't have to be like these, hey, let's test these two different variants. It could literally be like I I I believe in this feature. Let's imagine like I'm going to build this new feature. I'm going to launch this feature.
That is your hypothesis. The hypothesis is like, I'm going to build this feature. This feature is going to be good for whatever reason, right? So for my customers, it's going to be better. For my business, it's going to be better.
For my developers, it's going to be better. Some something's driving you to go build that feature. That is your hypothesis. Validate it.
So like put it out there and measure how much um, you know, it's actually like improved uh in all of those dimensions that you just thought about. Um, and then the other one is like how much it's actually affected. So what what did it catalyze?
What did I not know about this product? So yeah.
Host: Let's talk a little bit about growth. I mean you started in 2021. Can you give a scale about like how many customers are using StatSig or how many users or you know just a scale of the company right now?
Guest: Yeah, so we have about a couple thousand users, the most of them come self-served. They like you know pick up our product. We have like all all kinds of open source SDKs.
We have a few hundred uh enterprise customers um that are using our product um at scale.
Uh, and then, um, we have a massive scale in terms of like the the each one, so if you think about like the the few hundred customers that I just talked about in the enterprise segment, they all have big data.
So each one of them have immense amount of data.
And then we imagine like aggregating hundreds of that uh and then running pipelines on each one of them, keeping the data separate and you know, um we we process about like four petabyte of data every day. Uh, which, you know, is mind-boggling to me.
Uh, and then we, um, so last uh September, we announced that we were processing about a trillion events every single day. Um, now we're processing about 1.6 trillion events every single day.
So just to put that in perspective, that's about 18 million events every second. And so that's what our system is processing. So that's kind of of like been our growth um, in in terms of like customers, in terms of like scale and infra.
Host: So how how are you talked about the positioning? Now are you positioning StatSig uh you know primarily as a developer tool and focused on marketing and positioning for developers and using that to drive enterprise growth or how how how does the growth engine at StatSig look like?
Guest: Yeah, so we we position ourselves as a product development platform. So, um, it caters to the engineers, it caters to the product managers, marketers, and data scientists.
And so, um, there are parts of the product that each individual like so, uh if someone comes to us and say like, you know, hey, we need to solve our experimentation problem.
Um, usually it's a data scientist team that is like coming to us with like, you know, um need for advanced statistical methods and data sanity. And so we we work with them.
But then once we're, um, in uh that particular company, then it it's a so our product, we don't block anyone, we're we don't like charge for seats. We let the product grow organically, uh, within that company.
So natural extensions are like the engineering team will adopt StatSig for, um, feature flagging, and then, uh, the product team will adopt StatSig for analytics and visualization and so on.
So this kind of of like and the marketing team will adopt StatSig for their our visual web editor and so on, um, the the organic growth happens uh within a company.
And this is kind of of like how we've grown uh even within our existing customer base.
Host: Where do you spend um, you know, most of your marketing efforts, which has been like highest ROI as a channel or as a strategy?
Guest: Um, so there are different outcomes for our marketing efforts. Um, so some of those are direct response.
So we kind of of like, you know, feed people into our um our website, whether it is like signing up for a new account, self-serve or whether it is talking to our sales team.
So those kind of of like, you know, we track and you know it's a very seasonal thing. It depends on like you know which um uh there are