Transcript: Todd Bishop GeekWire Co-founder, Business and tech journalist on big tech, AI and more!
In this episode of The Startup Project, host Nataraj Sindam interviews GeekWire co-founder Todd Bishop. They dive deep into how AI is transforming journalism, offering a 20% efficiency boost. Todd also shares his expert analysis on the AI race between Microsoft and Amazon, the New York Times lawsuit against OpenAI, and the potential outcomes of the proposed TikTok ban. This is a must-listen for anyone interested in the intersection of media, technology, and policy.
2024-04-01
Host: Hey Todd, welcome to the show again.
Guest: Oh, it's great to see you again. Happy to be back.
Host: Yeah, so it's been a while. I think it's mid 2022 is when we last talked. Uh I think the conversation went really well. We covered a lot of interesting topics around Big Tech.
And I thought, you know, I should have you again and uh you know, talk some of the interesting things that are happening. I think the world has changed in a lot of ways. Even though it's like now not even two years, right?
Um so we talk even talking about generative AI back then.
Guest: No.
Host: That was all crypto.
Guest: Amazing, crazy.
Host: Crazy, right? How much of your coverage right now is AI?
Guest: I'd say it's the exception that it's not. It's it's amazing how much it's just taken over the world.
Host: Yeah. Do you are you are 90% of your stories now on Geek Wire uh AI or AI related?
Guest: You know, I would say it's more like two-thirds. Just this this is my guess. Um there's still a lot of non-AI tech out there that's interesting. And frankly, it's kind of refreshing these days when you run into something that is not AI.
Um that said, I mean, it's been energizing this past 18 months has been I think the period of the most rapid change that I've experienced as a tech reporter just covering the industry.
It's been energizing not only to cover, but also to use and uh see efficiencies in my own work. So a lot of interesting stuff to talk about.
Host: Does AI live up to the hype it is generating in your mind?
Guest: You know, it depends on whose hype you listen to. for me personally, I'd say I've become I'm a pretty heavy user. I can explain some of the things I do which are, you know, nothing out of the ordinary, but I'd say I've become 20% more efficient in my job. Um just as a reporter and writer and um I I so I don't know, is that more or less than the hype?
Host: I mean 20% I think is pretty good, right? when you see something like crypto and suddenly we have you know, when we have an actual tool that is impacting you on a day-to-day basis. Um how how are you using like is it mostly about like rewriting or getting ideas to write? How are you using it?
Guest: Yeah, it's more in the process of synthesizing everything that I'm doing on my reporting into uh insights, things that remind me of what somebody said, the context in which they said it.
I'm working on a big project right now and I've done about 20 interviews for it. It's a pretty unusually large project for us.
We're a little more into the daily and weekly cycle normally and I've got all of the interviews that I've done through an auto transcript and I use a tool called Otter AI and it does the auto transcript and then it also has a pretty powerful in my experience at least chat function that allows you to um go back in and the thing that I actually just got onto this week was they put me into a beta where you can take multiple conversations and put them into a channel and then focus the AI just on those.
So I took all of the 20 or so interviews that I did, put it in there and it's amazing the kinds of big picture insights you can get.
Not that you wouldn't have those as a human, but to be able to basically get an outline of a story very quickly based on your own work is pretty powerful.
Host: Yeah. I think it's pretty much touching everywhere where you write for sure. I can that's pretty obvious as a use case for everyone.
Yeah, even though like we are seeing some of the more hyped uh you know, replacing developers or creating videos.
I think which are still I think about to happen and things are happening so fast, but I think writing has been completely changed for a lot of people and it sort of gave some equilibrium to everyone you know who wanted to write better.
Um but have you thought about how would you still be unique if everyone has this access?
Guest: Yeah, and I've kind of gotten past that a little bit just because I feel like and you know, Microsoft uses the phrase drudgery a lot in in other words, eliminating drudgery and so I almost hesitate to say that.
That said, that has been my experience. The stuff that I'm using it for is the grunt work, the heavy lifting and so far at least and you know, certainly this can change, but so far at least, it's freed me up to do more creative thinking.
And even at times, I'll give you an example like I was just working on our podcast for tomorrow uh to to run on the Geek Wire podcast tomorrow and we always do a post to accompany the podcast and I've done all of the work for the audio, you know, done the interview, done the reporting in a lot of cases and to have to then sit there and write a new version of all that work, it's much easier just to be able to have it outline something for you and then flush that out.
And so for me, being able to do that kind of stuff through AI with the AI tailwind is just it's empowering.
Host: Yeah, for me, I think just the pure joy of experimenting with the new tools has been really fun.
For sure. and like I started doing something called like 100 days of AI and sort of being intentionally spending like two, three hours a day um and trying to document those learnings, experiments and sort of generate more ideas and like forcing myself to be more actively engaged in understanding a little bit more deeper than you know what we see on a surface level, you know, on a demo basis or in a particular tool gets viral.
Um and it's been super super interesting um to see what are all, you know, how rapidly things are improving.
And if you just pour like because I I've been storage mostly in my career last couple of years and if you see like what it would cost us, you know, to store 1 terabyte and how that graph went down and like last 10 years.
And if you do the same thing for a GPU and what that would mean in terms of like how much technology that you can deploy and every usable imaginable case, if it comes true in 10 years, like it's so so much more impactful I would say uh that we could see and provided we don't know how fast or slow these things go, but I could envision like once that happens how impactful it's going to be.
Um so in general what I wanted to ask you about AI was like how do you see the Big Tech ecosystem you know playing with AI?
Like how do you see about Amazon, Microsoft uh because you know, you've been covering this space for 10 years in the Seattle ecosystem so much more than probably anyone else.
So what what was your take in terms of like how Microsoft is approaching and Amazon is approaching and you know other companies in the space are approaching?
Guest: To me in many ways the story of 2023 was Microsoft's partnership with with Open AI and obviously that's some intriguing plot twists toward the fall with Sam Altman's aster and return and and that whole thing and so I think Microsoft has clearly established itself as a leader on the one hand and the storyline that seemed to be trying to bubble up throughout the year was Amazon playing catchup.
And I think now in one of the first few months here of 2024, I I might be misreading it, but I think a lot of those questions about Amazon have started to go by the wayside.
You've got them coming out not only with you know foundation models of their own and their bedrock AI large language model as a service on AWS, but also some interesting applications.
You know, I got a chance to try out Rufus, which is their e-commerce assistant inside the mobile app that they're gradually rolling out to folks and uh the Washington Post panned it a little they thought a lot less of it than I did.
I I I recognized that it was a beta and it's not going to be perfect and I was more interested in the potential and what it says about what that kind of assistant could do.
And for me, Rufus was like for example, I was able to say, hey, I have a this type of bike, a Rally Cadence 2 and I'm looking for a trainer, you know, something to put it on to be able to ride it as a stationary bike.
I didn't know what size tires it had and I just said like, what what should I be thinking about? What trainers would work with it? And it said, oh, you have, you know, 700C tires on that bike and this is a good trainer for that.
It's like that was a magic moment for me.
And so when you look at what Amazon's doing, I think it's uniquely positioned in that it's at every layer of the stack and and certainly Microsoft's doing some interesting AI applications, but I think Microsoft's superpower is that connection to GPT4 and Open AI whereas Amazon is more broad in its partnerships even though it's pretty heavy with anthropic.
I think Amazon is better positioned to be more broad-based.
Host: last time I don't know if you remember, but we talked about Alexa not having product market fit. Yeah. Um and I'm still sort of beating my drum about it and I'm wondering why haven't they integrated Alexa with an LLM yet? I think that seems like quite obvious.
Guest: Well, they showed something last fall where you had to kind of enter a specific mode. And I actually don't think and I might have just missed it, but I don't think it's actually been released publicly yet.
Host: I I I didn't think it is. Yeah. I I remember what you're talking about and I don't think it came out publicly.
Guest: Every time I interact with my echo with Alexa at home, I just kind of start shaking my head now. Things that in the past, you know, would have been just very basic communications and I would have been okay with it.
Now seem, you know, so rudimentary and the lack of context, you know, she's not Alexa's not able to retain context as you go forward. I agree with you.
I think every day or week that goes by that they aren't integrating that kind of natural language interaction in that voice assistant is a a big miss for them. So that that's certainly true.
Host: Yeah, I wouldn't even be surprised if like you know, Open AI launches uh Alexa like device, right? I mean, it's so obvious that you can just put this thing in a voice format and it sort of elevates the Alexa experience so much more.
Guest: It's funny you should say that. I'm not trying to name drop.
I went to the Open AI Dev Day in San Francisco last year and that was the question I asked Sam Altman and it was kind of an outgrowth of some of the things we talked about even though, you know, generative AI wasn't a topic that we discussed last time.
I asked him would there potentially be a dedicated device and then Ina Fried from Axios built on that.
I can't remember exactly whose question sparked it, but he was clearly at the time open to the idea of either Open AI or perhaps one of his other startups doing a dedicated device.
I got to say, one of my favorite Chat GPT features and frankly the reason that I end up continuing to use Chat GPT even though you've got co-pilot and Claude and um Otter uh for me, is the ability to go into the app and and have that voice conversation.
It's amazing. What app? So my my favorite thing because as a journalist, if I'm reviewing or interviewing an author a lot of times, they'll send a PDF of the book as a review copy. And I'll do two things.
One, I'll upload it to Specy and I'll have Gwyneth Paltro read it to me rather than having to read it myself. So it's kind of like a automated thing. They've licensed her voice in addition to I think Mr.
Beast and Snoop Dog and others like celebrity voices and her voice is fantastic in terms of just it's just a great synthesized voice.
But then the other thing I'll do is, I did this with Fay Fay Lee's book, the AI pioneer who came out with a book called the World's World's I see. I had read the book. I was going to interview her.
I read it really quickly and then I was about a month later I was going to interview her and I was like, I need just some recollection.
So I uploaded the PDF and I was driving around on errands and just saying, you know, to chat GPT, remind me about her what she said about her parents in the dry cleaners, you know, like that kind of stuff. And it's so good.
And to me that's where all this is headed. Sorry, I know I'm rambling here, but to me, these general purpose chatbots caught a lot of attention.
Obviously, you know, they're boiling the ocean with all this data that they have and they're delivering the world to you in generative AI form as a replacement for a search engine and I think that's what Sati Nadella was initially pitching it as when he released the when they released the Bing uh integration.
But to me the power comes when I can constrain that data to what really interests me, reduce the chances of hallucination and just get these rich responses based on the information I've provided to the LLM.
Host: I think that has been like the biggest application.
I think what people are calling it as rags and even like with the 100 days of AI series I've like documented a couple of how to, you know, create your own rag because my blog posts are more able like something a developer would use.
Uh but um rag is I think has been very interesting application and also when Gemini was released with 1 million token size.
I think that is also a new sort of a you know bigger change that might leverage again a lot more applications because now if the model itself has 1 million token context, do you really need a rag on top of it again, you know to limit it.
So, I think there are some interesting things happening. Another example like similar to how you are using sort of audio and you know audio to text and text to audio. What are the things I do is I write a by weekly newsletter.
It's sort of like you know what's happening in big tech strategy sort of thing. Um but you might have seen other creators do this.
Like for example Scott Galloway has a newsletter which someone narrates him for him and it converts it gets converted into a podcast uh on his feed.
So what I do is my uh because I can't afford someone actually doing it for me, I actually take every week my newsletter and it is converted into audio format using opening API and it gets published on my podcast feed.
Um so that gives me another podcast episode on my podcast feed. I I was even level thinking that I should provide this as a website so that other you know, newsletter authors should you know, can do it.
Um but I didn't get the time uh to actually go and do that but maybe I should at some point.
Guest: Is it in your, is it in your voice, like a voice clone or something?
Host: It's not in my voice. That would be next step, but it's in one of the default voices in Open AI, but it's very similar to if I hire an American, uh it would sound like exactly the same, right?
For example, like Prabh G has uh George Han, I think. Yes. right? So if I'm if I might want in like an Indian accent or if I want to hire an American accent person, you know, narrating to reach more audience, I think this is the perfect way to do it.
Yeah, but listeners to your podcast will want to hear it in your voice. I think you know, you got a distinctive voice and Sure. Yeah, but that's probably the next step I should experiment with. but yeah, that would be cool.
Um Guest: Can I ask you one other thing? Sure, yeah, good.
Host: This so I won't sometimes this has happened so fast that sometimes I just find myself wondering like, does this thing exist? Like various different things.
So one thing that I've been thinking a lot about lately along these lines is I would love to go to a service where all I do, if I'm a content owner like geekwire.com is put my URL in. And I'd be I'd be I'd pay for this.
It creates uh large language model based on my full sitemap. So for Geek Wire for example, like articles going back to 2011 and talking about context windows obviously that's not going to work even with the Gemini size.
But does that exist where you could just go put in a URL and say create an LLM?
Host: I haven't seen it. Um it would be mostly similar to the rag sort of like an application.
I've seen chat with your data applications where you have to download it into a PDF and upload it, but I haven't seen a specific website being plugged in.
For for my idea for like the podcast that I wanted to talk is because the difficulty is the API is pretty easy to use. The problem is not most people who run newsletters don't know how to use and they don't want to deal with every week going there.
So all the all the product should be is you give them an RSS feed, they take their newsletter RSS feed and they also give their podcast RSS feed. Every week they publish, it translates and goes there.
That solves the use case problem for the newsletter writer.
And I think you're also sort of pointing similar like I just want to put my website, my content, my blogs and I if I if I have to extend the idea, I'll extend it to like it's not just my website, but my website, my podcast, my YouTube appearances, my all the data that I exist on the internet that I care about that I've created and put it and then you know sort of chat with it.
I think we are getting there though and I I think it's very possible. I I don't think why it doesn't exist. But it's a very real application that you could create.
Guest: Yeah, and then the questions that we debate around here when I bring things like this up are would something like that be an internal tool? The answer to that is definitely yes. But would it also be an external product? In other words, would customers users find value in a chatbot that they could use to query our domain. And that to me is a more of an unanswered question.
Host: Yeah, I think it may it might make sense on a on a thing like, for example, in companies what we do see is support, right?
What essentially how are people really optimizing support using AI is basically taking all the data dump that you've created over the years for support engineers. And you sort of do a fine tuning of the LLM based on that data, right?
And then it's sort of what you are saying about your data, but the data you're caring here is different and then you create a chatbot and provide it to the customers.
And if that fails, now you transfer it to the customer engineer instead of directly going into the customer engineer. So the usual chatbot that exists which was robotic and streamline and had was rule based, right? primarily.
That now becomes much more elevated and I think we've seen example of I forgot the company which um it's it's I think in Clarna, which had like you know, fired 80% of its support staff because of this particular reason.
And I think it's a really powerful example. I do this in my day job because I have a particular set of documents. I go keep going back to and I personally I'm focused on and I know that it exists somewhere but I've forgot the answer.
So instead of going through all those documents and trying to figure out what I did is I fed all those documents that I care into a fine tune LLM on my local machine and I can just ask a Python query and if it has, it tells me it has something and it will tell me where it is.
So it's very simple but you can see this being rolled out and I think this is part of the logic for like co-pilot for security and co-pilot for word. I think part of this logic exists in these products now.
I I haven't fully experimented with this co-pilot for security and other other co-pilots, but I would assume that that's part of the logic.
Guest: Yeah, that's great. I did not mean to sidetrack you from your questions, but that's actually really helpful to me. So thank you.
Host: What do you think about um the whole debate of, you know, data versus uh I think New York Times is suing Open AI, right?
Guest: And Microsoft.
Host: And Microsoft.
Um and I think um some of the artists sorry, there's a Books3 library which so the authors whose books are part of that library are also suing um Open AI and I think they mention hugging face because that repository was hosted on hugging face, but I think hugging face said that hey the repository was never hosted here.
There was a script which could download the repository, but that was hosted. But that's a nuance there, but what do you think about you know what's happening with you know, free data versus you know, protected data? Like the whole debate.
Guest: I think that there's got to be some sort of standard that emerges either from the litigation that you mentioned or from uh some sort of legislation. I think the neither of those tools is ideal.
Certainly legislation is going to be hard to come by these days at the national level at least. I feel like I I read very closely the New York Times lawsuit.
It's been a month and a half or so, but the thing that struck me at the time was the fact that they were able to prompt Chat GPT in such a way that it reproduced articles verbatum.
And I think that's where the rubber's going to meet the road on that lawsuit. I think that makes it clear that not only are they using that repository data to train the LLM, but it's hard to argue against that being a copyright violation.
I mean it's very clear and that reproducing it in that form at that length must violate copyright law and it's got to go beyond fair use.
And so I think those two things are going to be interesting, the training and then the reproduction and I think the reproduction is going to be just flatly stated and and you know, do you remember last year when and I realize this gets into your employer and and but but remember last year when um Chat GPT integrated Bing initially, the Bing live search into the the plugin, uh not live search, but the Bing search into the plugin and they took it down for a while.
My understanding was that that was in part because people were able to get around pay walls. and it just feels like in all this in all these ways, it's sort of the wild west right now and I'm kind of waiting for the sheriff to show up and I just don't know who that sheriff is yet.
Host: Yeah. Um another thing I thought it would be interesting to ask you because I think Geek Wire was closely tracking is layoffs in tech. Yeah. Yeah.
Um I think I mean my take on it was we overhired in tech during pandemic assuming that the trends continue then we sort of corrected but my my thinking always about tech businesses and how the companies operate is they are actually well run efficient machines when you compare with like the rest of the businesses like you know you take a Disney or for example, any other sector and pick a bigger company.
I always considered you know, the bigger the big tech in general are better run companies because they have better data, better tracking, you know, adjustment towards, you know, trends, more data driven in general.
Um but it feels like the recent continuous layoffs are more than that. Uh and I I would like to get your take on what you are thinking.
Guest: Yeah, um I think the macro was a few different things.
You had the sort of follow the leader thing that happened in late 2022 and early 2023 where it felt like every company had to at least show Wall Street if they were publicly traded that they were Yeah. you know, making sacrifices.
It felt a little bit like the Hunger Games unfortunately at the time. And you know following up on the, you know, 2022 and the huge raises and the retention battles. It was just this weird flip.
So to your point it felt like it it felt like perhaps they over hired um and and spent too much on head count.
And now it feels like it's more in response to genuine macroeconomic trends where everybody in terms of enterprise customers is pulling back on their budgets, just being much more cautious.
I had a great conversation yesterday with Smart Sheets CEO Mark Mater. You know that's a big company, 5 billion uh in market cap about 3,000 employees. And there were a couple interesting points that came out of that.
He he explained that they're seeing AI juice demand for you know initial contracts, but what's been pulled back is expansion by existing customers adding seats. They're not doing that at quite the rate as they were before.
So it's this weird dichotomy that he's seeing, but then on their employment they've uh only increased their employment by, you know, 100 employees over the past year or so and I'm sure there's lots of attrition and all that stuff and hiring.
But and his point was he does not think they're going to have to hire at the pace they used to to achieve the same level of productivity because of all the AI tools.
And so you're already starting to see, I mean, is that AI taking jobs or is that AI taking somebody's job and making it more efficient and making it so that a company doesn't have to create more jobs?
I it's it's really an interesting moment right now in all of these areas.
Host: I mean if AI makes 20% of your job better, that means for every five people, you could only now hire four people, right? So I think that sort of plays out at a company level for sure if they're seeing productivity levels increase.
Um another probably like interesting thing that's happened right now is the Tik Tok ban or divesture. Man. You know, I am, I am I I've dabble in Tik Tok. Yeah.
And my 13-year-old daughter ridicules me endlessly because I am more of a YouTube shorts person. I don't, you know, the YouTube algorithm just sucks me in. But yeah, I'm interested in your take on this.
I I've been going back and forth with this well-known investor locally behind the scenes on like what's going to happen with this. And his take was that, yeah, I mean this could very well happen.
You know, it would be mechanically possible to get this thing off the App stores if it becomes law. It's just crazy. What do you think about it?
Host: I think they they're giving six months time to change the ownership. And I think there is a some amount of context. This happened before in case of like Grinder, if you remember, that Grinder had to change ownership uh as well.
Um because um sort of similar issues of data privacy etc and they they realize that they have to change ownership and it happened and the app continued to stay operational in the US.
Um so I think it will not get banned uh because I think too much of American interest exists um to get for it to get banned.
I think Jeff Yass owns and and I'm a little bit surprised that I I've written about Jeff Yass like about eight months back in my newsletter.
And it it's become right now popular uh that you know Jeff Yass owns a significant amount of uh uh ownership in Tik Tok and he has been funding super packs uh to promote sort of you know reduce the uh narrative around Tik Tok banning.
Um but I think it's too big of a thing to let Tik Tok operate in the way it operates today. And I think there are two parts of it.
One is you have to obviously force divesture to make sure that you know no foreign influences actually controlling your audience, your citizens.
The second part I think is sort of because you know if it is Tik Tok and because you know the uh narrative is so hyped up. Uh I think the proper trade policy with China is not being the focus.
I think it would be advantageous for US to get access to Chinese markets, right? more than you know banning or unbanning Tik Tok if you have to think long term here.
Um I think that narrative is the one which really needs to get driven and you need to really have a policy there. Because yeah, Tik Tok is just one company at the end of the day. You know people say it's like $250 billion worth. I don't think it is.
Um you know some people even say it's a trillion dollar worth company.
I don't think it is because once Instagram and YouTube shops start of get better at algorithms, it sort of the network efforts you know, take take over like I think people will eventually end up and on Instagram more than Tik Tok.
Guest: Do you feel like the data privacy and the manipulation concerns that are expressed are are legitimate from a technical perspective?
Host: They are legitimate because you you know that all manipulation is slow manipulation. It's not fast manipulation, right? It's not like one day I did an ad campaign and everything everyone changed their mind.
It's a it's always a slow manipulation.
So I think if if there was no Tik Tok, it would be Fox News, it would be MSNBC where the ad dollars would be spent and you know where the talking heads funded by different you know groups, a secret super pack funding a pack that funds uh you know someone who gives you a particular opinion on different channel.
I think that has been the MO on how these things operate, right? So I think it would change, but I think it's the scale is also different, right?
I mean the scale of technology is different than you know, putting 100 talking heads across different channels and getting the message out. I think it's definitely different. It is definitely a valid concern. I think it's more than data privacy.
It's mostly narrative manipulation is what I would say. I'm not really that concerned about data privacy with Tik Tok. Because you know what what other 16 teenage you know 16 year old's data is going to be so important for China?
Like what what are they going to do with that, right? So it's it's not. Groggy's recipe. Yeah, so there's no amount of like this I think everyone feels that their data privacy is so important. I don't think. Yeah.
It is actually not that important except for your bank accounts and you know, your property accounts. If I'm eating cheese on a particular day and like Amazon wants to know about it, like feel free to. Serve it see ad on your prime video.
So you mentioned politics obviously is a play here and it the US presidential election is interesting in two respects then, you know, if it's six months or probably more that they're going to give them, um, I would imagine, you know, those kinds of things end up getting stretched out.
I mean there was that whole thing that you know the Oracle, the data sovereignty and all that which went on for several months. Anyway. What happens if there's an administration change?
And then two, you're talking about this subtle long-term manipulation like what happens in the meantime if this platform is able to be abused by different interests? Um it's it's what a crazy time. Yeah.
And I think it's also like America is generally open, right? Like it's it's more capitalistic, it's more open to you know other company, other outside countries investing in America. Everyone wants to invest in the US.
So it's not in the nature for US as a policy to ban and I don't think they are making this a ban. I think they are giving six months time. I think that's good enough time to change ownership.
Maybe not move data centers and everything, tech per se, but change in ownership can be done in six months, right? Yeah. Yeah. Interesting.
Yeah, and so you're sort of seeing that everyone's citing the precedent of social media as reason to really start paying more attention to AI now. What happened here?
Let's pay more attention here and so here you have it in some respects coming home to roost in the form of Tik Tok. Yeah. It's and and and yet you've got all these people who have built businesses on it.
And so what happens there? it's it's really a fascinating time.
But also there's also this fact that it there are people who build business on Tik Tok, but I've often observed that a successful Tik Tok creator is actually going to be a YouTube creator.
I think Tik Tok is where like it becomes begins as a hobby, they find their product market fit on what type of content they make and what to sell, but they actually go immediately onto YouTube where they actually make a real money because of you know how big YouTube actually is.
So I don't think that even if you're making lot of ad revenue from Tik Tok, it is number one, it's not organic as organic as YouTube because they had to create a separate fund to fund your um you know, creator salary if you can call it. Right?
So it's not as organic ecosystem as YouTube is.
So Tik Tok, I always found that Tik Tok creators end up becoming YouTube creators because like Tik Tok is where you got your first internship and then you figure out how to do the job and actually find the job on YouTube. Oh, I lost you there.
Oh can you hear me now?
Guest: I can hear you now. What was it on my inner connection? That's weird. Sorry. Go go ahead.
Host: Yeah, I think like Tik Tok is where people find the product market fit of what type of content they make, but eventually they realize that oh there's a bigger opportunity on YouTube uh because like a real creator middle class if at all we can call it only exists on YouTube.
Every other channel is sort of like you know, merges into YouTube. YouTube is where like people make real money.
There are obviously like niche things like newsletter and but the niche sort of becomes much more if like 10 people out of 100 can make money on YouTube, on newsletter it will be two or one. Right? The ratios are so staggeringly different.
Um so overall I would say like Tik Tok ban will not even affect businesses if it all happens. And because they there are alternative products, right?
Guest: Yeah.
From a broad perspective, I can only imagine that Microsoft and Facebook and Amazon to some extent are more than happy to see this playing out because the spotlight at least for the moment is not on them when it comes to the tech industry and you know it's funny how it shifts, you know.
Yeah. It sort of depends on who's testifying before the Senate Judiciary Committee or whatever in any given week.
Host: I think that this was the same reason why Facebook changes its name to meta because Zuck didn't want the attention. They just said like, let's change this company's name and see if that helps. Yes. Exactly.
Another sort of topic I thought would be interesting to talk to you about is uh what are you seeing with the back to office trends? I think there's sort of like mixed opinions.
Some companies, you know, requiring company uh requiring employees to come, some companies uh lesser so. Oh so what are you seeing in where you end ended up?
Guest: Yeah. I love this topic. Um I was out at