TMT Weekly: MSFT, GOOGL, ChatGPT, and the New Compute Paradigm

In the News

MSFT reports today. The OpenAI deal structure, while interesting, is less interesting than the idea that Bing could become relevant. Can a ChatGPT-powered MSFT compete with GOOGL in search? What would that look like? Let’s look for some clues in expert call transcripts. 

Request access to all 28,000+ expert call transcripts in the Stream library here.

1) Is MSFT+OpenAI a threat to GOOGL in search?

GOOGL Former AI Product Manager Does Not Think ChatGPT Is a Big Threat to GOOGL’s Core Search Business Google – Product Manager, Google AI (Prior)

This former Google AI PM thinks that Microsoft and OpenAI will de-risk the consumer UX transition from keyword search to conversational search, allowing Google to gain information for their own go-to-market strategy.

“Let’s break that question down into two separate areas. One is Bing and Search specifically and then the other is how can you use it in other different products. Google has almost 90%+ share of the search market at this point of time. It’s very hard to disrupt such a player.

Google is taking a backseat here. They want to be sure that it’s an upgrade before they release anything on the Google side. What Microsoft will end up doing for Google is de-risk the user experienceI think the game is going to be of user experience: if you are adding a conversational interface to an industry that has traditionally been keyword-based, that’s going to be a huge shift

In terms of the technological capabilities, I’m pretty sure that Google probably has something close. I’m not worried that they don’t have the technology or the means to get there. I think it’s more about the risk-taking ability and the go-to-marketyou have to have some appetite for failure. I feel that over the years, that appetite within Google has gone down. The second thing: the risk-taking also is directly proportional to brand identity. If they launch something, if it doesn’t gel well, maybe let’s say it’s not factually correct, then could the question be, “Hey, Google is not correct anymore or Google is giving incorrect results?” That’s a big hit on their brand reputation.

I think those are probably the reasons why Google is waiting on the sidelines at this point of time. My prediction would be that Google will showcase their technology or showcase the level at which their technology can operate in the next couple of months even if they don’t launch it.”

Google – Product Manager, Google AI (Prior)

 

GOOGL Former Sr. Director of Engineering Thinks GOOGL Will Be Displaced Unless It Reinvents Itself Google – Senior Director Of Engineering, Google Search (Prior)

This headline sounds like it’s contrary to the prior expert, but they are saying much the same thing: GOOGL no longer has the option of not releasing an AI-assisted search product.[1] 

“Something will displace Google only if Google becomes complacent, does not read the tea leaves, and does not respond. There is a reasonable likelihood that Google, with its size and scale, can focus enough, can pivot, and redirect resources in order to respond. When it does… it has the momentum that no one else has.

If Google analyzes the tea leaves correctly, they will be able to respond and hold their market share and continue their hegemony. Your question was if something like GPT is going to displace Google. In engineering lingo, this is an implementation detail. This is why this is part of, in my opinion, a buzz and a hype. The fact that this solution right now can talk in natural language, it’s a nice toy. It’s interesting. The thing that is far more interesting is the quality of information it provides.”

Google – Senior Director Of Engineering, Google Search (Prior)

2) What are some other implications of the adoption of LLMs?

GOOGL Former AI Product Manager Does Not Think ChatGPT Is a Big Threat to GOOGL’s Core Search Business Google – Product Manager, Google AI (Prior)

MSFT’s ChatGPT-powered Bing search looks to be significantly higher cost.

“These [large language] models are, of course, larger at this point of time. I think my estimate would be that for a search query, these cost about 100X-200X more. In terms of the cost of one search query versus one ChatGPT query, I think the latter would be probably 200X in terms of cost.”

Google – Product Manager, Google AI (Prior)

 

QCOMFormer High Performance Computing Software Engineer Believes Compute Capacity Necessary to Run AI Models Will Only Increase Qualcomm – High Performance Computing Software Engineer (Prior)

The interesting – and terrifying[2] – point this software engineer makes is that large language models, and the engineers who run them, will remain huge consumers of compute regardless of advances in chip speed. All available compute will be “eaten up,” so it’s worthwhile to consider investments that benefit from huge LLM demand for compute, such as chipmakers.

“Talking of the compute side, I think the reason AI development is so different than traditional software and so different than even hardware development is because a lot of it is just compute. If you have unlimited resources, you can style multiple experiments at once in parallel and see what does best. It’s all parallel and serving sequential.

Traditionally as a programmer, let’s say you have a GPU now that’s 4X faster. You will just give it 4X more data because it’s fine. You can see better performance. A lot of times in AI development, you sometimes have to reduce your model size or you have to reduce the amount of data that you’re giving it so that it finishes in a certain amount of time. As you get faster GPUs, you can make your experiments run faster, but you can also just train bigger models, you can train with bigger data, so I think it just eats up.

TSMC or Samsung is a great bet because they’re the ones producing all the high-end chips.”

Qualcomm – High Performance Computing Software Engineer (Prior)

Notable Recent Expert Call Transcripts – TMT

Notable based mostly on the opinion of the analyst who conducted the interview, with some discretion from the author.

Information Technology

Media

Communications

Bonus: <$10 billion market cap

We have clients who make their hay in midcap, so we’ll start including a few of these each week going forward.

 

Footnotes:

  1. This is exactly the capitalist dynamic that will lead to all of us being converted into paperclips by an unaligned AI.
  2. Hence the update that, instead of paperclips, we will all be converted into a cloud of computonium expanding away from Earth at 99% of the speed of light.
ABOUT THE AUTHOR
Austin Moorhead
Austin Moorhead
Content Marketing for Stream by AlphaSense

Austin’s primary experience is in consulting and private equity, though he’s also a published author.

Read all posts written by Austin Moorhead