An anonymous reader quotes a report from Ars Technica: Today Google search works by building a huge index of the web, and when you search for something, those index entries gets scanned and ranked and categorized, with the most relevant entries showing up in your search results. Google’s results page actually tells you how long all of this takes when you search for something, and it’s usually less than a second. A ChatGPT-style search engine would involve firing up a huge neural network modeled on the human brain every time you run a search, generating a bunch of text and probably also querying that big search index for factual information. The back-and-forth nature of ChatGPT also means you’ll probably be interacting with it for a lot longer than a fraction of a second. All that extra processing is going to cost a lot more money. After speaking to Alphabet Chairman John Hennessy (Alphabet is Google’s parent company) and several analysts, Reuters writes that “an exchange with AI known as a large language model likely costs 10 times more than a standard keyword search” and that it could represent “several billion dollars of extra costs.” Exactly how many billions of Google’s $60 billion in yearly net income will be sucked up by a chatbot is up for debate. One estimate in the Reuters report is from Morgan Stanley, which tacks on a $6 billion yearly cost increase for Google if a “ChatGPT-like AI were to handle half the queries it receives with 50-word answers.” Another estimate from consulting firm SemiAnalysis claims it would cost $3 billion. […] Alphabet’s Hennessy told Reuters that Google is looking into driving down costs, calling it “a couple year problem at worst.” Read more of this story at Slashdot.
ChatGPT-Style Search Represents a 10x Cost Increase For Google, Microsoft
Advertisment