GPT-5 Set the Stage for Ad Monetization and the SuperApp
To many power users (Pro and Plus), GPT5 was a disappointing release. But with closer inspection, the real release is focused on the vast majority of ChatGPTâs users, which is the 700m+ free userbase that is growing rapidly. Power users should be disappointed; this release wasnât for them. The real consumer opportunity for OpenAI lies with the largest user base, and getting the unmonetized users who currently use ChatGPT infrequently in their day-to-day to indirectly pay is their largest opportunity. Analysts focused on the model capabilities are missing the much larger context of network effects that ChatGPT is gaining quickly. In November of 2023, ChatGPT wasnât even in the top 100 websites; now it is number 5. Larger than X/Twitter, Reddit, Whatsapp, Wikipedia, and quickly approaching Instagram, Facebook, Youtube and Google. On the top 10 list, every single property is much older than ChatGPT.com, and the sheer number of unmonetized users is staggering. But that changes with GPT-5. OpenAI is laying the groundwork to monetize one of the largest and fastest-growing web properties in the world, and it all begins with the router. On OpenAIâs release website, the second paragraph is about the âOne United System,â specifically focused on the router. The wording is instructive. GPTâ5 is a unified system with a smart, efficient model that answers most questions, a deeper reasoning model (GPTâ5 thinking) for harder problems, and a realâtime router that quickly decides which to use based on conversation type, complexity, tool needs, and your explicit intent (for example, if you say âthink hard about thisâ in the prompt). The router is continuously trained on real signals, including when users switch models, preference rates for responses, and measured correctness, improving over time. Once usage limits are reached, a mini version of each model handles remaining queries. In the near future, we plan to integrate these capabilities into a single model. The router serves multiple purposes on both the cost and performance side. On the cost side, routing users to mini versions of each model allows OpenAI to service users with lower costs. On the performance side, it will enable many users to use thinking aka CoT (Chain of Thought) reasoning for the first time. Over 99% of the free users have yet to interact with a thinking model like o3, and for the average user, ChatGPT just got a huge upgrade. The number of free users exposed to thinking models went up 7x in the first day and the number of paying users up nearly 3.5x. But the router is clearly a feature to the new service, and can likely see improvements or changes over time. It will continuously learn on preference rates and OpenAI promises it will improve over time. It just takes a single additional attribute to begin the path to monetization: the commercial value of the query. We believe that the Router is the groundwork for the next leg of ChatGPTâs story, and thatâs monetization of free users. Centralizing the control of the free user experience allows for many more future monetization paths. And this monetization path is one that has been hinted at subtly for a while. It all starts with OpenAIâs decision to hire Fidji Simo as CEO of Applications in May. Letâs look at her background because itâs telling. Fidji was at Ebay from 2007 to 2011, but her defining career was primarily at Facebook. She was Vice President and Head of Facebook, and she is known for having a superpower to monetize. She was critical in rolling out videos that autoplay, improving the Facebook feed, and monetizing mobile and gaming. She might be one of the most qualified individuals alive to turn high-intent internet properties into ad products, and now sheâs at the fastest-growing internet property of the last decade that is unmonetized. Itâs an obvious story. Whatâs more is Sam Altman himself has had a very direct tone shift in the last year. âI will disclose as a personal bias I hate ads. I think ads were important to give the early internet a business model. Iâm not totally against them, but ads plus AI are uniquely unsettling to me. I kind of think of ads as a last resort as a business model.â But in recent interviews his tone has now shifted. There is clearly a lot of thought happening about how to best monetize free users more recently. This likely is in conjunction with Simoâs hiring. âI am not totally against it⊠if you compare us to social media or web search where you can kinda tell that you are being monetized⊠we would hate to ever modify anything in the stream of an LLM⊠maybe if you click on something in there that is going to be there weâd show anyway, weâll get a bit of transaction revenue and itâs a flat thing for everything, maybe that could work. Itâs clearly possible to be a good ad driven company but there are obviously issues to it. Compared to the previous conversation, when he was dismissive, his most recent thoughts clearly shows that Sam Altman is thinking about monetizing free users. He mentions a take rate a potential affiliate model. And as a response, the interviewer (Andrew Mayne of OpenAI) literally says, âI would love to do all my purchasing through ChatGPT because often times I feel like I am not making the most informed decisions,â as a response in the conversation. This is likely the direction OpenAI is taking. The Router release can now understand the intent of the userâs queries, and importantly, can decide how to respond. It only takes one additional step to decide whether the query is economically monetizable or not. Today we will make our case for how ChatGPTâs monetized free end state could look like an Agentic super-app for the consumer. This is only possible because of routing. We believe that display ads are unlikely. Perplexity has tried this, and it doesnât seem to be going that well. Instead of inserting a paid feature into the query, we believe itâs more likely that they will pursue a take-rate based model. Now letâs discuss, because it looks like an Agentic Assistant is a way that could align Sam Altmanâs vision of AI being helpful, as well as monetizing via a transaction take rate. Letâs talk about Agentic purchasing and compare it to a search query today. Because LLMs have a core feature that Search does not, and that is scaling marginal costs. This is fundamentally different than the world search grew up in. Letâs examine âAggregation Theoryâ work of Ben Thompson, because the core feature was that most technology companies had zero marginal costs to an additional user. There were some fixed overheads for running the largest search engine, but the incremental cost of another query was virtually zero. Agents and LLMs kill this concept. For the first time, the more you spend the better your result is because of CoT reasoning tokens and now marginal costs exist in software again. There is a somewhat direct relationship between more money, more compute, and a better answer. Nowhere is this clearer than in AI, in which you can spend variable costs to get a variably better answer or outcome. So letâs apply marginal costs to a new purchasing experience. Letâs compare two queries, an information query and a commercial query. - A trivial information query like âWhy is the sky blue?â - A highly commercial query, âWhat is the best DUI lawyer near meâ Before the router, there was no way for a query to be distinguished, and after the router, the first low-value query could be routed to a GPT5 mini model that can answer with 0 tool calls and no reasoning. This likely means serving this user is approaching the cost of a search query. The search query on the other hand has a fixed cost. It would show a page ranking of websites, with a potential AI summary at the top. This is a fixed supply response to what could be a variably hard question. But now ChatGPT free (because of routing) can dynamically answer a harder question with a better answer, which is not how search is designed today. Below showsâŠ
Send this story to anyone â or drop the embed into a blog post, Substack, Notion page. Every play sends rev-share back to SemiAnalysis.