OpenAI released the ChatGPT Plugin function on Thursday (3/23). ChatGPT moves a big step to be a super app for information gathering and action taking. It also reduces the gap in processing real-time information (delegating to 3rd party applications) and lack of action-taking behavior after conversations. Honestly, it’s more like Alexa, which organizes voice communication and dispatches the traffic to 1P (Amazon-owned) and 3P (partner-owned) applications. However, Alexa cannot dominate the market with this architecture and business model. Could ChatGPT achieve it?
ChatGPT and Alexa are conversational bots with significantly different business models and tech architectures. ChatGPT is “a super app” that aggregates all the information and process with its LLM model, which means it doesn’t distribute traffic and handovers the control to 3rd party for customer interaction. While Alexa is building the channel to bridge the customer’s need with 3rd application’s capability. Alexa doesn’t have a crystal clear business model to make it profitable, and the challenging part is it could only provide one result in a voice without rich media opportunities. So it has to depend on the downstream impact to monetize through traffic allocation. On the contrary, ChatGPT has a UI that could serve rich media content but is constrained by the single result provided. It’s hard to do apple to apple comparison between Alexa and ChapGPT as they face different market segments. I can still share some of my thought based on my experience in Alexa for several years.
I can see three challenges and one opportunity here.
1. Data Ownership challenges. Customer data is critical to a business. In the ChatGPT model, 3rd applications lose control of two kinds of customer data — 1) customer behavior data about how they use the application; 2) customer traffic data — how to acquire customers to their application. Especially the second one, as the ChatGPT is more convenient to the customer, and their standalone application will lose direct traffic with an increased risk of being replaced.
2. Scaling Challenges. Latency is a big challenge when you integrate multiple applications into your system. It surfaces in two dimensions — 1) an additional layer to recommend one application over another, like what we did in Alexa to select skills; 2) 3rd applications’ performance quality, whether they could respond within a specific time, and the quality of the respond content.
3. Legal Risk Challenges. Legal risk is well-discussed. As all the responses come from ChatGPT, it cannot be protected by Section 230
4. CBO/CBM Opportunities. Talking too much about challenges, it also provides a blue ocean opportunity for Chat Bot Optimization and Chat Bot Marketing, like SEO and SEM. As more and more plugins are accepted and integrated into ChatGPT, the 3rd Party plugins will fight for the traffic like search engine first page or bug box on Amazon to guarantee their visibility. How to market on ChatGPT to promote visibility or say how to play with LLM will be a hot topic.
Anyway, it’s still a positive move and exploration to make AI more powerful to serve human society. It’s changing point, and we have no choice but to brace for this change. Are we ready?