How to use OpenAI to not get disrupted, a SaaS founders guide

[originally published at medium here ]

Platform shift underway

You are experiencing the fastest platform shift of all time.

What’s a platform?” Well, it’s not the technical term developers typically use.

Instead, think of a platform as a space where people interact and exchange things. Before Amazon and Flipkart, physical street shops dominated the shopping scene. Prior to eBay, scrap sellers were the go-to place. And before Google, there was the Yellow Pages. When platforms shift, the old titans rarely maintain their supremacy.

Recently, Web3 emerged as a promising platform shift, yet it hasn’t taken off as anticipated — at least, not yet. Its use cases have been largely confined to finance, without much expansion beyond that. If we use Metamask wallet users as a gauge for adoption, there are about 21 million users, while rough estimates put the number of Bitcoin users at over 50 million. It took Metamask several years and Bitcoin around a decade to reach these figures. In contrast, OpenAI’s ChatGPT garnered 100 million users in two months. This beats other technologies like Tiktok (1 year), Apple App Store (2 years), Instagram (2.5 years), WhatsApp (3.5 years), Twitter (5 years).

If you study the history of SaaS you’ll discover that the Application Service Provider (ASP) model laid the groundwork for the multi-tenant architecture of software. However, it wasn’t until the rise of Google Search in 2003 — when users became more comfortable searching, trying, and purchasing software online — that SaaS truly flourished.

For new platforms to succeed, it’s essential to have not only mature technology (supply) but also widespread adoption (demand).

When I graduated in 2003 in Computer Science (specialization in computer vision), AI did not have a big appeal in the business world. Even toys of the time lacked the excitement and innovation we see today. In the realm of research, it was a barren landscape. I personally found neural networks captivating, but most respected researchers were concentrating on heuristics-based algorithms that could only solve a tiny fraction of a problem

A significant turning point occurred in 2012 when deep learning burst onto the scene with the groundbreaking discovery of ImageNet. However, this development didn’t immediately translate into widespread technology adoption. Following that breakthrough, venture investments in the field skyrocketed to an $35 billion annually, only to eventually dwindle to a mere $1 billion last year (source Crunchbase).

AI has been around for a while, decades, however it is the adoption of ChatGPT that will be marked in history as the day where AI became mainstream. Jan 2023 when it reached 100m users.

There are 3 key ingredients to what made ChatGPT successful.

First is the chat interface. Although there are hundreds of chatbot platforms and conversation intelligence startups with substantial funding, none emerged as a winner in any niche. ChatGPT, on the other hand, resonated with millions of users, familiarising them with the chat interface. It’s now clear that ChatUI is the new GUI.

Secondly, the incorporation of Transformers was a game-changer. The last letter T in ChatGPT stands for Transformers. In 2017, Google released the paper “Attention is All You Need,” which introduced the idea of Transformers as a neural network architecture. The authors originally designed this architecture to solve translation tasks such as English to French. However, Transformers broke records on all state-of-the-art benchmarks, not only in translation but also in various domains. This breakthrough unlocked significant advances in AI and deep learning, allowing neural networks to perform language-related AI, computer vision, image processing, and practically all AI tasks. [If you are new to transformer or much of AI then read the beginner guide here to familiarise yourself]

Third is Reinforcement Learning with Human Feedback –, reinforcement learning with human feedback played a crucial role. Without the reinforcement feedback, transformers would produce garbage or create cringe-worthy responses. Google’s BARD or Microsoft Bing Sydney bombed because they didn’t incorporate adequate reinforcement learning with human feedback due to the pressure of releasing a demo.

Finally, the most under appreciated aspect of OpenAI and ChatGPT may just be Sam Altman. Having been close to YC until it reached a staggering 3000 startups, he’s seen all the possible ways not to build a successful startup. He was able to combine research and make sure a product emerge that could emerge as the most growing product of all time. It is interesting note that Transformer was created by researchers at Google, but it is the implementation done by the OpenAI team led by Sam Altman that has caused this inflection.

Google has 10 times more research power than the OpenAI team, but that doesn’t necessarily mean they’ll execute better or faster. Google feels a bit like how Bell Labs was years ago. Bell Labs had some of the most brilliant scientific minds, Nobel laureates but no one remembers it as a company that moved the frontiers of technology. It was startups that ultimately led the way in pushing the boundaries of technology. Satya Nadella, CEO of Microsoft, understands this concept well. Even two decades ago, Microsoft employed some of the world’s most renowned AI and computer vision researchers. That did not stop Satya from making a $10b investment in Sam Altman with OpenAI, a scrappy startup team that executes at warp speed.

Innovation requires speed, and large corporations can’t move as quickly as nimble startups without creating chaos in the process. Elephants can’t run at the speed of a cheetah without creating carnage. This platform shift of AI is led by OpenAI and will be copied and distributed by Microsoft.

If you have limited time as a founder then only pay attention to OpenAI and no one else.

This platform shift will drown many. Investors have had a hot and cold relationship with AI, 40 years ago there was a gold rush and then an AI winter. Since 2012, after ImageNet, there has been renewed interest in AI from venture investors.

Over the last two decades, as per Crunchbase approximately $263 billion has been invested in roughly 8,055 AI startups. However, over the last ten years, only 1,573 of these startups have seen liquidity via M&A which generated a combined value of $123 billion. This means a decline in interest in AI. In fact only $1 billion was invested in AI startups in 2022, compared to the staggering $35 billion invested in 2015.

What’s even more significant is that all AI startups funded before 2018 are in for a major blow. These companies would have raised money to build systems based on weak pre-transformer architecture and are unlikely to compete with the advanced ChatGPT-type systems that use the transformer architecture. The vast majority of AI startups founded before 2018 will likely be wiped out, unable to keep up with the advancements brought about by transformer architecture. Even those that managed to raise $50–100 million in funding will seem like individuals who only purchased half of a ticket to their destination.

In fact, all AI investments made before 2018, which amount to $239 billion, are now obsolete, as they used outdated technology prior to the emergence of transformer architecture. Schumpeter’s creative destruction is at play here. These AI startup founders must abandon their previous architecture and embrace the transformer. Those who are slow or unable to make the switch will inevitably shut down.

This may lead to venture investors who backed startups before 2023 to steer clear of investing in AI, which would be a grave mistake. The new paradigm has only just begun

This fundamental change in the landscape of AI is that new AI systems will be built on transformers and that is the platform shift.

In wardley map terms, AI has moved from custom to product, arguably even commodity [New to Wardley map? click here for a quick refresher]

To win in the commodity market means very different things than in the custom build world. You can’t win by playing closed, saying you are building moats, charging high prices etc.

Here scale would be important. Cheap price would matter, reliability is very critical. Brand is what helps to differentiate here. Markets here usually reorganise to monopolies or oligopolies. This has played out in several other industries such as telecom, cloud platforms, internet marketplaces etc. Amongst the big contender in this new market will be OpenAI.

As a SaaS founder you have to ask the question whether you will build towards the model AI or on top of the model AI.

In the OpenAI language will you build LLMToolchains to improve LLMOps or would you like to build LLMApps. [See beginners guide to learn what these terms mean]

When platform shifts happen you need to operate from the basics, first principles. You must understand what is a transformer, what is an LLM, and why is fine tuning important. Not knowing this will make you feel like an impostor for a while. First read this beginners guide to AI terms so that you can get basic familiarisation. Otherwise you are going to make comical errors in understanding metaphors and have bizarre understanding. Don’t understand AI, LLM like how this person understands cloud

Don’t get caught up in the philosophical ramble

Redditors are busy debating

“Whether ChatGPT is right or wrong?” “

“Will GPT5 reach AGI ? “

It is not a useful question.

Instead asking “What ChatGPT is useful for?” is a far more useful question.

Founders should ask what are useful questions. They should not waste time on whether AGI is possible, when is it possible etc

Philosophers ask questions that span across lifetime, policy makers ask questions that span decades, investors question for the next 8–10 years. But founders are tinkerers of today. They should ask what I can do that is of value today. If GPT6 or 10 does exhibit AGI then you can re-examine what is business value then in AGI till that happens it is not a useful thought experiment.

Wouldn’t AGI obsolete the need for SaaS tool itself

Despite this an argument can be made that if AGI happens then there will be no need to buy any SaaS tool at all. For instance, if every enterprise has its own version of a super app like Enterprise GPT, based on ChatGPT6 or 10, which has AGI-level capabilities, then the enterprise could simply write its own code to develop the tools it needs

Even there it is a classic build vs buy choice. But now made with the help of an Enterprise GPT.

This question has been asked before, developers have argued that any SaaS tool can be built over a single weekend. It’s important to understand that when someone pays for a SaaS tool, they are not just paying for the code. Instead, they are paying for the expertise and experience of the team that has considered thousands of different scenarios in order to build the most robust tool possible. This level of expertise and experience can’t be replicated over a single weekend.

Moreover, relying on a single super app for everything can create a single point of failure, something that enterprises would not want to risk. Additionally, assigning responsibility to a third party can help to address concerns related to CYA (Cover Your A**), as decision-makers would want one external person or entity to take responsibility and be held accountable for any issues that may arise.

How will the Enterprise Buy all things related to LLM

Consumers pay for things that either maximize their status or minimize their regrets. As humans, we naturally want to avoid buyer’s remorse.

However, for business buyers, the goal is often to avoid looking stupid or incompetent in front of colleagues. In B2B transactions, the focus is on impressing an imaginary colleague rather than personal status. Business buyers seek to meet their utilitarian needs while also avoiding choices that could jeopardize their job security.

When a board member suggests reducing the number of developers in favor of using “Prompt Engineers” and a Copilot to build enterprise apps, it will inevitably lead to a resizing of development teams. The VP responsible for making the final decision will need to choose an option that avoids putting their job at risk.

It’s important to keep in mind that LLMs (Large Language Models) are different from other software in that their outputs are unpredictable. Enterprises dislike unpredictability, which is why they may prefer to work with a reliable vendor like Redhat who can underwrite open-source works to meet enterprise needs and provide support when necessary.Enterprises will want predictability from their LLMs, which will lead to the emergence of a “Redhat” for every type of LLM. This will help enterprises avoid unpredictability and maintain the predictability they need to make informed decisions without putting their job security at risk.

In the coming years, General-purpose Language Models (LLMs) will become even more remarkable. The advancement of GPT 5, 6, and 10 will outstrip what startups are currently trying to achieve on top of GPT4 and completely disrupt them. As a result, startups will find themselves in a Red Queen’s race, a competition that requires them to run faster just to stay in the same place. It is not the right area for startups to focus their efforts on building on generative parts.

For example, a startup like Jasper cannot compete with ChatGPT when it comes to content writing. Instead, Jasper should focus on other areas, such as what a marketing team does before and after writing content, that won’t be solved by ChatGPT.

As a SaaS founder, it’s crucial to choose a specific domain and fully comprehend the suite of tools needed by a key persona in that domain. With this knowledge, then can build a comprehensive suite of tools that leverage LLMApp as one main component.

Pay attention to new way tools will get discovered

Prompt discovery will replace Search discovery. Search engine optimization (SEO) experts will soon be replaced by Prompt Engineering Optimization (PEO) specialists. ChatGPT has gained 100 million users in less than three months, and it’s not unrealistic to expect it to reach one billion users in three years. TikTok was the fastest platform to reach one billion users, achieving this milestone in five years, but it took one year to reach its first 100 million users. If one billion people are asking questions about the best CRM in a particular channel, such as whether Salesforce or PipeDrive is better, it is important that you do prompt optimization.

SEO hackers will transition into PEOs who invest in tools to influence the dataset that LLM trains on, as well as the reinforcement learning with human feedback (RLHF) systems that ChatGPT uses to enhance its model. The SEO era began with Google’s introduction of Robots.txt, and the PEO era will begin when OpenAI introduces train.txt, an interface that allows the OpenAI team to specify which data to use for training. While this interface is not yet in place, it can be expected to evolve in the future. Investing time in how this shapes up will be crucial.

Build ChatGPT plugins

Using plugins will be crucial for SaaS companies as it allows them to grow with the channel. As a SaaS founder, it’s important to consider leveraging the ChatGPT plugin as part of your go-to-market strategy. When a new platform is launched, it becomes a channel on its own, and early adopters have an advantage that latecomers can only overcome by paying a hefty price. Whether it’s the iPhone app store or the Salesforce app store, every SaaS founder should keep an eye out for new platforms and leverage them through plugins to grow their business. Getting into ChatGPT plugin is the best of that choice now.

New Categories will get created

Fresh category names will emerge that will build on a top of word like LLM. they will most likely be called LLMApps as LLMApp presents a unique sound to distinguish them from other types of apps. ChatGPT is nothing but an LLMApp built on top of the LLM called GPT. As LLMs become multi-modal and start including image, signal, audio, and others, these models may be called Foundation Learning Models (FLMs), at which point these category names may change into FLMApps.

Examples of these new categories include SupportLLM, in the area of support software SalesLLM for sales software leveraging LLMs, MarketingLLM, HRLLM, and others that follow suit. There will also be LLM tooling, LLMOps on top of MLOps, and LLMAnalytics for monitoring the model.

It’s quite unlikely that these apps will be called ChatApps because chat is an old word and does not encapsulate the underlying technology. LLM has a nice ring to it, similar to CRM. Those who like to grab domains should consider taking LLM suffixed or prefixed common name.

Keep in mind just like not every SaaS user needs a mobile app, not every enterprise will need an LLMApp. Don’t force it.

Whoever wins technical credibility will win

In the enterprise world, credibility is crucial. Both Google and Microsoft have established deep credibility in the tech industry. However, with the launch of Google BARD or Microsoft Sydney they have lost some face. Enterprises need more assurance that they won’t get fired if they use LLM and LLApps from these or any companies. They are also concerned about the protection of their proprietary data

LLMs have a reputation for being like overconfident kids who confidently answer questions even when they don’t know the facts. In the research world, this is known as hallucination. If an enterprise replaces its human support team with a support system powered by ChatGPT, and it mistakenly and confidently answers a billing dispute, the enterprise will permanently lose trust. This is a risk that enterprises cannot afford to take.

As a result, SaaS founders must solve technical challenges around data protection, hallucinations.

They must prepare for the fact that for the first time ever, technology risk is greater than market risk. Can technology deliver on its promises?

Summary — What should you do now ?

To gain an edge, consider purchasing a plus subscription of OpenAI. Apply for both OpenAI API and ChatPlugin API to take advantage of their capabilities.

Investing in understanding Transformers and breakthroughs of the last 3–4 years is crucial. Start by reading the paper “Attention is all you need,” and if needed, ask a machine learning friend to explain it to you and why it’s important.

Engage in open-ended conversations with your customers to understand their perception of the shift towards ChatGPT. Inquire about their experience with ChatGPT, whether they have used it or not, and why. Also, ask them about their concerns or fears related to using ChatGPT.

Track the tech perception in your industry i.e what has to be technology wise true for adoption to unhindered with AI in your industry.

If you have a PM in your team, buy them a subscription to Plus.

Assign a small group of your developers to build on OpenAI & HuggingFace API. If your development team is new to deep learning, look for the enthusiasts among them and encourage them to learn deep learning from fast.ai.

Follow folks on twitter who are building startups on top of OpenAI.

In times of unprecedented technological change no one can predict the future, tinkering is the only way to move forward. While tinkering you must to do it in an affordable way.

Some lucky choices can put you ahead of the curve. Aligning yourself with winning market choices that are ahead helps you leverage their luck for you.

No matter what every innovation is decided by customers, so talk to them earlier than later. In this domain tech risk is as important as market risk. Don’t buy a half way ticket, find ways to invest sufficient funds to resolve technology risk beyond what OpenAI can help with.

“The only way to make sense out of change is to plunge into it, move with it, and join the dance.” — Alan Watts

Join the dance. Early.

Leave a Reply