Don't build AI, build with AI

A robotic intern points towards a glowing AI God surrounded by cogs

Drew Breunig divides AI uses cases into Gods, Interns, and Cogs:

  1. Gods: Super-intelligent, artificial entities that do things autonomously.
  2. Interns: Supervised copilots that collaborate with experts, focusing on grunt work.
  3. Cogs: Functions optimized to perform a single task extremely well, usually as part of a pipeline or interface.

Each category comes with different costs:

This categorisation is useful when considering startup and product ideas, and your use of AI:

  1. Don’t build AI Gods, unless you have access to ungodly amounts of capital. This may be obvious to most people, but I still see pitches for model-centric startups, and God-building model companies are the focus of much AI hype.
  2. Build with AI Interns, as they can significantly increase your productivity. Interns include writing assistants, meeting transcribers, and programming copilots. Ignoring the wealth of AI interns is foolish for individuals and companies alike.
  3. Don’t build complex AI Interns, unless you have a use case that justifies the costs and risks. For example, Bloomberg’s multi-million dollar model was outperformed by GPT-4 a few months after its release, and it’s unclear whether it ended up powering any features.
  4. Build with and build AI Cogs, but ensure you can manage non-deterministic components in production. This includes anything from traditional machine learning to the plethora of AI tasks that have become commodified in recent years like object recognition and text summarisation.

Above all, start by defining the problem and assessing the impact rather than making AI use your goal. Focusing on problems and solution impact is robust to hype cycles. I’ve considered this focus to be the hardest problem in data science since at least 2015. Amazon data scientist Dzidas Martinaitis has recently captured a similar sentiment in his flowchart for data science projects. Similarly, Douglas Gray and Evan Shellshear have found that data science and AI projects typically fail due to issues with strategy and process, rather than tech and people shortfalls.

Ignore at your own risk.


Acknowledgement: This post was produced with the help of AI interns. One version of Gemini produced the cover image, while another made helpful suggestions on earlier drafts.

Subscribe

    Public comments are closed, but I love hearing from readers. Feel free to contact me with your thoughts.