Text is the interface, not the use case

Probably 80% of (mainstream, non-ML) Twitter is excited about the potential of GPT and ChatGPT. The other 20% is skeptical. Few are indifferent.

But both the optimists and pessimists seem to focus on the obvious set of use cases: text generation and chatbots. That is a failure of imagination. Text is just an interface to those models, and chat is just a UI pattern. That is not to say ChatGPT isn't better than GPT -- it is. But it can also be dropped in as a generic text completion model; it is not married to chat as the end-user interface.

It doesn't take much creativity to try out more interesting applications. Much as JSON is the input and output of the Stripe API, text is just the input-output interface to those models. Use cases are not fundamentally limited to "writing aids" as the first waves of prototypes have shown.

For example, one of my GPT micro-projects has been a command-line assistant. I've aliased g to a script that allows me to ask any question and get back a shell one-liner to achieve that:

$ g "remove the gem 'rubygems/bundler'"

And the output:

gem uninstall rubygems/bundler

The above is a real example of usage from yesterday: I was trying to setup an old Jekyll-based project, and since I have minimal understanding of Ruby development environments I needed to look up how to reinstall a dependency. Instead of googling and finding the answer in 30 seconds, I got it about 10 times faster. I've also used it to figure out ffmpeg arguments, git commands, and much more.

This is hardly just a writing aid. Thinking of GPT as a general intelligence engine (not to be confused with AGI, a well-defined term) yields much more interesting experiments. And eventually allows you as an app builder to make much more valuable things.