AI & UX

How UX holds the key for next phase evolution and adoption of LLMs

Mrigesh

Sep 2, 2024

In 2024, we've noticed that top AI language models are becoming more similar in quality. Having more computing power doesn't always mean a better product. The differences between the best models are getting smaller, and most people won't notice the small improvements anymore.

Much like how raw processing power doesn't guarantee a superior computing experience, we're finding that in the realm of LLMs, a computational edge doesn't inevitably yield a better product.

Ease of use trumps raw power

The 1984 Macintosh was a capable machine, but it was its groundbreaking user experience that transformed how we interact with computers.
It replaced complex commands with a mouse, clickable buttons, and pull-down menus. Users could now move windows, use icons, and easily undo or copy text. These simple changes made computers accessible to everyone, sparking a tech revolution.

We have seen the same trend over and over in Apple vs. Windows machines, or iPhone vs. Android phones. Raw computing power matters only up to a point. After reaching a certain level, improvements in user experience become the key differentiator.

Does it actually matter whether an AI is trained on 200 billion or 2 trillion parameters? For most people, what really matters is how well an AI works and what it can do for them. Users care about the clear benefits and actual results they get from using AI tools.

With LLM as well it will be the quality of user experience which will unlock the true potential of it's raw computing capability of LLM. After all, LLM's are the new CPU and context window are the new RAM  (see Karpathy original tweet )

Benchmark Chart of large LLMs Source: https://artificialanalysis.ai/


ChatGPT was also a UX breakthrough

I would argue that ChatGPT caught people's attention because of its UX changes. The model was bigger and better, but a similar one, Davinci, was available through API for almost a year before ChatGPT launched. Few people used the Davinci API, but ChatGPT API had two major UX improvements:

1. Streaming output: Shows text in real-time, like natural reading or listening.
2. Automatic context management: Keeps conversation history without user input.

These features made AI interaction more natural and easy to use.


CHAT UX is limiting

Having said that Chat UX is also limiting. Current chat-based interaction style suffers from articulation barrier and requires users to write out their problems as prose text.
Prompt engineers exists for a reason and as they know how craft inputs to get optimal outputs from ChatGPT. Half the population even from english speaking countries (USA, Germany) are not articulate enough to get good results from current AI bots.

We need smarter AI interfaces for everyone. Picture AI built into everyday tools, predicting what you need without requiring perfect prompts.

Current best AI tools are UI/UX first

If you've experimented with Claude's Artifacts, you know how much of an impact good UX can have on workflows. Artifacts in Claude are interactive, editable outputs that include code snippets, documents, websites, images, and more.

Devin and Cursor took off because they were UI/UX first. Devin added a Shell, Browser, Editor, and Planner with a chat UI, distilling what tools humans need to build & verify software. The AI was still a core component, but the real eye-opener was the surrounding infrastructure they built around it.

Cursor also works better because it connects terminal context and chat context, which makes the output super useful.

AI is a third paradigm in human-computer interaction

The first UI paradigm was batch processing, where users defined a complete workflow for the computer to execute. The second paradigm, represented by DOS and graphical user interfaces, introduced command-based interaction, allowing users and computers to take turns, one command at a time.
AI has introduced a the third user-interface paradigm in computing history. it creates a new way to use computers. Instead of telling computers how to do tasks, users now just say what they want. This flips who's in charge when using technology.

At AI application layer, the next billion dollar unlock resides in from asking "How powerful is the AI?" to "How can users best use this raw power?"

The focus now turns to innovating human-AI interfaces that:

1. Seamlessly it integrates into or transform our workflows
2. Significantly augments our problem-solving and comprehension capabilities
3. Understand user intention and deliver what truly matters to users


Conclusion

The next phase of LLM evolution will likely be defined not by performance improvements in models, but by advancements in human-AI interfaces that properly uses these powerful models to augment human capabilities.

With seemingly countless new AI products coming out every day, we at Huku believe that the ones that will survive will not only be backed by world-class models, but also provide best-in-class product experiences that make the magic feel intuitive.

Sign up for the waitlist now!

We're currently developing Huku and are excited to share it with you soon.


Join the waitlist to get early access

More Articles

Sign up for the waitlist now!

We're currently developing Huku and are excited to share it with you soon.


Join the waitlist to get early access