MUP (Model UI Protocol) lets you embed interactive UI directly in LLM chat. Each MUP is just a single .html file. The same functions can be triggered by the user (clicking a button) or by the LLM (function call). Both sides see each other's actions in real time.
The repo includes a PoC host and 9 example MUPs. Demo mode lets you interact with the UI side without an API key. Add an OpenAI key to see full LLM-UI collaboration.
Demo videos in the README show things like: drawing pixel art then charting its colors, a camera that captures a scene and the LLM recreates it, making beats on a drum machine with the LLM.
I'd love feedback on the protocol design.