Show HN: LLM-UI – A library for building LLM UIs
3 by richardgill88 | 0 comments on Hacker News.
I built llm-ui, a UI library to help build a UI similar to ChatGPT. It operates on language model output strings, so works with any language model. Quick features: - Hides partial / broken markdown syntax - Add your own custom components to LLM output - Throttling smooths out pauses in the LLM’s streamed output - Renders output at native frame rate - Code blocks for every language with Shiki - Headless: Bring your own styles The first version is react only for now, but I'm planning to support most frontend frameworks.
I built llm-ui, a UI library to help build a UI similar to ChatGPT. It operates on language model output strings, so works with any language model. Quick features: - Hides partial / broken markdown syntax - Add your own custom components to LLM output - Throttling smooths out pauses in the LLM’s streamed output - Renders output at native frame rate - Code blocks for every language with Shiki - Headless: Bring your own styles The first version is react only for now, but I'm planning to support most frontend frameworks. 0 https://ift.tt/F4v0T8k 3 Show HN: LLM-UI – A library for building LLM UIs
3 by richardgill88 | 0 comments on Hacker News.
I built llm-ui, a UI library to help build a UI similar to ChatGPT. It operates on language model output strings, so works with any language model. Quick features: - Hides partial / broken markdown syntax - Add your own custom components to LLM output - Throttling smooths out pauses in the LLM’s streamed output - Renders output at native frame rate - Code blocks for every language with Shiki - Headless: Bring your own styles The first version is react only for now, but I'm planning to support most frontend frameworks.
I built llm-ui, a UI library to help build a UI similar to ChatGPT. It operates on language model output strings, so works with any language model. Quick features: - Hides partial / broken markdown syntax - Add your own custom components to LLM output - Throttling smooths out pauses in the LLM’s streamed output - Renders output at native frame rate - Code blocks for every language with Shiki - Headless: Bring your own styles The first version is react only for now, but I'm planning to support most frontend frameworks. 0 https://ift.tt/F4v0T8k 3 Show HN: LLM-UI – A library for building LLM UIs
Comments
Post a Comment