ChattyUI - Run open-source LLMs locally in the browser using WebGPU
ChattyUI
Run open-source LLMs locally in the browser using WebGPU
Screenshots
Hunter's comment
Open-source, feature rich Gemini/ChatGPT-like interface for running open-source models (Gemma, Mistral, LLama3 etc.) locally in the browser using WebGPU. No server-side processing - your data never leaves your pc!
Link
This is posted on Steemhunt - A place where you can dig products and earn STEEM.
View on Steemhunt.com
ChattyUI
Run open-source LLMs locally in the browser using WebGPU, thanks for sharing.
Good tool for running open source models.
Congratulations!
We have upvoted your post for your contribution within our community.
Thanks again and look forward to seeing your next hunt!
Want to chat? Join us on: