ChattyUI - Run open-source LLMs locally in the browser using WebGPU

in #steemhunt22 days ago

ChattyUI

Run open-source LLMs locally in the browser using WebGPU


Screenshots

Screenshot (6).png


Hunter's comment

Open-source, feature rich Gemini/ChatGPT-like interface for running open-source models (Gemma, Mistral, LLama3 etc.) locally in the browser using WebGPU. No server-side processing - your data never leaves your pc!


Link

https://chattyui.com/



Steemhunt.com

This is posted on Steemhunt - A place where you can dig products and earn STEEM.
View on Steemhunt.com

Sort:  

ChattyUI
Run open-source LLMs locally in the browser using WebGPU, thanks for sharing.

Good tool for running open source models.

Congratulations!

We have upvoted your post for your contribution within our community.
Thanks again and look forward to seeing your next hunt!

Want to chat? Join us on:

Coin Marketplace

STEEM 0.19
TRX 0.13
JST 0.029
BTC 60880.32
ETH 3371.93
USDT 1.00
SBD 2.52