Any of you have a self-hosted AI "hub"? (e.g. for LLM, stable-diffusion, ...)
Any of you have a self-hosted AI "hub"? (e.g. for LLM, stable-diffusion, ...)
I've been looking into self-hosting LLMs or stable diffusion models using something like LocalAI and / or Ollama and LibreChat.
Some questions to get a nice discussion going:
- Any of you have experience with this?
- What are your motivations?
- What are you using in terms of hardware?
- Considerations regarding energy efficiency and associated costs?
- What about renting a GPU? Privacy implications?
You're viewing a single thread.
All Comments
22 comments
fidodo @lemmy.world I tried out ollama. It was trivially easy to set up.
Stable diffusion is a bit more work, but any power user should be able to figure it out.
5 0 Reply
22 comments
Scroll to top