Made in RAG (Local LLM Q&A System) With Docker + ollama + R2R
kojeomstudio
Build a local LLM with RAG-based Q&A system in just 2 hours of hands-on practice using Docker + ollama + R2R framework. (Suitable for in-house/internal Q&A systems, personal portfolios, AI-powered work capabilities, and even commercial services)
초급
Docker, AI, wsl

















