Priveedly: A django-based content reader and recommender for personal and private use
-
Updated
Jan 24, 2025 - Python
Priveedly: A django-based content reader and recommender for personal and private use
The MIT-licensed core of BrainDrive: an extensible, self-hosted AI platform with React UI, FastAPI backend, and a modular plugin architecture.
🔒 100% Private RAG Stack with EmbeddingGemma, SQLite-vec & Ollama - Zero Cost, Offline Capable
The Private AI Setup Dream Guide for Demos automates the installation of the software needed for a local private AI setup, utilizing AI models (LLMs and diffusion models) for use cases such as general assistance, business ideas, coding, image generation, systems administration, marketing, planning, and more.
SnapDoc AI processes everything on-device, ensuring your sensitive information never leaves your control. Use voice and text on-device processing in organizations.
Deploy a complete, self-hosted AI stack for private LLMs, agentic workflows, and content generation. One-command Docker Compose deployment on any cloud.
An advanced, fully local, and GPU-accelerated RAG pipeline. Features a sophisticated LLM-based preprocessing engine, state-of-the-art Parent Document Retriever with RAG Fusion, and a modular, Hydra-configurable architecture. Built with LangChain, Ollama, and ChromaDB for 100% private, high-performance document Q&A.
Local LLM integration for Odoo 18 - chat with AI directly in Odoo using Ollama, LM Studio, or any OpenAI-compatible API.
This is on-going research regarding the implementation of homomorphic encryption and federated learning for the use case of electric utility infrastructure defect detection using an object detection model in a Private AI framework.
IRISStar is an android app for interfacing with GGUF / llama.cpp models locally.
This project presents a streamlined interface for interacting with the Ollama API using Spring Boot and WebFlux.
Distributed Deep Learning
Local private AI assistant powered by FastAPI, Streamlit, FAISS, and TinyLlama with document search and chat capabilities.
Offline AI journaling app that gives insights based on your entries and runs locally with no cloud or data sharing.
Plataforma ChatGPT autoalojada con modelos locales y máxima privacidad. LLaMA, Mistral, todo en tu servidor
🤖 Automate local private AI setups for demos, showcasing models for diverse tasks like coding, image generation, and business planning effectively.
OpenMined 30DaysOfFLCode Challenge
The default settings plugin for BrainDrive.
Lightweight web UI for llama.cpp with dynamic model switching, chat history & markdown support. No GPU required. Perfect for local AI development.
Add a description, image, and links to the private-ai topic page so that developers can more easily learn about it.
To associate your repository with the private-ai topic, visit your repo's landing page and select "manage topics."