Building Privacy-Friendly Applications with Ollama, Vector Functions, and LangChainJS
Today, most AI applications send data to LLM cloud providers, raising privacy concerns. This talk introduces a new way to build AI applications that keep everything local on your computer. By running LLMs locally with Ollama, we avoid transmitting sensitive information to external cloud providers. We will highlight LangChain's ability to create versatile agents capable of handling tasks autonomously while protecting sensitive data. In this talk, we’ll see
- Overview of cloud-based AI privacy issues and the importance of local AI.
- Detailed insights into generating embeddings with Ollama for vector searches and demonstrating how LangChain agents can perform tasks such as document summarisation and API interactions, all while maintaining data privacy
- A practical demonstration of these tools and discussion of real-world use cases.
Pratim Bhosale
Pratim Bhosale is a Full Stack Developer and Developer advocate at SurrealDB. She is also the maintainer of the SurrealDB Go SDK. Pratim has worked with multiple DevTool companies and helped them with Developer Experience. She enjoys writing technical articles and building projects with SurrealDB. She actively supports Women in Tech and hosts multiple meetups/workshops for the community. Pratim enjoys baking and brewing kombucha in her leisure time. Pratim has spoken at conferences like GohperCon, Golab, DevBCN and JSconf.