Projects
A collection of LLM-focused projects showcasing innovative approaches to context management, dynamic interfaces, information retrieval, and AI-powered solutions.

Pralok
An AI assistant that creates custom interfaces on-the-fly with bi-directional state transfer. Features dynamic code generation workflows, advanced file editing, and sophisticated context management for frictionless chat experiences. One of its core features is bi-directional state transfer which means that the AI can consider the user's actions in the interface as context and take action or manipulate the interface if needed. This enables dynamic learning experiences and creating personalized software.
LLM
UserSubContext
A novel context management technique for Large Language Models (LLMs). Developed to dynamically select only a subset of the current context window for next turn based on previous user messages. Uses 'current user message' to intelligently choose relevant context, reducing token consumption and cost while maintaining accuracy. This approach is better than naive rolling-window based methods and provides significant efficiency improvements for LLM applications.
LLM
Gemini News Search Engine
Built an eclectic news search engine that helps with deduplication of news articles. Used graph database, TigerGraph to model and mine data using custom queries. Worked with NLP models including Semantic Search, Keyword Generation and Sentiment Analysis for enriching news articles. Worked on web scraping pipelines for effective data collection and indexing. Published in ICIScOIS 2023 conference.
TigerGraphKirinEdit
Specialized text editing techniques optimized for Large Language Models. Enables precise, efficient text manipulation in LLM-powered applications. This project focuses on building reliable editing primitives that LLMs can use to modify text with high accuracy and minimal token overhead.
LLMRAG Systems
Built multiple Retrieval Augmented Generation systems combining semantic search with LLM capabilities for accurate, context-aware responses in various domains. These systems leverage vector databases, embedding models, and sophisticated retrieval strategies to ground LLM responses in factual, up-to-date information.
RAGMore LLM Projects
Actively working on several other exciting LLM projects including token stream manipulation, text editing techniques, and multi-modality parsing. I'm continuously exploring new ways to make LLMs more efficient, accurate, and useful. Visit my GitHub to explore the full collection of LLM experiments and projects.
LLM