The AI model’s context window shows the ability of the model to “remember” information, which increases over time. However, researchers have proposed new ways to increase long-term memory in AI models. This is because several sessions often fail to maintain context.
Dhravya Shah, a 19-year-old founder, is trying to solve the problem in this field by building a memory solution called Supermemory for AI apps.
Originally from Mumbai, India, Shah began building bots and apps for consumers a few years ago. He sold the bot to hype social media tools to format tweets and to good-looking screenshots.
The founder, who was preparing for the entrance exam to enter IIT (India Institute of Technology), made good money from the sale and decided to move to the US to attend Arizona State University instead.
After the relocation, he tried to create something new every week for 40 weeks. During those weeks he built a supermemory (which was initially called an arbitrary context) and placed it on GitHub. At the time, this tool allowed you to chat with your Twitter bookmarks.
The current version of the tool extracts “memory” or insights from unstructured data, helping applications to better understand the context.

Shah secured an internship at CloudFlare in 2024, where he worked on AI and infrastructure. He later worked as a developer-related lead at the company. During this time, advisors, including CloudFlare CTO Dane Knecht, asked to turn Super Memory into a product.
This year he decided to build a full-time super memory.
Currently described as a universal memory API for AI apps, Supermemory builds knowledge graphs based on data that handles the context of the user. For example, it can support queries that span a month-old entry for a writing or journaling app, or searching for email apps. The solution also allows for multimodal input, allowing the video editor to retrieve related assets from a library at a particular prompt.
The company says startups can ingest all kinds of data, including files, documents, chats, projects, emails, PDFs and app data streams. The chatbot and Note Taker feature allow users to add notes to text, add files and links, and connect to apps such as Google Drive, OneDrive, and Concepts. There is also a Chrome extension that lets you easily add notes from your website.
“Our core strength is extracting insights from all kinds of unstructured data and providing more context for apps about users. When working beyond multimodal data, the solution is suitable for all kinds of AI apps, from email clients to video editors,” Shah said.
Supermemory has secured $2.6 million in seed funds led by Susa Ventures, Browder Capital and SF1.VC. The round also includes individual investors such as Cloudflare’s Knecht, Google AI chief Jeff Dean, Deepmind Product Manager Logan Kilpatrick, Sentry founder David Cramer, Openai, Meta and Google executives.
Shah at one point approached him to have the Y-Combinator join one of the batches, but the timing didn’t go well as he was already carrying investors.

Joshua Browder, founder and CEO of the “Robot Lawyer” startup, was impressed by Shah’s tenacity, who runs Browder Capital as Solo GP.
“I connected with Dhravya via X, and what surprised me was how quickly he moved and built things.
The company has multiple existing customers, including A16Z-backed desktop assistant Cluely, AI video editor Montra, AI Search Scira, Composio’s multi-MCP tool Rube, and real estate startup RET. Additionally, they work with the robot company to retain visual memories taken by the robot.
This is a consumer leaning, but the app feels like a playground for developers to understand more about the tool and potentially use it in their workflows and their own apps.
Supermemory has quite a competitor in its memory space. Startups like Letta and Mem0 (where Shah worked for a while) supported by Felicis Ventures are building a memory layer for their agents. Supermemory’s own backer, Susa Ventures, has invested in Memories.ai along with Samsung. Shah says these startups may be useful for a variety of industries and use cases, but supermemory stands out because of their low latency.
“More and more AI companies need a memory layer. SuperMemory’s solutions provide high performance while allowing them to quickly express relevant contexts,” Browder said.
Source link