A unified filesystem-like interface for AI agents. Store, retrieve, and search with automatic embedding and full-text indexing. Use familiar commands like a local drive.
$ curl -fsSL https://dat9.ai/install | sh
Read https://dat9.ai/skill.md and follow instructions
Your agent learns to install, auth, and use dat9 autonomously
dat9 works with any agent and stack
Built for AI agents, designed for developers.
Find files by content, not just filename
All files are automatically embedded and indexed. Query by meaning using natural language. Smart summaries let agents scan quickly before loading full content.
From configs to datasets — seamless handling
Efficient upload and download with automatic resume support. Optimized data flow for any file size with streaming and retry built in.
Copy and rename without re-upload
Copy without re-upload. Rename instantly. One file can appear at multiple paths. Efficient storage management handles cleanup automatically.
Search, store, and manage files — all through a familiar filesystem interface.
Find files by meaning. Auto-embeddings and vector search powered by TiDB Cloud — no external pipeline, no API keys in your code.
# Search for configs related to deployment
$ dat9 search "deploy to production" --limit 5
/config/prod.env
/scripts/deploy.sh
/docs/deployment.md
FUSE mount puts dat9 at a local path. Use ls, cat, vim, cp directly on your remote data.
$ dat9 mount /mnt/dat9
Mounted dat9 at /mnt/dat9
$ ls /mnt/dat9/data/
dataset.tar config.json logs/
$ cat /mnt/dat9/config.json | jq .
Every directory can carry summary files. Agents scan cheaply before loading full content — saving tokens and time.
$ dat9 cat :/data/.summary.md
Dataset: Image classification training data
Size: 50K images, 10 classes
Format: PNG, 224x224
Updated: 2024-01-15
Copy files without re-upload. Rename instantly. All metadata operations — no data movement, no storage cost for duplicates.
# Zero-copy link — no re-upload
$ dat9 cp :/data/file.bin :/backup/file.bin
Linked in 0.003s
# Metadata-only rename
$ dat9 mv :/old/name.txt :/new/name.txt
Renamed in 0.001s
Build agent tools with the Go SDK. Streaming, resume support, and context cancellation built in.
import "github.com/mem9-ai/dat9/pkg/client"
c := client.New("http://localhost:9009", "")
data, _ := c.Read("/config.json")
c.Write("/output.json", result)
Find files by meaning with vector similarity and full-text search
Instant copy and rename without data movement
Organize and filter files with key-value tags
Reliable upload with automatic resume support
Familiar commands for a networked filesystem.
Upload a file to dat9. Files are stored and indexed automatically.
Read file contents to stdout. Works with pipes and redirects.
List directory contents. Shows size, modified time, and type.
Zero-copy link — same file, new path. No re-upload, no extra storage.
Rename or move. Metadata-only operation, instant completion.
Mount as local filesystem. Use standard tools directly on dat9 data.
Build agent tools with streaming, resume, and context support.
package main
import (
"context"
"fmt"
"log"
"github.com/mem9-ai/dat9/pkg/client"
)
func main() {
ctx := context.Background()
// Create client
c := client.New("http://localhost:9009", "")
// Write file
err := c.Write(ctx, "/data/hello.txt", []byte("Hello, dat9!"))
if err != nil {
log.Fatal(err)
}
// Read file
data, err := c.Read(ctx, "/data/hello.txt")
if err != nil {
log.Fatal(err)
}
fmt.Println(string(data))
// List directory
entries, err := c.List(ctx, "/data/")
if err != nil {
log.Fatal(err)
}
for _, e := range entries {
fmt.Println(e.Name, e.Size, e.ModTime)
}
// Zero-copy link
err = c.Copy(ctx, "/data/hello.txt", "/backup/hello.txt")
if err != nil {
log.Fatal(err)
}
}
One command to install. One command to start. Zero config.
$ curl -fsSL https://dat9.ai/install | sh