📄

Request My Resume

Thank you for your interest! To receive my resume, please reach out to me through any of the following channels:

Learning Agent Development with Google Gemini CLI (Part 3): How Does Gemini CLI 'Remember' Key Information?

Learning Agent Development with Google Gemini CLI (Part 3): How Does Gemini CLI “Remember” Key Information?

In the first two articles, my partner Tam took us deep into Gemini CLI’s “security moat” (sandbox) and “perception scalpel” (file system). We now know AI can safely and precisely interact with our code. But that’s not enough.

We’ve all experienced AI’s “goldfish memory” dilemma—when you eagerly start a new session ready to continue yesterday’s unfinished work, only to find the AI assistant has forgotten everything. This “context loss” curse severely limits the efficiency and depth of our AI collaboration.

This time, Tam will explore Gemini CLI’s “brain,” revealing how it conquers this challenge through an exquisitely crafted “Hierarchical Instructional Context” system to achieve true persistent memory.

Over to Tam.


  • by Tam -

Chapter 1: Design Philosophy—A Layered Memory Model Beyond “Conversation History”

First, we must discard a misconception: conversation history is not memory. Conversation history is passive “meeting minutes,” while memory is active “core principles.” Gemini CLI’s designers deeply understood this and, borrowing from human cognitive patterns, built a “Hierarchical Instructional Context” system.

1.1 The Layered World of GEMINI.md

This system is implemented by searching for a special file (default GEMINI.md) at different filesystem locations, constructing three core memory layers:

  • 1. Global Memory: Located at ~/.gemini/GEMINI.md, storing universal preferences across all projects—AI’s “personality and soul.”

  • 2. Project Memory: Recursively searches upward from the current directory to project root, finding all GEMINI.md files on the path—defining the project’s “laws and specifications.”

  • 3. Local Memory: Recursively searches downward from current directory through all subdirectories for GEMINI.md files—the “tactical guides” for specific tasks.

Before each interaction, CLI concatenates all content in “Global → Project → Local” order, with more specific memories placed later to gain higher attention weight in the model, enabling elegant configuration override.

Chapter 2: Memory Shaping and Management—save_memory and /memory Command Family

2.1 save_memory: Shaping AI’s Global Cognition with One Sentence

The power of save_memory is that it provides the ability to dynamically modify AI’s core settings within conversation flow. When you find AI’s behavior doesn’t match your long-term preferences, you can directly command it to “remember” new rules.

User: “Your Go code just now didn’t handle errors, which is bad. Please remember: all my Go code must explicitly check for error, never use _ to ignore.”

AI (internal call): save_memory(fact="All my Go code must explicitly check for errors...")

AI (response): “Got it, I’ve saved this rule to my global memory. I’ll follow it going forward.”

It appends this fact structurally to a special ## Gemini Added Memories section in the global memory file ~/.gemini/GEMINI.md, convenient for management.

2.2 /memory Commands: Memory’s “Control Panel”

If save_memory is the “write” interface, /memory commands are the “read” and “manage” interfaces:

  • /memory show: Memory’s “X-ray machine,” fully printing the final concatenated memory context about to be sent to the model, clearly marking each fragment’s source file, achieving complete transparency.

  • /memory refresh: Manual “refresh” button, forcing CLI to rescan and reload all GEMINI.md files, ensuring memory reflects your manual modifications in real-time.

Chapter 3: Implementation Revealed—Diving into Context Loader’s Code Maze

Context Loader’s core responsibilities are: discovery, reading and sorting, concatenation. Its implementation is an elegant three-way file search algorithm:

  1. Upward Search (Project Memory): Starting from current directory, searching upward level by level for GEMINI.md, until project root (marked by .git folder).

  2. Downward Search (Local Memory): Using high-performance libraries like fast-glob to efficiently recursively search all subdirectories for GEMINI.md, naturally supporting .gitignore rules to exclude irrelevant files.

  3. Reading with Source: Each file’s content, when read, is wrapped in an “envelope” like [FROM: path/to/file]—this is the secret behind /memory show’s source tracing.

Finally, the concatenated massive string is injected as System Prompt or prefix into every Gemini API call, setting the tone, rules, and boundaries for all the model’s subsequent thinking and responses.

Chapter 4: Best Practices—Becoming AI’s “Memory Tuning Master”

Understanding the principles, the key is designing and maintaining your GEMINI.md files like writing high-quality code.

Global Memory (~/.gemini/GEMINI.md)

This is your “AI Personal Constitution.” Use it to define your role, tech stack preferences, communication style, and personal “coding red lines.”

Project Memory (my-project/GEMINI.md)

This is the project’s “Team Development Specification.” Should define project background, architectural principles, and provide key commands and workflows (like make test) that AI can directly invoke.

Local Memory (.../module/GEMINI.md)

This is specific module’s “Operations Manual” or “Pitfall Guide.” Use to emphasize module responsibilities, list key dependencies and special notes (like “Never log complete credit card numbers”).

Memory—The Final Puzzle Piece Toward True AI Partnership

Gemini CLI’s memory system is its most visionary design. Through a “Hierarchical Instructional Context” mechanism inspired by human cognitive models, it brings us a new possibility: building long-term, efficient partnerships with AI that can learn, grow, and deeply understand our personal and project needs.

Future AI tools’ core competitiveness will be the ability to build and leverage persistent context. Gemini CLI invites each of us to become AI’s “memory shapers,” crafting a truly understanding AI development partner uniquely our own. This is the highway to future human-AI collaboration.

Found Tam’s analysis insightful? Give it a thumbs up and share with more friends who need it!

Follow my channel to explore the infinite possibilities of AI, going global, and digital marketing together.

The smartest AI isn’t the one that knows everything—it’s the one that knows you best.

Mr. Guo Logo

© 2026 Mr'Guo

Twitter Github WeChat