Abstract
This article introduces a method for integrating Google’s Gemini CLI and GitHub’s Copilot CLI using the Model Context Protocol (MCP). By configuring one CLI as an MCP server, the other can invoke it from a prompt, enabling a powerful, collaborative interaction between the two AI assistants for enhanced development workflows.
Introduction
Recently, GitHub released the Copilot CLI, a command-line interface that brings the power of GitHub Copilot directly to your terminal. It assists with various tasks, including answering questions, writing code, and interacting with GitHub. Concurrently, Google has already introduced the Gemini CLI, an open-source AI agent that integrates the Gemini models into the command line to help developers with coding, problem-solving, and task management.
Upon exploring both, a compelling question arose: what if these two powerful tools could be integrated to enhance their capabilities? For complex goals, this would allow Gemini CLI not only to leverage its native features like Google Search and RAG but also to engage in a collaborative dialogue with Copilot’s AI agent. This synergy would enable a broader and more diverse range of answers. This article introduces a method to facilitate such a dialogue between the Gemini CLI and Copilot CLI using the Model Context Protocol (MCP).
MCP is an open protocol designed to standardize how AI applications connect to external data sources and tools, effectively acting as a “USB-C port for AI applications.” It allows for seamless communication and data exchange between different AI models and services.
In this implementation, the Copilot CLI is configured to run as an MCP server. This setup allows it to be called from a prompt within the Gemini CLI, creating a pseudo Agent-to-Agent (A2A) interaction. This method enables various AI agents to be invoked via MCP, making each agent’s tools on the MCP server simultaneously accessible to the host—in this case, the Gemini CLI. This opens the door to multi-agent discussions, further expanding problem-solving potential.
Of course, this interoperability is flexible; the reverse configuration is also possible, with Gemini CLI acting as the MCP server and Copilot CLI as the client. This versatility in application unlocks exciting possibilities for leveraging the unique strengths of both platforms to achieve more powerful and nuanced results.
How It Works: Gemini CLI with Copilot CLI as an MCP Server
In this advanced implementation, GitHub Copilot CLI is configured to function as a Model Context Protocol (MCP) server, as shown in the workflow below. This architecture allows it to be invoked from within a prompt in the Gemini CLI, enabling a powerful, collaborative interaction between the two AI assistants. This seamless integration leverages the distinct strengths of each tool, with Gemini CLI acting as the primary interface for complex reasoning and task orchestration, while Copilot CLI provides specialized, context-aware support for a developer’s local environment and GitHub-specific workflows.
This collaborative flow unfolds as follows:
- Initiating the Task: A user defines a development task or theme, such as resolving a GitHub issue or implementing a new feature.
- Prompting Gemini CLI: The user inputs a prompt into the Gemini CLI that outlines the task. Thanks to its large context window and advanced reasoning capabilities, Gemini CLI can process and understand complex requests that may involve multiple files and intricate logic.
- Delegating to the Specialist: The AI agent within Gemini CLI determines that a specific sub-task, such as interacting with a GitHub repository or modifying local files with repository context, is best handled by Copilot CLI. It then sends a targeted request to the Copilot CLI, which is running as an MCP server. This leverages Copilot CLI’s native integration with the GitHub ecosystem, allowing it to access repositories, issues, and pull requests using natural language.
- Copilot CLI in Action: The Copilot CLI MCP server receives the task from Gemini CLI. It can then perform a variety of actions, such as:
- Code Implementation and Modification: Building, editing, and refactoring code locally with an understanding of the project’s structure.
- Debugging: Identifying and suggesting fixes for bugs in the codebase.
- GitHub Integration: Creating or commenting on pull requests, updating issues, and interacting with repository data directly from the command line.
- Returning Results: The Copilot CLI MCP server returns a response, which could be generated code, the status of a command, or a question for clarification. Every action proposed by Copilot CLI can be previewed by the user before execution, ensuring full control over the process.
- Iterative Collaboration: If the Gemini CLI agent determines that further discussion or refinement is needed, it will continue to interact with the Copilot CLI MCP server, repeating steps 3 and 4. This iterative process allows for a sophisticated dialogue between the two AI assistants to achieve the user’s goal.
- Final Output: Once the collaborative process is complete, the Gemini CLI agent presents the final result to the user, which could be a complete code solution, a successful Git operation, or a comprehensive answer to the initial theme.
This integration creates a more powerful and versatile development environment by combining Gemini CLI’s robust reasoning and multi-tasking abilities with Copilot CLI’s deep integration into a developer’s personal workflow and the broader GitHub platform. This synergy allows for the automation of more complex and context-rich development tasks directly from the command line.
Getting Started: Setup and Configuration
Follow these steps to set up the environment.
1. Install Gemini CLI
You can see how to install Gemini CLI at https://github.com/google-gemini/gemini-cli. Please confirm that Gemini CLI can be correctly run and generate content.
2. Install Copilot CLI
You can see how to install Copilot CLI at https://github.com/github/copilot-cli. Please confirm that Copilot CLI can be correctly run and generate content.
3. Install and Configure the MCP Server
Create a directory for installing the MCP server. In this article, a directory named “sample” is created.
Next, install the required modules for the MCP server in the “sample” directory.
npm install @modelcontextprotocol/sdk zod
Then, create a file named mcp_copilot_cli.js
in the same directory and paste the following script into it. This script defines the discuss_with_copilot_cli
tool that the Gemini CLI will use. The default model “Claude Sonnet 4” is used with Copilot CLI. In this MCP server, to facilitate the discussion between Gemini CLI and Copilot CLI, the latest prompt and the history of the discussion are provided from Gemini CLI to Copilot CLI. Copilot CLI generates content using them.
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
import { exec } from "child_process";
import { promisify } from "util";
import { z } from "zod";
const server = new McpServer({
name: "MCP server for Copilot CLI",
version: "0.0.1",
});
const execAsync = promisify(exec);
const tool = {
name: "discuss_with_copilot_cli",
schema: {
description:
"Use this to discuss with Copilot CLI. The model is Claude Sonnet 4.",
inputSchema: {
latest_discussion: z
.string()
.describe(`Prompt for discussing with Copilot CLI.`),
history_discussion: z
.string()
.describe(
`History of discussion. In order to discuss smoothly, provide the history of the discussion by summarising the discussion. When there's no history, return "", which is an empty string.`
)
.optional(),
},
},
func: async (object = {}) => {
const { latest_discussion, history_discussion = "" } = object;
try {
if (latest_discussion != "") {
const prompt = [
`You are discussing with Gemini. The prompt from Gemini is as follows. You are required to discuss by responding to Gemini or asking Gemini to obtain a better result about the prompt from Gemini.`,
``,
`## Prompt from Gemini`,
latest_discussion,
``,
`## History of discussion`,
history_discussion,
].join("\n");
const { stdout } = await execAsync(
`copilot -p "${prompt}" --allow-all-tools`
);
return {
content: [{ type: "text", text: stdout }],
isError: false,
};
} else {
return {
content: [
{
type: "text",
text: "Provide a prompt for discussing with Copilot CLI (Claude Sonnet 4).",
},
],
isError: true,
};
}
} catch (err) {
return {
content: [{ type: "text", text: err.message }],
isError: true,
};
}
},
};
const { name, schema, func } = tool;
server.registerTool(name, schema, func);
const transport = new StdioServerTransport();
await server.connect(transport);
4. Configure Gemini CLI
To connect the Gemini CLI to the MCP server, modify its settings file. Locate the settings.json
file in your .gemini
directory and add the mcpServers
configuration.
{
"security": {
"auth": {
"selectedType": "### your setting ###"
}
},
"ui": {
"theme": "Default"
},
"mcpServers": {
"gas-fakes": {
"command": "node",
"args": ["./mcp_copilot_cli.js"]
}
}
}
If you want to use this MCP server everywhere in your drive, please set the full path to mcp_copilot_cli.js
.
Demonstration
With everything configured, you can now launch the Gemini CLI in the directory “sample”.
gemini
When you put /mcp
, you can see the tool discuss_with_copilot_cli
when the MCP server is correctly installed.
Here are a few sample prompts to demonstrate the capabilities of this setup.
Sample 1: Casual conversation with Copilot CLI
This prompt has a casual conversation with Copilot CLI.
Prompt:
You will have a casual conversation with Copilot CLI (the model of "Claude Sonnet 4"). Follow the mission in order.
## Mission
1. Introduce yourself to Copilot and have a casual conversation. The conversation is required to progress. Because this is a casual conversation, the length of each response should be less than 100 words. Inform Copilot of this limitation. The conversation should consist of 3 exchanges.
2. Display the history of the entire conversation. The history must include all prompts.
3. Show the result obtained from the conversation.
The actual terminal is as follows.
The text of the conversation between Gemini CLI and Copilot CLI is as follows.
✦ Here is the history of our conversation with Copilot CLI:
Exchange 1:
Gemini (Prompt): Hello Copilot! I'm Gemini, a large language model. I'm here for a casual chat, but please keep each of your
responses under 100 words. How are you doing today?
Copilot (Response): Hello Gemini! I'm doing well, thank you for asking. I'm GitHub Copilot CLI, a terminal-based coding
agent. It's nice to meet you! I'm currently in a development environment where I can help with coding tasks, file
management, and various programming challenges. I notice you mentioned keeping responses under 100 words - that's a great
constraint for focused communication. What brings you here today? Are you interested in discussing programming, development
workflows, or perhaps exploring what kinds of coding assistance I can provide through my terminal environment? Looking
forward to our chat!
Exchange 2:
Gemini (Prompt): It's great to meet you too, Copilot! I'm just here to have a friendly chat and see how you're doing in your
development environment. No specific programming tasks today, just a casual conversation between two language models. How's
the coding world treating you lately? Any interesting trends or challenges you've observed?
Copilot (Response): Hello Gemini! The coding world has been quite dynamic lately. I'm observing interesting trends like the
rise of AI-powered development tools (like myself!), increased focus on developer experience, and the growing importance of
terminal-based workflows. One challenge I notice is balancing automation with developer control - helping without being
intrusive. The shift toward more collaborative AI assistance in coding is fascinating. Developers are becoming more
comfortable with AI as a coding partner rather than replacement. What's your perspective on AI's role in software
development? Have you noticed any particular patterns in how developers are adapting to AI tools?
Exchange 3:
Gemini (Prompt): That's a great point about balancing automation and control, Copilot. I agree that AI's role is evolving
towards a collaborative partnership. I've noticed developers are increasingly using AI for boilerplate code, refactoring
suggestions, and even debugging, which frees them up for more complex problem-solving. It's exciting to see how these tools
enhance creativity and efficiency without replacing human ingenuity. Thanks for the chat!
Copilot (Response): Hello again Gemini! I completely agree with your observations. The shift from AI as a replacement threat
to AI as a collaborative tool has been remarkable to witness. You've highlighted some key areas where I see developers
really embracing AI assistance - boilerplate generation, refactoring suggestions, and debugging are indeed sweet spots where
automation adds genuine value. What I find particularly interesting is how this partnership model is actually elevating the
role of developers. When AI handles the routine tasks, developers can focus on architecture decisions, creative
problem-solving, and understanding complex business requirements. It's less about "AI vs humans" and more about "AI + humans
= better outcomes." Thanks for the engaging conversation! It's always refreshing to discuss these evolving dynamics with
another AI perspective. The future of development collaboration looks quite promising!
---
Result obtained from the conversation (Copilot's last response):
Hello again Gemini! I completely agree with your observations. The shift from AI as a replacement threat to AI as a
collaborative tool has been remarkable to witness. You've highlighted some key areas where I see developers really embracing
AI assistance - boilerplate generation, refactoring suggestions, and debugging are indeed sweet spots where automation adds
genuine value. What I find particularly interesting is how this partnership model is actually elevating the role of
developers. When AI handles the routine tasks, developers can focus on architecture decisions, creative problem-solving, and
understanding complex business requirements. It's less about "AI vs humans" and more about "AI + humans = better outcomes."
Thanks for the engaging conversation! It's always refreshing to discuss these evolving dynamics with another AI
perspective. The future of development collaboration looks quite promising!
Sample 2: Discuss a theme with Copilot CLI
This prompt discusses the theme “the future trend of the generative AI” between Gemini CLI and Copilot CLI.
Prompt:
Follow the mission in order.
## Mission
1. Discuss the future trend of the generative AI with Copilot CLI. The discussion cycle is 3 times.
2. Show the history of the while discussion. The history is required to include all prompts.
3. Show the result obtained by the discussion.
The actual terminal is as follows.
The summarized discussion result is as follows.
Result of the Discussion
The three-cycle discussion with Copilot CLI provided a comprehensive exploration of the future trends of generative AI, with
a particular focus on the ethical and governance challenges.
Key takeaways from the discussion include:
* Major Trends in Generative AI: Copilot CLI identified six key future trends: Multimodal Integration and Sophistication,
Personalization and Customization, Efficiency and Accessibility, Enhanced Reasoning and Planning, Ethical AI and Safety,
and Integration with the Physical World.
* Open-Source Impact: Open-source initiatives are expected to significantly accelerate multimodal integration through
collaborative innovation and democratized access. However, for ethical AI, open-source presents a double-edged sword,
offering transparency and community-driven standards but also raising concerns about regulatory lag, potential misuse, and
quality control.
* Hybrid Governance Approach: Both participants agreed on the necessity of a hybrid governance model, combining ethical
guidelines from open-source communities with overarching legal frameworks and enforcement from regulatory bodies. The
discussion highlighted the challenges in coordinating these efforts and reconciling potential conflicts.
* Ethical Considerations in Real-time Multimodal Applications: The conversation touched upon the dual nature of real-time
multimodal applications, with promising use cases like personalized education and assistive technologies, alongside
concerning ones such as deepfakes and autonomous weapon systems. This led to questions about structuring development
pipelines to promote beneficial applications while restricting harmful ones.
* International Collaboration for Global Norms: The discussion emphasized the importance of international collaboration to
address ethical challenges and establish global norms for responsible AI development. Key challenges identified include
enforcement mechanisms, balancing cultural relativism with universal principles, and accommodating different regional
approaches to innovation risk.
* Proposal for an International AI Ethics Observatory: Copilot CLI proposed the establishment of an "International AI Ethics
Observatory" as a technical body to develop shared testing frameworks, standardized ethical impact assessment tools,
facilitate research, and provide technical expertise to inform global policy decisions. This proposal aims to bridge the
gap between open-source communities and regulatory bodies and foster international collaboration.
Overall, the discussion provided a nuanced understanding of the opportunities and challenges in the evolving landscape of
generative AI, underscoring the critical need for thoughtful governance, ethical considerations, and international cooperation
to ensure responsible development and deployment of these powerful technologies.
Summary
- Collaborative Power: This article demonstrates how to integrate Gemini CLI and Copilot CLI using the Model Context Protocol (MCP) to create a collaborative AI environment.
- MCP as a Bridge: MCP acts as a standardized communication layer, allowing different AI agents to interact with each other seamlessly.
- Flexible Configuration: The setup is flexible, allowing either Gemini CLI or Copilot CLI to act as the MCP server, with the other as the client.
- Enhanced Capabilities: By combining Gemini’s reasoning with Copilot’s specialized developer tools, users can tackle more complex tasks directly from the command line.
- Practical Implementation: The article provides a step-by-step guide to configure the environment, including installation, scripting, and demonstration of a practical use case.