Skip to content

use GitHub Copilot LLM for local model testing #103

Open
@phodal

Description

@phodal

Describe the solution you'd like
Since copilot support for Using Copilot's LLM for your agent

  1. 读 ~/.config/copilot/apps.json 中的 oauthToken
  2. 向一个 xxx'/internal 的地址请求 apikey
  3. 向 /completion 发送真正的 llm 请求

We can use copilot as our llm service engine, like DeepSeek, GLM and OpenAI.

### 2. fetch apiToken
GET https://api.github.com/copilot_internal/v2/token
Authorization: token ghu_xxx
Accept: application/json

### 3. Send POST to github copilot
POST https://api.githubcopilot.com/chat/completions
Authorization: Bearer xxxx
Editor-Version: Zed/1.89.3
Content-Type: application/json
Copilot-Integration-Id: vscode-chat

{
  "messages": [
    {
      "role": "user",
      "content": "hi!"
    }
  ],
  "model": "o1",
  "intent": true,
  "n": 2,
  "temperature": 0.1,
  "stream": false
}

Additional context
可以直接参考 Zed 的实现,复制相关的逻辑: https://rg.gosu.cc/zed-industries/zed/refs/heads/main/crates/copilot/src/copilot_chat.rs

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions