Skip to content

API Key Configuration

Configure your AI model API Key to unlock code execution, AI code assistant, and smart chat features on this site.

🔒 Privacy Notice

All API Keys are stored only in your browser's localStorage

This website never uploads, collects, or stores any of your API Keys

Configure Model

Select a provider to auto-fill the model list and Base URL. You can also edit them manually. The site's AI features (code assistant, smart chat, code explanation) will use your configured model.

⚠️ No API Key configured yetPlease configure at least one API Key

Supported Models

ProviderBase URLCommon Models
OpenAIhttps://api.openai.com/v1gpt-4.1-nano · gpt-4.1-mini · gpt-4.1 · gpt-4o · o3-mini
DeepSeekhttps://api.deepseek.comdeepseek-chat · deepseek-reasoner
Anthropichttps://api.anthropic.comclaude-sonnet-4-5 · claude-haiku-4-5 · claude-opus-4-5
Kimi (Moonshot)https://api.moonshot.ai/v1kimi-k2 · moonshot-v1-8k · moonshot-v1-128k
Qwen (Tongyi)https://dashscope.aliyuncs.com/compatible-mode/v1qwen-max · qwen-plus · qwen-turbo
Zhipu GLMhttps://open.bigmodel.cn/api/paas/v4glm-4-plus · glm-4 · glm-4-flash
SiliconFlowhttps://api.siliconflow.cn/v1DeepSeek-V3 · Qwen2.5-72B · DeepSeek-R1

How to Get API Keys

OpenAI: Visit platform.openai.com
DeepSeek: Visit platform.deepseek.com
Anthropic: Visit console.anthropic.com
Kimi (Moonshot): Visit platform.moonshot.cn
Qwen (Tongyi): Visit dashscope.console.aliyun.com
Zhipu GLM: Visit open.bigmodel.cn
SiliconFlow: Visit cloud.siliconflow.cn

Verify API Key

After saving your API Key, run the following examples to verify your configuration.

Example 1: Streaming Output Test (OpenAI)

python
import os
from openai import OpenAI

client = OpenAI(api_key=os.environ.get("OPENAI_API_KEY"))

stream = client.chat.completions.create(
    model="gpt-4.1-nano",
    messages=[{"role": "user", "content": "Describe one advantage of Python in one sentence"}],
    stream=True
)

for chunk in stream:
    if chunk.choices[0].delta.content:
        print(chunk.choices[0].delta.content, end="", flush=True)
print()

Example 2: Using DeepSeek

python
import os
from langchain_deepseek import ChatDeepSeek

llm = ChatDeepSeek(
    model="deepseek-chat",
    api_key=os.environ.get("DEEPSEEK_API_KEY")
)
response = llm.invoke("Explain machine learning in one sentence")
print(response.content)

Example 3: Using Anthropic Claude

python
import os
from langchain_anthropic import ChatAnthropic

llm = ChatAnthropic(
    model="claude-sonnet-4-5-20250514",
    api_key=os.environ.get("ANTHROPIC_API_KEY")
)
response = llm.invoke("Explain causal inference in one sentence")
print(response.content)

Example 4: LangGraph Simple Graph

Create a simple LangGraph state graph for AI Q&A:

python
import os
from langgraph.graph import StateGraph, START, END
from langchain_openai import ChatOpenAI
from typing import TypedDict

class State(TypedDict):
    question: str
    answer: str

def answer_node(state: State):
    llm = ChatOpenAI(
        model="gpt-4.1-nano",
        api_key=os.environ.get("OPENAI_API_KEY")
    )
    response = llm.invoke(state["question"])
    return {"answer": response.content}

graph = StateGraph(State)
graph.add_node("answer_node", answer_node)
graph.add_edge(START, "answer_node")
graph.add_edge("answer_node", END)

app = graph.compile()
result = app.invoke({"question": "What is LangGraph? Explain briefly."})
print(result["answer"])

Switching Models

  • Tutorial code uses OpenAI models by default
  • To use other models, modify the model name in the code as shown above
  • The system automatically injects your configured API Key via environment variables

Start Learning

After configuration, begin your learning journey:

Released under the MIT License. Content © Author.