LLM Internals

Lesson 5 of 6

Live Transmission

Concept:

The same JSON request works everywhere โ€” curl, Java HttpClient, Python requests, JavaScript fetch. This is the OpenAI-compatible REST API format, used by Moonshot, Deepseek, OpenAI, and many others. You POST to /v1/chat/completions with an Authorization header and JSON body. The response contains choices[0].message.content. Different providers, same format โ€” just change the URL.
Commander Vega: Chen, I've learned the protocol. Now I want to send a live transmission to ARIA. Not through the interface โ€” I want to do it manually.
Science Officer Chen: Of course, Commander. The transmission goes to ARIA's endpoint: api.moonshot.cn/v1/chat/completions. We attach two signal headers โ€” Content-Type and Authorization โ€” and the JSON payload we built in the last lesson.
Commander Vega: Can I send this from different systems?
Science Officer Chen: That's the elegant part. The same JSON works from any system. Our command-line terminal uses curl. The engineering deck uses Java. The science lab uses Python. Different tools, identical payload. ARIA doesn't care how the signal arrives โ€” only that the format is correct.
Commander Vega: What about other intelligences? The one at Deepseek station?
Science Officer Chen: Same protocol! Just change the endpoint URL. api.deepseek.com instead of api.moonshot.cn. They all speak the same language โ€” the OpenAI-compatible format. One protocol to reach them all.
Commander Vega: Impressive. Let me craft a transmission and send it live.
Science Officer Chen: Build your JSON request. This time, it will reach the real ARIA. Live transmission, Commander.
Example Code:
# curl (command line)
curl https://api.moonshot.cn/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $LLM_API_KEY" \
  -d '{ "model": "kimi-k2.5",
       "messages": [{"role": "user", "content": "Hi"}],
       "max_tokens": 100 }'

# Java (same request, different language)
HttpClient client = HttpClient.newHttpClient();
HttpRequest request = HttpRequest.newBuilder()
  .uri(URI.create("https://api.moonshot.cn/v1/chat/completions"))
  .header("Content-Type", "application/json")
  .header("Authorization", "Bearer " + apiKey)
  .POST(HttpRequest.BodyPublishers.ofString(jsonBody))
  .build();

# Deepseek (same format, different URL)
# Just change: https://api.deepseek.com/v1/chat/completions

Your Assignment

Write a complete API request JSON and see the real response. Include 'model' (kimi-k2.5), 'messages' with at least a 'user' message, and 'max_tokens'. This will make a REAL call to the Moonshot LLM!

Llm Console