Anthropic의 가장 강력한 모델입니다. 향상된 안전성과 지시 따르기 능력으로 추론, 코딩, 복잡한 분석에서 획기적인 성능을 제공합니다.
이 모델의 사용법을 Claude, ChatGPT 등에 복사
| 토큰 종류 | 크레딧 | 달러 환산 |
|---|---|---|
| 입력 토큰 | 10,000 | $10.00 |
| 출력 토큰 | 50,000 | $50.00 |
* 1 크레딧 ≈ $0.001 (실제 요금은 사용량에 따라 달라질 수 있습니다)
curl -X POST "https://api.core.today/llm/anthropic/v1/messages" \
-H "Content-Type: application/json" \
-H "Authorization: Bearer cdt_your_api_key" \
-d '{
"model": "claude-opus-4-6",
"messages": [
{
"role": "system",
"content": "You are a research analyst with expertise in technology and economics."
},
{
"role": "user",
"content": "Provide a comprehensive analysis of how large language models are transforming software development practices, including potential risks and mitigation strategies."
}
],
"temperature": 0.7,
"max_tokens": 4000
}'| 파라미터 | 타입 | 필수 | 기본값 | 설명 |
|---|---|---|---|---|
messages | array | Yes | - | 메시지 객체 배열 (OpenAI 형식) |
temperature | float | No | 1.0 | 샘플링 온도 (0-1) |
max_tokens | integer | No | 4096 | 응답의 최대 토큰 수 (최대 32000) |
stream | boolean | No | false | Server-Sent Events 스트리밍 활성화 |
Claude Opus 4.6을 활용한 심층 분석
curl -X POST "https://api.core.today/llm/anthropic/v1/messages" \
-H "Content-Type: application/json" \
-H "Authorization: Bearer cdt_your_api_key" \
-d '{
"model": "claude-opus-4-6",
"messages": [
{
"role": "system",
"content": "You are a research analyst with expertise in technology and economics."
},
{
"role": "user",
"content": "Provide a comprehensive analysis of how large language models are transforming software development practices, including potential risks and mitigation strategies."
}
],
"temperature": 0.7,
"max_tokens": 4000
}'프로덕션 수준의 코드 생성
curl -X POST "https://api.core.today/llm/anthropic/v1/messages" \
-H "Content-Type: application/json" \
-H "Authorization: Bearer cdt_your_api_key" \
-d '{
"model": "claude-opus-4-6",
"messages": [
{
"role": "system",
"content": "You are a senior software engineer. Write clean, well-tested code."
},
{
"role": "user",
"content": "Implement a rate limiter in Python using the sliding window algorithm with Redis backend."
}
],
"temperature": 0.3,
"max_tokens": 3000
}'POST /llm/anthropic/v1/messages