LangMart: Inception: Mercury Coder
Model Overview
| Property | Value |
|---|---|
| Model ID | openrouter/inception/mercury-coder |
| Name | Inception: Mercury Coder |
| Provider | inception |
| Released | 2025-04-30 |
Description
Mercury Coder is the first diffusion large language model (dLLM). Applying a breakthrough discrete diffusion approach, the model runs 5-10x faster than even speed optimized models like Claude 3.5 Haiku and GPT-4o Mini while matching their performance. Mercury Coder's speed means that developers can stay in the flow while coding, enjoying rapid chat-based iteration and responsive code completion suggestions. On Copilot Arena, Mercury Coder ranks 1st in speed and ties for 2nd in quality. Read more in the blog post here.
Description
LangMart: Inception: Mercury Coder is a language model provided by inception. This model offers advanced capabilities for natural language processing tasks.
Provider
inception
Specifications
| Spec | Value |
|---|---|
| Context Window | 128,000 tokens |
| Modalities | text->text |
| Input Modalities | text |
| Output Modalities | text |
Pricing
| Type | Price |
|---|---|
| Input | $0.25 per 1M tokens |
| Output | $1.00 per 1M tokens |
Capabilities
- Frequency penalty
- Max tokens
- Presence penalty
- Response format
- Stop
- Structured outputs
- Temperature
- Tool choice
- Tools
- Top k
- Top p
Detailed Analysis
Inception: Mercury Coder