M

Mistral Medium Model Documentation

Mistral AI
Vision
32K
Context
$2.70
Input /1M
$8.10
Output /1M
N/A
Max Output

Mistral Medium Model Documentation

Model Overview

Model Name: Mistral Medium Provider: Mistral AI Inference Model ID: mistralai/mistral-medium Released: January 10, 2024 Model Type: Closed-source Language Model

Description

A closed-source, medium-sized model from Mistral AI that excels at reasoning, code, JSON, chat, and more. This model performs comparably to other companies' flagship models and represents Mistral's mid-tier offering, positioned between smaller and larger language models in their product lineup.

Technical Specifications

Context Window & Modalities

  • Context Window: 32,000 tokens
  • Input Modality: Text
  • Output Modality: Text
  • Image Input: Not supported

Default Parameters

  • Temperature: 0.3 (default)
  • Architecture: Closed-source prototype
  • Training Focus: Text-based tasks

Configuration Details

  • Instruction Type: Not explicitly documented
  • System Prompts: Not documented
  • Stop Sequences: No default stop sequences specified
  • Deprecation Status: Not deprecated

Pricing

Type Price
Input $2.70 per 1M tokens
Output $8.10 per 1M tokens

Note: Mistral Medium has been deprecated. Consider using Mistral Small or Mistral Large for new projects.

Capabilities

The Mistral Medium model excels at:

  1. Reasoning Tasks - Complex logical reasoning and problem-solving
  2. Code Generation - Writing and understanding code across programming languages
  3. JSON Handling - Structured data generation and parsing
  4. Chat Applications - Multi-turn conversational interactions
  5. General Conversation - Natural language processing and responses

Limitations

  • Image Processing: No image input or vision capabilities
  • Real-time Data: Training data has knowledge cutoff
  • No Multimodal: Text-only input and output
  • Activity Metrics: Usage statistics not yet tracked on LangMart platform

From Mistral AI:

  • Mistral 7B - Smaller, more efficient model
  • Mistral Large - Larger, more capable model
  • Mistral Next - Latest generation model

Availability

  • Supported Providers: Available through LangMart
  • Access: Public API via LangMart platform
  • Usage Data: Activity and usage metrics not yet available on platform

Usage Examples

Basic Chat Completion

curl -X POST https://api.langmart.ai/v1/chat/completions \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "mistralai/mistral-medium",
    "messages": [
      {
        "role": "user",
        "content": "Explain quantum computing in simple terms."
      }
    ]
  }'

Code Generation

curl -X POST https://api.langmart.ai/v1/chat/completions \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "mistralai/mistral-medium",
    "messages": [
      {
        "role": "user",
        "content": "Write a Python function to sort a list of dictionaries by a specific key."
      }
    ]
  }'

JSON Processing

curl -X POST https://api.langmart.ai/v1/chat/completions \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "mistralai/mistral-medium",
    "messages": [
      {
        "role": "user",
        "content": "Convert this data into JSON format: Name: John, Age: 30, City: New York"
      }
    ],
    "temperature": 0.3
  }'

Multi-turn Conversation

curl -X POST https://api.langmart.ai/v1/chat/completions \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "mistralai/mistral-medium",
    "messages": [
      {
        "role": "user",
        "content": "What is machine learning?"
      },
      {
        "role": "assistant",
        "content": "Machine learning is a subset of artificial intelligence..."
      },
      {
        "role": "user",
        "content": "Can you give me some practical examples?"
      }
    ]
  }'

Integration with LangMart Design

Connection Configuration

When adding Mistral Medium to LangMart:

  1. Provider: Mistral AI
  2. Connection Type: API Key Authentication
  3. Model ID: mistralai/mistral-medium
  4. Endpoint: LangMart API (https://api.langmart.ai/v1)
{
  "model": "mistralai/mistral-medium",
  "temperature": 0.3,
  "max_tokens": 2000,
  "top_p": 0.9,
  "provider": "mistralai"
}

Performance Characteristics

  • Comparable Performance: Performs at levels similar to other flagship models
  • Latency: Standard LangMart API latency
  • Throughput: Support for batch processing via LangMart
  • Reliability: Consistent performance for mid-tier inference tasks

References


Last Updated: December 23, 2025 Source: LangMart Model Registry