Groq: Llama 4 Scout
Model Overview
| Property | Value |
|---|---|
| Model ID | groq/llama-4-scout-17bx16e-128k |
| Provider | Groq |
| Category | Chat |
| Released | TBD |
| Status | Active |
| Flagship | No |
Description
Llama 4 Scout is a conversational AI model designed for multi-turn dialogue and interactive tasks. Developed by Groq, this model provides solid performance and is suitable for most use cases.
Specifications
| Property | Value |
|---|---|
| Context Window | 128000 tokens |
| Max Output Tokens | N/A |
| Knowledge Cutoff | N/A |
| Modalities | Text |
| Speed Tier | Instant |
| Quality Tier | Standard |
Pricing
| Type | Price |
|---|---|
| Input | $0.11 per 1M tokens |
| Output | $0.34 per 1M tokens |
Capabilities
| Capability | Supported |
|---|---|
| Vision/Image Input | No |
| Tool/Function Calling | Yes |
| JSON Mode | No |
| Streaming | Yes |
| System Prompt | Yes |
Use Cases
- Conversational AI and chatbots
- Customer support automation
- Interactive content generation
- Real-time dialogue systems
- Multi-turn reasoning and problem-solving
Strengths
- Good balance of quality and efficiency
- Lightning-fast inference with minimal latency
- Very large context window for long-form content
- Function calling for structured tool integration
Limitations
- No vision or image input capability
- No guaranteed JSON output mode
Integration with LangMart
Gateway Support:
- Type 1 (Full Platform): Yes
- Type 2 (Cloud Gateway): Yes
- Type 3 (Self-hosted): Yes
API Endpoint:
curl -X POST https://api.langmart.ai/v1/chat/completions \
-H "Authorization: Bearer sk-your-api-key" \
-H "Content-Type: application/json" \
-d '{
"model": "groq/llama-4-scout-17bx16e-128k",
"messages": [{"role": "user", "content": "Hello"}],
"max_tokens": 1024
}'
OpenAI Compatibility: Full compatibility with OpenAI API format.
Related Models
Models in this category and from this provider are available in the LangMart marketplace.
Additional Resources
Data Source: LangMart Provider Models Database Last Updated: 2026-01-02