JOBIM J-Factor Mistral 8x22B

MoE architecture, fully compressed. Best-in-class for math & science.

Model Overview

Base Model
Mistral 8x22B (MoE)
Endpoint
/v1/models/jobim-jfactor-mistral-8x22b

The J-Factor Advantage

Full 8-expert MoE, compressed to run on one GPU. No expert routing overhead.

94.3%
Compression
6.1 TPS
Throughput
8 Experts
Active

Best For

  • CheckmarkAdvanced mathematics
  • CheckmarkScientific paper analysis
  • CheckmarkMulti-step reasoning
  • CheckmarkDomain-specific routing
  • CheckmarkResearch prototypes

Code Example

cURL
curl https://api.jobim.ai/v1/chat/completions \
  -H "Authorization: Bearer $JOBIM_API_KEY" \
  -d '{
    "model": "jobim-jfactor-mistral-8x22b",
    "messages": [{"role": "user", "content": "Solve: ∫(x² + 2x + 1)dx"}]
  }'