iQuest Coder icon
iQuest Coder icon

iQuest Coder

A new family of code large language models (LLMs) designed to advance autonomous software engineering and code intelligence.

Cost / License

  • Free
  • Open Source

Platforms

  • Self-Hosted
  • Python
-
No reviews
0likes
0comments
0news articles

Features

Suggest and vote on features
  1.  AI-Powered

 Tags

iQuest Coder News & Activities

Highlights All activities

Recent activities

Show all activities

iQuest Coder information

  • Developed by

    IQuestLab
  • Licensing

    Open Source (MIT) and Free product.
  • Alternatives

    11 alternatives listed
  • Supported Languages

    • English

AlternativeTo Category

AI Tools & Services
iQuest Coder was added to AlternativeTo by Paul on and this page was last updated .
No comments or reviews, maybe you want to be first?
Post comment/review

What is iQuest Coder?

A new family of code large language models (LLMs) designed to advance autonomous software engineering and code intelligence. Built on the innovative code-flow multi-stage training paradigm, IQuest-Coder-V1 captures the dynamic evolution of software logic, delivering state-of-the-art performance across critical dimensions:

  • State-of-the-Art Performance: Achieves leading results on SWE-Bench Verified (76.2%), BigCodeBench (49.9%), LiveCodeBench v6 (81.1%), and other major coding benchmarks, surpassing competitive models across agentic software engineering, competitive programming, and complex tool use.
  • Code-Flow Training Paradigm: Moving beyond static code representations, our models learn from repository evolution patterns, commit transitions, and dynamic code transformations to understand real-world software development processes.
  • Dual Specialization Paths: Bifurcated post-training delivers two specialized variants—Thinking models (utilizing reasoning-driven RL for complex problem-solving) and Instruct models (optimized for general coding assistance and instruction-following).
  • Efficient Architecture: The IQuest-Coder-V1-Loop variant introduces a recurrent mechanism that optimizes the trade-off between model capacity and deployment footprint.
  • Native Long Context: All models natively support up to 128K tokens without requiring additional scaling techniques.

Official Links