Cursor Taps SpaceX Supercomputing to Scale Agentic Coding Model Training

Composer, its flagship agentic coding model. This follows the release of Composer 2.This partnership addresses a compute bottleneck that has limited the team's ability to push model capabilities. While previous iterations like Composer 1.5 scaled reinforcement learning by 20x, access to Colossus-scale hardware enables a massive expansion of these techniques. This helps Cursor compete with larger labs.
You can expect these infrastructure gains to manifest as more capable versions of Composer within the Cursor IDE. The team aims to translate increased compute into models that handle complex architectural tasks with higher reliability. Current users can continue using the latest coding model, which already benefits from real-time RL checkpoints.
Frequently asked questions
- What is the partnership between Cursor and SpaceX?
- Cursor is partnering with SpaceX to accelerate its AI model training efforts. Through this collaboration, the Cursor team will gain access to xAI's Colossus infrastructure, a massive supercomputing cluster. This partnership is designed to help Cursor overcome current compute bottlenecks and significantly scale the intelligence of its proprietary coding models.
- What is Cursor Composer?
- Composer is Cursor's flagship agentic coding model, designed to autonomously plan and execute complex programming tasks across multiple files. It was first released less than six months ago. Subsequent versions, such as Composer 1.5 and Composer 2, have introduced scaled reinforcement learning and continued pretraining to reach frontier-level performance for software engineering tasks.
- Why is Cursor using xAI's Colossus infrastructure?
- Cursor is leveraging the Colossus infrastructure because its team has been bottlenecked by the amount of compute available for training. By using this high-performance hardware, Cursor can scale up its reinforcement learning and pretraining efforts. The company has found that each increase in compute has historically resulted in meaningfully more capable and intelligent coding models.
- How does this partnership affect Cursor's model training?
- The partnership allows Cursor to move beyond the training limits of its previous models. While Composer 1.5 scaled reinforcement learning by 20x, the access to SpaceX-linked infrastructure enables even larger scaling. This compute-heavy approach is intended to improve the reasoning and execution capabilities of the AI agents that power the Cursor code editor.
- When will the new SpaceX-trained models be available in Cursor?
- The announcement focuses on the partnership for model training rather than a specific release date for a new model version. However, Cursor has a history of rapid iteration, having released Composer 2 and Composer 1.5 within six months of the original launch. The team intends to use the new compute to develop the next generation of intelligent coding features.

