CVPR 2026

WorldArena Challenge

Evaluating Embodied World Models on
Perception and Functional Utility

World Model Challenge
Held in conjunction with the CVPR 2026 Workshop on Video World Models
June 25, 2026 · Denver, USA

Important Dates

2026.03.06

Submission Opens

Challenge submission portal opens for participating teams.

2026.04.25

Mid-competition Ranking Release

The ranking results of mid-stage checkpoints will be released.

2026.05.15

Leaderboard Update

The leaderboard will be updated on this fixed date. For the remaining competition period, no leaderboard updates will be made outside May 15 and May 25.

2026.05.25

Final Submission Deadline & Leaderboard Update

Final date to submit results for Track 1 and Track 2 evaluation, and the second and final fixed leaderboard update date.

2026.06.04

Challenge Session & Awards at the CVPR 2026

Workshop challenge session and award announcements at CVPR 2026.


Challenge Overview

The WorldArena Challenge is the CVPR 2026 world model challenge built on WorldArena, a unified benchmark for evaluating embodied world models beyond visual realism alone.

Our goal is to evaluate whether a world model is not only perceptually convincing, but also functionally useful for embodied decision-making, simulation, and control. The challenge therefore emphasizes both perception quality and functional utility in a single evaluation setting.

Participants compete on two complementary tracks inspired by the official benchmark and the poster overview:

  • Track 1: Video Perception Quality — evaluating visual quality, motion quality, content consistency, physics adherence, 3D accuracy, and controllability.
  • Track 2: Data Engine & Policy Evaluator — evaluating generated worlds as embodied data engines and policy evaluators for downstream tasks.

By combining the two tracks, the challenge provides a holistic benchmark that encourages progress toward world simulators that are realistic, controllable, and useful for real embodied AI applications.


Tracks & Submission

Track 1

Video Quality

Evaluate video perception quality across visual quality, motion quality, content consistency, physics adherence, 3D accuracy, and controllability.

Track 1 QR placeholder
click to view full image
Track Group QR
HOT: Open for submission

Final Submission Deadline:

2026.05.25

Track 2

Data Engine & Policy Evaluator

Evaluate whether generated worlds preserve decision-relevant dynamics and support embodied tasks as data engines and policy evaluators.

Track 2 QR placeholder
click to view full image
Track Group QR
NOTICE: Track-specific resources continue to be updated

Evaluation Tasks:

Track 2 leaderboard evaluation covers the Data Engine and Policy Evaluator tasks.

Close-source models: Please provide API access for evaluation.

Submission Limit: Each team may submit at most twice per day. If a team submits more than twice on the same day, the latest two submissions of that day will be used for evaluation.


Awards

Awards are given separately for the two challenge tracks, and each track includes 1st, 2nd, and 3rd prizes.

Track 1

Video Perception Quality

Awards

3 prizes

1st Prize

Certificate + CVPR Workshop Presentation

$3000

2nd Prize

Certificate

$2000

3rd Prize

Certificate

$1000

Track 2

Data Engine & Policy Evaluator

Awards

3 prizes

1st Prize

Certificate + CVPR Workshop Presentation

$4000

2nd Prize

Certificate

$2500

3rd Prize

Certificate

$1500


Organizers

Mu Xu portrait
Mu Xu
AMAP
Zhicheng Liu portrait
Zhicheng Liu
AMAP
Wei Wu portrait
Wei Wu
Manifold AI
Yu Shang portrait
Yu Shang
THU
Huazhe Xu portrait
Huazhe Xu
THU
Dhruv Shah portrait
Dhruv Shah
Princeton
Tat-seng Chua portrait
Tat-seng Chua
NUS
Hongyang Li portrait
Hongyang Li
HKU
Chen Gao portrait
Chen Gao
THU
Wenwu Zhu portrait
Wenwu Zhu
THU
Xihui Liu portrait
Xihui Liu
HKU
Yong Li portrait
Yong Li
THU

Organization


Contact

If you have any questions or would like to get in touch with us, feel free to reach out via email:

WorldArena1@outlook.com