Foxes ask:
What will be the level of accuracy achieved by the first prize winner in the 2024 ARC Prize artificial general intelligence competition?
Closed Nov 11, 2024 08:01AM UTC
ARC Prize, Inc. is an organization sponsoring a competition for new AI systems and measuring their accuracy using the Abstraction and Reasoning Corpus for Artificial General Intelligence (ARC-AGI) benchmark (ARC Prize - Competition). Per ARC Prize, Inc., "Humans easily score 85% in ARC, whereas the best AI systems only score 34%" (Kaggle - ARC Prize 2024, ARC Prize - ARC). The question will be suspended on 10 November 2024 and the outcome determined once the winners are announced. If the organizers postpone the deadline, the closing date will be rescheduled accordingly.
Confused? Check our FAQ or ask us for help. To learn more about Good Judgment and Superforecasting, click here.
To learn more about how you can become a Superforecaster, see here. For other posts from our Insights blog, click here.
The question closed "At least 40%, but less than 55%" with a closing date of 11 November 2024.
See our FAQ to learn about how we resolve questions and how scores are calculated.
Possible Answer | Correct? | Final Crowd Forecast |
---|---|---|
Less than 40% | 0% | |
At least 40%, but less than 55% | 2% | |
At least 55%, but less than 70% | 94% | |
At least 70%, but less than 85% | 3% | |
85% or higher | 1% |
Crowd Forecast Profile
Participation Level | |
---|---|
Number of Forecasters | 45 |
Average for questions older than 6 months: 187 | |
Number of Forecasts | 370 |
Average for questions older than 6 months: 542 |
Accuracy | |
---|---|
Participants in this question vs. all forecasters | better than average |