Results of the 2017 Man vs Machine Challenge

  • Print

Every year, the Angry Birds AI Competition is followed by a Man vs Machine Challenge, where humans can challenge the four best AI agents. The purpose of this challenge is to see if we have reached the goal of the AIBIRDS competition to outperform the best human players in playing new Angry Birds levels. A recent survey among AI researchers found that AIBIRDS was considered the AI Milestone where AI will beat humans next, so the pressure was on for AI to perform well. 

The AI competition the day before had some surprising results. The Champions of the last three years were all eliminated in the Quarter Finals and we had a new AIBIRDS Champion with Eagle's Wing from Canada. One possible conclusion of this result seems to be that we have a big progress in the AI performance this year and should come closer to our goal of beating humans. Of course the result could also be due to our choice of levels for the competition. This year for the first time we selected some levels for the AI competition that required some form of reasoning in order to achieve a high score. For example. in some levels AI agents needed to make some non-obvious first shot in order to clear the level with the second shot. IHSEV and the new Champion Eagle's Wing solved almost all of these levels and established themselves as the two best agents this year. 

We selected similar "reasoning levels" for the Man vs Machine Challenge on Friday August 25. Humans were playing 4 new levels for 10 minutes each starting at 10.30am. Interest from humans was large, but unfortunately we had issues with our laptops and could only prepare 3 laptops for humans to play the 4 levels rather than the usual 6 laptops. Like in previous years, human players played surprisingly well and most players were able to solve all 4 new levels within the 10 minutes available. It is obvious that the IJCAI participants are better and better prepared every year we have the challenge, the Angry Birds playing skills of the AI community are clearly improving. 

Our hopes for AI beating humans this year were shattered soon after this years four best AI agents started playing at 4pm. Those were Eagle's Wing, IHSEV, Angry Hex and PlanA+, the four Semi Finalists. The agents were very slow at solving the 4 levels and didn't achieve very high scores. Levels similar to those the agents easily solved the day before now seemed too hard. It was almost as if the AI agents had stage fright and didn't dare to show their full skills. After the 10 minutes were up the results were shocking: only one human player was worse than the best AI agent! PlanA+. We couldn't believe it.

What is obvious from this result is that humans are brilliant at the type of reasoning that is required to solve the levels we designed this year. This is the type of reasoning that humans need every day for dealing with their physical environment. It is required for predicting consequences of ones own physical actions before executing them and for estimating what actions (out of infinitely many) are needed to achieve a given task. These capabilities are required for building Trusted Autonomous Systems that can safely interact with their physical environment. Developing these capabilities is one of the main motivation of our competition, which allows us to develop them in a simplified and controlled physical environment.

A much better performance than what we have seen today is needed for beating humans in playing Angry Birds. And a much better performance is needed for developing Trusted Autonomous Systems. We still seem far away from achieving this important AI Milestone. But we are hopeful when looking at the performance the agents demonstrated during the AI competition and expect to see many improvements for our next competition in 2018.  

 

Now to the Winners of this years Man vs Machine Challenge:

1. 178,290 Sebastian Rudolph from the University of Dresden, Germany

2. 174,870 Jenny Cui

3. 172,520 Przemek Walega

4. 171,990 Guillaume Derwal

5. 171,450 Shahaf Shperberg

 

Congratulations to Sebastian and the other participants for beating all the AI agents!  

While it is disappointing that the AI agents didn't perform better, it is good to see that we are still better at playing Angry Birds than AI and that Angry Birds remains a very challenging problem for Artificial Intelligence. Now that Go has been solved, Angry Birds is the next big AI challenge! 

 

This ends our 2017 competition. A big THANK YOU to all the participants. We hope to see you again in 2018 for an even more exciting Man vs Machine Challenge.

Humans: keep practising

AI Researchers, Developers, Enthusiasts, Hobbyists, Students and anyone else who would like to contribute, you can download our basic game playing software and start adding your own Angry Birds strategies to it or put some AI techniques to the test. You can also download the source code of some of the best agents, including this years and past years winners. Let's see how good we can be in 2018!

 

Jochen Renz, XiaoYu (Gary) Ge, Peng Zhang and Matthew Stephenson

Australian National University