Results of the 2016 Man vs Machine Challenge

  • Print

Every year, the Angry Birds AI Competition is followed by a Man vs Machine Challenge, where humans can challenge the four best AI agents. The purpose of this challenge is to see if we have reached the goal of the competition to outperform the best human players in playing new Angry Birds levels. On July 15, we set up 6 computers at the Hilton Midtown New York where people could play Angry Birds. Humans played first from 9.30am to 4.40pm. At 11am on that day, Donald Trump was scheduled to announce his running mate for the coming Presidential election at a press conference at the Hilton Midtown and we were hoping he would participate in our challenge as well. Unfortunately, the press conference was postponed to the day after and he didn't take up our challenge. 

Every participant had to solve 4 Angry Birds levels in 10 minutes. The levels were New York-themed, with the Statue of Liberty and the New York skyline. Since we did not want the birds to destroy these famous landmarks, they were made of unbreakable material. The main challenge in solving these levels was to hit the birds that were not reachable with a direct shot and required some form of reasoning in order to be hit. This turned out to be very difficult for human players. Only a few of the best players managed to solve all four levels, even some of the top players from 2015 didn't solve all levels. 

We were all excited to see how the AI agents performed and if they could cope with these very difficult levels. We ran the four best AI agents from the AIBIRDS 2016 competition at 4.45pm for 10 minutes and were very surprised when agent SEABirds solved the first level after only a minute. DataLab solved the first level shortly after, and another two minutes later SEABirds solved the second level. It looked very good for the AI agents and despite originally thinking the levels would be way too hard for AI, we now all thought AI had a real chance. But unfortunately our hopes were shattered and none of the 4 agents solved another level. Particularly remarkable was that the two finalists from the day before didn't solve a single level. 

In the end, the result was the same as in past years: Humans won the AIBIRDS 2016 Man vs Machine Challenge! Angry Birds is still too difficult for AI to outperform humans. Only three human players had a score of over 200,000 points. The 2016 Man vs Machine Champion Coen van Leeuwen from Delft University of Technology had a score of 203,350, Diedrich Wolter, whose AI agent BamBirds won the AIBIRDS 2016 competition, was only 250 points behind with a score of 203,100 points. The third placed human who only entered his/her name as raph had 200,800 points. The best AI agent of this years challenge, SEABirds, only had a score of 100,440 points. 

While it is disappointing that the AI agents didn't perform better, it is good to see that we are still better at playing Angry Birds than AI and that Angry Birds remains a very challenging problem for Artificial Intelligence. Now that Go has been solved, Angry Birds is the next big AI challenge! 

 

This ends our 2016 competition. A big THANK YOU to all the participants. We hope to see you again in 2017 for an even more exciting Man vs Machine Challenge.

Humans: keep practising

AI Researchers, Developers, Enthusiasts, Hobbyists, Students and anyone else who would like to contribute, you can download our basic game playing software and start adding your own Angry Birds strategies to it or put some AI techniques to the test. Let's see how good we can be in 2017!

 

Jochen Renz, XiaoYu (Gary) Ge, Peng Zhang and Matthew Stephenson

Australian National University