Results of the 2012 Angry Birds AI Challenge

On the first day of the competition, all submitted agents were tested on the competition setting using some basic Angry Birds levels. Feedback was given to all participants and allowed them to make some final changes to their agents. The final version had to be submitted one hour before the start of the main competition on the second day. Since three hours were allocated for the competition, the organizers decided to run them in batches of three agents for one hour each. This left one slot open for which the Naive Agent was used. This gave a total of nine participants, most of which participated remotely.

The competition was held on site at the main conference and the game play of the competing agents was transmitted to a large screen in the conference foyer. Watching the agents play was very exciting, particularly for team members of participating agents. None of the agents took the opportunity to solve levels again in order to improve their score. Once a level was solved, all agents proceeded to the next level. If all ten levels were solved, the agents stopped. Since agents were not allowed to jump levels, some agents were stuck at a level they could not solve until they ran out of time.

In the end, only 3 agents were able to solve all ten levels, ABC-IS-course-UNICAL, Zonino, and the Naive Agent. The overall winner was, surprisingly, the Naive Agent with a total score of 526,590. Birdbrain, despite only solving eight levels was second with a score of 501,200. Third was ABC-IS-course-UNICAL with a score of 492,980 and fourth was Zonino with 472,910. Taking the high score for all levels across all agents, the total score was 605,600. Birdbrain holds six high scores and the Naive Agent four high scores. Due to the unlimited setting which allowed team Birdbrain to access the game and to simulate shots, they were very good at levels that could be solved with one bird without tapping.

 On day three of the competition, he general public was invited to compete against the best AI agents. Each participant had to play the same ten levels as the AI agents the day before. The high scores of the AI agents were displayed and each participant could play levels again if they were lower than the AI agents. As opposed to the AI agents, humans took this opportunity to their advantage to play levels again. Despite having this opportunity, only 2 human players had a higher score than the Naive Agent. The winner, Dengji Zhao, who played the game for the first time had a total score of 556,510. The second placed human, a very experienced player, had a score of 545,870, and the third ranked human a score of 503,450. The overall high score of all human participants for all levels was 611,380, only slightly higher than that of the AI agents.