IJCAI 2016 Angry Birds AI Competition - Call for Participation

July 14-15, 2016, New York, USA

Angry Birds is a popular video game where the task is to shoot birds with different properties from a slingshot at a structure that houses pigs and to destroy the pigs. The structure can be very complicated and can involve a number of different object categories with different properties. The game and the structure largely observes the laws of physics and it is possible to infer how the structure will change when hit at a certain position.

The task of this competition is to develop an intelligent Angry Birds playing agent that is able to successfully play the game autonomously and without human intervention. The long term goal is to build AI agents that can play new levels better than the best human players. This may require analyzing the structure of the objects and to infer how to shoot the birds in order to destroy the pigs and to score most points. In order to successfully solve this challenge, participants can benefit from combining different areas of AI such as computer vision, knowledge representation and reasoning, planning, heuristic search, and machine learning. Successfully integrating methods from these areas is one of the great challenges of AI.

Since it cannot be expected that all participants can develop all these capabilities themselves, the organizers will provide a basic game playing software that is implemented using Java and includes the following components:

  • a computer vision component that can analyse a video game frame and identifies the location, category and bounding box of all relevant objects plus the game score
  • a trajectory component that calculates trajectories of birds and computes where to shoot from in order to hit a given location
  • a game playing component that executes actions and captures screen shots

Participants are free to use these components or can develop their own components. Note that there is a small amount of uncertainty in the output that the supplied components produce and participants should take this into account when developing their programs.

The basic game playing software includes a sample agent that demonstrates the use of the provided components. The sample agent only considers birds, pigs, and the slingshot, and shoots birds directly at the detected pigs using random trajectories. The sample agent also serves as the baseline that all participants have to beat.

The basic game playing software including the sample agent can be downloaded here. We also offer a discussion forum for participants.


The Competition

The IJCAI 2016 Angry Birds AI Competition is designed to test the abilities of Angry Birds playing agents on a variety of Angry Birds levels. The competition will be run using a client/server architecture, where the game server runs an instance of Angry Birds Chrome game for each participating Angry Birds agent. Agents run on a client computer and communicate with the server via a given protocol that allows agents to obtain screen shots of their game window from the server at any time. Agents can also obtain the current competition high scores for each level from the server. In return, agents send the server their shooting actions (release coordinate and tap time) which the server will then execute in the corresponding game window. More details about the competition rules and the communication protocol can be found here. A summary of the previous Angry Birds AI competitions can be found here. It also includes an overview of the performance of the participating agents that can be used as a comparison.

The IJCAI 2016 Angry Birds AI Competition will run over two days, from July 14-15, 2016 during the IJCAI 2015 conference. The AI Competition will be held on July 14. The number of teams that can enter the competition will be limited. In case a larger number of teams enter the competition than what we can handle in one day, we will run a qualification round on July 13. The organisers reserve the right to end registration early, if the number of registered participants exceeds the number of teams that can be processed during the qualification round.

During the competition, there will be a time limit to play a given set of Angry Birds levels automatically and without any human intervention. All levels of the current round can be accessed at any time. The competition will be played over multiple knock-out rounds. In each round, the agents to achieve the highest combined game score over all levels will proceed to the next round. The agent with the highest combined game score in the grand final will be the winner of the competition.

Note that the actual game levels used during the competition will not be disclosed to the participants in advance. However, participants will be informed in advance about the birds and the object categories used in the competition game levels, so that their behaviour can be learned in advance.

During the competition, each team will be given the opportunity to briefly introduce their team, their agent, and the methods and strategies used before we run their agent. Participants are encouraged to prepare a poster with this information that will be displayed during the competition. Remote participants can email a poster consisting of up to 9 A4/Letter pages.

In addition to the standard track, we will offer a competitive track where agents try solve the same level with alternating shots. The agent who makes the winning shot scores all the points of the level. Before the start of each level, each agent makes a concealed bid for the right of the first shot. The agent with the higher bid wins the right of the first shot and in case it wins the level, it has to pay the bid amount to the other agent. More details about this competitive track are given in the competition rules



The competition is open to all registered participants who submit a working Angry Birds game playing program to the organisers by July 5. This agent will be used for testing compatibility with the competition framework. Participants can still modify their agent after July 5, but no further compatibility tests will be made. Remote participation is possible, but attendance is encouraged. Each submission must be accompanied by a one page description of the methods and strategies used by the agent. This information will not be disclosed to other competitors before the competition.



All participating teams need to register, one registration per team. Early registration deadline is June 14. Registration is USD 100 per team for early registration and USD 150 per team for late registration. Teams who participate in both tracks will have to pay an additional USD 50. 

All participants who attend the competition in person will also need to register for IJCAI 2016. 

Each person can be a member of at most two teams, unless permission has been obtained from the organisers. This restriction does not apply to supervisors of student teams.



The teams of the three best agents will receive a trophy as well as a certificate.


The IJCAI 2016 Symposium on AI in Angry Birds

During the competition, there will be a limited opportunity to present original scientific work related to the problems of developing an Angry Birds playing agent. This includes but is not limited to problems related to computer vision, heuristic search, knowledge representation and reasoning, AI planning, diagnosis, and machine learning in the context of Angry Birds. 

Papers need to be submitted by the submission deadline and will be selected by the organisers. Submitted papers will not be made available to any competition participants before the start of the competition.

Please see the Call for Papers for further details. 


The Angry Birds: Man vs Machine Challenge

In the Angry Birds: Man vs Machine Challenge on July 15 we give interested participants the opportunity to compete with the winners of the Angry Birds AI competition. Participants can test if their Angry Birds playing skills are still better than those of the best AI agents. This might be the last chance to beat AI and to become quite possibly the last human to win this challenge.

Participants will play a number of Angry Birds levels within a given time limit, each participant can play only once. Current and previous levels can be played again. For each participant we will record their personal high scores for each of the solved levels. Once the human participants are finished, we will run the best Angry Birds agents from the AI competition on the same levels with the same time limit. 

The participant with the highest combined score, man or machine, will be the winner of this challenge. The winner will be awarded with a trophy and a certificate. Further details can be found here.


Summary of Important Rules

  • Attending the IJCAI 2016 conference is not required for participating in the competition. Remote participation will be possible, provided that participants have tested in advance that their game playing agents run on our competition environment. However, all participants who attend the competition in person will need to register for IJCAI 2016 in addition to registering for the competition.  
  • Participating agents can be developed in Java, C/C++ or python and we strongly recommend to use Java. Additional requirements need to be agreed upon with the organisers and need to be tested before July 5.
  • Competition versions of the game playing agents need to be submitted to the organisers by July 13, 10am New York time. This deadline is final. If no new version has been submitted, we will use the last version we have received. 
  • We will be using the Chrome version of Angry Birds for our competition. This version has been officially discontinued. Please refer to the AIBIRDS forum for instructions on how to run your agent for the competition. 
  • Agents will not have access to the internet and will not be able to communicate other than with the server via the specified communication protocol.
  • Each person can only be affiliated with at most two participating agents per track unless permission has been obtained from the organisers. This restriction does not apply to supervisors of student teams.



  • Jochen Renz, Australian National University
  • XiaoYu (Gary) Ge, Australian National University
  • Peng Zhang, Australian National University
  • Matthew Stephenson, Australian National University

All enquiries should be made by email to This email address is being protected from spambots. You need JavaScript enabled to view it. .