2013 Competition Rules

  • Print

Eligibility

The competition is open to all registered participants who submit a working Angry Birds game playing program to the organisers by July 30. Remote participation is possible, but attendance is encouraged. Each submission must be accompanied by a one page description of the methods and strategies used by the agent and why it is an improvement over the provided sample agent. This information will not be disclosed to other competitors before the competition. Each person can be a member of at most two teams, unless permission has been obtained from the organizers. This restriction does not apply to academic supervisors of student teams.

 

Game play environment: 

We will be using the Chrome version of Angry Birds in SD mode for our competition. Participants who attend the competition and intend to test and modify their agents while in Beijing should note that the game website may not be accessible in China. We recommend to use the offline version for this purpose.

For each agent, the corresponding Angry Birds game instance will be run on a game server, while the agent will be executed on a client computer. We will support java, C/C++ and python agents. Participants should contact the organisers if they would like to use other programming languages or non-standard software libraries.

Each agent will receive a total of 50MB of local disk space on the client computer to store information. This includes the space required for the agent code. Client computers have no access to the internet and can only communicate with the game server as specified in the communication protocol. No communication with other agents is possible and each agent will only be able to access files in its own directory. 

Each agent will be able to obtain screen shots of the current Angry Birds game state from the server and will be able to submit actions and other commands. The game will be played in SD mode and all screen shots will have a resolution of 840*480 pixels. 

Agents who are trying to tamper with the competition setting or who try to gain an unfair advantage will be disqualified.  

 

Game objects: 

The following objects will be used for the competition levels:

  1. All objects, background, terrain, etc that occur in the first 21 Poached Eggs levels on chrome.angrybirds.com.
  2. In addition, the competition levels may include the white bird, the black bird, and TNT boxes.

No other objects will be used. The vision module of our example code recognises all objects and all birds and pigs, but not the background or the terrain.   

 

Qualification (August 6-7): 

Qualification versions of the game playing agents need to be submitted to the organisers by August 5, 1pm, Beijing time via email. The email address wil be announced to all participants. The number of levels we use in the qualification round depends on the number of participants. The levels used in the competition will not be known in advance to any participant. Throughout the competition, each game level can be accessed and played and replayed in arbitrary order. Participants will have a total time to solve the competition levels that corresponds to a minimum of three minutes per game level on average. For example for 10 levels, there will be a minimum time of 30 minutes to solve the 10 levels.  After the overall time limit is reached, the connection of agents with the game server will be terminated. Agents then have up to two minutes to store information and to stop running. After two minutes the organisers will terminate the agents if they are still running. 

The qualification will be played in two rounds. We will first run the Sample Agent on all qualification game levels for the total time and record its high scores. 

Qualification Round 1: 

During the first qualification round, all participants will have half the total time to solve the qualification game levels, i.e., a minimum of 1.5 minutes per level on average. The order in which the agents will run is random and agents may be run in parallel. As a comparison, all agents can obtain the high score per level obtained by the Sample Agent. 

We will have a dynamically updated leader board where all agents are ranked according to their total score. 

Qualification Round 2: 

In the second qualification round, all agents will have the second half of their total time to solve the same qualification game levels as in round one, i.e., a minimum of 1.5 minutes per level on average. As a comparison, all agents can obtain the overall high score per level obtained in qualification round 1, that is the highest score per level obtained by any agent in qualification round 1. This information can be used by agents, for example, to determine the game levels where they can obtain the highest improvements over their round 1 performance. Agents cannot be modified between round 1 and round 2. 

The order in which we run the agents will depend on the outcome of qualification round 1, starting with the worst performing agent first and the best performing agent last. A certain number of agents will be run in parallel. 

We will have a dynamically updated leader board where all agents are ranked according to their total score. The overall score per agent is the sum of their individual high score per level over qualification rounds 1 and 2. The overall score of agents can only improve in qualification round 2, never get worse. 

The best 16 agents after qualification round 2 will qualify for the finals. This may include the Sample Agent. The agents can be modified and resubmitted to the Finals by 8am on August 8. The email address where to send the executable files to will be made available to the finalists only. If no new version has been submitted, we will use the last version we received.

 

Finals (August 8):

Participants will have a minimum of 3 minutes per game level on average. Agents cannot be modified during the finals. The finals consists of two group stages to determine the four best agents. 

Group Stage 1, Round of 16:

The first stage of the finals is a group stage where the 16 finalists will be divided into four groups according to their qualification ranking. The first four agents will be randomly assigned to a different group, then the next four, then the next four and finally the remaining four.  

The four teams of each group will play in parallel for thirty minutes on the same round of 16 game levels. Any team can query the current group high score for each game level. The two best teams of each group will qualify for the round of 8. 

Group Stage 2, Round of 8:

The second stage of the finals is a group stage where the 8 remaining agents will be assigned to two groups of four. The four winners of stage 1 will be randomly assigned to the two groups, so that each group contains two winners. The four second placed agents will be assigned to the opposite groups as the winners, so no two agents will play in the same group twice. 

The four teams of each group will play in parallel for thirty minutes on the same round of 8 game levels. Any team can query the current group high score for each level. The two best teams of each group will qualify for the semi finals. 

Semi finals: 

The winners of the two round of 8 group will be paired with the second placed agent of the other group. Each semi-final match will be over thirty minutes. Both matches will be using the same semi final game levels. In each match, the teams can query the current high score per level of their match. The winners of the semi-finals qualify for the grand final, the losers will go to the third place match. 

Third place match:

The third place match will be over thirty minutes. Both teams can query the current high score per level of their match. The winner of this match will win the third prize of the competition. 

Grand Final match:

The grand final match will be over thirty minutes. Both teams can query the current high score per level of their match. The winner of this match will be the 2013 Angry Birds AI Champion and winner of the first prize, the loser will win the second prize of the competition. 

The two grand finalists qualify for the Man vs Machine Challenge.

 

Man vs Machine Challenge (August 9):

During the Man vs Machine Challenge we will test if the AI agents can already beat humans at playing Angry Birds. We will have a selection of game levels with three minutes per game level on average. Each game level can be accessed and played and replayed in arbitrary order. Participating human players will play the game levels first. Each player can participate only once. We will keep a leader board where we rank the human players according to their overall score (sum of individual high scores per level).

After the human players are finished, we will run the two best AI agents in parallel on the same game levels with the same time limit. 

The player with the highest overall score, man or machine, will win this challenge. 

 

Note: In case we have an unexpectedly large or small number of participants, the way we run the competition may be modified. For example, we may divide the participating agents into two qualification groups and then only the best eight agents per group qualify for the finals. 

In case of conflicting opinions on how to interpret the competition rules, the organisers decisions are final.