2017 Competition Rules
In 2017 we will only have the standard competition track. If you are interested in the competitive track we ran in past years, please contact the organisers.
Summary of Important Rules (Standard Track)
- Attending the IJCAI 2017 conference is not required for participating in the competition. Remote participation is possible, provided that participants have tested in advance that their game playing agents run on our competition environment. However, all participants who attend the competition will need to register for IJCAI 2017.
- Participating agents can be developed in Java, C/C++ or python and we strongly recommend to use Java. Additional requirements need to be agreed upon with the organisers and need to be tested well in advance.
- Competition versions of the game playing agents need to be submitted to the organisers by August 23, 10am Melbourne time. This deadline is final. If no new version has been submitted, we will use the last version we have received.
- We will be using the Chrome version of Angry Birds for our competition in SD mode. If you have trouble running this version of Angry Birds, please check here for a possible solution (requires login).
- Agents will not have access to the internet and will not be able to communicate other than with the server via the specified communication protocol.
- Each person can only be affiliated with at most two participating agents per track unless permission has been obtained from the organisers. This restriction does not apply to supervisors of student teams.
The competition is open to all registered participants. Participants need to ensure that their agent is compatible with the competition framework. This will be made available for testing. The competition agents must be submitted by August 23, 2017, 10am (Melbourne time) by email and must be accompanied by a team description of the methods and strategies used by the agent. The email address will be announced to all participants.
Each person can be a member of at most two teams per track, unless permission has been obtained from the organisers. This restriction does not apply to academic supervisors of student teams.
Remote participation is possible, but attendance is encouraged.
Game play environment:
We will be using the Chrome version of Angry Birds in SD mode for our competition. If you have trouble running this version of Angry Birds, please check here for a possible solution (requires login). For each agent, the corresponding Angry Birds game instance will be run on a game server, while the agent will be executed on a client computer. We will support java, C/C++ and python agents and can run the agents under either Windows or Linux. Participants should contact the organisers if they would like to use other programming languages or non-standard software libraries.
Each agent will receive a total of 250MB of local disk space on the client computer to store information. This includes the space required for the agent code. Client computers have no access to the internet and can only communicate with the game server as specified in the communication protocol (see ServerClientProtocols.pdf that is part of the game playing software package). No communication with other agents is possible and each agent will only be able to access files in its own directory.
Each agent will be able to obtain screen shots of the current Angry Birds game state from the server and will be able to submit actions and other commands. The game will be played in SD mode and all screen shots will have a resolution of 840*480 pixels.
Agents who are trying to tamper with the competition setting or who try to gain an unfair advantage will be disqualified.
The following objects will be used for the competition levels:
- All objects, background, terrain, etc that occur in the first 21 Poached Eggs levels on chrome.angrybirds.com.
- In addition, the competition levels may include the white bird, the black bird, TNT boxes, triangular blocks and hollow blocks (triangle and squares).
No other objects will be used. The vision module of the provided game playing software recognises all relevant game objects and all birds and pigs, including the terrain, but not the background. All competition levels will use the same background that occurs in the first 21 Poached Eggs levels.
Initial Agent Ranking (August 23, 2017):
In order to make a fair assignment of agents to groups in the first competition round, we will run all agents on a set of game levels from the 2016 finals. The number of levels we use in the initial agent ranking depends on the number of participants. Each agent has the same time available, which will be a minimum of 3 minutes per level on average. Each game level can be accessed and played and replayed in arbitrary order. Agents will be run in groups of four agents as described below. The overall score is determined as the sum of the agent's individual high scores for all solved levels.
The 16 best agents will qualify for the main competition on August 24.
Main AI Competition (August 24, 2017):
The competition consists of four group stages to determine the two best agents. For each stage we will have a dynamically updated leader board where all agents are ranked according to their total score.
The levels used in the competition will not be known in advance to any participant. Throughout the competition, each game level can be accessed and played and replayed in arbitrary order. Agents will have a total time to solve the competition levels that corresponds to a minimum of three minutes per game level on average. For example for 10 levels, there will be a minimum time of 30 minutes to solve the 10 levels. After the overall time limit is reached, the connection of agents with the game server will be terminated. Agents then have up to two minutes to store information and to stop running. After two minutes the organisers will terminate the agents if they are still running. Agents cannot be modified during the competition.
Group Stage 1, Round of 16:
The first stage of the competition is a group stage where up to 16 agents will be assigned to four groups of four according to the initial agent ranking ranking. The four highest ranked agents will be assigned to four different groups, agents ranked 5-8 will be assigned to four different groups and so on.
The four teams of each group will play in parallel for the assigned duration on the same round of 16 game levels. Any team can query the current group high score for each level (but not the high scores of other groups). The two best teams of each group will qualify for the quarter final.
Group Stage 2, Round of 8 (Quarter Final):
The second stage of the competition is a group stage where the 8 qualifying agents will be assigned to two groups of four according to the overall score they achieved in the round of 16. Agents ranked 1, 4, 5, 8 will play in group 1, agents ranked 2, 3, 6, 7 will play in group 2.
The four teams of each group will play in parallel for the assigned duration on the same round of 8 game levels. Any team can query the current group high score for each level (but not the high scores of other groups). The two best teams of each group will qualify for the semi final.
Group Stage 3, Round of 4 (Semi Final):
The four qualifying teams will play in parallel for thirty minutes on the same round of 4 game levels. Any team can query the current high score for each level. The two best teams will qualify for the Grand Final, the third and fourth placed agents will be third and fourth prize winners of the competition.
Grand Final match:
The Grand Final match will be over thirty minutes. Both teams can query the current high score per level of their match. The winner of this match will be the 2017 AIBIRDS Champion and winner of the first prize, the loser will win the second prize of the competition.
The four semi-finalists qualify for the Man vs Machine Challenge.
Man vs Machine Challenge (August 25, 2017):
During the Man vs Machine Challenge we will test if the best AI agents can already beat humans at playing Angry Birds. We will have a selection of game levels with three minutes per game level on average. Each game level can be accessed and played and replayed in arbitrary order. Participating human players will play the game levels first. Each player can participate only once. We will keep a leader board where we rank the human players according to their overall score (sum of individual high scores per level).
After the human players are finished, we will run the four best AI agents in parallel on the same game levels with the same time limit.
The player with the highest overall score, man or machine, will win this challenge.
In case of conflicting opinions on how to interpret the competition rules, the organisers decisions are final.