Competitive Track Software

In past competitions, we ran a Competitive Track where two agents compete by playing the same level with alternating shots. The agent who scores the winning shot gets all the points of the level. Since participation was low, we decided not to offer the competitive track in 2017. Here you can still download the required software. If you are interested in participating in a competitive track competition, please contact the organisers. If we get enough interest, we will offer it again.

 

Here is how it works:

Every agent plays the same set of levels pairwise against all other agents. Agents are paired in random order. The agent with the most points overall wins. Before playing a level, both agents make a concealed bid of points. The agent with the higher bid makes the first shot. It is also possible to make a negative bid if an agent wants to make the second shot. If an agent gets to make the shot it was bidding for and ends up winning the level, it gets all the points of the level, but has to pay the positive bid amount of its initial bid to the other agent. 

For example, agent A bids +12000 points on level 1 and agent B bids +10000 points. Agent A makes the first shot, agent B the second shot, agent A the third shot and so on, until a level is won or lost. If agent A scores the winning shot and the score is 28000 points, then agent A gets 28000 points, but has to give 12000 points to agent B. If agent B wins, it can keep all points. 

If agent C bids -10000 points on level 1 and agent D bids +18000 points, then agent D makes the first shot and agent C the second shot. If agent C scores the winning shot worth 27000 points, then agent C gets 27000 points, but has to pay 10000 points to agent D, since agent C got to make the second shot it was bidding for. If agent D scores the winning shot, it gets 27000 points, but has to pay 18000 points to agent C, since agent D got to make the first shot it was bidding for. 

 

The idea of this track is that agents analyse levels to the extent that they try to determine in advance how a level can be solved, with how many shots and how much it will be worth. Then an agent can determine what to bid and what shot to make during its turn. Of course it might be entirely possible that an agent wins this track who doesn't do any of this well, but has some other cunning strategy. 

We have prepared a demo agent that participants can use and modify, as well as a game server software that collects bids, determines the order of play, makes sure that agents are playing in alternate order, and randomly pairs agents. The demo agent demonstrates how to use the new communication protocol that we designed for this track and its commands. It does not use any particular strategy, that's up to the participants to design. The software can be downloaded here. The package includes additional documentation about the communication protocol and the new track. We have also prepared a Getting Started page that describes how to use the demo agent and the game server. Further important information is given in the 2016 Competition Rules

 

Some things are important to know:

  • Agents can only interact with the game when it is their turn. For example, they cannot get screenshots from when the other agent is playing. 
  • Bids are not revealed, but agents can see how many points other agents received for which level. But it is not clear if the points an agent got when playing another agent are the bid amount or the winning score minus the bid, and whether the bid was positive or negative. However, it could certainly be useful to analyse what points the next opponent received and to estimate how the opponent is bidding and how it is playing for determining the strategy against this agent.   
  • Agents can forfeit their shot. Then the game server shoots the bird somewhere where it does not cause any damage, but the value of the bird (10000 points) is lost. 
  • The demo agent is based on the Naive agent of the Standard Track and uses the standard computer vision and trajectory planning modules. But participants could use one of the best agents of the past competitions as the basis for their competitive agent. The source code of some of the best agents is available here
  • The new track requires additional registration to the standard track, or agents can choose to participate only in the new track. 

 

 

 

Please cite the software as: XiaoYu Ge, Stephen Gould, Jochen Renz, Sahan Abeyasinghe, Jim Keys, Andrew Wang, Rohan Varma, Peng Zhang, AIBIRDS Competitive Track Software Version 1, aibirds.org, 2015. 

GNU Affero General Public License as published by the Free Software Foundation. 

Copyright © 2015 XiaoYu (Gary) Ge, Stephen Gould, Jochen Renz, Sahan Abeyasinghe, Jim Keys, Rohan Varma, Andrew Wang, Peng Zhang. All rights reserved.