1 year, 4 months ago

First semester project for Decision Systems Course 2022/2023

This is the first project for students enrolled in the Decision System 2022/2023 course at the Faculty of Mathematics, Informatics, and Mechanics at the University of Warsaw.

Overview

The goal of this competition is to construct a scoring model for predicting winners in the Starcraft video game (https://starcraft.com/). 

More detailed competition rules are given in Terms and Conditions.

The description of the data and evaluation metric is in the Task description section. 

The deadline for sending submissions and scores is December 9, 2022.

Terms & Conditions
 
 
Rank Team Name Is Report Preliminary Score Final Score Submissions
1
Huta Katowice
True 0.6988 0.685700 20
2
drużyna1
True 0.6878 0.685500 24
3
O.W.C.A.
True 0.6786 0.684200 57
4
ff
True 0.6922 0.681600 14
5
bochniarz_pasnikowski
True 0.6749 0.681200 5
6
rzutowanie
True 0.6876 0.681200 32
7
CakeTeam
True 0.6670 0.679300 52
8
x
True 0.6975 0.679000 17
9
Łukasz_Niedźwiedzki_419328
True 0.6788 0.678000 7
10
Epoch_Batch
True 0.6862 0.677600 22
11
quozz
True 0.6963 0.677200 41
12
baseline
True 0.6558 0.674800 1
13
OA
True 0.6972 0.673800 30
14
ZERG RUSH
True 0.6883 0.669900 36
15
Nie_umiem_w_ML
True 0.6691 0.669500 9
16
Adam i Łukasz
True 0.6820 0.666400 19
17
zsw
True 0.6762 0.665300 16
18
Team name
True 0.6757 0.660100 5

The provided data describe fragments of Starcraft games played between two players on standard 1v1 maps. Each row in the data files describes approximately 1 minute of a game. The first column in the training data contains game IDs, and the second column is the decision - information about the game-winner. The third column indicates the game type - which of the three in-game races are fighting. The remaining columns contain characteristics describing the situation in the game map. Their meaning is reflected by their names.

The data tables are provided as two CSV files with the ',' (coma) separator sign. They can be downloaded after registration to the competition. Both files (training and test sets) have the same format but the GameID and decision columns are missing from the test data.

The evaluation metric will be AUC. During the competition, your solutions will be evaluated on a small fraction of the test set (10%), and your best preliminary AUC score will be displayed on the public Leaderboard. The final score of each team will be computed on the remaining 90% of the test data.

The submission format: the solutions need to be submitted as text files with predictions. The file should have exactly the same number of rows as the test data table. In each row, it should contain exactly one real number expressing the likeliness that the correct decision value for the corresponding test set instance is A. The deadline for sending submissions and reports is December 9, 2022.

In order to download competition files you need to be enrolled.
This forum is for all users to discuss matters related to the competition. Good manners apply!
There is no topics in this competition.