Lab 19 - Autonomous Rescue Challenge Participation in Automated Competitions
Maintained by: mikerb@mit.edu Get PDF
src: project-pavlab/labdocs/lab_class_19_auto_comp
1 Overview and Objectives
2 The Short List of Steps to Participate
2.1 Prepare your Code
2.2 Make it Available From Git
2.3 Note the Competition Assumptions
3 How the Competition is Run
4 How the Results will be Disemminated
4.1 Win-Loss Results
4.2 Competition Data Set Logs
1 Overview and Objectives
In the 2025 we begin to support automated simulated competitions. These competitions will be held offline, with no GUI, on a set of servers. The goal is to run as many pairs of teams head-to-head, with each competition starting with a randomly generated swim file. With 20 students, there are 190 possible unique head-to-head pairings, ((n * (n-1))/2).

To start with, we will focus on the 1-1 rescue-rescue mission of Lab 15. An entry will be a 2.680 team implementation of their pGenRescue app, found in their GitHub repo.

To gain a better insight into the performance of any teams' algorithm, we compare it head-to-head with all other possible teams, and do so for as many unique starting conditions as possible. Each competition will have a randomly generated swim file and randomly generated starting position

2 The Short List of Steps to Participate
Joining the competition should be a short or zero step process described below. Actually for about half the students now, the process are already running and generating results.

2.1 Prepare your Code [top]
The competition structure pulls your GitHub moos-ivp-extend repo and builds your codebase. Most of you are already pushing regularly to GitHub. Your entry is comprised of your pGenRescue app. Make sure that this app is compiling and running properly. This includes not changing the MOOS variable (SURVEY_UPDATE) that is published by pGenRescue and is the updates parameter for the waypt_survey behavior.


2.2 Make it Available From Git [top]
The 2.680 student GitHub repos is already accessible by 2.680 staff since the early weeks in the course (thank you!). However, they are accessible to a particular person because (a) you made that person a collaborator to your moos-ivp-extend tree, and (b) that person is credentialed on their machine as that person. So things work fine if the test harness is running on that one machine by that one person.

We would like to run our competitions on a cluster of PABLOs, like the ones handed out in 2.680. These machines all have the same student2680 user account. This "user" is not credentialed on GitHub and is not a collaborator to anyone's repo. But we would like uncredentialed (anonymous) read-only access to your repos on our staff test machines. To do this, we have placed one (private) half of an ssh-key pair on our test pablo machines, and we need you to add the other (public) half of the ssh-key pair as a "deploy key" for your moos-ivp-extend tree. The public half of this key was provided in the Piazza post. Please let us know when you have done this.


2.3 Note the Competition Assumptions [top]
Note that the competition harness will only pull your repo to build and retain your version of pGenRescue. The competition will be run from the current version of the lab_14_rescue_baseline mission cloned onto our test machine(s). If you have any other modifications to the mission in your own local test preparation, this will not be carried over to our test machine(s). If you have a custom parameter block for your pGenRescue, it will not be used in the competition. To account for this, we recommend that your pGenRescue have default parameter settings, in the code, e.g., constructor, to be set the way you wish for competing.


3 How the Competition is Run
The below info is FYI. We will likely expound on this as this phase of 2.680 is further developed.

The competitions are designed to be run on a set of one or more test machines. Our compute cluster has 100+ nodes, but even a single powerful laptop can run about 500+ missions overnight. There are two phases run on each machine, a setup phase and a competition phase:

The setup phase:

- All student git repos are cloned and built on the test machine.
- A file is generated, one line per student entry, containing the full path name of the pGenRescue app.
- This file is placed in the local copy of lab_14_rescue_baseline mission folder.


The competition phase:

- A script is run to repeatedly run a random competition N times
- Each mission/competition is seeded with a randomly generated swim file, random starting position, and random choice of two pGenRescue choices from the file generated in the setup phase.
- The mission is launched and automatically exits when the competition is completed. An app called pMissionEval will generate a single results line in a file called results.txt in the mission folder.


The post-competition phase:

- If multiple test machines are in use, the entries of the results.txt file are merged into single file.
- A separate app parses the results file and generates a set of web pages linked from the 2.680 website.


4 How the Results will be Disemminated

4.1 Win-Loss Results [top]
The competition results will be posted and linked from the 2.680 lab page, alongside the other Rescue Lab links. We anticipate this will be available by May 2nd, 2025. There will be three results levels:

- Your sorted/ranked overall W-L record relative to other competitors
- Your sorted/ranked W-L record per competitor
- Your results, for each competition, selected by competitor.

This starting page will also be available through the link:

https://oceanai.mit.edu/2680/compete25


4.2 Competition Data Set Logs [top]
In the last level, each competition will have a mission hash. The goal is ensure that the alogs for each competition are archived on the class server and you will be able to run:


$ alogview 250428-2036H-WIDE-STOP


When invoked with a single mission hash, alogview will interpret this as a request to the class server to fetch the archived data, uncompress, and load the alog files for the shoreside and vehicles for viewing in alogview. This is not currently fully implemented yet. 50-50 chance it will be finished before the end of the class.


Document Maintained by: mikerb@mit.edu
Page built from LaTeX source using texwiki, developed at MIT. Errata to issues@moos-ivp.org.
Get PDF