GECCO 2016 logo

2nd Combinatorial Black-Box Optimization Competition (CBBOC)

July 20-24, 2016 @ GECCO 2016 in Denver, Colorado, U.S.A.

Description

This competition is designed to provide the GECCO community with detailed performance comparisons of a wide variety of meta-heuristics and hyper-heuristics on combinatorial problems, where the real-world problems which induce combinatorial problems have been categorized into those with no training time (good fit for parameter-less algorithms), those with short training time (good fit for typical evolutionary algorithms), and those with long training time (good fit for hyper-heuristics). Training and testing time is measured in terms of number of fitness evaluations, although wall time will be used to time-out algorithms taking infeasibly long to complete. Competitors choose which category or categories they want to submit to. While trained differently, all three categories will be compared employing instances drawn from the same test set. This can create a Pareto set of winners, maximizing solution quality while minimizing training time, with at most three nondominated points. The competition problems will be randomly generated by a meta-class based on Mk-Landscapes which can represent all NK-Landscapes, Ising Spin Glasses, MAX-kSAT, Concatenated Traps, etc. (this is a generalization of the NK-Landscapes meta-class employed in the GECCO 2015 CBBOC). A light-weight API is now available for C++, Java, C#, and Python; instructions for using the API are available here. Competitors who require other languages are encouraged to contact the competition organizers. Competitors are encouraged, though not required, to allow the source code of their competing algorithms to be made available on the competition website. Also, competitors are encouraged, but not required, to attend GECCO 2016.

Competition Details

After all submissions have been made, the competition organizers will use ProblemClassGenerator.py to generate 20 brand new problem classes. Each submission will then be run against each instance. During the training phase, competitors are provided training instances and can perform a predetermined total number of evaluations, divided how they see fit. The difference between the training categories is only how many evaluations are allowed at this phase. After training is complete, the submission is applied to each testing instance serially, with a specific number of evaluations per instance. This is designed to mimic real-world applications in which initial offline tuning can be flexibly performed, while actual optimization must be done in a specified time window.

Borrowing ideas from voting theory, we will use the Schulze method to rank all submissions and find the Condorcet "Beats All" winner, implemented by the following Python tool: RankEntries.py.

Previous Competitors

If you competed in CBBOC 2015, then you will need to update your code in order to handle the expanded problem meta-class. The easiest way to do so is either to use "git pull" (if you're code is still connected to our repository) or to delete your "src/cbboc" (C#, Python, Java), "include" (C++) and "resource" folders and replace them with the new versions. WARNING: Some files that used to include "2015" in their name have been renamed, so make certain you include the correct versions!

Submission Instructions

The submission deadline is June 15 25, 2016. Submissions can be from a single competitor or a team of competitors. No competitor may be part of more than three submissions, regardless of whether those are single competitor or team submissions. Until the submission deadline, competitors can replace their submissions with updated ones or entirely withdraw a submission. By default, submitted code will be made available from the competition website after the submission deadline, unless the submission E-mail clearly states that the code may not be distributed. Submissions should include the necessary source code as well as instructions about how to build and run it, and be E-mailed to cbboc.organizers@gmail.com. The submission E-mail should furthermore state the full name of each competitor associated with that submission along with their E-mail address and affiliation. It should also specify which category or categories are being submitted to (no training time, short training time, and long training time).

Contact Info

For all matters related to CBBOC, please contact the organizers at cbboc.organizers@gmail.com.

Organizers

[Picture]

E-mail: arb9z4@mst.edu

Alex R. Bertels is a M.S. student in the Department of Computer Science at the Missouri University of Science and Technology (S&T) and a research assistant in S&T's Natural Computation Laboratory.
[Picture]

E-mail: brianwgoldman@acm.org

Brian W. Goldman is a postdoc at Colorado State University, formerly a postdoc in the Department of Computer Science and Engineering at the Michigan State University, and a member of The BEACON Center for the Study of Evolution in Action. He has been a member of the GECCO program committee since 2012, and in 2014 was awarded Best Paper in the Genetic Algorithms track. His research interests focus on how to perform efficient optimization without expert input.
[Picture]

E-mail: jerry.swan@york.ac.uk

Jerry Swan is a Research Fellow at the University of York. Before entering academia, Jerry spent nearly 20 years in industry as a systems architect and software company owner. His research includes meta- and hyper-heuristics, symbolic computation and machine learning. He has published more than 60 papers in international journals and conferences. Jerry has lectured and presented his research worldwide, and has been running international workshops and tutorials on the automated design of metaheuristics since 2011.
[Picture]

E-mail: dtauritz@acm.org

Daniel R. Tauritz is an Associate Professor in the Department of Computer Science at the Missouri University of Science and Technology (S&T), a contract scientist for Sandia National Laboratories, a former Guest Scientist at Los Alamos National Laboratory (LANL), the founding director of S&T's Natural Computation Laboratory, and founding academic director of the LANL/S&T Cyber Security Sciences Institute. He received his Ph.D. in 2002 from Leiden University for Adaptive Information Filtering employing a novel type of evolutionary algorithm. He served previously as GECCO 2010 Late Breaking Papers Chair, GECCO 2012 & 2013 GA Track Co-Chair, GECCO 2015 ECADA Workshop Co-Chair, GECCO 2015 MetaDeeP Workshop Co-Chair, GECCO 2015 Hyper-heuristics Tutorial co-instructor, and GECCO 2015 CBBOC Competition co-organizer. For several years he has served on the GECCO GA track program committee, the Congress on Evolutionary Computation program committee, and a variety of other international conference program committees. His research interests include the design of hyper-heuristics and self-configuring evolutionary algorithms and the application of computational intelligence techniques in cyber security, critical infrastructure protection, and program understanding. He was granted a US patent for an artificially intelligent rule-based system to assist teams in becoming more effective by improving the communication process between team members.