Skip to content

marketdesignresearch/MVNN

Repository files navigation

Monotone-Value Neural Networks: Exploiting Preference Monotonicity in Combinatorial Assignment

Published at IJCAI 2022

This is a piece of software used for computing the prediction performance experiments shown in Table 1 and the MLCA efficiency experiments shown in Table 2 of the paper Monotone-Value Neural Networks: Exploiting Preference Monotonicity in Combinatorial Assignment. The algorithms are described in detail in the following paper:

Monotone-Value Neural Networks: Exploiting Preference Monotonicity in Combinatorial Assignment
Jakob Weissteiner, Jakob Heiss, Julien Siems, and Sven Seuken.
In Proceedings of the Thirty-first International joint Conference on Artificial Intelligence IJCAI'22, Vienna, AUT, July 2022.
Full paper version including appendix: [pdf]

Note that there exists an updated source code for the class of monotone-value neural networks (MVNNs) here (i.e., mvnns/mvnn.py and mvnns/layers.py) with the following updates:

  • Novel weight initialization method specialized for MVNNs that prevents from getting stuck when training, especially for large architectures.
  • Trainable cutoff $t$ of bRelu activation function.
  • Linear skip connections

Requirements

  • Python 3.7
  • Java 8 (or later)
    • Java environment variables set as described here
  • JAR-files ready (they should already be)
    • CPLEX (>=12.10.0): The file cplex.jar (for 12.10.0) is provided in the folder lib.
    • SATS (>=0.7.0): The file sats-0.7.0.jar is provided in the folder lib.

Dependencies

Prepare your python environment (whether you do that with conda, virtualenv, etc.) and enter this environment.

Using pip:

$ pip install -r requirements.txt
  • CPLEX Python API installed as described here
  • Make sure that your version of CPLEX is compatible with the cplex.jar file in the folder lib.

How to run

Prediction Performance of MVNN vs. NN

First collect some data from an auction domain, e.g. 1 instance of GSVM:

$ python all_bids_generator.py --domain=GSVM --number_of_instances=1
Parameter Explanation Example Can be empty
domain SATS domain to choose GSVM / LSVM / SRVM / MRVM No
number_of_instances Number of instances of the SATS domain to save as training data 5 No

The data is saved under data/GSVM/GSVM_seed1_all_bids.pkl

To run the prediction performance experiment from Table 1 of the main paper using the data you just collected do the following.

$ python simulation_prediction_performance.py --domain=GSVM --T=20 --bidder_type=national --network_type=MVNN --seed=1
Parameter Explanation Example Can be empty
domain SATS domain to choose GSVM / LSVM / SRVM / MRVM No
T Number of training data points 10 No
bidder_type Bidder type to choose regional/national/local/high_frequency No
network_type Whether to use MVNN or NN MVNN/NN No
seed Auction instance from which the training data was collected 1 No

This script selects the winning configurations of the HPO on the prediction performance saved in prediction_performance_hpo_results.json. Finally it prints the train/val/test metrics shown in the table and plots a true-predicted scatter plot of the test data.

MLCA Experiments

This step requires cplex so make sure everything is setup as described above.

$ python simulation_mlca.py --domain=GSVM --network_type=MVNN
Parameter Explanation Example Can be empty
domain SATS domain to choose GSVM / LSVM / SRVM / MRVM No
network_type Whether to use MVNN or NN MVNN/NN No
seed Auction instance seed was collected 1 No

Acknowledgements

The MLCA and the MIP formulation of the Plain (ReLU) Neural Network is based on Weissteiner et al.[1]

[1] Weissteiner, Jakob, and Sven Seuken. "Deep Learning—Powered Iterative Combinatorial Auctions." Proceedings of the AAAI Conference on Artificial Intelligence. Vol. 34. No. 02. 2020.

Contact

Maintained by Jakob Weissteiner (weissteiner), Jakob Heiss (JakobHeiss) and Julien Siems (Julien)

About

Monotone-Value Neural Networks: Exploiting Preference Monotonicity in Combinatorial Assignement

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages