Pytorch friendly implementation of minimum norm solvers.
Given a list of vectors ${ v_i }{i=1}^m$, finding weight vector ${ w_i }{i=1}^m$ belonging to a simplex
- min_norm_solvers: Mainly copy from the repository https://github.com/isl-org/MultiObjectiveOptimization. My job is just replacing some numpy functions by torch functions. Especially the torch.dot function is not equivalent to the numpy.dot function. Also fix some bugs (i.e., iter_count doesn't change therefore, there can be a forever while loop)
- gradient_descent_solvers: Using soft_max reparameterize and gradient descent to solve the problem. More specifically, we define a trainable variable
$\alpha \in \mathbb{R}^m$ , then using soft_max function to project$\alpha$ to the simplex$\Delta_w$ . Iterative solution
Please refer to the example.py
file.