Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding MultiLabel SoftMargin Loss #2345

Merged
merged 56 commits into from
Aug 14, 2021
Merged
Show file tree
Hide file tree
Changes from 5 commits
Commits
Show all changes
56 commits
Select commit Hold shift + click to select a range
e1eb983
Initial work. Forward/ Backward left.
iamshnoo Mar 24, 2020
7269357
Merge remote-tracking branch 'upstream/master' into multilabel_softma…
iamshnoo Mar 25, 2020
a3c3bb7
Completed Forward and Backward. Tests left.
iamshnoo Mar 31, 2020
92af440
Merge remote-tracking branch 'upstream/master' into multilabel_softma…
iamshnoo Mar 31, 2020
b2604b0
Fix style errors.
iamshnoo Mar 31, 2020
fc6335d
Fix weird indentation issue.
iamshnoo Apr 1, 2020
b6275ec
Manually fix the indentation in browser.
iamshnoo Apr 1, 2020
b42c360
Shift line 39 by 4 spaces to left
iamshnoo Apr 1, 2020
90c48ae
Shift line 39 by 2 spaces to the right
iamshnoo Apr 1, 2020
c0a3ef6
Corrected errors. Wrote test cases.
iamshnoo Apr 4, 2020
194f3ed
Merge remote-tracking branch 'upstream/master' into multilabel_softma…
iamshnoo Apr 4, 2020
c8280f1
Fix style errors.
iamshnoo Apr 4, 2020
6783aa7
Correct minor errors in backward function.
iamshnoo Apr 4, 2020
71e1a89
Resolve namespace error.
iamshnoo Apr 4, 2020
43d3506
Re-wrote backward correctly, updated tests
iamshnoo Apr 4, 2020
a346c1a
Fix typo
iamshnoo Apr 4, 2020
aa4b596
Fix errors in backward, tests. Update history.
iamshnoo Apr 5, 2020
503185b
Merge remote-tracking branch 'upstream/master'
iamshnoo Apr 5, 2020
1c43613
Attempt to fix merge conflict
iamshnoo Apr 5, 2020
b06486f
Merge remote-tracking branch 'upstream/master'.
iamshnoo Apr 5, 2020
67ec433
Prettify if-else construct.
iamshnoo Apr 5, 2020
6ec768f
Remove unnecesary spaces.
iamshnoo Apr 5, 2020
ab55a5e
Fix documentation.
iamshnoo Apr 6, 2020
697944c
Fix some changes requested in review (part 1).
iamshnoo Apr 7, 2020
70eb8bc
Minor fixes.
iamshnoo Apr 7, 2020
81fece7
Typo fix.
iamshnoo Apr 7, 2020
9521457
Temporary fix.
iamshnoo Apr 7, 2020
ba11c43
My fixes didn't fix anything.
iamshnoo Apr 7, 2020
3089f63
Some more typos. Should work okay now!
iamshnoo Apr 7, 2020
9e4da86
Merge remote-tracking branch 'upstream/master'.
iamshnoo Apr 7, 2020
17806e0
Adjust tolerance value for CheckMatrices.
iamshnoo Apr 7, 2020
e2eef8a
Fix to match format suggested in review.
iamshnoo Apr 7, 2020
0ddda95
Change weight mat to rowvec. Overload constructor.
iamshnoo Apr 9, 2020
86fec68
Merge 'upstream/master' and resolve conflicts.
iamshnoo Apr 9, 2020
43d91db
Fix typo.
iamshnoo Apr 9, 2020
1c1d33d
Apply suggestions from code review.
iamshnoo Apr 9, 2020
c395ed3
Correctly pass parameter weights in constructor.
iamshnoo Apr 9, 2020
eb17f31
Merge branch 'multilabel_softmargin_loss' of https://github.com/iamsh…
iamshnoo Apr 9, 2020
dcd02d9
Update tests. Minor fixes.
iamshnoo Apr 11, 2020
a450f71
Merge recent upstream changes.
iamshnoo Apr 22, 2020
56ce6fa
Fix tolerance value issue with PyTorch print trick
iamshnoo Apr 22, 2020
f298c7a
Re-design the API as per discussion.
iamshnoo May 5, 2020
9789fde
Merge recent upstream changes.
iamshnoo May 5, 2020
d3068cc
Fix some style issues.
iamshnoo May 5, 2020
4782809
Re-factor test cases.
iamshnoo May 5, 2020
4541966
Merge recent upstream changes into branch.
iamshnoo May 8, 2020
6470ae8
Adding .vscode to .gitignore
iamshnoo May 18, 2020
34153d5
Merge upstream/master into this branch.
iamshnoo May 18, 2020
c5a24d6
Add space between lines in HISTORY.md
iamshnoo May 18, 2020
9b2a3ac
Merge usptream changes.
iamshnoo May 31, 2020
6340e64
Apply suggestions from code review
iamshnoo May 31, 2020
77463d8
Fix bug in Forward().
iamshnoo Jun 1, 2020
733db6a
Merge upstream
jeffin143 Feb 21, 2021
180f917
Merge upstream
jeffin143 Feb 21, 2021
ebaa6c3
Merge branch 'master' into multilabel_softmargin_loss
jeffin143 Feb 28, 2021
bf61e97
Merge branch 'master' into multilabel_softmargin_loss
jeffin143 Aug 14, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
2 changes: 2 additions & 0 deletions src/mlpack/methods/ann/loss_functions/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,8 @@ set(SOURCES
mean_squared_error_impl.hpp
mean_squared_logarithmic_error.hpp
mean_squared_logarithmic_error_impl.hpp
multilabel_softmargin_loss.hpp
multilabel_softmargin_loss_impl.hpp
negative_log_likelihood.hpp
negative_log_likelihood_impl.hpp
log_cosh_loss.hpp
Expand Down
120 changes: 120 additions & 0 deletions src/mlpack/methods/ann/loss_functions/multilabel_softmargin_loss.hpp
Original file line number Diff line number Diff line change
@@ -0,0 +1,120 @@
/**
* @file multilabel_softmargin_loss.hpp
iamshnoo marked this conversation as resolved.
Show resolved Hide resolved
* @author Anjishnu Mukherjee
*
* Definition of the Multi Label Soft Margin Loss function.
iamshnoo marked this conversation as resolved.
Show resolved Hide resolved
*
* mlpack is free software; you may redistribute it and/or modify it under the
* terms of the 3-clause BSD license. You should have received a copy of the
* 3-clause BSD license along with mlpack. If not, see
* http://www.opensource.org/licenses/BSD-3-Clause for more information.
*/
#ifndef MLPACK_ANN_LOSS_FUNCTION_MULTILABEL_SOFTMARGIN_LOSS_HPP
#define MLPACK_ANN_LOSS_FUNCTION_MULTILABEL_SOFTMARGIN_LOSS_HPP

#include <mlpack/prereqs.hpp>

namespace mlpack {
namespace ann /** Artificial Neural Network. */ {

/**
* @tparam InputDataType Type of the input data (arma::colvec, arma::mat,
* arma::sp_mat or arma::cube).
* @tparam OutputDataType Type of the output data (arma::colvec, arma::mat,
* arma::sp_mat or arma::cube).
*/
template <
typename InputDataType = arma::mat,
typename OutputDataType = arma::mat
>
class MultiLabelSoftMarginLoss
{
public:
/**
* Create the MultiLabelSoftMarginLoss object.
*
* @param weight A manual rescaling weight given to each class. Initialized to
* 1 by default.
* @param reduction Specifies the reduction to apply to the output. When true,
* 'mean' reduction is used, where sum of the output will be divided by
iamshnoo marked this conversation as resolved.
Show resolved Hide resolved
* the number of elements in the output. When false, 'sum' reduction is
* used and the output will be summed.
*/

MultiLabelSoftMarginLoss(const double weight = 1.0,
const bool reduction = true);

/**
* Computes the Multi Label Soft Margin Loss function.
* This criterion optimizes a multi-label one-versus-all loss based
* on max-entropy, between input x and target y.
iamshnoo marked this conversation as resolved.
Show resolved Hide resolved
*
* @param input Input data used for evaluating the specified function.
* @param target The target vector with same shape as input.
*/

template<typename InputType, typename TargetType>
double Forward(const InputType& input, const TargetType& target);
/**
* Ordinary feed backward pass of a neural network.
*
* @param input The propagated input activation.
* @param target The target vector.
* @param output The calculated error.
*/

iamshnoo marked this conversation as resolved.
Show resolved Hide resolved
template<typename InputType, typename TargetType, typename OutputType>
void Backward(const InputType& input,
const TargetType& target,
OutputType& output);

//! Get the input parameter.
InputDataType& InputParameter() const { return inputParameter; }
//! Modify the input parameter.
InputDataType& InputParameter() { return inputParameter; }
iamshnoo marked this conversation as resolved.
Show resolved Hide resolved

//! Get the output parameter.
OutputDataType& OutputParameter() const { return outputParameter; }
//! Modify the output parameter.
OutputDataType& OutputParameter() { return outputParameter; }

//! Get the weight.
double Weight() const { return weight; }
//! Modify the weight.
double& Weight() { return weight; }

//! Get the reduction.
bool Reduction() const { return reduction; }
//! Modify the reduction.
bool& Reduction() { return reduction; }

/**
* Serialize the layer.
*/
template<typename Archive>
void serialize(Archive& ar, const unsigned int /* version */);

private:
//! Locally-stored output parameter object.
OutputDataType outputParameter;

//! Locally-stored input parameter object.
InputDataType inputParameter;

//! The manual rescaling factor given to the loss.
double weight;

//! The weight for positive examples.
double posWeight;

//! The boolean value that tells if reduction is mean or sum.
bool reduction;
}; // class MarginRankingLoss

} // namespace ann
} // namespace mlpack

// include implementation.
#include "multilabel_softmargin_loss_impl.hpp"

#endif
Original file line number Diff line number Diff line change
@@ -0,0 +1,69 @@
/**
* @file multilabel_softmargin_loss_impl.hpp
iamshnoo marked this conversation as resolved.
Show resolved Hide resolved
* @author Anjishnu Mukherjee
*
* Implementation of the Multi Label Soft Margin Loss function.
*
* mlpack is free software; you may redistribute it and/or modify it under the
* terms of the 3-clause BSD license. You should have received a copy of the
* 3-clause BSD license along with mlpack. If not, see
* http://www.opensource.org/licenses/BSD-3-Clause for more information.
*/
#ifndef MLPACK_METHODS_ANN_LOSS_FUNCTION_MULTILABEL_SOFTMARGIN_LOSS_IMPL_HPP
#define MLPACK_METHODS_ANN_LOSS_FUNCTION_MULTILABEL_SOFTMARGIN_LOSS_IMPL_HPP

// In case it hasn't been included.
#include "multilabel_softmargin_loss.hpp"

namespace mlpack {
namespace ann /** Artifical Neural Network. */ {

template<typename InputDataType, typename OutputDataType>
MultiLabelSoftMarginLoss<InputDataType, OutputDataType>::
MultiLabelSoftMarginLoss(
const double weight,
const bool reduction) :
weight(weight),
reduction(reduction)
{
// Nothing to do here.
}

template<typename InputDataType, typename OutputDataType>
template<typename InputType, typename TargetType>
double MultiLabelSoftMarginLoss<InputDataType, OutputDataType>::Forward(
const InputType& input, const TargetType& target)
{
InputType logSigmoid = 1 / (1 + arma::exp(-input));
InputType logSigmoidNeg = 1 / (1 + arma::exp(input));
double loss = arma::accu(-(target % logSigmoid +
(1 - target) % logSigmoidNeg) * weight);
return reduction ? loss / input.n_elem : loss / input.n_rows;
}

template<typename InputDataType, typename OutputDataType>
template<typename InputType, typename TargetType, typename OutputType>
void MultiLabelSoftMarginLoss<InputDataType, OutputDataType>::Backward(
const InputType& input,
const TargetType& target,
OutputType& output)
{
InputType expo = arma::exp(input);
output = (expo / arma::pow(1 + expo, 2)) %
((target + 1) / arma::log(1 + expo) -
target / (input - arma::log(expo + 1))) * weight;
}

template<typename InputDataType, typename OutputDataType>
template<typename Archive>
void MultiLabelSoftMarginLoss<InputDataType, OutputDataType>::serialize(
Archive& ar,
const unsigned int /* version */)
{
// Nothing to do here.
iamshnoo marked this conversation as resolved.
Show resolved Hide resolved
}

} // namespace ann
} // namespace mlpack

#endif