Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added Scaled Exponential Linear Unit Activation Function #9027

Merged
merged 11 commits into from
Sep 6, 2023
Merged

Added Scaled Exponential Linear Unit Activation Function #9027

merged 11 commits into from
Sep 6, 2023

Conversation

AdarshAcharya5
Copy link
Contributor

Added Scaled Exponential Linear Unit Activation (SELU) Function under TheAlgorithms/Python/neural_network/activation_functions. Description of SELU taken from reference link provided in the top comment section in scaled_exponential_linear_unit.py
Fixes #9010

  • Add an algorithm?

Checklist:

  • I have read CONTRIBUTING.md.
  • This pull request is all my own work -- I have not plagiarized.
  • I know that pull requests will not be merged if they fail the automated tests.
  • This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
  • All new Python files are placed inside an existing directory.
  • All filenames are in all lowercase characters with no spaces or dashes.
  • All functions and variable names follow Python naming conventions.
  • All function parameters and return values are annotated with Python type hints.
  • All functions have doctests that pass the automated testing.
  • All new algorithms include at least one URL that points to Wikipedia or another similar explanation.
  • If this pull request resolves one or more open issues then the description above includes the issue number(s) with a closing keyword: "Fixes #ISSUE-NUMBER".

@algorithms-keeper algorithms-keeper bot added the awaiting reviews This PR is ready to be reviewed label Sep 2, 2023
@algorithms-keeper algorithms-keeper bot added the tests are failing Do not merge until tests pass label Sep 2, 2023
@algorithms-keeper algorithms-keeper bot removed the tests are failing Do not merge until tests pass label Sep 5, 2023
@AdarshAcharya5
Copy link
Contributor Author

@tianyizheng02 Thanks for your reply in the discussion section!. All checks passed now.



def scaled_exponential_linear_unit(
vector: np.ndarray, alpha: float = 1.6732, _lambda: float = 1.0507
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a reason why _lambda has an underscore? Is the user not meant to change this coefficient?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, firstly lambda is a reserved keyword in python. Secondly both alpha and lambda are fixed constants.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The user can change these values, that may yield slightly different behaviour from the function, but the default values given to them are alpha: float = 1.6732, _lambda: float = 1.0507.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, firstly lambda is a reserved keyword in python.

Oh yeah, duh 🤦

Could you rename the variable to something like lambda_ instead? Having an underscore at the start of a variable name generally signifies that the user isn't supposed to use it.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah. Yes for sure, I'll get it done right away!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

@tianyizheng02 tianyizheng02 merged commit 153c35e into TheAlgorithms:master Sep 6, 2023
3 checks passed
@AdarshAcharya5 AdarshAcharya5 deleted the feat/SELU_activation branch September 10, 2023 03:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
awaiting reviews This PR is ready to be reviewed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Other Activation Functions
2 participants