-
-
Notifications
You must be signed in to change notification settings - Fork 46.9k
Add the polynomial kernel to the SVM code #12740
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
for more information, see https://pre-commit.ci
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Click here to look at the relevant links ⬇️
🔗 Relevant Links
Repository:
Python:
Automated review generated by algorithms-keeper. If there's any problem regarding this review, please open an issue about it.
algorithms-keeper
commands and options
algorithms-keeper actions can be triggered by commenting on this PR:
@algorithms-keeper review
to trigger the checks for only added pull request files@algorithms-keeper review-all
to trigger the checks for all the pull request files, including the modified files. As we cannot post review comments on lines not part of the diff, this command will post all the messages in one comment.NOTE: Commands are in beta and so this feature is restricted only to a member or owner of the organization.
[0. , 0. , 0.99]]) | ||
""" | ||
|
||
def __init__(self, X: List[List[float]], y: List[int]) -> None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please provide descriptive name for the parameter: X
Please provide descriptive name for the parameter: y
self.y = np.array(y) | ||
self.class_weights = {0: 1.0, 1: 1.0} # Example class weights, adjust as needed | ||
|
||
def get_Train_test_data(self) -> Tuple[List[np.ndarray], List[np.ndarray], List[np.ndarray], List[np.ndarray]]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Variable and function names should follow the snake_case
naming convention. Please update the following name accordingly: get_Train_test_data
return in_dim, out_dim | ||
|
||
@staticmethod | ||
def one_hot_encode(labels, num_classes): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please provide return type hint for the function: one_hot_encode
. If the function does not return a value, please provide the type hint as: def function() -> None:
Please provide type hint for the parameter: labels
Please provide type hint for the parameter: num_classes
|
||
|
||
""" | ||
def __init__(self, dataloader, epoch: int, learning_rate: float, gamma=1, hidden_dim=2): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please provide return type hint for the function: __init__
. If the function does not return a value, please provide the type hint as: def function() -> None:
Please provide type hint for the parameter: dataloader
Please provide type hint for the parameter: gamma
Please provide type hint for the parameter: hidden_dim
self.inter_variable = {} | ||
self.weights1_list = [] | ||
|
||
def get_inout_dim(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please provide return type hint for the function: get_inout_dim
. If the function does not return a value, please provide the type hint as: def function() -> None:
return learning_rate * self.gamma | ||
|
||
@staticmethod | ||
def accuracy(label, y_hat): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please provide return type hint for the function: accuracy
. If the function does not return a value, please provide the type hint as: def function() -> None:
Please provide type hint for the parameter: label
Please provide type hint for the parameter: y_hat
return (y_hat.argmax(axis=1) == label.argmax(axis=1)).mean() | ||
|
||
@staticmethod | ||
def loss(output, label): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please provide return type hint for the function: loss
. If the function does not return a value, please provide the type hint as: def function() -> None:
Please provide type hint for the parameter: output
Please provide type hint for the parameter: label
""" | ||
return np.sum((output - label) ** 2) / (2 * label.shape[0]) | ||
|
||
def get_acc_loss(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please provide return type hint for the function: get_acc_loss
. If the function does not return a value, please provide the type hint as: def function() -> None:
""" | ||
return self.test_accuracy, self.test_loss | ||
|
||
def train(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please provide return type hint for the function: train
. If the function does not return a value, please provide the type hint as: def function() -> None:
|
||
output = self.forward(x=batch_imgs, W1=W1, W2=W2, no_gradient=False) | ||
|
||
grad_W1, grad_W2 = self.back_prop(x=batch_imgs, y=batch_labels, W1=W1, W2=W2) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Variable and function names should follow the snake_case
naming convention. Please update the following name accordingly: grad_W1
Variable and function names should follow the snake_case
naming convention. Please update the following name accordingly: grad_W2
for more information, see https://pre-commit.ci
for more information, see https://pre-commit.ci
Describe your change:
Add the polynomial kernel to the SVM code
Checklist: