Skip to content

add gradient function #503

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
Jan 26, 2021
Merged

add gradient function #503

merged 6 commits into from
Jan 26, 2021

Conversation

KsanaKozlova
Copy link
Contributor

No description provided.

@KsanaKozlova KsanaKozlova requested a review from densmirn January 19, 2021 13:31
@@ -294,6 +294,27 @@ cpdef dparray dpnp_fmod(dparray x1, dparray x2):
return call_fptr_2in_1out(DPNP_FN_FMOD, x1, x2, x1.shape)


cpdef dparray dpnp_gradient(dparray y1, int dx=1):

len = y1.size
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

len is built-in name. Let's rename the variable, e.g. size.

if not use_origin_backend(y1) and not kwargs:
if not isinstance(y1, dparray):
pass
elif len(varargs) != 0 and not isinstance(varargs[0], int):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What if len(varargs) = 2 and varargs[0] = 2? Don't we need to fallback to numpy in this case?

@pytest.mark.parametrize("array", [[2, 3, 6, 8, 4, 9],
[3., 4., 7.5, 9.],
[2, 6, 8, 10]])
def test_gradient_y1(self, array):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Both tests are named the same.


result._setitem_scalar(size - 1, cur)

for i in range(1, len - 1):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
for i in range(1, len - 1):
for i in range(1, size-1):

@densmirn densmirn added the in progress Please do not merge. Work is in progress. label Jan 19, 2021
@densmirn densmirn removed the in progress Please do not merge. Work is in progress. label Jan 20, 2021
@shssf
Copy link
Contributor

shssf commented Jan 21, 2021

Could you please rebase on master?

@shssf shssf added the in progress Please do not merge. Work is in progress. label Jan 21, 2021
@shssf shssf removed the in progress Please do not merge. Work is in progress. label Jan 21, 2021
@shssf
Copy link
Contributor

shssf commented Jan 22, 2021

error in tests:

NameError: name 'dpnp_gradient' is not defined

@shssf shssf added the in progress Please do not merge. Work is in progress. label Jan 22, 2021
@KsanaKozlova KsanaKozlova removed the in progress Please do not merge. Work is in progress. label Jan 26, 2021
@shssf shssf merged commit 87163d0 into master Jan 26, 2021
@shssf shssf deleted the gradient_func branch January 26, 2021 22:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants