Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue 43 #44

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open

Issue 43 #44

wants to merge 3 commits into from

Conversation

stefanwaterval
Copy link
Contributor

I added methods to deal with #43 as well as #35.

Copy link
Collaborator

@pscicluna pscicluna left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi Stefan, thanks for doing this, and sorry it took me so long to review it. I left a couple of very minor comments on the code - I'm happy to accept the PR as is and merge it now, or if you feel like making any of those indicated changes that's also fine with me. Just reply one way or the other!

self._yerr_raw = values
# now apply the same transformation that was applied to the ydata
if self.ytransform is None:
self._yerr_transformed = values
elif isinstance(self.ytransform, Transformer):
self._yerr_transformed = self.ytransform.transform(values)

def _ensure_tensor(self, values):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In principle, I think this could be moved outside the context of the class, as maybe it would make sense as a more general utility? But I think this is great for now, and we can think about that later.

Copy link
Contributor Author

@stefanwaterval stefanwaterval Nov 7, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd say that this method could stay as an internal method as I don't see any other utility for it outside of the scope of making sure the data is in the correct form. But if you think that we could generalize it more and that it could be used externally, we can do that too.

warnings.warn(('The function expects a torch.Tensor as input.'
'Your data will be converted to a tensor.'),
stacklevel=2)
values = torch.as_tensor(values, dtype=torch.float32)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we change the dtype here to be more general? I can see float32 being the default, but maybe we should have a self._default_dtype to cast to, which defaults to torch.float32? Again, this isn't a showstopper.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, I'll think about it.

values = torch.as_tensor(values, dtype=torch.float32)
return values

def _ensure_dim(self, values):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a great start, we can figure out how to add a threshold later

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants