Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] Improve ergonomics for writing functions generic across integral scalars. #2776

Open
1 task done
bgreni opened this issue May 21, 2024 · 5 comments
Open
1 task done
Labels
enhancement New feature or request mojo-repo Tag all issues with this label

Comments

@bgreni
Copy link
Contributor

bgreni commented May 21, 2024

Review Mojo's priorities

What is your request?

Come up with a way to allow users to write a single function that accepts Int, IntLiteral and scalar SIMD types. I assume there is sound rational for Int not simply being defined as alias Int = Scalar[DType.index], so maybe this would require the introduction of another trait?

What is your motivation for this change?

I've noticed a bit of pain point when attempting to write functions that are meant to accept any possible integral scalar type (Int and integral SIMD scalars). You can easily write a function to accept SIMD scalars like such.

fn foo[type: DType](s: Scalar[type]):
    constrained[type.is_integral(), "Expected an integral"]()
    print(s)

However, even though you can easily convert an Int to a Scalar[DType.index], trying to call this function with an Int will fail since it can't implicitly parameterize type. So in order to maintain a clean interface one must write a second overload that accepts Int.

fn foo(i: Int):
    foo[DType.index](i)

I imagine this behaviour could become burdensome for numerical library writers.

Any other details?

No response

@bgreni bgreni added enhancement New feature or request mojo-repo Tag all issues with this label labels May 21, 2024
@martinvuyk
Copy link
Contributor

what about doing

fn foo[type: DType = DType.int64](s: Scalar[type]):
    constrained[type.is_integral(), "Expected an integral"]()
    print(s)


fn main():
    foo(UInt64(123))
    foo(Int32(123))
    foo(UInt16(123))
    foo(Int8(123))
    foo(int(123))

@bgreni
Copy link
Contributor Author

bgreni commented May 22, 2024

what about doing

fn foo[type: DType = DType.int64](s: Scalar[type]):
    constrained[type.is_integral(), "Expected an integral"]()
    print(s)


fn main():
    foo(UInt64(123))
    foo(Int32(123))
    foo(UInt16(123))
    foo(Int8(123))
    foo(int(123))

That works well, but it results in accepting Bool implicitly as well which might be undesirable? I'm not sure that the opinion of the team is on that sort of behaviour honestly.

@martinvuyk
Copy link
Contributor

results in accepting Bool implicitly

oof, might be a problem. But i think that would rather be on the developers side (?)

@bgreni
Copy link
Contributor Author

bgreni commented May 22, 2024

Yes but it's typically better to make unexpected behaviour impossible.

Copy link
Collaborator

JoeLoser commented Jul 24, 2024

It would be nice to provide a Arithmetic or Numeric trait for example. One tricky part is you want this to be a generic based on the associated type, right? Consider something like an Addable trait which requires a type to implement the add dunder function. What should the return type be as mandated by the trait? You can't just force it to be Int or something, since you'd want both Int and UInt to conform to Addable. However, they return their respective types, e.g. Int's dunder add returns an Int, and UInt as the return type for UInt case. So we need some more powerful language features (associated types is what I think @jeff and others have been calling this) in order to do this right.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request mojo-repo Tag all issues with this label
Projects
None yet
Development

No branches or pull requests

3 participants