-
Notifications
You must be signed in to change notification settings - Fork 26
Support functions as layers #188
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
This change adds two types of new functionality. First of all, it
introduces the `(Locked|Local)?FuncRepo` classes these can be used
to extend a layer with a kernel function. For instance, a layer
like
```
@use_kernel_forward_from_hub("SiluAndMul")
class SiluAndMul(nn.Module):
def forward(self, input: torch.Tensor) -> torch.Tensor:
d = input.shape[-1] // 2
return F.silu(input[..., :d]) * input[..., d:]
```
can now also be kernelized using a function `silu_and_mul` from the
Hub:
```
with use_kernel_mapping({
"SiluAndMul": {
"cuda": FuncRepository(
repo_id="kernels-community/activation",
func_name="silu_and_mul",
),
}
}):
kernelize(...)
```
This makes it easier to kernelize pure layers (layers that do not use
module state), since the Hub kernel does not have to provide a `layers`
Python module with wrappers.
Secondly, we introduce a decorator `use_kernel_func_from_hub` that turns
functions into layers that can be kernelized. For example:
```
@use_kernel_forward_from_hub("silu_and_mul")
def silu_and_mul(x: torch.Tensor) -> torch.Tensor:
d = x.shape[-1] // 2
return F.silu(x[..., :d]) * x[..., d:]
```
will implicitly create an instance of the following class:
```
class Func(nn.Module):
# We add some magic to preserve the function's signature.
def forward(self, *args, **kwargs):
return silu_and_mul(*args, **kwargs)
```
Due to the `__call__` implementation of `nn.Module`, the instance
still behaves as a function:
```
out = silu_and_mul(x)
```
However, when the function is used as a member of an `nn.Module`,
it will be kernelized:
```
class FeedForward(nn.Module):
def __init__(self, in_features: int, out_features: int):
self.linear = nn.Linear(in_features, out_features)
# Note: silu_and_mul is a Torch module.
self.silu_and_mul = silu_and_mul
def forward(self, x: torch.Tensor) -> torch.Tensor:
return self.silu_and_mul(self.linear(x))
```
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
MekkCyber
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Amazing work 🫡 !
I'm still a confused why we didn't need to change how kernelize works for the function mapping for example in here: https://github.com/huggingface/kernels/blob/main/src/kernels/layer/layer.py#L463.
| kernel_layer_mapping = { | ||
| "SiluAndMul": { | ||
| "cuda": FuncRepository( | ||
| repo_id="kernels-community/activation", | ||
| func_name="silu_and_mul", | ||
| ), | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Awesome!!!
|
|
||
| # Use function signature with args prepended by self to support | ||
| # module validation. | ||
| func_sig = inspect.signature(func) | ||
| new_args = [Parameter("self", Parameter.POSITIONAL_OR_KEYWORD)] | ||
| new_args.extend(func_sig.parameters.values()) | ||
| Func.forward.__signature__ = Signature( # type: ignore[attr-defined] | ||
| parameters=new_args, | ||
| return_annotation=func_sig.return_annotation, | ||
| ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So calling kernelize won’t cause any issues from replacing a method with a function, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Indeed, we need to fix the signature so that validate_layer works.
| class Func(nn.Module): | ||
| def forward(self, *args, **kwargs): | ||
| return func(*args, **kwargs) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great idea!
Co-authored-by: Mohamed Mekkouri <93391238+MekkCyber@users.noreply.github.com>
Since the |
|
Ack'ed by @MekkCyber in a call. |
This change adds two types of new functionality. First of all, it introduces the
(Locked|Local)?FuncRepoclasses these can be used to extend a layer with a kernel function. For instance, a layer likecan now also be kernelized using a function
silu_and_mulfrom the Hub:This makes it easier to kernelize pure layers (layers that do not use module state), since the Hub kernel does not have to provide a
layersPython module with wrappers.Secondly, we introduce a decorator
use_kernel_func_from_hubthat turns functions into layers that can be kernelized. For example:will implicitly create an instance of the following class:
Due to the
__call__implementation ofnn.Module, the instance still behaves as a function:However, when the function is used as a member of an
nn.Module, it will be kernelized: