Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Making LLRT available as a crate #326

Merged
merged 28 commits into from
Apr 30, 2024

Conversation

KaanMol
Copy link
Contributor

@KaanMol KaanMol commented Apr 3, 2024

Issue #319 (if available)

closes #319

Description of changes

Currently it is pretty "ok", some naming etc is made up on the fly, like the package names or like module_builder.

Splitting AWS parts out to its own package is possibly a better option to pick up in a second PR, since this PR is getting a bit on the big side.

Tasks that might be picked up with this PR:

  • Moving AWS specific parts out of the core of LLRT;

Checklist

  • Created unit tests in tests/unit and/or in Rust for my feature if needed
  • Ran make fix to format JS and apply Clippy auto fixes
  • Made sure my code didn't add any additional warnings: make check
  • Updated documentation if needed (API.md/README.md/Other)

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

@KaanMol
Copy link
Contributor Author

KaanMol commented Apr 9, 2024

@richarddavison, could you take a look and let me know if it aligns with what you had in mind? It is more of a proof of concept. I dont like how I did the module declaration and also the bind to global part, however at the same time no other solution comes to my mind.

Copy link
Contributor

@richarddavison richarddavison left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like this approach! Now third party projects can pull in a lightweight runtime and use the modules they wan't. Unused modules would then be automatically removed at compile time. Just some minor changes regarding module builder

src/module_builder.rs Outdated Show resolved Hide resolved
@KaanMol
Copy link
Contributor Author

KaanMol commented Apr 15, 2024

@richarddavison I had another problem, how would we allow passing values from LLRT to Rust context? Do you maybe have an idea? Deno has a Scope for example

@richarddavison
Copy link
Contributor

@richarddavison I had another problem, how would we allow passing values from LLRT to Rust context? Do you maybe have an idea? Deno has a Scope for example

Can you expand a bit what you mean pass values? I'm not familiar with Scope in Deno, can you send me documentation?

@KaanMol
Copy link
Contributor Author

KaanMol commented Apr 15, 2024

@richarddavison I had another problem, how would we allow passing values from LLRT to Rust context? Do you maybe have an idea? Deno has a Scope for example

Can you expand a bit what you mean pass values? I'm not familiar with Scope in Deno, can you send me documentation?

Their documentation is not the best, they dont have their runtime really documented.
I haven't really looked at it in dept, but maybe you have an idea how one would do this.

How could one start the runtime maybe pass the JS function some params from Rust and also get the data returned in the JS function to do something with it after the execution?

@richarddavison
Copy link
Contributor

@richarddavison I had another problem, how would we allow passing values from LLRT to Rust context? Do you maybe have an idea? Deno has a Scope for example

Can you expand a bit what you mean pass values? I'm not familiar with Scope in Deno, can you send me documentation?

Their documentation is not the best, they dont have their runtime really documented. I haven't really looked at it in dept, but maybe you have an idea how one would do this.

How could one start the runtime maybe pass the JS function some params from Rust and also get the data returned in the JS function to do something with it after the execution?

This is very simple, for example call JS from Rust:

globals.anyJsObjectOnGlobalScope = (arg) => {
  return arg;
}
let global_js_fn : Function = ctx.globals().get("anyJsObjectOnGlobalScope")?;
let result_string: String = global_js_fn.call(("hello",))?;

Or call Rust from JS:

rustFunction("hi")
ctx.globals().set("rustFunction",Func::from(|value:String|{
   //returns same value
   value
})

@KaanMol
Copy link
Contributor Author

KaanMol commented Apr 17, 2024

@richarddavison I had another problem, how would we allow passing values from LLRT to Rust context? Do you maybe have an idea? Deno has a Scope for example

Can you expand a bit what you mean pass values? I'm not familiar with Scope in Deno, can you send me documentation?

Their documentation is not the best, they dont have their runtime really documented. I haven't really looked at it in dept, but maybe you have an idea how one would do this.
How could one start the runtime maybe pass the JS function some params from Rust and also get the data returned in the JS function to do something with it after the execution?

This is very simple, for example call JS from Rust:

globals.anyJsObjectOnGlobalScope = (arg) => {
  return arg;
}
let global_js_fn : Function = ctx.globals().get("anyJsObjectOnGlobalScope")?;
let result_string: String = global_js_fn.call(("hello",))?;

Or call Rust from JS:

rustFunction("hi")
ctx.globals().set("rustFunction",Func::from(|value:String|{
   //returns same value
   value
})

Oh thats amazing!

@KaanMol
Copy link
Contributor Author

KaanMol commented Apr 17, 2024

@richarddavison I just implemented LLRT as a test in my own project, it brought it back from around 2/3 sec with Deno to around 100ms, I also ran LLRT's benchmarks and didn't notice a change before and after these changes :)

@richarddavison
Copy link
Contributor

@richarddavison I just implemented LLRT as a test in my own project, it brought it back from around 2/3 sec with Deno to around 100ms, I also ran LLRT's benchmarks and didn't notice a change before and after these changes :)

Have you thought about moving out the Lambda specific implementations we do? For instance the console contains a lot of Lambda specific code, even at its core (for example treating newlines). I suspect the best thing right now would be to feature gate it. Another option could be to pass a config and use a closure for the exposed functions so we can pass config values down the the console argument handling (for example line endings, output etc).

@KaanMol
Copy link
Contributor Author

KaanMol commented Apr 19, 2024

@richarddavison I just implemented LLRT as a test in my own project, it brought it back from around 2/3 sec with Deno to around 100ms, I also ran LLRT's benchmarks and didn't notice a change before and after these changes :)

Have you thought about moving out the Lambda specific implementations we do? For instance the console contains a lot of Lambda specific code, even at its core (for example treating newlines). I suspect the best thing right now would be to feature gate it. Another option could be to pass a config and use a closure for the exposed functions so we can pass config values down the the console argument handling (for example line endings, output etc).

Yes, like I mentioned previously, I thought of making console utils and implementing the module within LLRT while creating a second module, but in LLRT-lambda or some other name. The thing I actually find complicated is that some JS gets compiled for improved start-up times, and it also includes AWS libraries. Do you know if it would be easy to split that?

@richarddavison
Copy link
Contributor

Yes, like I mentioned previously, I thought of making console utils and implementing the module within LLRT while creating a second module, but in LLRT-lambda or some other name. The thing I actually find complicated is that some JS gets compiled for improved start-up times, and it also includes AWS libraries. Do you know if it would be easy to split that?

Yes, that can be easily removed by feature gate or in this case remove that logic from the module loader and resolver:

llrt/src/vm.rs

Line 225 in bfaa41e

impl Resolver for BinaryResolver {

llrt/src/vm.rs

Line 346 in bfaa41e

impl Loader for BinaryLoader {

Right now the binary resolver does some other things as well so that can complexity probably be refactored a bit.

@KaanMol
Copy link
Contributor Author

KaanMol commented Apr 26, 2024

Yes, like I mentioned previously, I thought of making console utils and implementing the module within LLRT while creating a second module, but in LLRT-lambda or some other name. The thing I actually find complicated is that some JS gets compiled for improved start-up times, and it also includes AWS libraries. Do you know if it would be easy to split that?

Yes, that can be easily removed by feature gate or in this case remove that logic from the module loader and resolver:

llrt/src/vm.rs

Line 225 in bfaa41e

impl Resolver for BinaryResolver {

llrt/src/vm.rs

Line 346 in bfaa41e

impl Loader for BinaryLoader {

Right now the binary resolver does some other things as well so that can complexity probably be refactored a bit.

So I wonder if it wouldn't be a better idea to start splitting this PR up in multiple parts. It already got very big.
Right now I have split LLRT up in 2 parts, one of them being the core and one is the executable itself.

How would you feel if we set the scope for this PR only to what it is now, where LLRT is 2 parts with a module selection etc. And after merging I would start splitting the AWS parts away. At least if a big refactor happens somewhere, I would not have to somehow merge that in this newer structure every time.

@KaanMol KaanMol marked this pull request as ready for review April 26, 2024 21:41
@KaanMol
Copy link
Contributor Author

KaanMol commented Apr 29, 2024

@richarddavison Could you review this once you have time? :)
Splitting the AWS parts is something I am going to do in its own PR

Copy link
Contributor

@richarddavison richarddavison left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minor nit, but LGTM!

pack Outdated Show resolved Hide resolved
@KaanMol
Copy link
Contributor Author

KaanMol commented Apr 30, 2024

Minor nit, but LGTM!

Should be done now! The failing test should (hopefully) also be addressed, make test-ci now works as well locally.

@richarddavison
Copy link
Contributor

@KaanMol can you please resolve the last conflict, then I'll merge! I think we're good now!

@KaanMol
Copy link
Contributor Author

KaanMol commented Apr 30, 2024

@richarddavison Done!

@richarddavison richarddavison merged commit 0ba28cd into awslabs:main Apr 30, 2024
5 checks passed
@KaanMol KaanMol deleted the feature/llrt-library branch April 30, 2024 14:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Publishing LLRT as a crate
2 participants