-
-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(vim.net): fetch(), download(), and :e URL in lua #29104
Conversation
To be honest, I think this code still needs a lot of work before it could be merged. Maybe it would be better to start with only |
runtime/doc/lua.txt
Outdated
See man://curl for supported protocols. Not all protocols are fully | ||
tested. | ||
|
||
Please carefully note the option differences with |vim.net.fetch()|, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what are the differences? at a glance doesn't look like they need to be separate functions.
as clason also mentioned, we could start with a minimal version of vim.net.fetch
which has the interface (signature) that we want, but only supports basic "download" behavior.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fetch is much more general, allowing arbitrary requests (including writing, needed for :e
). Download is a simplified interface for just... downloading a file (but not enough to implement :e
over network). But note that you still need to expose some curl options since network conditions are very heterogeneous.
So both are needed (if both of these different features are wanted). I agree the name fetch
is bad, though, and should be changed to make that clearer -- maybe vim.net.request
(could mark it as private for now).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fetch is much more general,
why can't that be an overload? I still don't get why we need two different functions
Netrw seems to handle |
Also, the Netrw implementation uses multiple commands ( |
whatever curl can easily handle is fine. or we can start with http[s] and add more later. btw, please review comments starting from #23461 (comment) |
Would this look like providing a |
This was exactly my worry about relying on system curl. I don't know what the minimum expectable baseline is (I just know it's very low). I think we need to rely on runtime checks here -- support everything that is reasonable (not necessarily for first PR) but fail gracefully if the local |
|
||
|
||
HeaderTable:append({key}) *HeaderTable:append()* | ||
Append value to header. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm really not a fan of the vim.opt-style custom functions for basic table operations nor the headertable object. Could this not be a normal table instead? IIRC even python's requests package use a basic dictionary for providing headers.
I wonder if making this API public is premature. Could we not start using this internally for a while so we have time to iron out problems that show up encounter? Netrw could still use this under the hood ofc. |
@TheLeoP I really appreciate the effort, but I would like to structure it a bit more. (There's a reason the original PR languished and got closed in the end...) I really don't think trying to do the whole So I will close this PR and would like to ask you to open a new, fresh, one where you just extract and polish the functionality needed for Once we're happy with that, we can build on that with Again, this work is welcome, but the process would just be smoother (not least for you) if we do it in multiple self-contained steps. |
I saw that #23461 was closed, so I rebased it on top of master and addressed a few of the pending comments. In the following days I'll continue working on this PR.
Should close #23232