Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Providing API to set HTTP2 max concurrent streams #3806

Closed
wants to merge 12 commits into from

Conversation

kunalekawde
Copy link

Hello,
Added an API to set the max concurrent streams for HTTP2 connections. Request to please review.
Thanks

@bagder
Copy link
Member

bagder commented Apr 25, 2019

Thanks a lot for your contribution! Some initial feedback and questions:

  1. Try make checksrc locally first to find the nits the code style checker finds. Saves you time.
  2. How about coming up with a name for the option that would also work for HTTP/3?
  3. What happens if this is set when there are existing connections already setup? (goes to the documentation).
  4. Any min/max values ? Users will need some guidance here. What happens if you set max to 20K or 128G?

@kunalekawde
Copy link
Author

@bagder , thanks for the feedback and some important points you shared, I shall check these.
PS: This is my first git open source add process, please pardon for any miss and open for learning :)

@bagder
Copy link
Member

bagder commented Apr 26, 2019

@kunalekawde, my comments are in no way a critique of your approach or the process you're following. They're only meant to help making this an even better PR and get it into a shape that we can end up merging. I really appreciate you helping us out. There's also no hurry, since this is a feature/change that we, according to our merge rules, can't merge into master until after the pending release anyway so we have a few weeks to polish it to perfection!

I'm honored you're dipping your toes into open source for real for the first time with us, and I hope we won't discourage you from doing more of it going forward! 😄

lib/multi.c Outdated
@@ -3043,3 +3046,8 @@ void Curl_multi_dump(struct Curl_multi *multi)
}
}
#endif

uint32_t Curl_multi_max_http2_concurrent_streams(struct Curl_multi *multi)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

inconveniently enough, we don't use uint32_t in our code (and this causes compiler error on some places). Stick to good old unsigned int and you'll get an unsigned 32 bit variable.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure thanks, I had moved it to long, following other examples. Shall check on this.

@kunalekawde
Copy link
Author

kunalekawde commented May 15, 2019

@bagder
1.How about coming up with a name for the option that would also work for HTTP/3?
--Yeah, I quick read through QUIC / HTTP3 and moved the name to max_concurrent_streams instead of max_http2_concurrent_streams. Basically removed http2 from it.

2.What happens if this is set when there are existing connections already setup? (goes to the documentation).
--Yes, I've updated the documentation for default, min and max. But documentation needs to be updated with target libcurl version.

3.Any min/max values ? Users will need some guidance here. What happens if you set max to 20K or 128G?
--I've put an upper limit on 2^32 -1 (referred other implementation limits)

@kunalekawde
Copy link
Author

@bagder , from the build logs, I don't see any issue with code, looks like timeout and FTP case failed. Any pointers to look at ?

@jay
Copy link
Member

jay commented Jul 11, 2019

CIs have arbitrary slowdowns which cause some of our tests to fail sometimes. I've restarted Travis and Cirrus CIs but not appveyor since I can't figure out how. (My appveyor for some reason shows jay/curl and not curl/curl even though when I go into appveyor settings I see it set to curl/curl so not sure what that's about I'll have to ask them.)

@IlyaFinkelshteyn
Copy link

@jay answered here. After you added as a Collaborator, I would recommend RE-RUN INCOMPLETE button to save some time.

@kunalekawde
Copy link
Author

Thanks @jay and @IlyaFinkelshteyn for the support , so as I understand, I need to wait for appveyor to be restarted, post which I need to push some dummy change from my repo or the builds would run automatically ?

@IlyaFinkelshteyn
Copy link

@kunalekawde pushing some dummy change will work, but it will restart all other CIs and all AppVeyor jobs. Instead someone who has permissions to AppVeyor project can press RE-RUN INCOMPLETE to restart just single failed job. CC @bagder

@kunalekawde
Copy link
Author

Thanks @IlyaFinkelshteyn , Looks like it was run as only coverage is failing, shall check this.

@@ -382,6 +382,9 @@ typedef enum {
/* This is the argument passed to the server push callback */
CINIT(PUSHDATA, OBJECTPOINT, 15),

/* maximum number of (pipelining) connections to one host */
CINIT(MAX_CONCURRENT_STREAMS, LONG, 16),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

comment is wrong

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, updated

@@ -434,6 +437,9 @@ typedef int (*curl_push_callback)(CURL *parent,
struct curl_pushheaders *headers,
void *userp);

/* value for MAXIMUM CONCURRENT STREAMS upper limit */
#define INITIAL_MAX_CONCURRENT_STREAMS ((1U << 31) - 1)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggest DEFAULT instead of INITIAL like DEFAULT_MAX_CONCURRENT_STREAMS

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nevermind I see why you called it INITIAL_MAX_CONCURRENT_STREAMS.

lib/multi.c Outdated Show resolved Hide resolved
@@ -1821,6 +1821,8 @@
d c 00007
d CURLMOPT_MAX_PIPELINE_LENGTH...
d c 00008
d CURLMOPT_MAX_CONCURRENT_STREAMS...
d c 00009
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is wrong it has to be appended
d CURLMOPT_MAX_CONCURRENT_STREAMS...
d c 00016

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

     d  CURLMOPT_PUSHDATA...
     d                 c                   10015
     d  CURLMOPT_MAX_CONCURRENT_STREAMS...
     d                 c                   10016
      *
      * Bitmask bits for CURLMOPT_PIPELING.
      *

Is this correct ? or the original placement is correct and only value needs to be updated to 00016 ?

docs/examples/http2-download.c Outdated Show resolved Hide resolved
Copy link
Member

@bagder bagder left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is getting really close to be ready for merge. Some comments I though about when reading this again just now!

.SH DESCRIPTION
Pass a long indicating the \fBmax\fP. The set number will be used as the
maximum number of concurrent streams for a connections that libcurl should
support.

Valid values range from 1 to 2147483647 (2^31 - 1) and defaults to 100.
This option is for the multi handle's use only.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this needs to say it is for multi handles only. The option is only for a single function and that function is for a multi handle.

I think the description needs to say something about this value being a request and that there's no guarantee that the set value will actually be used.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this needs to say it is for multi handles only. The option is only for a single function and that function is for a multi handle.

Agree

I think the description needs to say something about this value being a request and that there's no guarantee that the set value will actually be used.

Actually upstream nghttp2 doesn't mention something like this. May be I can only update that its the max value and would be dependent on transfer traffic rate.

docs/libcurl/symbols-in-versions Outdated Show resolved Hide resolved
lib/multi.c Outdated
(multi->max_http2_concurrent_streams?
multi->max_http2_concurrent_streams:100) : 0;
(multi->max_concurrent_streams?
multi->max_concurrent_streams:100) : 0;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

multi->max_concurrent_streams is a long and this returns a size_t ? On some platforms like Windows, they are not of equal size.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Its the same for max_host_connections, max_total_connections. May be I can type cast it to size_t, which would adjust according to architecture used.

lib/multi.c Outdated
{
long streams = va_arg(param, long);
multi->max_concurrent_streams = (streams>INITIAL_MAX_CONCURRENT_STREAMS)?
INITIAL_MAX_CONCURRENT_STREAMS : streams;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This will allow negative values to be set. I presume that isn't intended? And zero.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agree, updated

* 'master' of https://github.com/kunalekawde/curl: (436 commits)
  ngtcp2: use ngtcp2_version() to get the run-time version
  ngtcp2: move the h3 initing to immediately after the rx key
  quiche: register debug callback once and earlier
  ssh: add a generic Curl_ssh_version function for SSH backends
  base64: check for SSH, not specific SSH backends
  vssh: move ssh init/cleanup functions into backend code
  vssh: create directory for SSH backend code
  TODO/ROADMAP: remove "refuse downgrade redirects" and HTTP/3
  RELEASE-NOTES: synced
  travis: add a quiche build
  http: fix use of credentials from URL when using HTTP proxy
  tests: Replace outdated test case numbering documentation
  travis: reduce number of torture tests in 'coverage'
  configure: use -lquiche to link to quiche
  ngtcp2: provide the callbacks as a static struct
  ngtcp2: add missing nghttp3_conn_add_write_offset call
  ngtcp2: deal with stream close
  ngtcp2: Consume QUIC STREAM data properly
  ngtcp2: don't reinitialize SSL on Retry
  multi: getsock improvements for QUIC connecting
  ...
@kunalekawde
Copy link
Author

@bagder , @jay , any help /pointers on this ?

for travis build:configure: error: --with-quiche was specified but could not find quiche pkg-config file.
for appveyor: not sure if its testcase test1501

@bagder
Copy link
Member

bagder commented Aug 22, 2019

If you rebase your branch the quiche problem will go away as it is already fixed.

Note that we strongly recommend rebase, not merging master into your branch, as it makes your work much easier to review.

@kunalekawde
Copy link
Author

Thanks @bagder
I tried following in my local clone:

$ git rebase origin/master
Current branch master is up to date.

$ git status
On branch master
Your branch is up-to-date with 'origin/master'.

nothing to commit, working tree clean

[kunal_ekawde@cni-builder-15-15-34 curl]$ git pull
Already up-to-date.

There are no changes pulled in.

@bagder
Copy link
Member

bagder commented Aug 26, 2019

  1. make sure your origin/master is in sync with the upstream's (curl/curl) master branch
  2. checkout your local feature branch and rebase that to your origin/master
  3. force push to your local feature branch

@kunalekawde
Copy link
Author

Thanks @bagder , apologies for delay as I was on vacation.
I had to play around git & not sure if its as expected. Somehow I ended up creating a branch.

$ git push --set-upstream origin max-concurrent-streams -f
Total 0 (delta 0), reused 0 (delta 0)
remote:
remote: Create a pull request for 'max-concurrent-streams' on GitHub by visiting:
remote:      https://github.com/kunalekawde/curl/pull/new/max-concurrent-streams
remote:
To https://github.com/kunalekawde/curl.git
 * [new branch]          max-concurrent-streams -> max-concurrent-streams
Branch max-concurrent-streams set up to track remote branch max-concurrent-streams from origin.

$ git branch -vv
  master                 046e1fbd6 [origin/master] Merge branch 'master' of https://github.com/kunalekawde/curl
* max-concurrent-streams 046e1fbd6 [origin/max-concurrent-streams] Merge branch 'master' of https://github.com/kunalekawde/curl

Do you think this is correct and any minimum step required to finish this ? Or if there is some alternate way if you could suggest.

@kunalekawde
Copy link
Author

Sorry as I'm not able to get correct git commands may be, only way I see here is to create a draft pull as suggested so that changes / diff can be seen and absorbed. Please let me know, so I can atleast close this pull request.

@jay
Copy link
Member

jay commented Sep 24, 2019

It's because the branch of the PR is set to master. It's not good to work in master, typically you make a branch for the PR which it looks like you did later, max-concurrent-streams. There might be a way to change the branch associated with this PR but I think it would be easier to just open a new PR and reference this PR.

@kunalekawde
Copy link
Author

kunalekawde commented Sep 24, 2019

Thanks @jay , yes I later realized that branch was required earlier. Let me create a new pull.

@lock lock bot locked as resolved and limited conversation to collaborators Dec 23, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Development

Successfully merging this pull request may close these issues.

None yet

4 participants