Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

moving to upstream parboiled2 dependency #14

Merged
merged 1 commit into from Feb 23, 2023
Merged

Conversation

luksow
Copy link
Contributor

@luksow luksow commented Dec 10, 2022

Hi,

since #1 was rather easy, I took stab at it. The parboiled2 interface changed a bit over the time, so more changes were needed than just the imports.

Thing to consider is to remove akka-parsing completely but I think it was partially sorted out in https://github.com/apache/incubator-pekko-http/tree/scala-3 branch

@pjfanning
Copy link
Contributor

Thanks for the PR. We still need to decide if we want to make this change yet. For v1.0.0, we generally want minimal changes.
The scala-3 branch is shaping up to be the basis of the v1.1.0 release and we may end targeting this work to that release and possibly that branch.
The scalfmt checks are failing and it would be a good idea to change the order of the imports so that the org.parboiled imports are not with the Akka imports.

@luksow
Copy link
Contributor Author

luksow commented Dec 11, 2022

@pjfanning thanks for your comments. I did scalafmt and also reorganized imports so that org.parboiled2 is always after akka's.

After you decide the faith of v1.0.0 - let me know. I'm happy to target v1.1.0 or scala-3 branch instead.

@pjfanning
Copy link
Contributor

@mdedetrich, @jrudolph, @nvollmar - I think we should consider treating the scala-3 branch as the v1.1.0 branch. If this PR was retargeted to the scala-3 branch, would any of you support approving the PR?

@nvollmar
Copy link

I think that would make sense.

@mdedetrich
Copy link
Contributor

I would also support putting this into 1.1.x, will retarget the PR when the 1.1.x branch is out. @luksow Thanks a lot for the work, much appreciated!

@pjfanning
Copy link
Contributor

@luksow would you consider updating this PR to work with the latest main branch? The akka package classes have moved to org.apache.pekko. The main branch should be more stable now.

We are actually thinking about merging this into main for the v1.0.0 release.

@mdedetrich
Copy link
Contributor

mdedetrich commented Feb 14, 2023

@luksow would you consider updating this PR to work with the latest main branch? The akka package classes have moved to org.apache.pekko. The main branch should be more stable now.

We are actually thinking about merging this into main for the v1.0.0 release.

I was also going to mention this. There are still a couple more big structural changes that need to be done (i.e. configuration change from akka to pekko as well as renaming types that have the akka in them). I don't think that these changes would cause further merge conflicts for this kind of PR but I am going to focus on this for the week.

@@ -108,25 +108,25 @@ private[http] object HeaderParser {

object EmptyCookieException extends SingletonException("Cookie header contained no parsable cookie values.")

def lookupParser(headerName: String, settings: Settings = DefaultSettings): Option[String => HeaderParser#Result] =
dispatch.lookup(headerName).map { runner => (value: String) =>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should keep this optimization, maybe we have to keep at our own version of DynamicRuleDispatch?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe, let's try to upstream the optimization and go back to the simpler version for now (which adds an extra hashmap lookup per header parsed).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense to me

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe, let's try to upstream the optimization and go back to the simpler version for now (which adds an extra hashmap lookup per header parsed).

If we do this then lets make an issue on it.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Created #45

@jrudolph
Copy link
Contributor

I added a merge commit to fix the conflicts.

@mdedetrich
Copy link
Contributor

I added a merge commit to fix the conflicts.

Thanks, I just quickly checked the PR and I don't see any class files containing Akka being modified so there shouldn't be any major merge conflicts later

import pekko.http.scaladsl.model.headers._
import CacheDirectives._

private[parser] trait CacheControlHeader { this: Parser with CommonRules with CommonActions with StringBuilding =>
private[parser] trait CacheControlHeader { this: HeaderParser =>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Somehow the Scala compiler got confused here about the more restricted self-type, not sure what's the issue, but that works for now.

@@ -110,11 +110,11 @@ private[http] object HeaderParser {
object EmptyCookieException extends SingletonException("Cookie header contained no parsable cookie values.")

def lookupParser(headerName: String, settings: Settings = DefaultSettings): Option[String => HeaderParser#Result] =
dispatch.lookup(headerName).map { runner => (value: String) =>
import pekko.parboiled2.EOI
Some { (value: String) =>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I changed that to keep the old signatures even if the new one might "lie" because it may still fail later with a RuleNotFound exception (only if we are adding ModeledHeaders without accompanying it with the correctly named parser).

Copy link
Contributor

@jrudolph jrudolph left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm but has to wait for a decision in #1.

@jrudolph
Copy link
Contributor

(Fixed the test fixes I accidentally reverted from @luksow's state)

@mdedetrich mdedetrich modified the milestones: 1.1.0, 1.0.0 Feb 14, 2023
@luksow
Copy link
Contributor Author

luksow commented Feb 14, 2023

Wow, you're moving quick! I barely was able to have a look and all work is already done :) Glad to see it moving forward.

LGTM.

@mdedetrich
Copy link
Contributor

Wow, you're moving quick! I barely was able to have a look and all work is already done :) Glad to see it moving forward.

Yeah this is a common problem we have....

There is still one issue though, which is that the PR is unable to be merged because it needs to be rebased due to conflicts (at least thats what github is complaining)

@luksow
Copy link
Contributor Author

luksow commented Feb 14, 2023

@mdedetrich: here it says it's good to go 🤔
image

@mdedetrich
Copy link
Contributor

I need to download your version of github :D

image

@spangaer
Copy link
Contributor

Tie breaker
image

Maybe you should try light mode 😉

Perhaps it's resolved by now, I saw the same yesterday evening, but it's not the first time I notice GitHub being "eventually consistent"

@mdedetrich
Copy link
Contributor

mdedetrich commented Feb 15, 2023

So I noticed that on my phone its fine and on the desktop its still broken, so this is probably the most liberal definition of eventual that I have seen in a while.

Anyways it just means someone else will have to merge.

@jrudolph
Copy link
Contributor

The reason is probably that you selected "rebase and merge" which isn't possible cleanly due to my merge commit. But we can squash here.

@mdedetrich
Copy link
Contributor

mdedetrich commented Feb 15, 2023

The reason is probably that you selected "rebase and merge" which isn't possible cleanly due to my merge commit. But we can squash here.

You are right, this is a case of UI ergonomics because the drop down button which lets you change the merge strategy was kind of greyed out (implying it was disabled), probably due to it being mixed with dark theme did not help.

Copy link
Contributor

@mdedetrich mdedetrich left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@pjfanning
Copy link
Contributor

could we hold off for a couple of days? - on #1, it has been suggested that we run a perf test - may be over cautious but may be no harm if someone has time to run the benchmarks in this project (with and without this change)

@mdedetrich
Copy link
Contributor

Sure

@mdedetrich mdedetrich added the upstream Blocked by upstream for some reason label Feb 16, 2023
@mdedetrich
Copy link
Contributor

As per discussion at #1 (comment) I have applied the upstream label (indicating that the PR is waiting for some upstream changes).

@jrudolph
Copy link
Contributor

Cleaned up the commits into a single one based on top of current master.

@jrudolph
Copy link
Contributor

As per discussion at #1 (comment) I have applied the upstream label (indicating that the PR is waiting for some upstream changes).

I don't think we should hold off for any upstream changes as discussed above and in #1, the performance hit is currently minor and can partly be solved without upstream changes.

Copy link
Contributor

@pjfanning pjfanning left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm - the perf hit is small

@pjfanning
Copy link
Contributor

@luksow this change is large enough that we probably need a CLA. If you don't have one already, could you get an iCLA as documented in https://www.apache.org/licenses/contributor-agreements.html ?

@jrudolph
Copy link
Contributor

@luksow this change is large enough that we probably need a CLA. If you don't have one already, could you get an iCLA as documented in apache.org/licenses/contributor-agreements.html ?

AFAIU, we have to collect CLAs for every contributor. We have the problem here that if we don't get a CLA we will have to resubmit the exact same PR with a new author because it is so trivial that there is no other alternative to these exact changes. After all only 66 lines were updated or added here.

@pjfanning
Copy link
Contributor

In practice, ASF projects don't collect CLAs from every user who submits a PR. There is no magic criterion for what makes a PR significant enough to require one but the PR changes 70 files, even if the changes are not very large.

Let's see what @luksow says, if he doesn't have time to sort out the CLA, we are probably going to have to look at alternatives.

@luksow
Copy link
Contributor Author

luksow commented Feb 23, 2023

@pjfanning @jrudolph I've just signed and sent ICLA.

@mdedetrich
Copy link
Contributor

In practice, ASF projects don't collect CLAs from every user who submits a PR. There is no magic criterion for what makes a PR significant enough to require one but the PR changes 70 files, even if the changes are not very large.

My understanding is also the same, an ICLA is required for significant contributions but the definition of this is entirely subjective.

@jrudolph jrudolph merged commit 26846a0 into apache:main Feb 23, 2023
@pjfanning pjfanning added the release notes should be mentioned in release notes label Jun 11, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
release notes should be mentioned in release notes upstream Blocked by upstream for some reason
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants