Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(deps): update otel-core experimental to ^0.47.0 #1906

Conversation

pichlermarc
Copy link
Member

Which problem is this PR solving?

  • fix(deps): update otel core experimental to ^0.47.0 #1905 is failing due to the package-lock.json being only partially updated, causing 1.19.0 to be pulled in. This PR that by re-generating package-lock.json.
  • I also had to bump the version on @types/koa to 2.14.0 because it became incompatible with the newer one that's pulled in by another package.

Copy link

codecov bot commented Jan 17, 2024

Codecov Report

Merging #1906 (3185bef) into main (f65f2f1) will increase coverage by 0.02%.
Report is 1 commits behind head on main.
The diff coverage is 82.35%.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #1906      +/-   ##
==========================================
+ Coverage   91.51%   91.54%   +0.02%     
==========================================
  Files         145      145              
  Lines        7427     7435       +8     
  Branches     1486     1487       +1     
==========================================
+ Hits         6797     6806       +9     
+ Misses        630      629       -1     
Files Coverage Δ
...try-instrumentation-ioredis/src/instrumentation.ts 91.48% <87.50%> (+1.27%) ⬆️
...lemetry-instrumentation-koa/src/instrumentation.ts 96.62% <87.50%> (+1.22%) ⬆️
...try-instrumentation-redis-4/src/instrumentation.ts 80.09% <77.77%> (-0.11%) ⬇️

@trentm
Copy link
Contributor

trentm commented Jan 17, 2024

@pichlermarc I see it is failing in koa ESM tests. Let me know if you'd like help debugging that ... assuming it isn't some package-lock shenanigan. The package-lock update here is updating a lot of other deps as well (e.g. aws-sdk deps, npmcli deps, etc.). I'm not sure if that was intentional.

@pichlermarc
Copy link
Member Author

pichlermarc commented Jan 19, 2024

@pichlermarc I see it is failing in koa ESM tests. Let me know if you'd like help debugging that ... assuming it isn't some package-lock shenanigan. The package-lock update here is updating a lot of other deps as well (e.g. aws-sdk deps, npmcli deps, etc.). I'm not sure if that was intentional.

Yes I've had to re-generate the package-lock.json as npm install --package-lock-only is holding back the @opentelemetry/* stable packages at 1.19.0.

@trentm if you have some time to look into that I'd appreciate it 🙂

I looked into it again briefly today (using the new iitm release 1.7.3) and it looks like the ESM check is failing: module[Symbol.toStringTag] === 'Module' evaluates to false, so it's taking module directly, instead of module.default, which causes prototype to be undefined. As to why this is, though - I have no idea. 😞

@trentm
Copy link
Contributor

trentm commented Jan 22, 2024

@trentm if you have some time to look into that I'd appreciate it 🙂

I'll take a look today or tomorrow.

@pichlermarc
Copy link
Member Author

pichlermarc commented Jan 23, 2024

@trentm if you have some time to look into that I'd appreciate it 🙂

I'll take a look today or tomorrow.

Thank you 🙏

I've done some more digging and it looks like iitm 1.7.2 is causing this issue. More specifically, anything after this commit does not work for the koa instrumentation: nodejs/import-in-the-middle@5ec0576.

For koa iitm generates code like this:

import { register } from 'file:///home/marc/otel-js/1906/node_modules/import-in-the-middle/lib/register.js'
import * as namespace from "file:///home/marc/otel-js/1906/node_modules/koa/lib/application.js"

const _ = Object.assign({}, ...[namespace])
const set = {}


let $HttpError = _.HttpError
export { $HttpError as HttpError }
set.HttpError = (v) => {
$HttpError = v
return true
}


let $default = _.default
export { $default as default }
set.default = (v) => {
$default = v
return true
}

register("file:///home/marc/otel-js/1906/node_modules/koa/lib/application.js", _, set, "../lib/application.js")

Then, on the _ object, _[Symbol.toStringTag] === 'Module' evaluates to false (because it's not enumerable, it never gets assigned). This is, what I think, makes it back to the instrumentation, causing it to evaluate to false there as well, then we pick the wrong property, which ends up being undefined.

Edit: this code is generated by iitm at the commit mentioned above. It's different in the most recent revision but I think the outcome is similar.

Edit 2: adding code to define the symbols from the source object works - I'm now certain this is a IITM bug, I'll open an issue and link it here. Then we'll have to figure out how to move forward with this. I think we may have to downgrade to 1.7.1 in core, release there, then update here and the tests will pass. We'll miss out on the fix that makes it work on Node.js 18.19.0 though.

Another option is to add a workaround to instrumentation-koa so that it does not need to rely on @@toStringTag, as it seems to work for all the other instrumenations.

@pichlermarc
Copy link
Member Author

Opened the issue nodejs/import-in-the-middle#57

@pichlermarc
Copy link
Member Author

pichlermarc commented Jan 23, 2024

I applied some workarounds.
Now the only thing left seems related to something using process even though it's not defined in the document-load tests.

Edit: this is likely related to the updated packages though.
Edit 2: it's nise 5.1.7 (pulled in by sinon) - downgrading it to 5.1.5 solves the issue.
Edit 3: issue is closed for some reason: sinonjs/nise#221

@pichlermarc
Copy link
Member Author

It'll now likely fail with the same TAV failure from main (instrumentation-aws-sdk), let's see.

@trentm
Copy link
Contributor

trentm commented Jan 24, 2024

@pichlermarc Wow, nice job working through all this!

Another option is to add a workaround to instrumentation-koa so that it does not need to rely on @@toStringTag, as it seems to work for all the other instrumenations.

Interesting that this works for the ESM tests of pino, bunyan, and pg without needing changes. Those instrumentations are checking module[Symbol.toStringTag] as well. I haven't dug in to understand this yet.

I think we may have to downgrade to 1.7.1 in core, release there, then update here and the tests will pass. We'll miss out on the fix that makes it work on Node.js 18.19.0 though.

Agreed.


I was looking a bit yesterday, but I was looking at totally different things: nothing to do with Koa or IITM.

Looking at the first commit (the one from renovatebot), I notice that the test failures were from mismatched deps from the core repo (e.g. having both versions 1.19.0 and 1.20.0 of @opentelemetry/sdk-trace-base or whatever). It bugged me a little bit that the expedient way to get those updated presumably led to the massive "package-lock.json" update in the second commit (the one where you updated @types/koa).

I started working on a script to see if there was a better/easier way to update all the deps from the core repo without pain.
I will follow up on that. However, the more important issue is the IITM one that you found.

@trentm
Copy link
Contributor

trentm commented Jan 24, 2024

Another option is to add a workaround to instrumentation-koa so that it does not need to rely on @@toStringTag, as it seems to work for all the other instrumenations.

I was trying to verify the other instrumentations that have ESM support (i.e. that are using toStringTag):

  • ioredis: Confirmed works for me. (cd plugins/node/opentelemetry-instrumentation-ioredis && npm run test:local)
  • koa: Confirmed works for me. (cd plugins/node/opentelemetry-instrumentation-koa && npm run test)
  • pg: instr-pg does not have ESM tests. fix(pg): fix instrumentation of ESM-imported pg #1701 didn't add a test case for it at the time, because the runTestFixture thing was still being discussed.
  • bunyan: instr-bunyan does not have an ESM test case. I don't recall if that was because the runTestFixture ESM testing mechanism was still being discussed.
  • pino: instr-pino ESM tests were removed accidentally in a refactor of pino tests in feat: Allow configuring pino to log with different keys #1867

So I wonder if the other three will need work as well to work with IITM 1.7.2 || 1.7.3.

@trentm
Copy link
Contributor

trentm commented Jan 24, 2024

I started working on a script to see if there was a better/easier way to update all the deps from the core repo without pain.
I will follow up on that.

I think I have a procedure that will correctly update all the @opentelemetry/* deps coming from the core-repo to their latest version, without having the update to unrelated other deps. I don't have it scripted yet.

I think we could get this PR down to just: (a) the @opentelemetry/* deps updates and (b) any toStringTag workarounds needed to cope with IITM@1.7.2. That said, maybe we'll wait for a core update that pins to IITM@1.7.1 instead.

@trentm
Copy link
Contributor

trentm commented Jan 26, 2024

I think I have a procedure that will correctly update all the @opentelemetry/* deps coming from the core-repo to their latest version, without having the update to unrelated other deps. I don't have it scripted yet.

@pichlermarc FYI: I have a start at scripting this. I'm off tomorrow, so I'll continue on Monday.

I did try to see if a dependabot update group would do a good job of this. Here is my dependabot group config (in a separated copy, not a clone, of the contrib-repo): https://github.com/trentm/tm-ojc-dependabot-play/blob/main/.github/dependabot.yml#L10-L13

Unfortunately dependabot just times out trying to do this group update. Here are the update attempt jobs (I'm not sure if this will be visible to all): https://github.com/trentm/tm-ojc-dependabot-play/network/updates/19077332/jobs
The logs for one of those ends with:

...
updater | 2024/01/24 23:07:47 INFO <job_778420776> Requirements to unlock own
updater | 2024/01/24 23:07:47 INFO <job_778420776> Requirements update strategy widen_ranges
updater | 2024/01/24 23:07:47 INFO <job_778420776> Updating @opentelemetry/sdk-node from 0.46.0 to 0.47.0
updater | time="2024-01-24T23:08:52Z" level=info msg="task complete" container_id=job-778420776-updater exit_code=137 job_id=778420776 step=updater
updater | time="2024-01-24T23:08:53Z" level=warning msg="timeout running job" error="waiting for updater: waiting for container: context deadline exceeded" job_id=778420776

The troubleshooting docs for this timeout (https://docs.github.com/en/code-security/dependabot/working-with-dependabot/troubleshooting-dependabot-errors#dependabot-timed-out-during-its-update) unfortunately say:

This error is usually seen only for large repositories with many manifest files, for example, npm or yarn monorepo projects with hundreds of package.json files. ... This error is difficult to address.

:) So I think I'd lean towards scripting this.

My guess is that we could still use dependabot for other version updates. My understanding is that it is the large group of deps that together make it timeout.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.