Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ray logs persistence #4266

Merged
merged 2 commits into from
Nov 1, 2023
Merged

Ray logs persistence #4266

merged 2 commits into from
Nov 1, 2023

Conversation

jeevb
Copy link
Contributor

@jeevb jeevb commented Oct 20, 2023

Describe your changes

Injects a logging sidecar to tail and expose RayJob logs to the container's stdout, and in turn, to a cluster's logging infrastructure. This will allow job logs to be persisted to Cloudwatch, Stackdriver, or other storage backend for perusal at a later time.

Also adds support for specifying multiple containers in a pod template for Ray tasks.

Check all the applicable boxes

  • I updated the documentation accordingly.
  • All new and existing tests passed.
  • All commits are signed-off.

Screenshots

Got this working with the following fluent-bit.conf config:

[SERVICE]
    Log_Level error
[INPUT]
    Name tail
    Path /tmp/ray/session_latest/logs/job-driver-*
    Refresh_Interval 1
[OUTPUT]
    Name file
    Match *
    File /dev/stdout
    Format template
    Template {log}

Note that the stdout output plugin could work here as well if structured logs are desired, but I went with just exposing logs as plaintext for now.

Sidecar stdout looks as follows:

> kubectl logs -f r9d1584b28b7d-taskraytask-0-raycluster-kksrv-head-fhm4q -c logs
Fluent Bit v2.1.10
* Copyright (C) 2015-2022 The Fluent Bit Authors
* Fluent Bit is a CNCF sub-project under the umbrella of Fluentd
* https://fluentbit.io

tar: Removing leading `/' from member names

2023-10-31 23:17:27,711	INFO worker.py:1329 -- Using address 10.42.0.11:6379 set in the environment variable RAY_ADDRESS
2023-10-31 23:17:27,712	INFO worker.py:1458 -- Connecting to existing Ray cluster at address: 10.42.0.11:6379...
2023-10-31 23:17:27,718	INFO worker.py:1633 -- Connected to Ray cluster. View the dashboard at http://10.42.0.11:8265
(compute_squared pid=186) Computing squared of 0...
(compute_squared pid=186) Computed squared of 0: 0
(compute_squared pid=85, ip=10.42.0.12) Computing squared of 1...
(compute_squared pid=186) Computing squared of 2...
(compute_squared pid=186) Computed squared of 2: 4 [repeated 2x across cluster] (Ray deduplicates logs by default. Set RAY_DEDUP_LOGS=0 to disable log deduplication, or see https://docs.ray.io/en/master/ray-observability/ray-logging.html#log-deduplication for more options.)
(compute_squared pid=85, ip=10.42.0.12) Computing squared of 3...
(compute_squared pid=186) Computing squared of 4...
(compute_squared pid=85, ip=10.42.0.12) Computed squared of 3: 9
[2023/10/31 23:18:44] [engine] caught signal (SIGTERM)

Note to reviewers

@jeevb jeevb changed the title Ray logs persistence [Draft] Ray logs persistence Oct 20, 2023
@codecov
Copy link

codecov bot commented Oct 20, 2023

Codecov Report

Attention: 5 lines in your changes are missing coverage. Please review.

Comparison is base (b6b0d61) 59.53% compared to head (6f713c2) 59.30%.
Report is 3 commits behind head on master.

❗ Current head 6f713c2 differs from pull request most recent head 29c0688. Consider uploading reports for the commit 29c0688 to get more accurate results

Additional details and impacted files
@@            Coverage Diff             @@
##           master    #4266      +/-   ##
==========================================
- Coverage   59.53%   59.30%   -0.23%     
==========================================
  Files         632      544      -88     
  Lines       53527    38974   -14553     
==========================================
- Hits        31867    23114    -8753     
+ Misses      19145    13582    -5563     
+ Partials     2515     2278     -237     
Flag Coverage Δ
unittests ?

Flags with carried forward coverage won't be shown. Click here to find out more.

Files Coverage Δ
flyteplugins/go/tasks/plugins/k8s/ray/config.go 36.36% <ø> (+7.79%) ⬆️
flyteplugins/go/tasks/plugins/k8s/ray/ray.go 83.68% <91.93%> (+0.30%) ⬆️

... and 568 files with indirect coverage changes

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Contributor

@EngHabu EngHabu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! after adding config options to control that behavior as you suggested...

Name: "system-ray-state",
MountPath: "/tmp/ray",
}
primaryContainer.VolumeMounts = append(primaryContainer.VolumeMounts, writeableVolMount)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This block should be included in the if block, right? otherwise it'll double add that volumemount if one exists...

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yea this is wrong. I think we want to use info from an existing volume / volume mount where possible. I'll clean up.

// Ray logs integration
foundTmpRayVolMount := false
for _, vm := range primaryContainer.VolumeMounts {
if vm.MountPath == "/tmp/ray" {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is that path hardcoded in ray driver?

Copy link
Contributor Author

@jeevb jeevb Oct 20, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yea /tmp/ray is a fixed path. I'll make this a const.

@jeevb jeevb marked this pull request as ready for review October 29, 2023 06:18
@jeevb jeevb changed the title [Draft] Ray logs persistence Ray logs persistence Oct 29, 2023
@jeevb jeevb merged commit 1b92105 into master Nov 1, 2023
40 of 41 checks passed
@jeevb jeevb deleted the jeev/ray-logs branch November 1, 2023 04:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants