Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mechanical switch from log to slog #2056

Merged
merged 2 commits into from
Jan 18, 2024
Merged

Mechanical switch from log to slog #2056

merged 2 commits into from
Jan 18, 2024

Conversation

dhiltgen
Copy link
Collaborator

A few obvious levels were adjusted, but generally everything mapped to "info" level.

@dhiltgen dhiltgen marked this pull request as ready for review January 18, 2024 21:12
@dhiltgen
Copy link
Collaborator Author

Example output on linux with debug turned on

% OLLAMA_DEBUG=1 ./ollama-linux-amd64 serve
time=2024-01-18T13:10:33.272-08:00 level=DEBUG source=/go/src/github.com/jmorganca/ollama/server/routes.go:901 msg="Debug logging enabled"
time=2024-01-18T13:10:33.272-08:00 level=INFO source=/go/src/github.com/jmorganca/ollama/server/images.go:810 msg="total blobs: 22"
time=2024-01-18T13:10:33.272-08:00 level=INFO source=/go/src/github.com/jmorganca/ollama/server/images.go:817 msg="total unused blobs removed: 0"
time=2024-01-18T13:10:33.272-08:00 level=INFO source=/go/src/github.com/jmorganca/ollama/server/routes.go:925 msg="Listening on 127.0.0.1:11434 (version 0.0.0)"
time=2024-01-18T13:10:33.272-08:00 level=INFO source=/go/src/github.com/jmorganca/ollama/llm/payload_common.go:106 msg="Extracting dynamic libraries..."
time=2024-01-18T13:10:48.298-08:00 level=INFO source=/go/src/github.com/jmorganca/ollama/llm/payload_common.go:145 msg="Dynamic LLM libraries [rocm_v5 rocm_v6 cpu_avx2 cpu_avx cpu cuda_v11]"
time=2024-01-18T13:10:48.298-08:00 level=INFO source=/go/src/github.com/jmorganca/ollama/llm/payload_common.go:146 msg="Override detection logic by setting OLLAMA_LLM_LIBRARY"
time=2024-01-18T13:10:48.298-08:00 level=INFO source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:88 msg="Detecting GPU type"
time=2024-01-18T13:10:48.298-08:00 level=INFO source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:208 msg="Searching for GPU management library libnvidia-ml.so"
time=2024-01-18T13:10:48.298-08:00 level=DEBUG source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:226 msg="gpu management search paths: [/usr/local/cuda/lib64/libnvidia-ml.so* /usr/lib/x86_64-linux-gnu/nvidia/current/libnvidia-ml.so* /usr/lib/x86_64-linux-gnu/libnvidia-ml.so* /usr/lib/wsl/lib/libnvidia-ml.so* /opt/cuda/lib64/libnvidia-ml.so* /usr/lib*/libnvidia-ml.so* /usr/local/lib*/libnvidia-ml.so* /usr/lib/aarch64-linux-gnu/nvidia/current/libnvidia-ml.so* /usr/lib/aarch64-linux-gnu/libnvidia-ml.so* /home/daniel/libnvidia-ml.so*]"
time=2024-01-18T13:10:48.300-08:00 level=INFO source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:254 msg="Discovered GPU libraries: [/usr/lib/x86_64-linux-gnu/libnvidia-ml.so.545.23.08]"
time=2024-01-18T13:10:48.304-08:00 level=INFO source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:94 msg="Nvidia GPU detected"
time=2024-01-18T13:10:48.310-08:00 level=INFO source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:135 msg="CUDA Compute Capability detected: 7.5"

Normal linux without enabling debug:

% ./ollama-linux-amd64 serve
2024/01/18 13:14:08 images.go:810: INFO total blobs: 22
2024/01/18 13:14:08 images.go:817: INFO total unused blobs removed: 0
2024/01/18 13:14:08 routes.go:925: INFO Listening on 127.0.0.1:11434 (version 0.0.0)
2024/01/18 13:14:08 payload_common.go:106: INFO Extracting dynamic libraries...
2024/01/18 13:14:23 payload_common.go:145: INFO Dynamic LLM libraries [rocm_v6 cpu_avx2 rocm_v5 cuda_v11 cpu cpu_avx]
2024/01/18 13:14:23 payload_common.go:146: INFO Override detection logic by setting OLLAMA_LLM_LIBRARY
2024/01/18 13:14:23 gpu.go:88: INFO Detecting GPU type
2024/01/18 13:14:23 gpu.go:208: INFO Searching for GPU management library libnvidia-ml.so
2024/01/18 13:14:23 gpu.go:254: INFO Discovered GPU libraries: [/usr/lib/x86_64-linux-gnu/libnvidia-ml.so.545.23.08]
2024/01/18 13:14:23 gpu.go:94: INFO Nvidia GPU detected
2024/01/18 13:14:23 gpu.go:135: INFO CUDA Compute Capability detected: 7.5

Copy link
Contributor

@pdevine pdevine left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM.

@@ -7,7 +7,7 @@
Install required tools:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The help text just above here is confusing. I'm not quite sure what to do w/
- Install cmake or (optionally, required tools for GPUs)

is the or for the other tools for GPUs, or for go generate/go build?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll tidy this part of the README up.

@mxyng
Copy link
Contributor

mxyng commented Jan 18, 2024

One thing you can do to minimize changes is to use slog.SetDefault() to change the log package. I see you're doing that. With this change, you don't have to change log.Printf to slog.Info if the initial log level is INFO

I misremembered how slog works. For dynamic log level checking, it'll need a custom handler. Something like this should work:

type slogHandler struct {
  h *slog.TextHandler
}

func (h slogHandler) Enabled(ctx context.Context, level Level) bool {
  if _, ok := os.Getenv("OLLAMA_DEBUG"); ok {
    return level >= slog.LevelDebug
  }

  return h.Enabled(ctx, level)
}

@@ -892,6 +893,13 @@ func (s *Server) GenerateRoutes() http.Handler {
}

func Serve(ln net.Listener) error {
if debug := os.Getenv("OLLAMA_DEBUG"); debug != "" {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be declared globally in cmd so it configures non-server commands too

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My intent was to only impact the server to narrow the scope.

@dhiltgen
Copy link
Collaborator Author

With this change, you don't have to change log.Printf to slog.Info if the initial log level is INFO

I think being explicit on level is better and makes it easier for us to start to adjust the levels for messages in follow-up incremental changes. I didn't do an analysis of every log output but just skimmed for obvious warn/err scenarios to adjust those, but I would like to continue refining the levels over time.

@mxyng
Copy link
Contributor

mxyng commented Jan 18, 2024

There's some log.Println and log.Print that didn't get updated but otherwise this looks fine

A few obvious levels were adjusted, but generally everything mapped to "info" level.
@dhiltgen dhiltgen merged commit abec7f0 into ollama:main Jan 18, 2024
10 checks passed
@dhiltgen dhiltgen deleted the slog branch January 18, 2024 22:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants