Conversation
0db4cad to
7766650
Compare
nirga
left a comment
There was a problem hiding this comment.
Thanks so much for this @Kartik1397! Wrote a couple of comments. I'd also suggest actually testing this against app.traceloop.com because I'm not sure it will work as expected currently. You can build a new simple sample app for that.
.../opentelemetry-instrumentation-replicate/opentelemetry/instrumentation/replicate/__init__.py
Outdated
Show resolved
Hide resolved
.../opentelemetry-instrumentation-replicate/opentelemetry/instrumentation/replicate/__init__.py
Outdated
Show resolved
Hide resolved
|
Thanks, @nirga for reviewing the PR! I've incorporated your feedback in my latest commit. Please let me know if there's still anything missing or can be done in better way. This is my first time using OpenTelemetry SDK, I may have overlooked some straightforward aspects. Your guidance is greatly appreciated! Testing on api.traceloop.com output2.mp4 |
nirga
left a comment
There was a problem hiding this comment.
Perfect! Can you just add the sample app you used?
| "stability-ai/stable-diffusion:27b93a2413e7f36cd83da926f3656280b2931564ff050bf9575f1fdf9bcd7478", | ||
| input={"prompt": "robots"} | ||
| ) | ||
| print(image) |
There was a problem hiding this comment.
@nirga Should I remove all print statements from this test file?
packages/opentelemetry-instrumentation-replicate/pyproject.toml
Outdated
Show resolved
Hide resolved
Should I add it to https://github.com/traceloop/openllmetry/tree/main/packages/sample-app/sample_app? |
Yes! |
|
Done @nirga. Also reduced minimum python version required for |
| jinja2 = "^3.1.2" | ||
| deprecated = "^1.2.14" | ||
| posthog = "^3.0.2" | ||
| replicate = "^0.22.0" |
There was a problem hiding this comment.
This shouldn't be a dependency of the SDK, just a test dependency (under tool.poetry.group.test.dependencies)
|
@Kartik1397 when installing the SDK there's a warning Also, I'm not sure your code will work for streaming - have you tested it? |
This should be fixed now.
I can see traces of stream function calls are getting consumed by traceloop(with propmpts). Do we need to capture response in case of streaming API?
|
Yes, of course :) Same as we do in OpenAI. |
|
Okay, will add support for streaming. |
|
Pushed changes to capture response streams. |
|
Failed as Github doesn't recognize newly added Replicate API token. Tested locally. |
Co-authored-by: Nir Gazit <nirga@users.noreply.github.com>
Co-authored-by: Nir Gazit <nirga@users.noreply.github.com>
Co-authored-by: Nir Gazit <nirga@users.noreply.github.com>
Co-authored-by: Nir Gazit <nirga@users.noreply.github.com>
Co-authored-by: Nir Gazit <nirga@users.noreply.github.com>
Co-authored-by: Nir Gazit <nirga@users.noreply.github.com>


closes #242
/claim #242