fix(pinecone): instrument pinecone package instead of deprecated pinecone-client#3733
fix(pinecone): instrument pinecone package instead of deprecated pinecone-client#3733galkleinman merged 11 commits intotraceloop:mainfrom
Conversation
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughInstrumentation updated to prefer Changes
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
|
Generated with ❤️ by ellipsis.dev |
There was a problem hiding this comment.
Actionable comments posted: 2
🧹 Nitpick comments (2)
packages/sample-app/pyproject.toml (1)
16-16: Consider adding an upper bound to the version constraint.The
pinecone>=2.2.2constraint lacks an upper bound, unlike most other dependencies in this file (e.g.,openai>=1.52.2,<2). Adding an upper bound would guard against unexpected breaking changes from major version updates.Suggested change
- "pinecone>=2.2.2", + "pinecone>=2.2.2,<6",🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/sample-app/pyproject.toml` at line 16, The dependency entry "pinecone>=2.2.2" has no upper bound which can allow breaking major upgrades; update the version constraint to include a sensible upper bound (e.g., "pinecone>=2.2.2,<3" or match the project's pinning style like "<2" if you prefer minor-only upgrades) so that the pyproject dependency for "pinecone" follows the same bounded pattern as other packages.packages/opentelemetry-instrumentation-pinecone/opentelemetry/instrumentation/pinecone/__init__.py (1)
280-281: Consider narrowing the exception handling.Using bare
except Exception: passcan mask unexpected errors during uninstrumentation. Consider catching more specific exceptions or at minimum logging the error for debugging purposes.Alternative approach inspired by chromadb pattern
if wrap_object == "Index": try: - unwrap("pinecone.db_data.index", f"Index.{wrap_method_name}") - except Exception: - pass + from pinecone.db_data import index as db_data_index + wrapped = getattr(db_data_index, "Index", None) + if wrapped: + unwrap(wrapped, wrap_method_name) + except (ImportError, AttributeError): + pass try: unwrap(f"pinecone.{wrap_object}", wrap_method_name) - except Exception: + except (ImportError, AttributeError): passAlso applies to: 285-286
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/opentelemetry-instrumentation-pinecone/opentelemetry/instrumentation/pinecone/__init__.py` around lines 280 - 281, The bare "except Exception: pass" in the Pinecone instrumentation cleanup/uninstrumentation block should be narrowed and logged: locate the uninstrumentation/cleanup method in the Pinecone instrumentation class (e.g., PineconeInstrumentor or PineconeInstrumentation uninstrument/disable routine where the try/except appears) and replace the bare except with specific exception catches for likely failure modes (e.g., AttributeError, RuntimeError) and/or use "except Exception as e: logger.exception('pinecone uninstrument error: %s', e)" so errors are recorded (only re-raise if they indicate a fatal state). Ensure you use the module logger (or processLogger used elsewhere in the file) and avoid swallowing exceptions silently.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In
`@packages/opentelemetry-instrumentation-pinecone/opentelemetry/instrumentation/pinecone/__init__.py`:
- Around line 277-286: The unwrap call uses an incorrect combined module+class
string; change the special-case unwrap for Index to match the wrap format by
passing the module path and "Index.{method}" as two arguments to unwrap (i.e.,
call unwrap("pinecone.db_data.index", f"Index.{wrap_method_name}")) so it
mirrors how wrap_function_wrapper was invoked for ("pinecone.db_data.index",
"Index.{method}"); update the conditional branch that references
unwrap("pinecone.db_data.index.Index", wrap_method_name) to use the separated
module and Class.method form.
- Around line 224-270: The metrics variables used in the _wrap(...) calls can be
undefined when is_metrics_enabled() is False; initialize query_duration_metric,
read_units_metric, write_units_metric, and scores_metric to None before the
is_metrics_enabled() conditional so they always exist, then leave the rest of
the code (the is_metrics_enabled() branch that sets them and the
wrap_function_wrapper calls for Index and GRPCIndex) unchanged so _wrap(...)
receives None when metrics are disabled.
---
Nitpick comments:
In
`@packages/opentelemetry-instrumentation-pinecone/opentelemetry/instrumentation/pinecone/__init__.py`:
- Around line 280-281: The bare "except Exception: pass" in the Pinecone
instrumentation cleanup/uninstrumentation block should be narrowed and logged:
locate the uninstrumentation/cleanup method in the Pinecone instrumentation
class (e.g., PineconeInstrumentor or PineconeInstrumentation
uninstrument/disable routine where the try/except appears) and replace the bare
except with specific exception catches for likely failure modes (e.g.,
AttributeError, RuntimeError) and/or use "except Exception as e:
logger.exception('pinecone uninstrument error: %s', e)" so errors are recorded
(only re-raise if they indicate a fatal state). Ensure you use the module logger
(or processLogger used elsewhere in the file) and avoid swallowing exceptions
silently.
In `@packages/sample-app/pyproject.toml`:
- Line 16: The dependency entry "pinecone>=2.2.2" has no upper bound which can
allow breaking major upgrades; update the version constraint to include a
sensible upper bound (e.g., "pinecone>=2.2.2,<3" or match the project's pinning
style like "<2" if you prefer minor-only upgrades) so that the pyproject
dependency for "pinecone" follows the same bounded pattern as other packages.
ℹ️ Review info
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Disabled knowledge base sources:
- Linear integration is disabled
You can enable these sources in your CodeRabbit configuration.
⛔ Files ignored due to path filters (3)
packages/opentelemetry-instrumentation-pinecone/uv.lockis excluded by!**/*.lockpackages/sample-app/uv.lockis excluded by!**/*.lockpackages/traceloop-sdk/uv.lockis excluded by!**/*.lock
📒 Files selected for processing (3)
packages/opentelemetry-instrumentation-pinecone/opentelemetry/instrumentation/pinecone/__init__.pypackages/opentelemetry-instrumentation-pinecone/pyproject.tomlpackages/sample-app/pyproject.toml
| if wrap_object == "Index": | ||
| try: | ||
| unwrap("pinecone.db_data.index.Index", wrap_method_name) | ||
| except Exception: | ||
| pass | ||
|
|
||
| try: | ||
| unwrap(f"pinecone.{wrap_object}", wrap_method_name) | ||
| except Exception: | ||
| pass |
There was a problem hiding this comment.
Incorrect unwrap path format for pinecone.db_data.index.Index.
The unwrap call at line 279 uses an inconsistent path format compared to the wrap_function_wrapper call. The wrap uses ("pinecone.db_data.index", "Index.{method}") (lines 229-230), but the unwrap uses ("pinecone.db_data.index.Index", "{method}").
Based on the pattern in opentelemetry-instrumentation-openai (e.g., unwrap("openai.resources.chat.completions", "Completions.create")), the module path and Class.method should be separate arguments.
Proposed fix
if wrap_object == "Index":
try:
- unwrap("pinecone.db_data.index.Index", wrap_method_name)
+ unwrap("pinecone.db_data.index", f"Index.{wrap_method_name}")
except Exception:
pass📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| if wrap_object == "Index": | |
| try: | |
| unwrap("pinecone.db_data.index.Index", wrap_method_name) | |
| except Exception: | |
| pass | |
| try: | |
| unwrap(f"pinecone.{wrap_object}", wrap_method_name) | |
| except Exception: | |
| pass | |
| if wrap_object == "Index": | |
| try: | |
| unwrap("pinecone.db_data.index", f"Index.{wrap_method_name}") | |
| except Exception: | |
| pass | |
| try: | |
| unwrap(f"pinecone.{wrap_object}", wrap_method_name) | |
| except Exception: | |
| pass |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In
`@packages/opentelemetry-instrumentation-pinecone/opentelemetry/instrumentation/pinecone/__init__.py`
around lines 277 - 286, The unwrap call uses an incorrect combined module+class
string; change the special-case unwrap for Index to match the wrap format by
passing the module path and "Index.{method}" as two arguments to unwrap (i.e.,
call unwrap("pinecone.db_data.index", f"Index.{wrap_method_name}")) so it
mirrors how wrap_function_wrapper was invoked for ("pinecone.db_data.index",
"Index.{method}"); update the conditional branch that references
unwrap("pinecone.db_data.index.Index", wrap_method_name) to use the separated
module and Class.method form.
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (1)
packages/opentelemetry-instrumentation-qdrant/tests/test_qdrant_instrumentation.py (1)
79-84: Add tests for legacysearch/search_batchmethod names to prevent regressions.The test helpers now call the new
query_points/query_batch_pointsAPIs exclusively. While the instrumentation supports both legacy and new method names (present inqdrant_client_methods.jsonand handled in the wrapper), the test suite no longer verifies that legacy methods are correctly instrumented. Add at least one test for each legacy method to catch any accidental regressions in their instrumentation coverage.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/opentelemetry-instrumentation-qdrant/tests/test_qdrant_instrumentation.py` around lines 79 - 84, Add tests that exercise the legacy QdrantClient method names by calling search and search_batch (in addition to existing query_points/query_batch_points) so instrumentation still wraps them; update the test helper functions in tests/test_qdrant_instrumentation.py to include a search(...) and search_batch(...) function that mirror the existing query_points/query_batch_points usage (same COLLECTION_NAME, EMBEDDING_DIM, query/queries, limit), and add assertions in the test that these legacy calls produce the same instrumentation spans as the new APIs; reference the wrapper that reads qdrant_client_methods.json to ensure both legacy names are covered when adding the new test cases.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@packages/sample-app/sample_app/qdrant_app.py`:
- Line 1: Update the installation comment in qdrant_app.py to use the
repository-standard uv wrapper instead of invoking pip directly: replace the
existing comment line "# pip install qdrant-client sentence-transformers" with
an instruction that runs the same install through uv (e.g., "uv run pip install
qdrant-client sentence-transformers") so the repository package manager is used
consistently; locate the comment near the top of the file (the line containing
"pip install qdrant-client sentence-transformers") and update it accordingly.
---
Nitpick comments:
In
`@packages/opentelemetry-instrumentation-qdrant/tests/test_qdrant_instrumentation.py`:
- Around line 79-84: Add tests that exercise the legacy QdrantClient method
names by calling search and search_batch (in addition to existing
query_points/query_batch_points) so instrumentation still wraps them; update the
test helper functions in tests/test_qdrant_instrumentation.py to include a
search(...) and search_batch(...) function that mirror the existing
query_points/query_batch_points usage (same COLLECTION_NAME, EMBEDDING_DIM,
query/queries, limit), and add assertions in the test that these legacy calls
produce the same instrumentation spans as the new APIs; reference the wrapper
that reads qdrant_client_methods.json to ensure both legacy names are covered
when adding the new test cases.
ℹ️ Review info
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Disabled knowledge base sources:
- Linear integration is disabled
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (4)
packages/opentelemetry-instrumentation-qdrant/opentelemetry/instrumentation/qdrant/qdrant_client_methods.jsonpackages/opentelemetry-instrumentation-qdrant/opentelemetry/instrumentation/qdrant/wrapper.pypackages/opentelemetry-instrumentation-qdrant/tests/test_qdrant_instrumentation.pypackages/sample-app/sample_app/qdrant_app.py
| @@ -0,0 +1,45 @@ | |||
| # pip install qdrant-client sentence-transformers | |||
There was a problem hiding this comment.
Use uv run in the installation instruction.
The command comment currently uses pip directly; please switch it to the repository-standard uv run <command>.
Suggested fix
-# pip install qdrant-client sentence-transformers
+# uv run pip install qdrant-client sentence-transformersAs per coding guidelines, "Execute all package management commands through the uv package manager using 'uv run '".
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| # pip install qdrant-client sentence-transformers | |
| # uv run pip install qdrant-client sentence-transformers |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/sample-app/sample_app/qdrant_app.py` at line 1, Update the
installation comment in qdrant_app.py to use the repository-standard uv wrapper
instead of invoking pip directly: replace the existing comment line "# pip
install qdrant-client sentence-transformers" with an instruction that runs the
same install through uv (e.g., "uv run pip install qdrant-client
sentence-transformers") so the repository package manager is used consistently;
locate the comment near the top of the file (the line containing "pip install
qdrant-client sentence-transformers") and update it accordingly.
90cfc52 to
5be8410
Compare
There was a problem hiding this comment.
Actionable comments posted: 1
♻️ Duplicate comments (2)
packages/opentelemetry-instrumentation-pinecone/opentelemetry/instrumentation/pinecone/__init__.py (2)
225-273:⚠️ Potential issue | 🔴 CriticalInitialize metric handles before conditional setup.
query_duration_metric,read_units_metric,write_units_metric, andscores_metricare used on Line 234–237, Line 253–256, and Line 267–270, but they are only assigned insideif is_metrics_enabled()(Line 194). With metrics disabled, this raisesUnboundLocalErrorduring instrumentation setup.Proposed fix
def _instrument(self, **kwargs): + query_duration_metric = None + read_units_metric = None + write_units_metric = None + scores_metric = None + if is_metrics_enabled(): meter_provider = kwargs.get("meter_provider") meter = get_meter(__name__, __version__, meter_provider)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/opentelemetry-instrumentation-pinecone/opentelemetry/instrumentation/pinecone/__init__.py` around lines 225 - 273, The instrumentation references query_duration_metric, read_units_metric, write_units_metric, and scores_metric inside the Index and GRPCIndex wrapping logic (in the wrap loop that calls wrap_function_wrapper and _wrap) but those variables are only created inside is_metrics_enabled(); initialize them to None (or no-op metric handles) before the metrics check so they are always defined, then keep the existing conditional assignment inside is_metrics_enabled(); update _wrap (or its caller) to handle None/no-op metric values safely when metrics are disabled so the wrap logic for "Index" and "GRPCIndex" never raises UnboundLocalError.
280-288:⚠️ Potential issue | 🟠 MajorMatch
unwraptargets to the wrapped module/object format.The current uninstrument calls use incompatible target strings (
"pinecone.db_data.index.Index"and"pinecone.{wrap_object}"with bare method name). This does not mirror thewrap_function_wrappertargets and can leave wrappers active after uninstrumentation.Proposed fix
if wrap_object == "Index": try: - unwrap("pinecone.db_data.index.Index", wrap_method_name) + unwrap("pinecone.db_data.index", f"Index.{wrap_method_name}") except Exception: pass try: - unwrap(f"pinecone.{wrap_object}", wrap_method_name) + unwrap("pinecone", f"{wrap_object}.{wrap_method_name}") except Exception: pass#!/bin/bash set -euo pipefail FILE="packages/opentelemetry-instrumentation-pinecone/opentelemetry/instrumentation/pinecone/__init__.py" echo "=== wrap targets ===" rg -n 'wrap_function_wrapper\(' "$FILE" -A4 -B1 echo echo "=== unwrap targets ===" rg -n 'unwrap\(' "$FILE" -A1 -B1Expected verification outcome: unwrap calls should use the same module +
Class.methodtarget pattern as wrap calls.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/opentelemetry-instrumentation-pinecone/opentelemetry/instrumentation/pinecone/__init__.py` around lines 280 - 288, The unwrap calls must mirror the wrap_function_wrapper targets so wrappers are properly removed; update the unwrap targets used near the wrap_function_wrapper calls to include the class/method pattern (i.e., match the same module + Class.method string you passed to wrap_function_wrapper). Specifically, replace the current unwrap targets referencing "pinecone.db_data.index.Index" and f"pinecone.{wrap_object}" with the corresponding module + Class.method patterns (e.g., the same string used when calling wrap_function_wrapper for Index and for the dynamic wrap_object) so that unwrap receives the exact module.Class.method target names.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In
`@packages/opentelemetry-instrumentation-pinecone/opentelemetry/instrumentation/pinecone/__init__.py`:
- Around line 224-273: Add regression tests that exercise both Index resolution
paths and the GRPCIndex branch: write tests that simulate/patch module
availability for "pinecone.db_data.index" so the branch that calls
wrap_function_wrapper("pinecone.db_data.index", "Index.<method>", _wrap(...)) is
executed, and separate tests where that module is absent so the fallback
wrap_function_wrapper("pinecone", "Index.<method>", _wrap(...)) runs; also add a
test that asserts the GRPCIndex branch wraps via
wrap_function_wrapper("pinecone", "GRPCIndex.<method>", _wrap(...)). Use
mocks/stubs to control getattr(pinecone, ...) and importlib.util.find_spec
outcomes and assert wrap_function_wrapper was invoked with the expected target
strings and _wrap was used.
---
Duplicate comments:
In
`@packages/opentelemetry-instrumentation-pinecone/opentelemetry/instrumentation/pinecone/__init__.py`:
- Around line 225-273: The instrumentation references query_duration_metric,
read_units_metric, write_units_metric, and scores_metric inside the Index and
GRPCIndex wrapping logic (in the wrap loop that calls wrap_function_wrapper and
_wrap) but those variables are only created inside is_metrics_enabled();
initialize them to None (or no-op metric handles) before the metrics check so
they are always defined, then keep the existing conditional assignment inside
is_metrics_enabled(); update _wrap (or its caller) to handle None/no-op metric
values safely when metrics are disabled so the wrap logic for "Index" and
"GRPCIndex" never raises UnboundLocalError.
- Around line 280-288: The unwrap calls must mirror the wrap_function_wrapper
targets so wrappers are properly removed; update the unwrap targets used near
the wrap_function_wrapper calls to include the class/method pattern (i.e., match
the same module + Class.method string you passed to wrap_function_wrapper).
Specifically, replace the current unwrap targets referencing
"pinecone.db_data.index.Index" and f"pinecone.{wrap_object}" with the
corresponding module + Class.method patterns (e.g., the same string used when
calling wrap_function_wrapper for Index and for the dynamic wrap_object) so that
unwrap receives the exact module.Class.method target names.
ℹ️ Review info
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Disabled knowledge base sources:
- Linear integration is disabled
You can enable these sources in your CodeRabbit configuration.
⛔ Files ignored due to path filters (3)
packages/opentelemetry-instrumentation-pinecone/uv.lockis excluded by!**/*.lockpackages/sample-app/uv.lockis excluded by!**/*.lockpackages/traceloop-sdk/uv.lockis excluded by!**/*.lock
📒 Files selected for processing (3)
packages/opentelemetry-instrumentation-pinecone/opentelemetry/instrumentation/pinecone/__init__.pypackages/opentelemetry-instrumentation-pinecone/pyproject.tomlpackages/sample-app/pyproject.toml
|
|
||
| if wrap_object == "Index": | ||
| try: | ||
| if importlib.util.find_spec("pinecone.db_data.index") is not None: | ||
| try: | ||
| wrap_function_wrapper( | ||
| "pinecone.db_data.index", | ||
| f"{wrap_object}.{wrap_method}", | ||
| _wrap( | ||
| tracer, | ||
| query_duration_metric, | ||
| read_units_metric, | ||
| write_units_metric, | ||
| scores_metric, | ||
| wrapped_method, | ||
| ), | ||
| ) | ||
| continue | ||
| except (ImportError, AttributeError): | ||
| pass | ||
| except (ModuleNotFoundError, ImportError): | ||
| pass | ||
|
|
||
| if getattr(pinecone, wrap_object, None): | ||
| wrap_function_wrapper( | ||
| "pinecone", | ||
| f"{wrap_object}.{wrap_method}", | ||
| _wrap( | ||
| tracer, | ||
| query_duration_metric, | ||
| read_units_metric, | ||
| write_units_metric, | ||
| scores_metric, | ||
| wrapped_method, | ||
| ), | ||
| ) | ||
| elif wrap_object == "GRPCIndex": | ||
| if getattr(pinecone, wrap_object, None): | ||
| wrap_function_wrapper( | ||
| "pinecone", | ||
| f"{wrap_object}.{wrap_method}", | ||
| _wrap( | ||
| tracer, | ||
| query_duration_metric, | ||
| read_units_metric, | ||
| write_units_metric, | ||
| scores_metric, | ||
| wrapped_method, | ||
| ), | ||
| ) |
There was a problem hiding this comment.
Add regression tests for both Index resolution paths.
The new branch logic introduces two execution paths (pinecone.db_data.index.Index and pinecone.Index) plus GRPC conditional wrapping. This should be covered with targeted tests to prevent silent instrumentation drift in future Pinecone releases.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In
`@packages/opentelemetry-instrumentation-pinecone/opentelemetry/instrumentation/pinecone/__init__.py`
around lines 224 - 273, Add regression tests that exercise both Index resolution
paths and the GRPCIndex branch: write tests that simulate/patch module
availability for "pinecone.db_data.index" so the branch that calls
wrap_function_wrapper("pinecone.db_data.index", "Index.<method>", _wrap(...)) is
executed, and separate tests where that module is absent so the fallback
wrap_function_wrapper("pinecone", "Index.<method>", _wrap(...)) runs; also add a
test that asserts the GRPCIndex branch wraps via
wrap_function_wrapper("pinecone", "GRPCIndex.<method>", _wrap(...)). Use
mocks/stubs to control getattr(pinecone, ...) and importlib.util.find_spec
outcomes and assert wrap_function_wrapper was invoked with the expected target
strings and _wrap was used.
|
Fix for the issue #2753. |
|
@nirga @galkleinman @nina-kollman , Can anyone of you review this PR for the Issue #3625 . |
galkleinman
left a comment
There was a problem hiding this comment.
@elinacse - Good move migrating from pinecone-client to pinecone.
A few concerns:
- The lower bound of >= 2.2.2 doesn't make sense for the pinecone package - the pinecone package name only started at v5.1.0 (earlier versions were published as pinecone-client).
The lower bound should probably be >= 5.0.0 or even higher. - The new module path handling (pinecone.db_data.index) is only added for Index but not for GRPCIndex — shouldn't GRPCIndex get the same treatment?
- The broad except Exception: pass blocks in _uninstrument could silently swallow real errors. Consider logging at debug level at minimum.
- Removing the upper bound entirely means this could break silently with future Pinecone releases. Was this tested against the current v8.x?
|
@galkleinman Thanks for the review , I have addressed 1st 3 points , yes I have tested it with v8.1.0 . |
There was a problem hiding this comment.
Actionable comments posted: 1
♻️ Duplicate comments (1)
packages/opentelemetry-instrumentation-pinecone/opentelemetry/instrumentation/pinecone/__init__.py (1)
301-322:⚠️ Potential issue | 🟠 MajorInconsistent
unwrappath format across all unwrap calls.The
unwrapcalls use a different argument pattern thanwrap_function_wrapper. Based on the openai v1 instrumentation pattern (unwrap("openai.resources.chat.completions", "Completions.create")), the module andClass.methodshould be separate arguments.Current inconsistencies:
- Line 303:
unwrap("pinecone.db_data.index.Index", wrap_method_name)— class embedded in module path- Line 311:
unwrap("pinecone.db_data.index.GRPCIndex", wrap_method_name)— same issue- Line 318:
unwrap(f"pinecone.{wrap_object}", wrap_method_name)— same issueAll should match the wrap format.
🔧 Proposed fix to align unwrap with wrap pattern
if wrap_object == "Index": try: - unwrap("pinecone.db_data.index.Index", wrap_method_name) + unwrap("pinecone.db_data.index", f"Index.{wrap_method_name}") except Exception as e: logger.debug( f"Failed to unwrap pinecone.db_data.index.Index.{wrap_method_name}: {e}" ) if wrap_object == "GRPCIndex": try: - unwrap("pinecone.db_data.index.GRPCIndex", wrap_method_name) + unwrap("pinecone.db_data.index", f"GRPCIndex.{wrap_method_name}") except Exception as e: logger.debug( f"Failed to unwrap pinecone.db_data.index.GRPCIndex.{wrap_method_name}: {e}" ) try: - unwrap(f"pinecone.{wrap_object}", wrap_method_name) + unwrap("pinecone", f"{wrap_object}.{wrap_method_name}") except Exception as e: logger.debug( f"Failed to unwrap pinecone.{wrap_object}.{wrap_method_name}: {e}" )🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/opentelemetry-instrumentation-pinecone/opentelemetry/instrumentation/pinecone/__init__.py` around lines 301 - 322, The unwrap calls pass combined module+class strings; change them to two-argument form matching wrap_function_wrapper (module path, "Class.method"). Replace unwrap("pinecone.db_data.index.Index", wrap_method_name) with unwrap("pinecone.db_data.index", f"Index.{wrap_method_name}"), unwrap("pinecone.db_data.index.GRPCIndex", wrap_method_name) with unwrap("pinecone.db_data.index", f"GRPCIndex.{wrap_method_name}"), and replace unwrap(f"pinecone.{wrap_object}", wrap_method_name) with unwrap("pinecone", f"{wrap_object}.{wrap_method_name}") so module and Class.method are separate arguments (refer to unwrap and wrap_function_wrapper usage).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In
`@packages/opentelemetry-instrumentation-pinecone/opentelemetry/instrumentation/pinecone/__init__.py`:
- Line 36: Update the package compatibility tuple _instruments in __init__.py to
require pinecone >= 7.0.0 (replace "pinecone >= 5.1.0" with "pinecone >= 7.0.0")
so the instrumentation only installs with versions that include the
pinecone.db_data.index module used by the code (references: the _instruments
variable in
packages/opentelemetry-instrumentation-pinecone/opentelemetry/instrumentation/pinecone/__init__.py).
---
Duplicate comments:
In
`@packages/opentelemetry-instrumentation-pinecone/opentelemetry/instrumentation/pinecone/__init__.py`:
- Around line 301-322: The unwrap calls pass combined module+class strings;
change them to two-argument form matching wrap_function_wrapper (module path,
"Class.method"). Replace unwrap("pinecone.db_data.index.Index",
wrap_method_name) with unwrap("pinecone.db_data.index",
f"Index.{wrap_method_name}"), unwrap("pinecone.db_data.index.GRPCIndex",
wrap_method_name) with unwrap("pinecone.db_data.index",
f"GRPCIndex.{wrap_method_name}"), and replace unwrap(f"pinecone.{wrap_object}",
wrap_method_name) with unwrap("pinecone", f"{wrap_object}.{wrap_method_name}")
so module and Class.method are separate arguments (refer to unwrap and
wrap_function_wrapper usage).
ℹ️ Review info
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Disabled knowledge base sources:
- Linear integration is disabled
You can enable these sources in your CodeRabbit configuration.
⛔ Files ignored due to path filters (1)
packages/opentelemetry-instrumentation-pinecone/uv.lockis excluded by!**/*.lock
📒 Files selected for processing (2)
packages/opentelemetry-instrumentation-pinecone/opentelemetry/instrumentation/pinecone/__init__.pypackages/opentelemetry-instrumentation-pinecone/pyproject.toml
🚧 Files skipped from review as they are similar to previous changes (1)
- packages/opentelemetry-instrumentation-pinecone/pyproject.toml
| logger = logging.getLogger(__name__) | ||
|
|
||
| _instruments = ("pinecone-client >= 2.2.2, <6",) | ||
| _instruments = ("pinecone >= 5.1.0",) |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
Pinecone Python SDK db_data.index module version history
💡 Result:
Pinecone Python SDK pinecone.db_data.index (a.k.a. “db_data.index”) — version history (high-signal)
- v6.x (example: v6.0.2): The REST Index client lived under
pinecone.data.index(notpinecone.db_data.index). (raw.githubusercontent.com) - v7.0.0 (June 2025): The SDK introduced the
pinecone.db_data.indexmodule (with anIndexclient implemented against thedb_dataOpenAPI surface). This coincided with a major-version bump tied to Pinecone API versioning (the release notes explicitly mention internal re-organization/moves while trying to preserve aliases). (raw.githubusercontent.com) - v8.0.0 (Nov 18, 2025) → v8.0.1 (Feb 11, 2026):
pinecone.db_data.indexremains the module path; the implementation shows further refactors, including use of response adapters (e.g.,adapt_query_response) and additional dataclasses likeFetchByMetadataResponse,Pagination, andUpdateResponse(as seen in the v8.0.1 source). Also, 8.x comes with documented upgrade notes (e.g., Python 3.10+ requirement). (raw.githubusercontent.com)
Quick “where is it?” mapping
- If you’re on 6.x: import path is
pinecone.data.index. (raw.githubusercontent.com) - If you’re on 7.x or 8.x: import path is
pinecone.db_data.index. (raw.githubusercontent.com)
Citations:
- 1: https://raw.githubusercontent.com/pinecone-io/pinecone-python-client/v6.0.2/pinecone/data/index.py
- 2: https://raw.githubusercontent.com/pinecone-io/pinecone-python-client/v7.0.0/pinecone/db_data/index.py
- 3: https://raw.githubusercontent.com/pinecone-io/pinecone-python-client/v8.0.1/pinecone/db_data/index.py
- 4: https://raw.githubusercontent.com/pinecone-io/pinecone-python-client/v6.0.2/pinecone/data/index.py
- 5: https://raw.githubusercontent.com/pinecone-io/pinecone-python-client/v7.0.0/pinecone/db_data/index.py
🏁 Script executed:
find . -name "*.py" -path "*pinecone*" -type f | head -20Repository: traceloop/openllmetry
Length of output: 895
🏁 Script executed:
cat -n packages/opentelemetry-instrumentation-pinecone/opentelemetry/instrumentation/pinecone/__init__.py | head -100Repository: traceloop/openllmetry
Length of output: 3544
🏁 Script executed:
grep -r "db_data" packages/opentelemetry-instrumentation-pinecone/ --include="*.py"Repository: traceloop/openllmetry
Length of output: 1502
🏁 Script executed:
grep -r "from pinecone" packages/opentelemetry-instrumentation-pinecone/ --include="*.py"Repository: traceloop/openllmetry
Length of output: 162
🏁 Script executed:
cat -n packages/opentelemetry-instrumentation-pinecone/opentelemetry/instrumentation/pinecone/query_handlers.pyRepository: traceloop/openllmetry
Length of output: 4246
Change minimum version constraint to pinecone >= 7.0.0.
The pinecone.db_data.index module was introduced in v7.0.0 (June 2025). Versions 5.x and 6.x do not have this module—v6.x uses pinecone.data.index instead. The current constraint pinecone >= 5.1.0 allows incompatible versions where the code's pinecone.db_data.index.Index and pinecone.db_data.index.GRPCIndex wrapping would fail at runtime.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In
`@packages/opentelemetry-instrumentation-pinecone/opentelemetry/instrumentation/pinecone/__init__.py`
at line 36, Update the package compatibility tuple _instruments in __init__.py
to require pinecone >= 7.0.0 (replace "pinecone >= 5.1.0" with "pinecone >=
7.0.0") so the instrumentation only installs with versions that include the
pinecone.db_data.index module used by the code (references: the _instruments
variable in
packages/opentelemetry-instrumentation-pinecone/opentelemetry/instrumentation/pinecone/__init__.py).
Thanks! Regarding the 4th bullet, I still strongly suggest that we add an upper bound (`< 9). |
|
@galkleinman I have added version bound <9 . Please have a look . Thanks ! |
|
@galkleinman Updated version in all files , Please have a look . Thank You ! |
feat(instrumentation): ...orfix(instrumentation): ....Summary by CodeRabbit