Skip to content

Commit

Permalink
Merge branch 'main' into issue_3985
Browse files Browse the repository at this point in the history
  • Loading branch information
ocelotl committed Jul 11, 2024
2 parents aafaeae + 5dc48c5 commit d0b0f78
Show file tree
Hide file tree
Showing 70 changed files with 450 additions and 100 deletions.
2 changes: 1 addition & 1 deletion .codespellrc
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
[codespell]
# skipping auto generated folders
skip = ./.tox,./.mypy_cache,./docs/_build,./target,*/LICENSE,./venv,.git,./opentelemetry-semantic-conventions
skip = ./.tox,./.mypy_cache,./docs/_build,./target,*/LICENSE,./venv,.git,./opentelemetry-semantic-conventions,*-requirements*.txt
ignore-words-list = ans,ue,ot,hist,ro
54 changes: 54 additions & 0 deletions .github/workflows/lint.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
name: Lint tests

on:
push:
branches-ignore:
- 'release/*'
pull_request:

jobs:
lint-3_12:
strategy:
fail-fast: false # ensures the entire test matrix is run, even if one permutation fails
matrix:
package:
- "opentelemetry-api"
- "opentelemetry-proto"
- "opentelemetry-sdk"
- "opentelemetry-semantic-conventions"
- "opentelemetry-getting-started"
- "opentelemetry-opentracing-shim"
- "opentelemetry-opencensus-shim"
- "opentelemetry-exporter-opencensus"
- "opentelemetry-exporter-otlp-proto-common"
- "opentelemetry-exporter-otlp-combined"
- "opentelemetry-exporter-otlp-proto-grpc"
- "opentelemetry-exporter-otlp-proto-http"
- "opentelemetry-exporter-otlp-proto-prometheus"
- "opentelemetry-exporter-otlp-proto-zipkin-combined"
- "opentelemetry-exporter-otlp-proto-zipkin-proto-http"
- "opentelemetry-exporter-otlp-proto-zipkin-json"
- "opentelemetry-propagator-b3"
- "opentelemetry-propagator-jaeger"
- "opentelemetry-test-utils"
os: [ubuntu-20.04]
runs-on: ubuntu-20.04
steps:
- name: Checkout Core Repo @ SHA - ${{ github.sha }}
uses: actions/checkout@v4
- name: Set up Python 3.12
uses: actions/setup-python@v5
with:
python-version: 3.12
- name: Install tox
run: pip install tox
- name: Cache tox environment
# Preserves .tox directory between runs for faster installs
uses: actions/cache@v4
with:
path: |
.tox
~/.cache/pip
key: v7-build-tox-cache-${{ matrix.package }}-${{ hashFiles('tox.ini', 'gen-requirements.txt', 'dev-requirements.txt') }}
- name: run tox
run: tox -e lint-${{ matrix.package }}
6 changes: 4 additions & 2 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ jobs:
- "exporter-zipkin-combined"
- "exporter-zipkin-proto-http"
- "exporter-zipkin-json"
- "protobuf"
- "proto"
- "propagator-b3"
- "propagator-jaeger"
os: [ubuntu-20.04, windows-2019]
Expand All @@ -69,6 +69,8 @@ jobs:
package: "exporter-otlp-combined"
- python-version: pypy3
package: "exporter-otlp-proto-grpc"
- python-version: pypy3
package: "getting-started"

steps:
- name: Checkout Core Repo @ SHA - ${{ github.sha }}
Expand Down Expand Up @@ -98,7 +100,7 @@ jobs:
strategy:
fail-fast: false
matrix:
tox-environment: ["docker-tests-proto3", "docker-tests-proto4", "lint", "spellcheck",
tox-environment: ["docker-tests-proto3", "docker-tests-proto4", "spellcheck",
"docs", "mypy", "mypyinstalled", "tracecontext"]
name: ${{ matrix.tox-environment }}
runs-on: ubuntu-20.04
Expand Down
4 changes: 4 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,10 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0

## Unreleased

- OTLP exporter is encoding invalid span/trace IDs in the logs fix
([#4006](https://github.com/open-telemetry/opentelemetry-python/pull/4006))
- Update sdk process resource detector `process.command_args` attribute to also include the executable itself
([#4032](https://github.com/open-telemetry/opentelemetry-python/pull/4032))
- Fix `start_time_unix_nano` for delta collection for explicit bucket histogram aggregation
([#4009](https://github.com/open-telemetry/opentelemetry-python/pull/4009))
- Fix `start_time_unix_nano` for delta collection for sum aggregation
Expand Down
2 changes: 1 addition & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ You can run `tox` with the following arguments:
- `tox -e py312-opentelemetry-api` to e.g. run the API unit tests under a specific
Python version
- `tox -e spellcheck` to run a spellcheck on all the code
- `tox -e lint` to run lint checks on all code
- `tox -e lint-some-package` to run lint checks on `some-package`

`black` and `isort` are executed when `tox -e lint` is run. The reported errors can be tedious to fix manually.
An easier way to do so is:
Expand Down
3 changes: 3 additions & 0 deletions docs/examples/metrics/instruments/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
opentelemetry-api~=1.25
opentelemetry-sdk~=1.25
opentelemetry-exporter-otlp~=1.25
File renamed without changes.
2 changes: 1 addition & 1 deletion docs/getting_started/flask_example.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@
@app.route("/")
def hello():
with tracer.start_as_current_span("example-request"):
requests.get("http://www.example.com")
requests.get("http://www.example.com", timeout=10)
return "hello"


Expand Down
3 changes: 3 additions & 0 deletions docs/getting_started/tests/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -23,5 +23,8 @@ Werkzeug==3.0.3
wrapt==1.15.0
zipp==3.19.2
-e opentelemetry-semantic-conventions
-e opentelemetry-proto
-e exporter/opentelemetry-exporter-otlp-proto-common
-e exporter/opentelemetry-exporter-otlp-proto-grpc
-e opentelemetry-api
-e opentelemetry-sdk
6 changes: 4 additions & 2 deletions docs/getting_started/tests/test_flask.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,14 +19,16 @@

import requests
from requests.adapters import HTTPAdapter
from requests.packages.urllib3.util.retry import Retry
from requests.packages.urllib3.util.retry import ( # pylint: disable=import-error
Retry,
)


class TestFlask(unittest.TestCase):
def test_flask(self):
dirpath = os.path.dirname(os.path.realpath(__file__))
server_script = f"{dirpath}/../flask_example.py"
server = subprocess.Popen(
server = subprocess.Popen( # pylint: disable=consider-using-with
[sys.executable, server_script],
stdout=subprocess.PIPE,
)
Expand Down
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Original file line number Diff line number Diff line change
Expand Up @@ -39,11 +39,21 @@ def encode_logs(batch: Sequence[LogData]) -> ExportLogsServiceRequest:


def _encode_log(log_data: LogData) -> PB2LogRecord:
span_id = (
None
if log_data.log_record.span_id == 0
else _encode_span_id(log_data.log_record.span_id)
)
trace_id = (
None
if log_data.log_record.trace_id == 0
else _encode_trace_id(log_data.log_record.trace_id)
)
return PB2LogRecord(
time_unix_nano=log_data.log_record.timestamp,
observed_time_unix_nano=log_data.log_record.observed_timestamp,
span_id=_encode_span_id(log_data.log_record.span_id),
trace_id=_encode_trace_id(log_data.log_record.trace_id),
span_id=span_id,
trace_id=trace_id,
flags=int(log_data.log_record.trace_flags),
body=_encode_value(log_data.log_record.body),
severity_text=log_data.log_record.severity_text,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -239,8 +239,8 @@ def get_test_logs(
PB2LogRecord(
time_unix_nano=1644650249738562048,
observed_time_unix_nano=1644650249738562049,
trace_id=_encode_trace_id(0),
span_id=_encode_span_id(0),
trace_id=None,
span_id=None,
flags=int(TraceFlags.DEFAULT),
severity_text="WARN",
severity_number=SeverityNumber.WARN.value,
Expand Down
Empty file.
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@
from google.protobuf.duration_pb2 import ( # pylint: disable=no-name-in-module
Duration,
)
from google.protobuf.json_format import MessageToDict
from google.rpc.error_details_pb2 import RetryInfo
from grpc import ChannelCredentials, Compression, StatusCode, server

Expand Down Expand Up @@ -167,6 +168,36 @@ def setUp(self):
"third_name", "third_version"
),
)
self.log_data_4 = LogData(
log_record=LogRecord(
timestamp=int(time.time() * 1e9),
trace_id=0,
span_id=5213367945872657629,
trace_flags=TraceFlags(0x01),
severity_text="ERROR",
severity_number=SeverityNumber.WARN,
body="Invalid trace id check",
resource=SDKResource({"service": "myapp"}),
),
instrumentation_scope=InstrumentationScope(
"fourth_name", "fourth_version"
),
)
self.log_data_5 = LogData(
log_record=LogRecord(
timestamp=int(time.time() * 1e9),
trace_id=2604504634922341076776623263868986801,
span_id=0,
trace_flags=TraceFlags(0x01),
severity_text="ERROR",
severity_number=SeverityNumber.WARN,
body="Invalid span id check",
resource=SDKResource({"service": "myapp"}),
),
instrumentation_scope=InstrumentationScope(
"fifth_name", "fifth_version"
),
)

def tearDown(self):
self.server.stop(None)
Expand Down Expand Up @@ -342,6 +373,43 @@ def test_failure(self):
self.exporter.export([self.log_data_1]), LogExportResult.FAILURE
)

def export_log_and_deserialize(self, log_data):
# pylint: disable=protected-access
translated_data = self.exporter._translate_data([log_data])
request_dict = MessageToDict(translated_data)
log_records = (
request_dict.get("resourceLogs")[0]
.get("scopeLogs")[0]
.get("logRecords")
)
return log_records

def test_exported_log_without_trace_id(self):
log_records = self.export_log_and_deserialize(self.log_data_4)
if log_records:
log_record = log_records[0]
self.assertIn("spanId", log_record)
self.assertNotIn(
"traceId",
log_record,
"traceId should not be present in the log record",
)
else:
self.fail("No log records found")

def test_exported_log_without_span_id(self):
log_records = self.export_log_and_deserialize(self.log_data_5)
if log_records:
log_record = log_records[0]
self.assertIn("traceId", log_record)
self.assertNotIn(
"spanId",
log_record,
"spanId should not be present in the log record",
)
else:
self.fail("No log records found")

def test_translate_log_data(self):

expected = ExportLogsServiceRequest(
Expand Down
Empty file.
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@

import requests
import responses
from google.protobuf.json_format import MessageToDict

from opentelemetry._logs import SeverityNumber
from opentelemetry.exporter.otlp.proto.http import Compression
Expand All @@ -31,6 +32,9 @@
OTLPLogExporter,
)
from opentelemetry.exporter.otlp.proto.http.version import __version__
from opentelemetry.proto.collector.logs.v1.logs_service_pb2 import (
ExportLogsServiceRequest,
)
from opentelemetry.sdk._logs import LogData
from opentelemetry.sdk._logs import LogRecord as SDKLogRecord
from opentelemetry.sdk._logs.export import LogExportResult
Expand Down Expand Up @@ -167,6 +171,76 @@ def test_exporter_env(self):
)
self.assertIsInstance(exporter._session, requests.Session)

@staticmethod
def export_log_and_deserialize(log):
with patch("requests.Session.post") as mock_post:
exporter = OTLPLogExporter()
exporter.export([log])
request_body = mock_post.call_args[1]["data"]
request = ExportLogsServiceRequest()
request.ParseFromString(request_body)
request_dict = MessageToDict(request)
log_records = (
request_dict.get("resourceLogs")[0]
.get("scopeLogs")[0]
.get("logRecords")
)
return log_records

def test_exported_log_without_trace_id(self):
log = LogData(
log_record=SDKLogRecord(
timestamp=1644650195189786182,
trace_id=0,
span_id=1312458408527513292,
trace_flags=TraceFlags(0x01),
severity_text="WARN",
severity_number=SeverityNumber.WARN,
body="Invalid trace id check",
resource=SDKResource({"first_resource": "value"}),
attributes={"a": 1, "b": "c"},
),
instrumentation_scope=InstrumentationScope("name", "version"),
)
log_records = TestOTLPHTTPLogExporter.export_log_and_deserialize(log)
if log_records:
log_record = log_records[0]
self.assertIn("spanId", log_record)
self.assertNotIn(
"traceId",
log_record,
"trace_id should not be present in the log record",
)
else:
self.fail("No log records found")

def test_exported_log_without_span_id(self):
log = LogData(
log_record=SDKLogRecord(
timestamp=1644650195189786360,
trace_id=89564621134313219400156819398935297696,
span_id=0,
trace_flags=TraceFlags(0x01),
severity_text="WARN",
severity_number=SeverityNumber.WARN,
body="Invalid span id check",
resource=SDKResource({"first_resource": "value"}),
attributes={"a": 1, "b": "c"},
),
instrumentation_scope=InstrumentationScope("name", "version"),
)
log_records = TestOTLPHTTPLogExporter.export_log_and_deserialize(log)
if log_records:
log_record = log_records[0]
self.assertIn("traceId", log_record)
self.assertNotIn(
"spanId",
log_record,
"spanId should not be present in the log record",
)
else:
self.fail("No log records found")

@responses.activate
@patch("opentelemetry.exporter.otlp.proto.http._log_exporter.sleep")
def test_exponential_backoff(self, mock_sleep):
Expand Down
Empty file.
Empty file.
Empty file.
Empty file.
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@ urllib3==2.2.2
wrapt==1.16.0
zipp==3.19.2
-e opentelemetry-api
-e opentelemetry-proto
-e exporter/opentelemetry-exporter-zipkin-json
-e opentelemetry-sdk
-e tests/opentelemetry-test-utils
Expand Down
Empty file.
15 changes: 15 additions & 0 deletions lint-requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
astroid==3.0.3
black==24.3.0
click==8.1.7
dill==0.3.8
flake8==6.1.0
isort==5.12.0
mccabe==0.7.0
mypy-extensions==1.0.0
packaging==24.0
pathspec==0.12.1
platformdirs==4.2.1
pycodestyle==2.11.1
pyflakes==3.1.0
pylint==3.0.2
tomlkit==0.12.4
Empty file added opentelemetry-api/py.typed
Empty file.
Loading

0 comments on commit d0b0f78

Please sign in to comment.