Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 4 additions & 2 deletions packages/core/src/logs/exports.ts
Original file line number Diff line number Diff line change
Expand Up @@ -90,14 +90,16 @@ function setLogAttribute(
*/
export function _INTERNAL_captureSerializedLog(client: Client, serializedLog: SerializedLog): void {
const bufferMap = _getBufferMap();

const logBuffer = _INTERNAL_getLogBuffer(client);

if (logBuffer === undefined) {
bufferMap.set(client, [serializedLog]);
} else {
bufferMap.set(client, [...logBuffer, serializedLog]);
if (logBuffer.length >= MAX_LOG_BUFFER_SIZE) {
_INTERNAL_flushLogsBuffer(client, logBuffer);
bufferMap.set(client, [serializedLog]);
} else {
bufferMap.set(client, [...logBuffer, serializedLog]);
}
}
}
Expand Down
5 changes: 4 additions & 1 deletion packages/core/test/lib/logs/exports.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -245,7 +245,10 @@ describe('_INTERNAL_captureLog', () => {
// Add one more to trigger flush
_INTERNAL_captureLog({ level: 'info', message: 'trigger flush' }, client, undefined);

expect(_INTERNAL_getLogBuffer(client)).toEqual([]);
// After flushing the 100 logs, the new log starts a fresh buffer with 1 item
const buffer = _INTERNAL_getLogBuffer(client);
expect(buffer).toHaveLength(1);
expect(buffer?.[0]?.body).toBe('trigger flush');
});

it('does not flush logs buffer when it is empty', () => {
Comment on lines 245 to 254
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: New logs are silently discarded when the log buffer reaches its maximum size and triggers a flush.
Severity: CRITICAL | Confidence: 1.00

🔍 Detailed Analysis

When the log buffer reaches MAX_LOG_BUFFER_SIZE and a new log is added, the system incorrectly flushes the old buffer before the new log is fully processed. Specifically, _INTERNAL_flushLogsBuffer is called with the logBuffer (100 items) before the serializedLog is properly accounted for. Subsequently, _getBufferMap().set(client, []) clears the buffer, overwriting the newly updated buffer that contained the 101st log. This results in the most recent log being silently discarded.

💡 Suggested Fix

Modify the logic to either check the new buffer size before flushing or ensure the new log is preserved after the buffer is cleared by the flush operation.

🤖 Prompt for AI Agent
Review the code at the location below. A potential bug has been identified by an AI
agent.
Verify if this is a real issue. If it is, propose a fix; if not, explain why it's not
valid.

Location: packages/core/test/lib/logs/exports.test.ts#L245-L254

Potential issue: When the log buffer reaches `MAX_LOG_BUFFER_SIZE` and a new log is
added, the system incorrectly flushes the old buffer before the new log is fully
processed. Specifically, `_INTERNAL_flushLogsBuffer` is called with the `logBuffer` (100
items) before the `serializedLog` is properly accounted for. Subsequently,
`_getBufferMap().set(client, [])` clears the buffer, overwriting the newly updated
buffer that contained the 101st log. This results in the most recent log being silently
discarded.

Did we get this right? 👍 / 👎 to inform future reviews.

Reference_id: 2693436

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Indeed, that's what this PR fixes with ba24997.

Expand Down
Loading