Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Consider adding a mode which supports fine-grained debugging #174

Closed
johnivdel opened this issue Jul 7, 2021 · 4 comments
Closed

Consider adding a mode which supports fine-grained debugging #174

johnivdel opened this issue Jul 7, 2021 · 4 comments
Labels
debugging-monitoring Issues relating to debugging, locally or at scake

Comments

@johnivdel
Copy link
Collaborator

One of the issues with the debugging approach in #160 is that it can be difficult to find and pinpoint issues with only aggregate data.

There was discussion in a previous call around having a non-default mode for the API which allows for finer-grained event-level debugging.

This mode could support sending additional conversion side information, and state regarding API limits such as the max reports per source limit, report rate-limits, etc.

While not enabled by default, this debugging mode could be enabled by individual browsers, or potentially browsers managed by an organization, to debug API integrations and issues.

@maudnals maudnals added the debugging-monitoring Issues relating to debugging, locally or at scake label Aug 18, 2021
@ycmou
Copy link

ycmou commented Sep 30, 2021

Throwing out another idea -- could there be support for some sort of auditing feature? Especially as the API + data is in its early stages, it would be great to have some mechanism for us to compare and validate the attributions coming from Attribution Reporting API against our own internal attribution dataset. This is something that would allow us to establish trust in the data and understand how it aligns or differs from our own. In today's world, it is unclear whether Attribution Reporting API's data is a superset or subset or some other specific overlap with our own attributions, and thus makes it difficult to determine how to utilize the data effectively for showing accurate final attribution reporting to our users. Could we support an auditing feature where a unique event id could be sent back to the browser as part of the triggering flow, and reports for those audited attributions would include the unique event id so that we could know whether the attribution is the same as our internal one or is different?

@rowan-m
Copy link

rowan-m commented Jan 27, 2022

We are planning to move ahead with providing this level of site-enabled debugging until third-party cookies are phased out as we can actually use a cookie to provide a clear marker when this debugging is enabled. While the API proposal is still under active discussion and testing, we would like to provide more fine-grained debugging functionality for developers to effectively validate their own implementations and compare it to current cookie-based solutions.

As this would be explicitly for debugging during this transitional phase, we're proposing linking it to the ability of the developer to already use a cookie in those contexts. This means that the debugging functionality is not providing any additional data for potential tracking that was not already available to the developer in that context. As in, if the party calling the Attribution Reporting API has access to the same cookie at the time of ad click and at the time of conversion, the site is already able to create a cross-site identifier for that specific combination.

The detail is in the explainer updates, but the proposed implementation looks like:

  • To enable debugging, the site that receives attribution reports (reporting endpoint, that is the origin that is set as attributionteportto) must set the following cookie:
    • Set-Cookie: ar_debug=1; SameSite=None; Secure; HttpOnly
  • Attribution Reporting API can trigger debugging reports for data created after the cookie was created

This means:

  • If the user has blocked third-party cookies, it is not possible for the site to enable debugging.
  • If the user removes the cookie, no debugging information from before clearing can be sent.
  • To trigger debugging at ad click time, a request must go to the reporting origin at some point before that event occurs.

There is also more detail on the report contents in the explainer.

@Paul-Ki
Copy link
Contributor

Paul-Ki commented Mar 10, 2022

Question on the last bullet point:

At impression time, Ad Tech will set the ar_debug cookie as part of the same response that registration data is sent. Would this lead to a timing issue due to which the cookie may not be set at the time which Chrome is registering the impression? Would Chrome consider this sufficient for showing that there was access to a third party cookie for both impression and conversion time?

@apasel422
Copy link
Collaborator

Question on the last bullet point:

At impression time, Ad Tech will set the ar_debug cookie as part of the same response that registration data is sent. Would this lead to a timing issue due to which the cookie may not be set at the time which Chrome is registering the impression? Would Chrome consider this sufficient for showing that there was access to a third party cookie for both impression and conversion time?

Yes, that should work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
debugging-monitoring Issues relating to debugging, locally or at scake
Projects
None yet
Development

No branches or pull requests

7 participants