Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Align our principles with what we're building #58

Open
martinthomson opened this issue Apr 10, 2024 · 0 comments
Open

Align our principles with what we're building #58

martinthomson opened this issue Apr 10, 2024 · 0 comments

Comments

@martinthomson
Copy link
Collaborator

martinthomson commented Apr 10, 2024

This came up in the discussion on #52. It was not clear that the text about researchers and auditors was consistent with the sorts of things that are possible within the systems we're building. At that time, I said:

Any system that handles user data in the aggregate needs to provide strong constraints that limit the possibility that the data is misused. However, the uses of data that are permitted within those constraints might still admit narrower forms of abuse. For measurement, this might involve selectively targeting individuals or groups of individuals for the purposes of obtaining more and more actionable data about their online activities.

For advertising purposes, this sort of targeting is often a primary goal of measurement systems. A problem arises when this targeting is repeated to the point that it puts individuals at greater risk of exploitation based on the information that is obtained.

The distinction between abusive uses and ordinary uses of these systems could be hard to make without additional information about the inputs to the system.

The measurement systems being proposed all rely on oblivious computation to some degree. This means that access to their internal operation reveals no meaningful information. To that end, most of the information of interest is held by companies in the advertising market: ad techs, publishers, and advertisers.

In attempting to access that information, the key challenge is that any information that might be needed to detect abuse is also virtually guaranteed to be commercially sensitive. Revealing information about the conduct of measurement also reveals information about how advertisers place their advertisements, how they structure their bidding strategies, and even details of clients.

It might be possible for an independent researcher or auditor to gain access to this sort of information. They might be able to convince participants to allow access to the information for certain narrow purposes. The current environment does not establish good incentives for market participants to accede to that sort of inspection. Inspection carries risks both to that commercially sensitive data and to the reputation of the advertiser, with no real upside.

The question we need to ask is whether there is any change to how the system operates that might make the system more open to these sorts of aggregate, independent systems of accountability. In doing so, we need to balance the commercial sensitivity interests of those participating in advertising with those goals. And we need to sustain the high standards we have for privacy at the same time.

Originally posted by @martinthomson in #52 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant