Differential Privacy

This report provides differential privacy guarantees. The concept of differential privacy is that the output of an algorithm remains nearly unchanged if the records of one individual (user-level privacy)/ or one trip (item-level privacy) are removed or added. In this way, differential privacy limits the impact of a single individual on the analysis outcome, preventing the reconstruction of an individual's data. Broadly speaking, this is achieved by adding calibrated noise to the output and the amount of noise is defined by the privacy_budget. Depending on the setting of user_privacy, noise is either calibrated to only protect single trips (item-level privacy) or to protect the privacy of users (user-level privacy). The privacy budget is split between all analyses. The cofiguration table provides information on the used privacy_budget, the budget_split and user_privacy. For each analysis, information is provided on the amount of utilized privacy budget.

The Laplace mechanism is used for counts and the Exponential mechanism for the five number summaries. Details on the notion of differential privacy and the used mechanisms are provided in the documentation.