LogoLogo
GitHubSlackDataCebo
  • RDT: Reversible Data Transforms
  • Getting Started
    • Installation
    • Quickstart
  • Usage
    • Basic Concepts
    • HyperTransformer
      • Preparation
      • Configuration
      • Transformation
  • Transformers Glossary
    • Numerical
      • ClusterBasedNormalizer
      • FloatFormatter
      • GaussianNormalizer
      • LogScaler
      • LogitScaler
      • * OutlierEncoder
      • ❖ DPECDFNormalizer
      • ❖ DPLaplaceNoiser
      • ❖ ECDFNormalizer
      • ❖ XGaussianNormalizer
    • Categorical
      • LabelEncoder
      • OrderedLabelEncoder
      • FrequencyEncoder
      • OneHotEncoder
      • OrderedUniformEncoder
      • UniformEncoder
      • BinaryEncoder
      • ❖ DPDiscreteECDFNormalizer
      • ❖ DPResponseRandomizer
      • ❖ DPWeightedResponseRandomizer
    • Datetime
      • OptimizedTimestampEncoder
      • UnixTimestampEncoder
      • ❖ DPTimestampLaplaceNoiser
    • ID
      • AnonymizedFaker
      • IndexGenerator
      • RegexGenerator
      • Treat IDs as categorical labels
    • Generic PII Anonymization
      • AnonymizedFaker
      • PseudoAnonymizedFaker
    • * Deep Data Understanding
      • * Address
        • * RandomLocationGenerator
        • * RegionalAnonymizer
      • * Email
        • * DomainBasedAnonymizer
        • * DomainBasedMapper
        • * DomainExtractor
      • * GPS Coordinates
        • * RandomLocationGenerator
        • * GPSNoiser
        • * MetroAreaAnonymizer
      • * Phone Number
        • * AnonymizedGeoExtractor
        • * NewNumberMapper
        • * GeoExtractor
  • Resources
    • Use Cases
      • Contextual Anonymization
      • Differential Privacy
      • Statistical Preprocessing
    • For Businesses
    • For Developers
Powered by GitBook
On this page
  • Parameters
  • Attributes
  • FAQ
  1. Transformers Glossary
  2. Numerical

❖ DPLaplaceNoiser

Previous❖ DPECDFNormalizerNext❖ ECDFNormalizer

Last updated 16 days ago

Compatibility: numerical data

The DPLaplaceNoiser uses differential privacy techniques to add noise to your data. It adds noise to numerical data using the and, if requested, it also uses mechanism to noise missing values.

As a result, the entire column of transformed data will have differential privacy guarantees. (This transformer does not do anything on the reverse transform, as it is not possible to undo the differential privacy noise.)

from rdt.transformers.numerical import DPLaplaceNoiser

transformer = DPLaplaceNoiser(
    epsilon=3.5,
    known_min_value=0,
    noise_missing_values=True
)

Parameters

(required) epsilon: A float >0 that represents the privacy loss budget you are willing to accommodate.

known_min_value: A previously-known min value that the data must take. Providing this value will help to conserve the privacy budget and ultimately yield higher fidelity data for the same epsilon value.

The min value should represent prior knowledge of the data. In order to enforce differential privacy, it is critical that the min value is prior knowledge that is not based on any computations of the real data.

(default) None

There is no known minimum value for the data. The transformer will compute one based on the fit data, using some privacy budget

<float>

The transformer will make sure the data will never be less than the value. This will not use up any privacy budget.

known_max_value: A previously-known max value that the data must take. Providing this value will help to conserve the privacy budget and ultimately yield higher fidelity data for the same epsilon value.

The max value should represent prior knowledge of the data. In order to enforce differential privacy, it is critical that the max value is prior knowledge that is not based on any computations of the real data.

(default) None

There is no known maximum value for the data. The transformer will compute one based on the fit data, using some privacy budget

<float>

The transformer will make sure the data will never be greater than the value. This will not use up any privacy budget.

noise_missing_values: Add this argument to add noise to the missing values in your data. Noise means that some of the missing values will flip to missing (and vice-versa).

(default) False

Do not add any noise to the missing values. In doing this, we assume that the missing values are not statistically relevant and thus do not require us to use any privacy budget.

True

Use a randomized response mechanism to perturb missing values. This will use some privacy budget.

learn_rounding_scheme: Add this argument to allow the transformer to learn about rounded values in your dataset.

(default) False

Do not learn or enforce any rounding scheme. When reverse transforming the data, there may be many decimal places present.

True

Learn the rounding rules from the input data. When reverse transforming the data, round the number of digits to match the original.

Attributes

After fitting the transformer, you can access the learned values through the attributes.

epsilon_breakdown: A dictionary that stores how the privacy loss budget (epsilon) is broken into the different steps for adding differential privacy (noising the data, noising the min/max boundaries, and noising the missing values). Based on the parameters, not all steps will be needed.

These values represent percentages that add up to 1 (100%). For example 0.5 means that 50% of the epsilon was used up for a particular area.

>>> transformer.epsilon_breakdown
{
    'epsilon_data': 0.5,
    'epsilon_boundaries': 0.3,
    'epsilon_missing_values': 0.2
}

FAQ

Which algorithms does this transformer use?
How is the privacy loss budget (ε) used?

The privacy loss budget is used during 3 possible phases of the transformation:

  • Computing differentially private min/max values from the data, so as not to reveal the actual min or max value that the data contains. This step is required if the known_min_value and known_max_value are not provided.

To see what the final breakdown is, use the epsilon_breakdown parameter.

Can I share the data after applying this? What are the differential privacy guarantees?

Differential privacy controls the amount of influence a single data point can have over the final, transformed column. After applying the transformer to this column, the entire column provides differential privacy guarantees, so you should be able to share it as well as any statistics about it (min, max, mean, etc.).

Please note that this transformer only applies differential privacy to the individual column. It does not provide differential privacy guarantees if you'd like to share multiple columns at a time. For that, we recommend using a differentially private synthesizer that can handle many columns at once.

This transformer uses ε-differentially private mechanisms to add controlled noise to the column of data. It uses the to add noise to the numerical data, and to add noise to missing values.

Noising the numerical data using the . This step is always performed.

Noising the missing values (aka flipping the missing values to non-missing and vice versa) using . This step is performed if the noise_missing_values parameter is set to True.

Laplace mechanism
Randomized response
Laplacian mechanism
Randomized response
Laplace mechanism
randomized response

How should I chose my privacy loss budget (epsilon)? The value of epsilon is a measure of how much risk you're willing to take on when it comes to privacy.

  • Values in the 0-1 range indicate that you are not willing to take on too much risk. As a result, the synthetic data will have strong privacy guarantees — potentially at the expense of data quality.

  • Values in the 2-10 range indicate that you're willing to accept some privacy risk in order to preserve more data quality.

❖ SDV Enterprise Bundle. This feature is available as part of the Differential Privacy Bundle, an optional add-on to SDV Enterprise. For more information, please visit the page. Coming soon!

Differential Privacy Bundle