FloatFormatter
Compatibility:
numerical
dataThe
FloatFormatter
transforms numerical data. By default, it does nothing because numerical data is already ready to use for data science. But it can optionally handle missing values, learn rounding schemes and min/max bounds.
from rdt.transformers.numerical import FloatFormatter
transformer = FloatFormatter()
missing_value_replacement
: Add this argument to replace missing values during the transform phase(default) 'random' | Replace missing values with a random value. The value is chosen uniformly at random from the min/max range. |
'mean' | Replace all missing values with the average value. |
'mode' | Replace all missing values with the most frequently occurring value |
<number> | Replace all missing values with the specified number ( 0 , -1 , 0.5 , etc.) |
None | Deprecated. Do not replace missing values. The transformed data will continue to have missing values. |
(deprecated)
model_missing_values
: Use the missing_value_generation
parameter instead.missing_value_generation
: Add this argument to determine how to recreate missing values during the reverse transform phase(default) 'random' | Randomly assign missing values in roughly the same proportion as the original data. |
'from_column' | Create a new column to store whether the value should be missing. Use it to recreate missing values. Note: Adding extra columns uses more memory and increases the RDT processing time. |
None | Do not recreate missing values. |
enforce_min_max_values
: Add this argument to allow the transformer to learn the min and max allowed values from the data.(default) False | Do not learn any min or max values from the dataset. When reverse transforming the data, the values may be above or below what was originally present. |
True | Learn the min and max values from the input data. When reverse transforming the data, any out-of-bounds values will be clipped to the min or max value. |
learn_rounding_scheme
: Add this argument to allow the transformer to learn about rounded values in your dataset.(default) False | Do not learn or enforce any rounding scheme. When reverse transforming the data, there may be many decimal places present. |
True | Learn the rounding rules from the input data. When reverse transforming the data, round the number of digits to match the original. |
computer_representation
: Add this argument when the original data has a specific representation, even if it's not loaded that way into Python. The transformer will make sure that any reverse transformed data is compatible with this representation.(default) 'Float' | The data is a float |
'Int8' , 'Int16' , 'Int32' , 'Int64' | The data is a signed integer represented as an 8, 16, 32 or 64-bit number |
'UInt8' , 'UInt16' , 'UInt32' , 'UInt64' | The data is an unsigned integer represented as an 9, 16, 32 or 64-bit number |
from transformers.numerical import FloatFormatter
ff = FloatFormatter(missing_value_replacement='mean',
learn_rounding_scheme=True,
missing_value_generation='from_column')

On the forward transform, this transformer uses
missing_value_replacement
and missing_value_generation
strategy. In this case, we create an extra column storing that the value is missing.
On the reverse transform,
enforce_min_max_values
and learn_rounding_scheme
are applied. In this case, the values are rounded to 2 decimal digits like the original data. Also, missing values are added back in.The method for replacing missing values is dependent on what they mean in your dataset. For example:
- If missing values are the equivalent of
0
, replace them with a0
. - If missing values indicate that you don't know the value at all, you might replace them with the
'mean'
or the'mode'
When setting the
model_missing_values
parameter, consider whether the "missingness" of the data is something important. For example, maybe the user opted out of supplying the info on purpose, or maybe a missing value is highly correlated with another column your dataset. If "missingness" is something you want to account for, you should model missing values.Last modified 28d ago