class feature_engine.selection.SelectBySingleFeaturePerformance(estimator, scoring='roc_auc', cv=3, threshold=None, variables=None)[source]#

SelectBySingleFeaturePerformance() selects features based on the performance of a machine learning model trained utilising a single feature. In other words, it trains a machine learning model for every single feature, then determines each model’s performance. If the performance of the model is greater than a user specified threshold, then the feature is retained, otherwise removed.

The models are trained on each individual features using cross-validation. The performance metric to evaluate and the machine learning model to train are specified by the user.

More details in the User Guide.


A Scikit-learn estimator for regression or classification.

variables: str or list, default=None

The list of variable(s) to be evaluated. If None, the transformer will evaluate all numerical variables in the dataset.

scoring: str, default=’roc_auc’

Desired metric to optimise the performance for the estimator. Comes from sklearn.metrics. See the model evaluation documentation for more options:

threshold: float, int, default = None

The value that defines if a feature will be kept or removed.

If the metric is r2, the threshold needs to be set-up within 0 and 1. If the metrics is the roc-auc, the threshold needs to be set-up within 0.5 and 1. For metrics like the mean_square_error and the root_mean_square_error the threshold can be a bigger number.

The threshold can be specified by the user. If None, it will be automatically set to the mean performance value of all features.

cv: int, cross-validation generator or an iterable, default=3

Determines the cross-validation splitting strategy. Possible inputs for cv are:

For int/None inputs, if the estimator is a classifier and y is either binary or multiclass, StratifiedKFold is used. In all other cases, Fold is used. These splitters are instantiated with shuffle=False so the splits will be the same across calls. For more details check Scikit-learn’s cross_validate’s documentation,


List with the features to remove from the dataset.


Dictionary with the single feature model performance per feature.


The variables that will be considered for the feature selection.


The number of features in the train set used in fit.


Selection based on single feature performance was used in Credit Risk modelling as discussed in the following talk at PyData London 2017:


Galli S. “Machine Learning in Financial Risk Assessment”.



Find the important features.


Reduce X to the selected features.


Fit to data, then transform it.

fit(X, y)[source]#

Select features.

X: pandas dataframe of shape = [n_samples, n_features]

The input dataframe

y: array-like of shape (n_samples)

Target variable. Required to train the estimator.

fit_transform(X, y=None, **fit_params)[source]#

Fit to data, then transform it.

Fits transformer to X and y with optional parameters fit_params and returns a transformed version of X.

Xarray-like of shape (n_samples, n_features)

Input samples.

yarray-like of shape (n_samples,) or (n_samples, n_outputs), default=None

Target values (None for unsupervised transformations).


Additional fit parameters.

X_newndarray array of shape (n_samples, n_features_new)

Transformed array.


Get parameters for this estimator.

deepbool, default=True

If True, will return the parameters for this estimator and contained subobjects that are estimators.


Parameter names mapped to their values.


Set the parameters of this estimator.

The method works on simple estimators as well as on nested objects (such as Pipeline). The latter have parameters of the form <component>__<parameter> so that it’s possible to update each component of a nested object.


Estimator parameters.

selfestimator instance

Estimator instance.


Return dataframe with selected features.

X: pandas dataframe of shape = [n_samples, n_features].

The input dataframe.

X_new: pandas dataframe of shape = [n_samples, n_selected_features]

Pandas dataframe with the selected features.