Feature Engineering technique

The main feature engineering techniques are:

  1. Missing data imputation

  2. Categorical encoding

  3. Variable transformation

  4. Outlier engineering

  5. Date and time engineering

Often in classical machine learning feature engineering is crucial for building a strong model. For example a column has a lot of missing values. Would you simply replace it with a mean, or a sub group mean? The answer really depends upon what the column is about. Sometimes just binning it to a binary feature column of 0/1 where 1 stands for not missing value can yield good results.

On word of caution, out of my experience, avoid over engineering features. You might loose some information or regenerate features that anyways the model can deduce from existing features.