Retraining Update Strategies
A benefit of neural network models is that their weights can be updated at any time with continued training.
When responding to changes in the underlying data or the availability of new data, there are a few different strategies to choose from when updating a neural network model, such as:
- Continue training the model on the new data only.
- Continue training the model on the old and new data.
We might also imagine variations on the above strategies, such as using a sample of the new data or a sample of new and old data instead of all available data, as well as possible instance-based weightings on sampled data.
We might also consider extensions of the model that freeze the layers of the existing model (e.g. so model weights cannot change during training), then add new layers with model weights that can change, grafting on extensions to the model to handle any change in the data. Perhaps this is a variation of the retraining and the ensemble approach in the next section, and we’ll leave it for now.
Ensemble Update Strategies
An ensemble is a predictive model that is composed of multiple other models.
There are many different types of ensemble models, although perhaps the simplest approach is to average the predictions from multiple different models. We can use an ensemble model as a strategy when responding to changes in the underlying data or availability of new data.
Mirroring the approaches in the previous section, we might consider two approaches to ensemble learning algorithms as strategies for responding to new data; they are:
Ensemble of existing model and new model fit on new data only.
Ensemble of existing model and new model fit on old and new data.
Again, we might consider variations on these approaches, such as samples of old and new data, and more than one existing or additional models included in the ensemble.