Transfer Learning

In machine learning, transfer of knowledge from one pre-trained model to another model is called transfer learning. You use transfer learning when there is already a pre-trained model for similar machine learning task you want to achieves. Its simply training a new ML model by reusing the learned weights from another pre-trained model.

Transfer learning is very common in deep learning. In deep learning, a neural network should be trained for hours or days on large training data. And its not a good idea to train a large Deep Neural Network from scratch. The lower layers of the network can be reused to train your own network that performs similar task.
fig. Transfer Learning in Deep Neural Networks

Transfer learning can be use in any neural network architecture until pre-trained model accomplishes the similar task you want to achieve. You can reuse google's pre-trained NLP model (BERT) for your own NLP task or any CNN model like (VGG) to build your own object detection model.

Advantages:

The major benefits you can get from transfer learning are:

  • Save time by speeding up training.
  • Save computing resources.
  • Best results even if you have less training data.

Comments