Browse Articles

Transfer Learning for Small and Different Datasets: Fine-Tuning A Pre-Trained Model Affects Performance

Gupta et al. | Oct 18, 2020

Transfer Learning for Small and Different Datasets: Fine-Tuning A Pre-Trained Model Affects Performance

In this study, the authors seek to improve a machine learning algorithm used for image classification: identifying male and female images. In addition to fine-tuning the classification model, they investigate how accuracy is affected by their changes (an important task when developing and updating algorithms). To determine accuracy, a set of images is used to train the model and then a separate set of images is used for validation. They found that the validation accuracy was close to the training accuracy. This study contributes to the expanding areas of machine learning and its applications to image identification.

Read More...

Artificial Intelligence Networks Towards Learning Without Forgetting

Kreiman et al. | Oct 26, 2018

Artificial Intelligence Networks Towards Learning Without Forgetting

In their paper, Kreiman et al. examined what it takes for an artificial neural network to be able to perform well on a new task without forgetting its previous knowledge. By comparing methods that stop task forgetting, they found that longer training times and maintenance of the most important connections in a particular task while training on a new one helped the neural network maintain its performance on both tasks. The authors hope that this proof-of-principle research will someday contribute to artificial intelligence that better mimics natural human intelligence.

Read More...