Browse Articles

Transfer Learning for Small and Different Datasets: Fine-Tuning A Pre-Trained Model Affects Performance

Gupta et al. | Oct 18, 2020

Transfer Learning for Small and Different Datasets: Fine-Tuning A Pre-Trained Model Affects Performance

In this study, the authors seek to improve a machine learning algorithm used for image classification: identifying male and female images. In addition to fine-tuning the classification model, they investigate how accuracy is affected by their changes (an important task when developing and updating algorithms). To determine accuracy, a set of images is used to train the model and then a separate set of images is used for validation. They found that the validation accuracy was close to the training accuracy. This study contributes to the expanding areas of machine learning and its applications to image identification.

Read More...

Rhythmic lyrics translation: Customizing a pre-trained language model using stacked fine-tuning

Chong et al. | May 01, 2023

Rhythmic lyrics translation: Customizing a pre-trained language model using stacked fine-tuning
Image credit: Pixabay

Neural machine translation (NMT) is a software that uses neural network techniques to translate text from one language to another. However, one of the most famous NMT models—Google Translate—failed to give an accurate English translation of a famous Korean nursery rhyme, "Airplane" (비행기). The authors fine-tuned a pre-trained model first with a dataset from the lyrics domain, and then with a smaller dataset containing the rhythmical properties, to teach the model to translate rhythmically accurate lyrics. This stacked fine-tuning method resulted in an NMT model that could maintain the rhythmical characteristics of lyrics during translation while single fine-tuned models failed to do so.

Read More...

Large Language Models are Good Translators

Zeng et al. | Oct 16, 2024

Large Language Models are Good Translators

Machine translation remains a challenging area in artificial intelligence, with neural machine translation (NMT) making significant strides over the past decade but still facing hurdles, particularly in translation quality due to the reliance on expensive bilingual training data. This study explores whether large language models (LLMs), like GPT-4, can be effectively adapted for translation tasks and outperform traditional NMT systems.

Read More...

Identifying shark species using an AlexNet CNN model

Sarwal et al. | Sep 23, 2024

Identifying shark species using an AlexNet CNN model

The challenge of accurately identifying shark species is crucial for biodiversity monitoring but is often hindered by time-consuming and labor-intensive manual methods. To address this, SharkNet, a CNN model based on AlexNet, achieved 93% accuracy in classifying shark species using a limited dataset of 1,400 images across 14 species. SharkNet offers a more efficient and reliable solution for marine biologists and conservationists in species identification and environmental monitoring.

Read More...

Artificial Intelligence-Based Smart Solution to Reduce Respiratory Problems Caused by Air Pollution

Bhardwaj et al. | Dec 14, 2021

Artificial Intelligence-Based Smart Solution to Reduce Respiratory Problems Caused by Air Pollution

In this report, Bhardwaj and Sharma tested whether placing specific plants indoors can reduce levels of indoor air pollution that can lead to lung-related illnesses. Using machine learning, they show that plants improved overall indoor air quality and reduced levels of particulate matter. They suggest that plant-based interventions coupled with sensors may be a useful long-term solution to reducing and maintaining indoor air pollution.

Read More...

Which Diaper is More Absorbent, Huggies or Pampers?

Shramko et al. | Sep 19, 2013

Which Diaper is More Absorbent, Huggies or Pampers?

The authors here investigate the absorbency of two leading brands of diapers. They find that Huggies Little Snugglers absorb over 50% more salt water than Pampers Swaddlers, although both absorb significantly more fluid than what an average newborn can produce.

Read More...

Intra and interspecies control of bacterial growth through extracellular extracts

Howe et al. | Jun 07, 2024

Intra and interspecies control of bacterial growth through extracellular extracts

The study discusses the relationship between bacterial species and the human gut microbiome, emphasizing the role of quorum sensing molecules in bacterial communication and its implications for health. Authors investigated the impact of bacterial supernatants from Escherichia coli (E. coli) on the growth of new E. coli and Enterobacter aerogenes (E. aerogenes) cultures.

Read More...