Browse Articles

Similarity Graph-Based Semi-supervised Methods for Multiclass Data Classification

Balaji et al. | Sep 11, 2021

Similarity Graph-Based Semi-supervised Methods for Multiclass Data Classification

The purpose of the study was to determine whether graph-based machine learning techniques, which have increased prevalence in the last few years, can accurately classify data into one of many clusters, while requiring less labeled training data and parameter tuning as opposed to traditional machine learning algorithms. The results determined that the accuracy of graph-based and traditional classification algorithms depends directly upon the number of features of each dataset, the number of classes in each dataset, and the amount of labeled training data used.

Read More...

Open Source RNN designed for text generation is capable of composing music similar to Baroque composers

Goel et al. | May 05, 2021

Open Source RNN designed for text generation is capable of composing music similar to Baroque composers

Recurrent neural networks (RNNs) are useful for text generation since they can generate outputs in the context of previous ones. Baroque music and language are similar, as every word or note exists in context with others, and they both follow strict rules. The authors hypothesized that if we represent music in a text format, an RNN designed to generate language could train on it and create music structurally similar to Bach’s. They found that the music generated by our RNN shared a similar structure with Bach’s music in the input dataset, while Bachbot’s outputs are significantly different from this experiment’s outputs and thus are less similar to Bach’s repertoire compared to our algorithm.

Read More...

Statistically Analyzing the Effect of Various Factors on the Absorbency of Paper Towels

Tao et al. | Dec 04, 2020

Statistically Analyzing the Effect of Various Factors on the Absorbency of Paper Towels

In this study, the authors investigate just how effectively paper towels can absorb different types of liquid and whether changing the properties of the towel (such as folding it) affects absorbance. Using variables of either different liquid types or the folded state of the paper towels, they used thorough approaches to make some important and very useful conclusions about optimal ways to use paper towels. This has important implications as we as a society continue to use more and more paper towels.

Read More...

Geographic Distribution of Scripps National Spelling Bee Spellers Resembles Geographic Distribution of Child Population in US States upon Implementation of the RSVBee “Wildcard” Program

Kannankeril et al. | Aug 17, 2020

Geographic Distribution of Scripps National Spelling Bee Spellers Resembles Geographic Distribution of Child Population in US States upon Implementation of the RSVBee “Wildcard” Program

The Scripps National Spelling Bee (SNSB) is an iconic academic competition for United States (US) schoolchildren, held annually since 1925. However, the sizes and geographic distributions of sponsored regions are uneven. One state may send more than twice as many spellers as another state, despite similar numbers in child population. In 2018, the SNSB introduced a wildcard program known as RSVBee, which allowed students to apply to compete as a national finalist, even if they did not win their regional spelling bee. In this study, the authors tested the hypothesis that the geographic distribution of SNSB national finalists more closely matched the child population of the US after RSVBee was implemented.

Read More...

The Effect of Varying Training on Neural Network Weights and Visualizations

Fountain et al. | Dec 04, 2019

The Effect of Varying Training on Neural Network Weights and Visualizations

Neural networks are used throughout modern society to solve many problems commonly thought of as impossible for computers. Fountain and Rasmus designed a convolutional neural network and ran it with varying levels of training to see if consistent, accurate, and precise changes or patterns could be observed. They found that training introduced and strengthened patterns in the weights and visualizations, the patterns observed may not be consistent between all neural networks.

Read More...

Artificial Intelligence Networks Towards Learning Without Forgetting

Kreiman et al. | Oct 26, 2018

Artificial Intelligence Networks Towards Learning Without Forgetting

In their paper, Kreiman et al. examined what it takes for an artificial neural network to be able to perform well on a new task without forgetting its previous knowledge. By comparing methods that stop task forgetting, they found that longer training times and maintenance of the most important connections in a particular task while training on a new one helped the neural network maintain its performance on both tasks. The authors hope that this proof-of-principle research will someday contribute to artificial intelligence that better mimics natural human intelligence.

Read More...

Search Articles

Search articles by title, author name, or tags

Clear all filters

Popular Tags

Browse by school level