Browse Articles

Using machine learning to develop a global coral bleaching predictor

Madireddy et al. | Feb 21, 2023

Using machine learning to develop a global coral bleaching predictor
Image credit: Madireddy, Bosch, and McCalla

Coral bleaching is a fatal process that reduces coral diversity, leads to habitat loss for marine organisms, and is a symptom of climate change. This process occurs when corals expel their symbiotic dinoflagellates, algae that photosynthesize within coral tissue providing corals with glucose. Restoration efforts have attempted to repair damaged reefs; however, there are over 360,000 square miles of coral reefs worldwide, making it challenging to target conservation efforts. Thus, predicting the likelihood of bleaching in a certain region would make it easier to allocate resources for conservation efforts. We developed a machine learning model to predict global locations at risk for coral bleaching. Data obtained from the Biological and Chemical Oceanography Data Management Office consisted of various coral bleaching events and the parameters under which the bleaching occurred. Sea surface temperature, sea surface temperature anomalies, longitude, latitude, and coral depth below the surface were the features found to be most correlated to coral bleaching. Thirty-nine machine learning models were tested to determine which one most accurately used the parameters of interest to predict the percentage of corals that would be bleached. A random forest regressor model with an R-squared value of 0.25 and a root mean squared error value of 7.91 was determined to be the best model for predicting coral bleaching. In the end, the random model had a 96% accuracy in predicting the percentage of corals that would be bleached. This prediction system can make it easier for researchers and conservationists to identify coral bleaching hotspots and properly allocate resources to prevent or mitigate bleaching events.

Read More...

Using explainable artificial intelligence to identify patient-specific breast cancer subtypes

Suresh et al. | Jan 12, 2024

Using explainable artificial intelligence to identify patient-specific breast cancer subtypes

Breast cancer is the most common cancer in women, with approximately 300,000 diagnosed with breast cancer in 2023. It ranks second in cancer-related deaths for women, after lung cancer with nearly 50,000 deaths. Scientists have identified important genetic mutations in genes like BRCA1 and BRCA2 that lead to the development of breast cancer, but previous studies were limited as they focused on specific populations. To overcome limitations, diverse populations and powerful statistical methods like genome-wide association studies and whole-genome sequencing are needed. Explainable artificial intelligence (XAI) can be used in oncology and breast cancer research to overcome these limitations of specificity as it can analyze datasets of diagnosed patients by providing interpretable explanations for identified patterns and predictions. This project aims to achieve technological and medicinal goals by using advanced algorithms to identify breast cancer subtypes for faster diagnoses. Multiple methods were utilized to develop an efficient algorithm. We hypothesized that an XAI approach would be best as it can assign scores to genes, specifically with a 90% success rate. To test that, we ran multiple trials utilizing XAI methods through the identification of class-specific and patient-specific key genes. We found that the study demonstrated a pipeline that combines multiple XAI techniques to identify potential biomarker genes for breast cancer with a 95% success rate.

Read More...

Similarity Graph-Based Semi-supervised Methods for Multiclass Data Classification

Balaji et al. | Sep 11, 2021

Similarity Graph-Based Semi-supervised Methods for Multiclass Data Classification

The purpose of the study was to determine whether graph-based machine learning techniques, which have increased prevalence in the last few years, can accurately classify data into one of many clusters, while requiring less labeled training data and parameter tuning as opposed to traditional machine learning algorithms. The results determined that the accuracy of graph-based and traditional classification algorithms depends directly upon the number of features of each dataset, the number of classes in each dataset, and the amount of labeled training data used.

Read More...

Expression of Anti-Neurodegeneration Genes in Mutant Caenorhabditis elegans Using CRISPR-Cas9 Improves Behavior Associated With Alzheimer’s Disease

Mishra et al. | Sep 14, 2019

Expression of Anti-Neurodegeneration Genes in Mutant <em>Caenorhabditis elegans</em> Using CRISPR-Cas9 Improves Behavior Associated With Alzheimer’s Disease

Alzheimer's disease is one of the leading causes of death in the United States and is characterized by neurodegeneration. Mishra et al. wanted to understand the role of two transport proteins, LRP1 and AQP4, in the neurodegeneration of Alzheimer's disease. They used a model organism for Alzheimer's disease, the nematode C. elegans, and genetic engineering to look at whether they would see a decrease in neurodegeneration if they increased the amount of these two transport proteins. They found that the best improvements were caused by increased expression of both transport proteins, with smaller improvements when just one of the proteins is overly expressed. Their work has important implications for how we understand neurodegeneration in Alzheimer's disease and what we can do to slow or prevent the progression of the disease.

Read More...

A Simple Printing Solution to Aid Deficit Reduction

Mirchandani et al. | Mar 09, 2014

A Simple Printing Solution to Aid Deficit Reduction

The printing-related expenditure that is budgeted in 2014 for U.S. Federal agencies is $1.8 billion. A sample of five publically available documents produced by various federal agencies is analyzed and the cost savings arising from a change in font type are estimated. The analysis predicts that the Government’s annual savings by switching to Garamond are likely to be about $234 million with worst-case savings of $62 million and best-case savings of $394 million. Indirect benefits arising from a less detrimental impact on the environment due to lower ink production and disposal volumes are not included in these estimates. Times New Roman is not as efficient as Garamond, and the third federally-recommended font, Century Gothic, is actually worse on average than the fonts used in the sample documents.

Read More...

Exponential regression analysis of the Canadian Zero Emission Vehicle market’s effects on climate emissions in 2030

Ajay et al. | Feb 25, 2023

Exponential regression analysis of the Canadian Zero Emission Vehicle market’s effects on climate emissions in 2030
Image credit: Andrew Roberts

Here, the authors explored how the sale and use of electric vehicles could reduce emissions from the transport industry in Canada. By fitting the sale of total of electric vehicles with an exponential model, the authors predicted the number of electric vehicle sales through 2030 and related that to the average emission for such vehicles. Ultimately, they found that the sale and use of electric vehicles alone would likely not meet the 45% reduction in emissions from the transport industry suggested by the Canadian government

Read More...

Search Articles

Search articles by title, author name, or tags

Clear all filters

Popular Tags

Browse by school level