A key barrier to adoption of solar energy technology is the low efficiency of solar cells converting solar energy into electricity. Sims and Sims tackle this problem by coding a Raspberry Pi as a multimeter to determine which wavelength of light generates the most voltage and current from a solar panel.
Read More...Browse Articles
Can the nucleotide content of a DNA sequence predict the sequence accessibility?
Sequence accessibility is an important factor affecting gene expression. Sequence accessibility or openness impacts the likelihood that a gene is transcribed and translated into a protein and performs functions and manifests traits. There are many potential factors that affect the accessibility of a gene. In this study, our hypothesis was that the content of nucleotides in a genetic sequence predicts its accessibility. Using a machine learning linear regression model, we studied the relationship between nucleotide content and accessibility.
Read More...A novel approach for early detection of Alzheimer’s disease using deep neural networks with magnetic resonance imaging
In the battle against Alzheimer's disease, early detection is critical to mitigating symptoms in patients. Here, the authors use a collection of MRI scans, layering with deep learning computer modeling, to investigate early stages of AD which can be hard to catch by human eye. Their model is successful, able to outperform previous models, and detected regions of interest in the brain for further consideration.
Read More...Exploring the effects of diverse historical stock price data on the accuracy of stock price prediction models
Algorithmic trading has been increasingly used by Americans. In this work, we tested whether including the opening, closing, and highest prices in three supervised learning models affected their performance. Indeed, we found that including all three prices decreased the error of the prediction significantly.
Read More...Cleaning up the world’s oceans with underwater laser imaging
Here recognizing the growing amount of plastic waste in the oceans, the authors sought to develop and test laser imaging for the identification of waste in water. They found that while possible, limitations such as increasing depth and water turbidity result in increasing blurriness in laser images. While their image processing methods were somewhat insufficient they identified recent methods to use deep learning-based techniques as a potential avenue to viability for this method.
Read More...Gradient boosting with temporal feature extraction for modeling keystroke log data
Although there has been great progress in the field of Natural language processing (NLP) over the last few years, particularly with the development of attention-based models, less research has contributed towards modeling keystroke log data. State of the art methods handle textual data directly and while this has produced excellent results, the time complexity and resource usage are quite high for such methods. Additionally, these methods fail to incorporate the actual writing process when assessing text and instead solely focus on the content. Therefore, we proposed a framework for modeling textual data using keystroke-based features. Such methods pay attention to how a document or response was written, rather than the final text that was produced. These features are vastly different from the kind of features extracted from raw text but reveal information that is otherwise hidden. We hypothesized that pairing efficient machine learning techniques with keystroke log information should produce results comparable to transformer techniques, models which pay more or less attention to the different components of a text sequence in a far quicker time. Transformer-based methods dominate the field of NLP currently due to the strong understanding they display of natural language. We showed that models trained on keystroke log data are capable of effectively evaluating the quality of writing and do it in a significantly shorter amount of time compared to traditional methods. This is significant as it provides a necessary fast and cheap alternative to increasingly larger and slower LLMs.
Read More...Comparing and evaluating ChatGPT’s performance giving financial advice with Reddit questions and answers
Here, the authors compared financial advice output by chat-GPT to actual Reddit comments from the "r/Financial Planning" subreddit. By assessing the model's response content, length, and advice they found that while artificial intelligence can deliver information, it failed in its delivery, clarity, and decisiveness.
Read More...Survival analysis in cardiovascular epidemiology: nexus between heart disease and mortality
In 2021, over 20 million people died from cardiovascular diseases, highlighting the need for a deeper understanding of factors influencing heart failure outcomes. This study examined multiple variables affecting mortality after heart failure, using random forest models to identify time, serum creatinine, and ejection fraction as key predictors. These findings could contribute to personalized medicine, improving survival rates by tailoring treatment strategies for heart failure patients.
Read More...The use of computer vision to differentiate valley fever from lung cancer via CT scans of nodules
Pulmonary diseases like lung cancer and valley fever pose serious health challenges, making accurate and rapid diagnostics essential. This study developed a MATLAB-based software tool that uses computer vision techniques to differentiate between these diseases by analyzing features of lung nodules in CT scans, achieving higher precision than traditional methods.
Read More...An explainable model for content moderation
The authors looked at the ability of machine learning algorithms to interpret language given their increasing use in moderating content on social media. Using an explainable model they were able to achieve 81% accuracy in detecting fake vs. real news based on language of posts alone.
Read More...