Although there has been great progress in the field of Natural language processing (NLP) over the last few years, particularly with the development of attention-based models, less research has contributed towards modeling keystroke log data. State of the art methods handle textual data directly and while this has produced excellent results, the time complexity and resource usage are quite high for such methods. Additionally, these methods fail to incorporate the actual writing process when assessing text and instead solely focus on the content. Therefore, we proposed a framework for modeling textual data using keystroke-based features. Such methods pay attention to how a document or response was written, rather than the final text that was produced. These features are vastly different from the kind of features extracted from raw text but reveal information that is otherwise hidden. We hypothesized that pairing efficient machine learning techniques with keystroke log information should produce results comparable to transformer techniques, models which pay more or less attention to the different components of a text sequence in a far quicker time. Transformer-based methods dominate the field of NLP currently due to the strong understanding they display of natural language. We showed that models trained on keystroke log data are capable of effectively evaluating the quality of writing and do it in a significantly shorter amount of time compared to traditional methods. This is significant as it provides a necessary fast and cheap alternative to increasingly larger and slower LLMs.
Read More...Browse Articles
A meta-analysis on NIST post-quantum cryptographic primitive finalists
The advent of quantum computing will pose a substantial threat to the security of classical cryptographic methods, which could become vulnerable to quantum-based attacks. In response to this impending challenge, the field of post-quantum cryptography has emerged, aiming to develop algorithms that can withstand the computational power of quantum computers. This study addressed the pressing concern of classical cryptographic methods becoming vulnerable to quantum-based attacks due to the rise of quantum computing. The emergence of post-quantum cryptography has led to the development of new resistant algorithms. Our research focused on four quantum-resistant algorithms endorsed by America’s National Institute of Standards and Technology (NIST) in 2022: CRYSTALS-Kyber, CRYSTALS-Dilithium, FALCON, and SPHINCS+. This study evaluated the security, performance, and comparative attributes of the four algorithms, considering factors such as key size, encryption/decryption speed, and complexity. Comparative analyses against each other and existing quantum-resistant algorithms provided insights into the strengths and weaknesses of each program. This research explored potential applications and future directions in the realm of quantum-resistant cryptography. Our findings concluded that the NIST algorithms were substantially more effective and efficient compared to classical cryptographic algorithms. Ultimately, this work underscored the need to adapt cryptographic techniques in the face of advancing quantum computing capabilities, offering valuable insights for researchers and practitioners in the field. Implementing NIST-endorsed quantum-resistant algorithms substantially reduced the vulnerability of cryptographic systems to quantum-based attacks compared to classical cryptographic methods.
Read More...Shortage of Black physicians: Florida Black medical student enrollment from 2013 to 2021
Black patients tend to have better health outcomes when cared for by Black physicians, yet Black doctors make up only 5% of U.S. physicians, despite Black people comprising 14% of the population. This analysis of data from Florida medical schools showed a higher enrollment of Black first-year students (13.5%) compared to the national average (9%), and a national increase from 6% in 2013 to 9% in 2021, aligning with the rise of social justice movements. Increasing Black medical student enrollment could reduce health disparities and improve outcomes for Black communities.
Read More...Effects of noise on information corruption in the quantum teleportation algorithm
In quantum computing, noise disrupts experimental results, particularly affecting the quantum teleportation algorithm used to transfer qubit states. This study explores how noise impacts this algorithm across different platforms—a perfect simulation, a noisy simulation, and real hardware.
Read More...Mechanistic deconvolution of autoreduction in tetrazolium-based cell viability assays
Optical reporters like tetrazolium dyes, exemplified by 5-diphenyl tetrazolium bromide (MTT), are effective tools for quantifying cellular responses under experimental conditions. These dyes assess cell viability by producing brightly-colored formazan dyes when reduced inside active cells. However, certain small molecules, including reducing agents like ascorbic acid, cysteine, and glutathione (GSH), can interfere with MTT assays, potentially compromising accuracy.
Read More...Automated dynamic lighting control system to reduce energy consumption in daylight
Buildings, which are responsible for the majority of electricity consumption in cities like Dubai, are often exclusively reliant on electrical lighting even in the presence of daylight to meet the illumination requirements of the building. This inefficient use of lighting creates potential to further optimize the energy efficiency of buildings by complementing natural light with electrical lighting. Prior research has mostly used ballasts (variable resistors) to regulate the brightness of bulbs. There has been limited research pertaining to the use of pulse width modulation (PWM) and the use of ‘triodes for alternating current’ (TRIACs). PWM and TRIACs rapidly stop and restart the flow of current to the bulb thus saving energy whilst maintaining a constant illumination level of a space. We conducted experiments to investigate the feasibility of using TRIACs and PWM in regulating the brightness of bulbs. We also established the relationship between power and brightness within the experimental setups. Our results indicate that lighting systems can be regulated through these alternate methods and that there is potential to save up to 16% of energy used without affecting the overall lighting of a given space. Since most energy used in buildings is still produced through fossil fuels, energy savings from lighting systems could contribute towards a lower carbon footprint. Our study provides an innovative solution to conserve light energy in buildings during daytime.
Read More...Automated classification of nebulae using deep learning & machine learning for enhanced discovery
There are believed to be ~20,000 nebulae in the Milky Way Galaxy. However, humans have only cataloged ~1,800 of them even though we have gathered 1.3 million nebula images. Classification of nebulae is important as it helps scientists understand the chemical composition of a nebula which in turn helps them understand the material of the original star. Our research on nebulae classification aims to make the process of classifying new nebulae faster and more accurate using a hybrid of deep learning and machine learning techniques.
Read More...Using explainable artificial intelligence to identify patient-specific breast cancer subtypes
Breast cancer is the most common cancer in women, with approximately 300,000 diagnosed with breast cancer in 2023. It ranks second in cancer-related deaths for women, after lung cancer with nearly 50,000 deaths. Scientists have identified important genetic mutations in genes like BRCA1 and BRCA2 that lead to the development of breast cancer, but previous studies were limited as they focused on specific populations. To overcome limitations, diverse populations and powerful statistical methods like genome-wide association studies and whole-genome sequencing are needed. Explainable artificial intelligence (XAI) can be used in oncology and breast cancer research to overcome these limitations of specificity as it can analyze datasets of diagnosed patients by providing interpretable explanations for identified patterns and predictions. This project aims to achieve technological and medicinal goals by using advanced algorithms to identify breast cancer subtypes for faster diagnoses. Multiple methods were utilized to develop an efficient algorithm. We hypothesized that an XAI approach would be best as it can assign scores to genes, specifically with a 90% success rate. To test that, we ran multiple trials utilizing XAI methods through the identification of class-specific and patient-specific key genes. We found that the study demonstrated a pipeline that combines multiple XAI techniques to identify potential biomarker genes for breast cancer with a 95% success rate.
Read More...Predicting the factors involved in orthopedic patient hospital stay
Long hospital stays can be stressful for the patient for many reasons. We hypothesized that age would be the greatest predictor of hospital stay among patients who underwent orthopedic surgery. Through our models, we found that severity of illness was indeed the highest factor that contributed to determining patient length of stay. The other two factors that followed were the facility that the patient was staying in and the type of procedure that they underwent.
Read More...How are genetically modified foods discussed on TikTok? An analysis of #GMOFOODS
Here, the authors investigated engagement with #GMOFOODS, a hashtag on TikTok. They hypothesized that content focused on the negative effects of genetically modified organisms would receive more interaction driven by consumers. They found that the most common cateogry focused on the disadvantages of GMOs related to nutrition and health with the number of views determining if the video would be provided to users.
Read More...