Irrespective of the final application of a molecule, synthetic accessibility is the rate-determining step in discovering and developing novel entities. However, synthetic complexity is challenging to quantify as a single metric, since it is a composite of several measurable metrics, some of which include cost, safety, and availability. Moreover, defining a single synthetic accessibility metric for both natural products and non-natural products poses yet another challenge given the structural distinctions between these two classes of compounds. Here, we propose a model for synthetic accessibility of all chemical compounds, inspired by the Central Limit Theorem, and devise a novel synthetic accessibility metric assessing the overall feasibility of making chemical compounds that has been fitted to a Gaussian distribution.
Read More...Browse Articles
One-step photochemical crosslinking of native proteins is feasible in tyrosine-rich bovine serum albumin
In this study, the authors develop a new hydrogel using photochemical crosslinking with bovine serum albumin and methylene blue. They find that this new hydrogel has some useful applications!
Read More...Environmentally-friendly graphene conductive ink using graphene powder, polystyrene, and waste oil
In this article, the authors propose an effective, environmentally-friendly method of producing conductive ink using expired waste oil, polystyrene, and graphene.
Read More...The effects of food type on mediator-less microbial fuel cell electricity output
The authors look at how different food types impact the ability of bacteria to produce electricity.
Read More...Converting SiO2 wafers to hydrophobic using chlorotrimethylsilane
Semiconductors are the center of the fourth industrial revolution as they are key components for all electronics. Exposed wafers made of silicon (Si), which can easily oxidize, convert to silicon dioxide (SiO2). The surface of SiO2 wafers consists of many Si-OH bonds, allowing them to easily bond with water, resulting in a “wet” or hydrophilic condition. We sought to determine a way to modify the surface of SiO2 wafers to become hydrophobic to ensure safe wet cleaning.
Read More...The Cohesiveness of the Oscillating Belousov-Zhabotinsky Reaction
In this study the author undertakes a careful characterization of a special type of chemical reaction, called an oscillating Belousov-Zhabotinsky (or B-Z) reaction, which has a number of existing applications in biomedical engineering as well as the potential to be useful in future developments in other fields of science and engineering. Specifically, she uses experimental measurements in combination with computational analysis to investigate whether the reaction is cohesive – that is, whether the oscillations between chemical states will remain consistent or change over time as the reaction progresses. Her results indicate that the reaction is not cohesive, providing an important foundation for the development of future technologies using B-Z reactions.
Read More...Optical anisotropy of crystallized vanillin thin film: the science behind the art
Microscopic beauty is hiding in common kitchen ingredients - even vanillin flavoring can be turned into mesmerizing artwork by crystallizing the vanillin and examining it under a polarizing microscope. Wang and Pang explore this hidden beauty by determining the optimal conditions to grow crystalline vanillin films and by creating computer simulations of chemical interactions between vanillin molecules.
Read More...Hybrid Quantum-Classical Generative Adversarial Network for synthesizing chemically feasible molecules
Current drug discovery processes can cost billions of dollars and usually take five to ten years. People have been researching and implementing various computational approaches to search for molecules and compounds from the chemical space, which can be on the order of 1060 molecules. One solution involves deep generative models, which are artificial intelligence models that learn from nonlinear data by modeling the probability distribution of chemical structures and creating similar data points from the trends it identifies. Aiming for faster runtime and greater robustness when analyzing high-dimensional data, we designed and implemented a Hybrid Quantum-Classical Generative Adversarial Network (QGAN) to synthesize molecules.
Read More...Automated classification of nebulae using deep learning & machine learning for enhanced discovery
There are believed to be ~20,000 nebulae in the Milky Way Galaxy. However, humans have only cataloged ~1,800 of them even though we have gathered 1.3 million nebula images. Classification of nebulae is important as it helps scientists understand the chemical composition of a nebula which in turn helps them understand the material of the original star. Our research on nebulae classification aims to make the process of classifying new nebulae faster and more accurate using a hybrid of deep learning and machine learning techniques.
Read More...Using machine learning to develop a global coral bleaching predictor
Coral bleaching is a fatal process that reduces coral diversity, leads to habitat loss for marine organisms, and is a symptom of climate change. This process occurs when corals expel their symbiotic dinoflagellates, algae that photosynthesize within coral tissue providing corals with glucose. Restoration efforts have attempted to repair damaged reefs; however, there are over 360,000 square miles of coral reefs worldwide, making it challenging to target conservation efforts. Thus, predicting the likelihood of bleaching in a certain region would make it easier to allocate resources for conservation efforts. We developed a machine learning model to predict global locations at risk for coral bleaching. Data obtained from the Biological and Chemical Oceanography Data Management Office consisted of various coral bleaching events and the parameters under which the bleaching occurred. Sea surface temperature, sea surface temperature anomalies, longitude, latitude, and coral depth below the surface were the features found to be most correlated to coral bleaching. Thirty-nine machine learning models were tested to determine which one most accurately used the parameters of interest to predict the percentage of corals that would be bleached. A random forest regressor model with an R-squared value of 0.25 and a root mean squared error value of 7.91 was determined to be the best model for predicting coral bleaching. In the end, the random model had a 96% accuracy in predicting the percentage of corals that would be bleached. This prediction system can make it easier for researchers and conservationists to identify coral bleaching hotspots and properly allocate resources to prevent or mitigate bleaching events.
Read More...