You are here
Among many other disadvantages, being poor often means that people are more likely to live in polluted environments. A recent study by MSI Principal Investigators Julian Marshall (an associate professor in the Department of Civil, Environmental, and Geo- Engineering, College of Science and Engineering, and Fellow of the Institute on the Environment) and Dylan Millet (an associate professor in the Department of Soil, Water, and Climate, College of Food, Agricultural, and Natural Resource Sciences, and Fellow of the Institute on the Environment), published in PLoS One, investigated disparities in health risks in a variety of areas within the US. This included regions, states, and cities. Specifically targeting the air pollutant NO2, one of the US EPA’s criteria pollutants, this research showed that poorer populations are more likely to be exposed to higher levels of NO2, which is related to poorer health. Poorer populations may also be more affected by pollutants because of compounding factors such as more limited access to health-care services.
While other studies about the relationship between pollution and socioeconomic status have been done within some cities, there is little data concerning broader patterns across the entire US. This research covered the contiguous US states, and also included both urban and rural areas. This research used US Census demographic data and a recently published high-resolution dataset of outdoor NO2 concentrations. The authors used MSI computational resources to process the data.
The results of this study showed that, within a given urban area, nonwhites are exposed to more NO2 than whites, and lower-income people are exposed to more NO2 than those with higher incomes. The comparative amount of NO2 that poorer populations and nonwhites are exposed to may have serious health implications. The results for counties and cities may give policy-makers means to determine how pollution-control efforts should be directed. The article can be read on the PLoS One website (Clark, Lara P., Dylan B. Millet, and Julian D. Marshall. 2014. National patterns in environmental injustice and inequality: Outdoor NO2 air pollution in the United States. PLoS One 9 (4) (APR 15): e94431.)
Image description: The left side shows differences in population-weighted mean NO2 concentrations between low-income nonwhites (LIN) and high-income whites (HIW), with large positive differences (red colors) indicating higher injustice (larger concentration difference between LIN and HIW). The right column shows the Atkinson Index (NO2 inequality), with higher values indicating greater inequality. This image shows results for urban areas in the contiguous US. (Image and description, L. Clark, et al., PLoS One 9 (4) (APR 15): e94431.)
posted on September 17, 2014
Anyone who plays video games or who goes to movies that use computer-generated imaging knows that virtual environments are becoming more and more realistic. Also, computers available to consumers continue to get more powerful, with multiple cores. The algorithms that use these computers to create virtual environments need to be structured in such a way that they can create realistic, interactive realms in games and other virtual-reality applications as efficiently as possible.
One way to improve efficiency is to create a parallel program, which is broken into several pieces that are run concurrently, instead of each piece running sequentially. Software developers of video games are working to develop programs that can be parallelized in order to make use of multiple cores or computer clusters. MSI Principal Investigator Stephen Guy, an assistant professor in the Department of Computer Science and Engineering (College of Science and Engineering), and members of his group are working on these kinds of programs. At the 6th International Conference of Motion in Games in Dublin, Ireland, in November 2013, Professor Guy, along with student John Koenig and post-doctoral researcher Dr. Ioannis Karamouzas, presented a paper discussing a new object-centric algorithm for parallel rigid-body simulation. The object-centric method means that each simulated body is modeled independently and is therefore self-contained. Objects that are more isolated from other objects can be modeled with larger time-steps, which improves efficiency and saves computation time for objects that are interacting more closely.
This object-centric method results in interactive, real-time simulations that can scale across many CPU cores. This paper included scenarios that consisted of hundreds of interacting objects. The paper can be found at the Association for Computer Machinery’s Digital Library (Koenig, John, Ioannis Karamouzas, Stephen J. Guy. Object-centric parallel rigid body simulation with timewarp. 2013. Proceedings of MIG ’13, Motion of Games, 203-212. DOI: 10.1145/2522628.2522652).
Image description: A dynamic scene with two hundred spheres falling onto five static cylinders. This simulation approach is object-centric, with each modeled as a soft-thread and simulated independently. This results in scalable performance, achieving a 5-6x simulation speedup on eight cores and a 9-10x speedup on 16 cores. Image and description J. Koenig et al., Proceedings of MIG ’13, Motion of Games, 203-212 (2013). ©2013 Association for Computer Machinery
posted on September 3, 2014
Graphene, a form of pure carbon that exists in one-atom-thick sheets, is of great interest to scientists and engineers because of its strength and other remarkable properties. Graphene sheets can be formed into other structures. The mechanical properties of a graphene nanostructure are different than structures in the macroscale.
A graphene tube with a radius of less than one nanometer is known as a carbon nanotube. MSI Principal Investigator Traian Dumitrica, a professor in the Department of Mechanical Engineering (College of Science and Engineering), has been studying the mechanical properties of carbon nanotubes for several years. In a recent paper that appeared in Physical Review B, Professor Dumitrica and his colleagues modeled bent graphene as a large-radius (> 1nm) carbon nanotube to investigate its bending rigidity. The authors developed a simple analytic formula for the bending energy, which was confirmed by tight-binding objective molecular dynamics calculations. They believe that this simulation approach may also be applicable to understanding bending behavior of other atomic monolayers. The article can be found on the Physical Review B website (I. Nikiforov, E. Dontsova, R.D. James, T. Dumitrica. Tight-binding theory of graphene bending. 2014. Physical Review B. 89, 155437).
Earlier research relating to this work appeared in a 2011 paper in Physical Review Letters (D.-B. Zhang, E. Akatyeva, T. Dumitrica. Bending ultrathin graphene at the margins of continuum mechanics. 2011. Physical Review Letters. 106, 255503). An article also appeared in MSI’s Research Bulletin (Understanding and Predicting Properties of Nanostructures: Insights From Atomic-level Simulations. Supercomputing Institute Research Bulletin, Spring 2010).
Image description: Schematic of the symmetries used in objective molecular dynamics simulations. The model simulated carbon nanotubes (CNTs) of varying diameter; these simulated ideal graphene sheets rolled into constant-curvature cylinders. (a) Pure rotation around the CNT axis of angle ψ. (b) Rotation around the CNT axis of angle γ combined with translation ρ along the CNT axis. Image and description Nikiforov, I., et al., Phys Rev B, 2014, 106:155437. ©2014 American Physical Society
posted on August 20, 2014.
Fluorescent chromophores are compounds that can emit light under certain circumstances. They are useful to chemists as dyes or markers, and can be used for such applications as medical and biological imaging.
In a recent paper that appeared in the Journal of the American Chemical Society, Professor Victor Nemykin (Chemistry and Biochemistry, University of Minnesota Duluth) and his collaborators at the University of Akron (Akron, Ohio) discuss a new fluorescent chromophore, or fluorophore, called BOPHY. (BOPHY stands for bis(difluoroboron)1,2-bis(1H-pyrrol-2-yl)methylene)hydrazine.) This compound is related to a very successful class of fluorophores already in use, the boron dipyrromethene family of compounds. BOPHY is important because it can be produced with a simple two-step procedure, and it is highly fluorescent.
The researchers studied two compounds, the BOPHY chromophore and a tetrameythyl-substituted BOPHY analogue called Me4BOPHY. They investigated the absorption and emissions spectra of these compounds using density functional theory (DFT) and time-dependent DFT (TDDFT) calculations. The graphs on the right side of the figure above show the (top) experimental and (middle, bottom) TDDFT-predicted absorption spectra of (left) 2 and (right) 4 in dichloromethane, showing the excellent agreement between theory and experiment. The left of the figure shows a schematic of BOPHY (2) and Me4BOPHY (4) (top) and their fluorescence property.
This paper first appeared on the JACS website in April 2014. It was published in the April 16, 2014 print edition of the journal (Tamgho, Ingrid-Suzy, Abed Hasheminasab, James T. Engle, Victor N. Nemykin, and Christopher J. Ziegler. 2014. A new highly fluorescent and symmetric pyrrole-BF2 chromophore: BOPHY. Journal of the American Chemical Society 136 (15) (APR 16): 5623-6).
posted on August 6, 2014
Nanostructures are devices built on an extremely tiny scale - a nanometer is one-billionth of a meter. Materials at this scale show unique properties that affect how they behave. These properties mean than nanomaterials may be useful for novel and interesting applications, but we need to understand how to work with them and how they will react.
One feature of nanomaterials is that they seem to have a random (stochastic) response when subjected to external loading. In a recent paper that appeared in the Proceedings of the National Academy of Science of the USA, two MSI Principal Investigators, Associate Professor Ryan Elliott and Professor Ellad Tadmor, worked with Subrahmanyam Pattamatta to investigate this behavior. The authors are in the Department of Aerospace Engineering and Mechanics in the College of Science and Engineering. Because standard simulation methods are insufficient to deal with the complex behavior of nanomaterials, the authors developed a new method to simulate this behavior. They created an equilibrium map (EM) that characterizes the material's responses. This EM-based approach allows for simulation of nanostructure experiments. The paper shows how the method works in the case of a nanoslab of nickel. The paper can be found on the PNAS website: Pattamatta, Subrahmanyam, Ryan Elliott, and Ellad Tadmor. 2014. Mapping the stochastic response of nanostructures. Proceedings of the National Academy of Science of the USA 111(17):E1678-E1686. Published online before print.
Professor Elliott and his research group use MSI for research into objective structures using a parallel code. The group is investigating the scalability and performance of the code. Professor Tadmor and his group are developing an optimal and parallel version of the quasicontinuum method, which is a multiscale technique based on the idea of representative atoms and finite element interpolation.
Image description: A schematic of the possible behaviors of a compressed nickel nanoslab. As the compression increases with time, the initially perfect structure (bottom) develops defects associated with minima on its evolving potential energy surface. The sequence of states observed in repeated experiments on nominally identical nanostructures is highly stochastic and rate dependent. Each colored line in the figure represents one such realization obtained using a new computational method described in the paper by S. Pattamatta et al.. Image courtesy of Subrahmanyam Pattamatta, Ryan Elliott, and Ellad Tadmor.
posted on July 23, 2014