Saturday, October 22, 2011

Building better catalysts

 University of Utah chemists developed a method to design and test new catalysts, which are substances that speed chemical reactions and are crucial for producing energy, chemicals and industrial products. By using the new method, the chemists also made a discovery that will make it easier to design future catalysts.


The discovery: the sizes and electronic properties of catalysts interact to affect how well a catalyst performs, and are not independent factors as was thought previously. Chemistry Professor Matt Sigman and doctoral student Kaid Harper, report their findings in the Sept. 30, 2011, issue of the journal Science.


"It opens our eyes to how to design new catalysts that we wouldn't necessarily think about designing, for a broad range of reactions," Sigman says. "We're pretty excited."


Sigman believes the new technique for designing and testing catalysts "is going to be picked up pretty fast," first by academic and then by industrial chemists, who "will see it's a simple way to rapidly design better catalysts."


'Catalysts Make the World Go 'Round'


Catalysts speed chemical reactions without being consumed by those reactions. Their importance to society and the economy is tough to overstate. Products made with catalysts include medicines, fuels, foods and fertilizers.


Ninety percent of U.S. chemical manufacturing processes involve catalysts, which also are used to make more than one-fifth of all industrial products. Those processes consume much energy, so making catalytic reactions more efficient would both save energy and reduce emissions of climate-warming carbon dioxide gas.


"Catalysts make the world go 'round," says Sigman. "Catalysts are how we make molecules more efficiently and, more important, make molecules that can't be made using any other method."


The Utah researchers developed a new method for rapidly identifying and designing what are known as "asymmetric catalysts," which are catalyst molecules that are considered either left-handed or right-handed because they are physically asymmetrical. In chemistry, this property of handedness is known as chirality.


Chemists want new asymmetric catalysts because they impart handedness or chirality to the molecules they are used to make. For example, when a left-handed or right-handed catalyst is used to speed a chemical reaction, the chemical that results from that reaction can be either left-handed or right-handed.


"Handedness is an essential component of a drug's effectiveness," Sigman says.


Drugs generally work by latching onto proteins involved in a disease-causing process. The drug is like a key that fits into a protein lock, and chirality "is the direction the key goes" to fit properly and open the lock, says Sigman.


"However, developing asymmetric catalysts [to produce asymmetric drug molecules] can be a time-consuming and sometimes unsuccessful undertaking" because it usually is done by trial and error, he adds.


Sigman says the new study "is a step toward developing faster methods to identify optimal catalysts and insight into how to design them."


A Mathematical Approach to Catalyst Design


Harper and Sigman combined principles of data analysis with principles of catalyst design to create a "library" of nine related catalysts that they hypothesized would effectively catalyze a given reaction -- one that could be useful for making new pharmaceuticals. Essentially, they used math to find the optimal size and electronic properties of the candidate catalysts.


Then the chemists tested the nine catalysts -- known as "quinoline proline ligands" -- to determine how well their degree of handedness would be passed on to the chemical reaction products the catalysts were used to produce.


Sigman and Harper depicted results of the reactions using the different catalysts as a three-dimensional mathematical surface that bulges upward. The highest part of the bulge represents those among the nine catalysts that had the greatest degree of handedness.


This technique was used -- and can be used in the future -- to identify the optimal catalysts among a number of candidates. But it also revealed the unexpected link between the size and electronic properties of catalysts in determining their effectiveness in speeding reactions.


"This study shows quantitatively that the two factors are related," and knowing that will make it easier to design many future catalysts, Sigman says.


The new study was funded by the National Science Foundation.


Story Source:


The above story is reprinted (with editorial adaptations) from materials provided by University of Utah.

Journal Reference:

Matt Sigman and Kaid Harper. A Well-Behaved Catalyst. Science, 30 September 2011: 1797 DOI: 10.1126/science.333.6051.1797-f

One room -- 63 different dust particles? Researchers aim to build dust library

 Researchers recently isolated 63 unique dust particles from their laboratory -- and that's just the beginning. The chemists were testing a new kind of sensor when dust got stuck inside it, and they discovered that they could measure the composition of single dust particles.


In a recent issue of The Journal of Physical Chemistry C, they describe how the discovery could aid the study respiratory diseases caused by airborne particles.


Most dust is natural in origin, explained James Coe, professor of chemistry at Ohio State University. The 63 particles they identified were mainly irregular blobs containing bits of many different ingredients.


The most common ingredient of the dust particles was organic matter, Coe said. "Organic" indicates some kind of plant or animal material, though the researchers can't yet say precisely what kinds of organic matter they found. They are about to do an in-depth analysis to find out.


Quartz was the second-most common ingredient. Both quartz and organic matter were found in more than half of the dust particles the researchers classified. Human-made chemicals from air pollution, fertilizers, and construction materials were also present in small amounts.


"In that way, a single dust particle is like a snapshot of mankind's impact on the environment," Coe said.


Scientists have had some difficulty getting precise measurements of dust composition, in part because standard techniques involve studying dust in bulk quantities rather than individual particles.


Nowhere is dust composition more important than in public health, where some airborne particulates have been linked to diseases. Coe cited silica dust from mining operations, which causes a lung disease called silicosis.


The patented sensor that Coe's team was testing -- a type of metal mesh that transmits infrared light through materials caught in the holes -- is ideal for picking up minute details in the composition of single dust grains.


"We can separate particles by size to isolate the ones that are small enough to get into people's lungs, and look at them in detail," he added.


Coe didn't set out to study dust. He and his team invented the metal mesh sensor in 2003, and discovered that they could use it to create surface plasmons -- mixtures of conducting electrons and photons. The effect boosts the intensity of light passing through microscopic holes in the mesh, and lets scientists record a detailed infrared light spectrum. Any material stuck in the holes will leave a unique signature on the spectrum, so the sensor can be used to identify the chemicals in microscopic samples.


Early this year, the researchers were testing how light flows through the sensor, and they coated the mesh with a ring of tiny latex spheres to take a baseline measurement. The result should have been a spectrum unique to latex, but instead the spectrum carried the signature of several common minerals due to a single dust particle that had gotten inside the sensor -- most likely from the laboratory air.


Coe launched a contest among his students to see who would be the first to take an infrared spectrum of a single dust particle -- and an electron microscope image of the same particle. The winner got a free lunch and the chance to name the particle for publication.


Matthew McCormack, then an honors undergraduate student in the lab, won the contest and named the dust particle after his dog, Abby. His study of the particle formed the basis for his honors thesis, and the data has since been used by Coe and other members of the team in publications and presentations.


In subsequent tests, the students were able to isolate and study 63 individual dust particles from the air of their laboratory. The spectra they obtained with the sensor were free of scattering effects and stronger than expected.


The result is a library of common dust components from the lab. Forty of the particles (63 percent) contained organic material. The most common mineral was quartz, which was present in 34 (54 percent) of the particles, followed by carbonates (17 particles, or 27 percent), and gypsum (14 particles, or 22 percent).


Currently, Coe and his team are constructing computer algorithms to better analyze the mineral components and reveal details about the organic components.


A library of common dust components would be useful for many areas of science, he said.


Eventually, researchers in public health could use the sensor as a laboratory tool to analyze dust particles. It could also enable studies in astronomy, geology, environmental science, and atmospheric science.


McCormack is a co-author on the paper, along with Katherine Cilwa, now a postdoctoral researcher in chemistry at the University of Michigan; Michelle Lew, now a doctoral student in chemistry at Indiana University; Christophe Robitaille, now in medical school at the University of Chicago; Lloyd Corwin, a former Ohio State undergraduate student in nuclear engineering; and Marvin Malone, a current doctoral student in Coe's laboratory.


Story Source:


The above story is reprinted (with editorial adaptations) from materials provided by Ohio State University.

Journal Reference:

Katherine E. Cilwa, Matthew McCormack, Michelle Lew, Christophe Robitaille, Lloyd Corwin, Marvin A. Malone, James V. Coe. Scatter-Free IR Absorption Spectra of Individual, 3–5 µm, Airborne Dust Particles Using Plasmonic Metal Microarrays: A Library of 63 Spectra. The Journal of Physical Chemistry C, 2011; 115 (34): 16910 DOI: 10.1021/jp205383h

Nanopores on a chip: Applications for analytical tasks in chemistry and biology

Biological nanopores are proteins of only a few nanometers in diameter that form tiny water-filled canals. They have proven to be promising tools in the field of nanobiotechnology. In a joint project at the University of Freiburg, a research group led by Prof. Dr. Jan C. Behrends, Institute of Physiology, and scientists working under Prof. Dr. Jürgen Rühe, Department of Microsystems Technology (IMTEK), have succeeded in arranging nanopores on a tiny microchip and using it to determine the mass of chain-like molecules called polymers with a high degree of precision.


In these experiments, the nanopores assume the role of the actual sensor.. The first author of the study, now published in the journal ACS Nano of the American Chemical Society, Dr. Gerhard Baaken, hopes that the new development will be instrumental in exploiting the great potential of nanopore analysis for chemistry and the life sciences.


In their natural environment, nanopores often have the function of transporting larger molecules. For instance, they convey proteins through membranes. Bacteria also use nanopores to destroy the cells of infected organisms. This is also true of alpha-hemolysin, a protein produced by staphylococci to destroy red blood cells. This protein has also recently found applications in analytical tasks in chemistry and biology. If a large-sized molecule gets into the pore, it becomes partially blocked for fractions of a second.


By measuring the electrical conductivity of the hemolysin pore, scientists can detect the presence of a single molecule -- in a fashion similar to the function of a light barrier. By the same principle it is also possible to make a very precise measurement of the size of the molecule. Scientists are very optimistic about the potential applications -- not only for the analysis of synthetic polymer mixtures, but also for the analysis of genetic material and even as a quick and inexpensive way to sequence DNA.


The Freiburg research team has now succeeded in conducting such measurements on a specially developed biohybrid microsensor made of biological and micro-technical parts. It contains 16 miniaturized artificial cell membranes on only one square millimeter. The individual membranes are spread over minuscule pits, each with a diameter of approximately two-hundredths of a millimeter. That is the equivalent of around one-third of the thickness of a human hair. In their publication, the authors demonstrate that they can use the chip to obtain the distributions of polymer sizes that are accurate to a single chain element. Currently, results of such precision require expensive equipment filling entire rooms. The project is a good example of successful collaboration among strongly diverging disciplines.


The above story is reprinted (with editorial adaptations ) from materials provided by Albert-Ludwigs-Universität Freiburg, via AlphaGalileo.

New membrane lipid measuring technique may help fight disease

Could controlling cell-membrane fat play a key role in turning off disease? Researchers at the University of Illinois at Chicago think so, and a biosensor they've created that measures membrane lipid levels may open up new pathways to disease treatment. Wonhwa Cho, distinguished professor of chemistry, and his coworkers engineered a way to modify proteins to fluoresce and act as sensors for lipid levels.


Their findings are reported in Nature Chemistry, online on Oct. 9.


"Lipid molecules on cell membranes can act as switches that turn on or off protein-protein interactions affecting all cellular processes, including those associated with disease," says Cho. "While the exact mechanism is still unknown, our hypothesis is that lipid molecules serve sort of like a sliding switch."


Cho said once lipid concentrations reach a certain threshold, they trigger reactions, including disease-fighting immune responses. Quantifying lipid membrane concentration in a living cell and studying its location in real time can provide a powerful tool for understanding and developing new ways to combat a range of maladies from inflammation, cancer and diabetes to metabolic diseases.


"It's not just the presence of lipid, but the number of lipid molecules that are important for turning on and off biological activity," said Cho.


While visualizing lipid molecules with fluorescent proteins isn't new, Cho's technique allows quantification by using a hybrid protein molecule that fluoresces only when it binds specific lipids. His lab worked with a lipid known as PIP2 -- an important fat molecule involved in many cellular processes. Cho's sensor binds to PIP2 and gives a clear signal that can be quantified through a fluorescent microscope.


The result is the first successful quantification of membrane lipids in a living cell in real time.


"We had to engineer the protein in such a way to make it very stable, behave well, and specifically recognizes a particular lipid," Cho said. He has been working on the technique for about a decade, overcoming technical obstacles only about three years ago.


Cho hopes now to create a tool kit of biosensors to quantify most, if not all lipids.


"We'd like to be able to measure multiple lipids, simultaneously," he said. "It would give us a snapshot of all the processes being regulated by the different lipids inside a cell."


Other authors on the paper are postdoctoral researcher Youngdae Yoon, who developed the sensor; Park J. Lee, a doctoral student who developed microscope tools to enable the lipid quantification; and doctoral student Svetlana Kurilova, who worked on the protein cell delivery.


The above story is reprinted (with editorial adaptations ) from materials provided by University of Illinois at Chicago.

Journal Reference:

Youngdae Yoon, Park J. Lee, Svetlana Kurilova, Wonhwa Cho. In situ quantitative imaging of cellular lipids using molecular sensors. Nature Chemistry, 2011; DOI: 10.1038/nchem.1163

Ionic liquid catalyst helps turn emissions into fuel

 An Illinois research team has succeeded in overcoming one major obstacle to a promising technology that simultaneously reduces atmospheric carbon dioxide and produces fuel.


University of Illinois chemical and biological engineering professor Paul Kenis and his research group joined forces with researchers at Dioxide Materials, a startup company, to produce a catalyst that improves artificial photosynthesis. The company, in the university Research Park, was founded by retired chemical engineering professor Richard Masel. The team reported their results in the journal Science.


Artificial photosynthesis is the process of converting carbon dioxide gas into useful carbon-based chemicals, most notably fuel or other compounds usually derived from petroleum, as an alternative to extracting them from biomass.


In plants, photosynthesis uses solar energy to convert carbon dioxide (CO2) and water to sugars and other hydrocarbons. Biofuels are refined from sugars extracted from crops such as corn. However, in artificial photosynthesis, an electrochemical cell uses energy from a solar collector or a wind turbine to convert CO2 to simple carbon fuels such as formic acid or methanol, which are further refined to make ethanol and other fuels.


"The key advantage is that there is no competition with the food supply," said Masel, a co-principal investigator of the paper and CEO of Dioxide Materials, "and it is a lot cheaper to transmit electricity than it is to ship biomass to a refinery."


However, one big hurdle has kept artificial photosynthesis from vaulting into the mainstream: The first step to making fuel, turning carbon dioxide into carbon monoxide, is too energy intensive. It requires so much electricity to drive this first reaction that more energy is used to produce the fuel than can be stored in the fuel.


The Illinois group used a novel approach involving an ionic liquid to catalyze the reaction, greatly reducing the energy required to drive the process. The ionic liquids stabilize the intermediates in the reaction so that less electricity is needed to complete the conversion.


The researchers used an electrochemical cell as a flow reactor, separating the gaseous CO2 input and oxygen output from the liquid electrolyte catalyst with gas-diffusion electrodes. The cell design allowed the researchers to fine-tune the composition of the electrolyte stream to improve reaction kinetics, including adding ionic liquids as a co-catalyst.


"It lowers the overpotential for CO2 reduction tremendously," said Kenis, who is also a professor of mechanical science and engineering and affiliated with the Beckman Institute for Advanced Science and Technology. "Therefore, a much lower potential has to be applied. Applying a much lower potential corresponds to consuming less energy to drive the process."


Next, the researchers hope to tackle the problem of throughput. To make their technology useful for commercial applications, they need to speed up the reaction and maximize conversion.


"More work is needed, but this research brings us a significant step closer to reducing our dependence on fossil fuels while simultaneously reducing CO2 emissions that are linked to unwanted climate change," Kenis said.


Graduate students Brian Rosen, Michael Thorson, Wei Zhu and Devin Whipple and postdoctoral researcher Amin Salehi-Khojin were co-authors of the paper. The U.S. Department of Energy supported this work.


Story Source:


The above story is reprinted (with editorial adaptations ) from materials provided by University of Illinois at Urbana-Champaign.