Wednesday, May 4, 2011

Hydrogen fuel tech gets boost from low-cost, efficient catalyst

 

A team of researchers have engineered a cheap, abundant alternative to the expensive catalyst platinum and coupled it with a light-absorbing electrode to make hydrogen fuel from sunlight and water. The discovery was published in Nature Materials by theorist Jens Norskov of the Department of Energy?s SLAC National Accelerator Laboratory and Stanford University and a team of colleagues led by Ib Chorkendorff and Soren Dahl at the Technical University of Denmark. The team optimized a photo-electrochemical water splitting device by designing light absorbers made of silicon arranged in closely packed pillars, imaged above using a scanning electron microscope. After dotting the pillars with tiny clusters of the new catalyst and exposing the pillars to light, researchers watched as hydrogen gas bubbled up -- as quickly as if they'd used costly platinum. Credit: Image courtesy of Christian D. Damsgaard, Thomas Pedersen and Ole Hansen, Technical University of Denmark.


Scientists have engineered a cheap, abundant alternative to the expensive platinum catalyst and coupled it with a light-absorbing electrode to make hydrogen fuel from sunlight and water.


The discovery is an important development in the worldwide effort to mimic the way plants make fuel from sunlight, a key step in creating a economy. It was reported last week in by theorist Jens Norskov of the Department of Energy's SLAC National Accelerator Laboratory and Stanford University and a team of colleagues led by Ib Chorkendorff and Soren Dahl at the Technical University of Denmark (DTU).


is an energy dense and clean fuel, which upon combustion releases only water. Today, most hydrogen is produced from natural gas which results in large CO2-emissions. An alternative, clean method is to make from sunlight and water. The process is called photo-electrochemical, or PEC, water splitting. When sun hits the PEC cell, the solar energy is absorbed and used for splitting into its components, hydrogen and oxygen.


Progress has so far been limited in part by a lack of cheap catalysts that can speed up the generation of hydrogen and oxygen. A vital part of the American-Danish effort was combining theory and advanced computation with synthesis and testing to accelerate the process of identifying new catalysts. This is a new development in a field that has historically relied on trial and error. "If we can find new ways of rationally designing catalysts, we can speed up the development of new catalytic materials enormously," Norskov said.


The team first tackled the hydrogen half of the problem. The DTU researchers created a device to harvest the energy from part of the solar spectrum and used it to power the conversion of single hydrogen ions into . However, the process requires a to facilitate the reaction. Platinum is already known as an efficient catalyst, but platinum is too rare and too expensive for widespread use. So the collaborators turned to nature for inspiration.


They investigated hydrogen producing enzymes—natural catalysts—from certain organisms, using a theoretical approach Norskov's group has been developing to describe catalyst behavior. "We did the calculations," Norskov explained, "and found out why these enzymes work as well as they do." These studies led them to related compounds, which eventually took them to molybdenum sulfide. "Molybdenum is an inexpensive solution" for catalyzing hydrogen production, Chorkendorff said.


The team also optimized parts of the device, introducing a "chemical solar cell" designed to capture as much solar energy as possible. The experimental researchers at DTU designed light absorbers that consist of silicon arranged in closely packed pillars, and dotted the pillars with tiny clusters of the molybdenum sulfide. When they exposed the pillars to light, hydrogen gas bubbled up—as quickly as if they'd used costly platinum.


The hydrogen gas-generating device is only half of a full photo-electrochemical cell. The other half of the PEC would generate oxygen gas from the water; though hydrogen gas is the goal, without the simultaneous generation of oxygen, the whole PEC cell shuts down. Many groups—including Chorkendorff, Dahl and Norskov and their colleagues—are working on finding catalysts and sunlight absorbers to do this well. "This is the most difficult half of the problem, and we are attacking this in the same way as we attacked the hydrogen side," Dahl said.


Norskov looks forward to solving that problem as well. "A sustainable energy choice that no one can afford is not sustainable at all," he said. "I hope this approach will enable us to choose a truly sustainable fuel."


Provided by SLAC National Accelerator Laboratory (news : web)

Explaining the behavior of latest high-temp superconductors

A Rice University-led team of physicists this week offered up one of the first theoretical explanations of how two dissimilar types of high-temperature superconductors behave in similar ways.


The research appears online this week in the journal Physical Review Letters. It describes how the magnetic properties of electrons in two dissimilar families of iron-based materials called "pnictides" (pronounced: NICK-tides) could give rise to superconductivity. One of the parent families of pnictides is a metal and was discovered in 2008; the other is an insulator and was discovered in late 2010. Experiments have shown that each material, if prepared in a particular way, can become a superconductor at roughly the same temperature. This has left theoretical physicists scrambling to determine what might account for the similar behavior between such different compounds.


Rice physicist Qimiao Si, the lead researcher on the new paper, said the explanation is tied to subtle differences in the way iron atoms are arranged in each material. The pnictides are laminates that contain layers of iron separated by layers of other compounds. In the newest family of insulating materials, Chinese scientists found a way to selectively remove iron atoms and leave an orderly pattern of "vacancies" in the iron layers.


Si, who learned about the discovery of the new insulating compounds during a visit to China in late December, suspected that the explanation for the similar behavior between the new and old compounds could lie in the collective way that electrons behave in each as they are cooled to the point of superconductivity. His prior work had shown that the arrangement of the iron atoms in the older materials could give rise to collective behavior of the magnetic moments, or "spins," of electrons. These collective behaviors, or "quasi-localizations," have been linked to high-temperature superconductivity in both pnictides and other high-temperature superconductors.


"The reason we got there first is we were in a position to really quickly incorporate the effect of vacancies in our model," Si said. "Intuitively, on my flight back (from China last Christmas), I was thinking through the calculations we should begin doing."


Si conducted the calculations and analyses with co-authors Rong Yu, postdoctoral research associate at Rice, and Jian-Xin Zhu, staff scientist at Los Alamos National Laboratory.


"We found that ordered vacancies enhance the tendency of the electrons to lock themselves some distance away from their neighbors in a pattern that physicists call 'Mott localization,' which gives rise to an insulating state," Yu said. "This is an entirely new route toward Mott localization."


By showing that merely creating ordered vacancies can prevent the material from being electrical conductors like their relatives, the researchers concluded that even the metallic parents of the iron pnictides are close to Mott localization.


"What we are learning by comparing the new materials with the older ones is that these quasi-localized spins and the interactions among them are crucial for superconductivity, and that's a lesson that can be potentially applied to tell experimentalists what is good for raising the transition temperature in new families of compounds," Zhu said.


Superconductivity occurs when electrons pair up and flow freely through a material without any loss of energy due to resistance. This most often occurs at extremely low temperatures, but compounds like the pnictides and others become superconductors at higher temperatures -- close to or above the temperature of liquid nitrogen -- which creates the possibility that they could be used on an industrial scale. One impediment to their broader use has been the struggle to precisely explain what causes them to become superconductors in the first place. The race to find that has been called the biggest mystery in modern physics.


"The new superconductors are arguably the most important iron-based materials that have been discovered since the initial discovery of iron pnictide high-temperature superconductors in 2008," Si said. "Our theoretical results provide a natural link between the new and old iron-based superconductors, thereby suggesting a universal origin of the superconductivity in these materials."


The research was funded by the National Science Foundation, the Robert A. Welch Foundation and the Department of Energy. It was facilitated by the International Collaborative Center on Quantum Matter, a collaborative entity Rice formed with partner institutions from China, Germany and United Kingdom.


Story Source:


The above story is reprinted (with editorial adaptations) from materials provided by Rice University.

Journal Reference:

Rong Yu, Jian-Xin Zhu, Qimiao Si. Mott Transition in Modulated Lattices and Parent Insulator of (K,Tl)_{y}Fe_{x}Se_{2} Superconductors. Physical Review Letters, 2011; 106 (18) DOI: 10.1103/PhysRevLett.106.186401

New material could improve safety for first responders to chemical hazards

 A new kind of sensor could warn emergency workers when carbon filters in the respirators they wear to avoid inhaling toxic fumes have become dangerously saturated.


In a recent issue of the journal Advanced Materials, a team of researchers from the University of California, San Diego and Tyco Electronics describe how they made the carbon nanostructures and demonstrate their potential use as microsensors for volatile organic compounds.


First responders protect themselves from such vapors, whose composition is often unknown, by breathing through a canister filled with activated charcoal -- a gas mask. Airborne toxins stick to the carbon in the filter, trapping the dangerous materials.


As the filters become saturated, chemicals will begin to pass through. The respirator can then do more harm than good by providing an illusion of safety. But there is no easy way to determine when the filter is spent. Current safety protocols base the timing of filter changes on how long the user has worn the mask.


"The new sensors would provide a more accurate reading of how much material the carbon in the filters has actually absorbed," said team leader Michael Sailor, professor of chemistry and biochemistry and bioengineering at UC San Diego. "Because these carbon nanofibers have the same chemical properties as the activated charcoal used in respirators, they have a similar ability to absorb organic pollutants."


Sailor's team assembled the nanofibers into repeating structures called photonic crystals that reflect specific wavelengths, or colors, of light. The wing scales of the Morpho butterfly, which give the insect its brilliant iridescent coloration, are natural examples of this kind of structure.


The sensors are an iridescent color too, rather than black like ordinary carbon. That color changes when the fibers absorb toxins -- a visible indication of their capacity for absorbing additional chemicals.


The agency that certifies respirators in the U.S., the National Institute of Occupational Safety and Health, has long sought such a sensor but the design requirements for a tiny, sensitive, inexpensive device that requires little power, have proved difficult to meet.


The materials that the team fabricated are very thin -- less than half the width of a human hair. Sailor's group has previously placed similar photonic sensors on the tips of optical fibers less than a millimeter across and shown that they can be inserted into respirator cartridges. And the crystals are sensitive enough to detect chemicals such as toluene at concentrations as low as one part per million.


Ting Gao, a senior researcher at the Polymers, Ceramics, and Technical Services Laboratories of Tyco Electronics in Menlo Park, California and Timothy L. Kelly, a NSERC post-doctoral fellow at UC San Diego co-authored the paper. The National Science Foundation, the Department of Homeland Security, the Natural Sciences and Engineering Research Council of Canada, and TYCO Electronics provided funding for the work.


Story Source:


The above story is reprinted (with editorial adaptations) from materials provided by University of California - San Diego. The original article was written by Susan Brown.

Journal Reference:

Timothy L. Kelly, Ting Gao, Michael J. Sailor. Carbon Nanofiber Photonic Crystals: Carbon and Carbon/Silicon Composites Templated in Rugate Filters for the Adsorption and Detection of Organic Vapors (Adv. Mater. 15/2011). Advanced Materials, 2011; 23 (15): 1688 DOI: 10.1002/adma.201190052

Several baffling puzzles in protein molecular structure solved with new method

The structures of many protein molecules remain unsolved even after experts apply an extensive array of approaches. An international collaboration has led to a new, high-performance method that rapidly determined the structure of protein molecules in several cases where previous methods had failed.


The usefulness of the new method is reported May 1 in Nature advanced online publication. The lead authors are Dr. Frank DiMaio of the University of Washington (UW) in Seattle and Dr. Thomas C. Terwilliger of Los Alamos National Laboratory in New Mexico. The senior author is Dr. David Baker, of the UW Department of Biochemistry.


A protein's molecular structure shapes its functions. In biomedical and health research, for example, scientists are interested in the molecular structure of specific proteins for many reasons, a few of which are:

To design drugs that selectively target, at the molecular level, particular biochemical reactions in the bodyTo understand abnormal human proteins in disease, such as those found in cancer and neurodegenerative disorders like Alzheimer's, and how these abnormal proteins cause malfunctionsTo learn the shape and function of virus particles and how they act to cause infectionsTo see how the chains of amino acids, decoded from the DNA in genes, fold and twist into normally or abnormally shaped protein moleculesTo design new proteins not found in the natural world, such as enzymes to speed up a slow biochemical reactionTo find ways to replace malfunctioning molecular parts of proteins that are critical to healthTo devise nano-scale tools, such as molecular motors

"The important new method described this week in Nature highlights the value of computational modeling in helping scientists to determine the structures and functions of molecules that are difficult to study using current techniques," said Dr. Peter Preusch, who oversees Baker's research grant and other structural biology grants at the National Institutes of Health (NIH). "Expanding the repertoire of known protein structures -- a key goal of the NIH Protein Structure Initiative, which helped fund the research -- will be of great benefit to scientists striving to design new therapeutic agents to treat disease."


The methods devised by the group overcome some of the limitations of X-ray crystallography in determining the molecular structure of a protein. X-ray crystallography obtains information about the positions of atoms, chemical bonds, the density of electrons and other arrangements within a protein molecule.


The information is gleaned by striking protein crystals with X-ray beams. The beams bounce off in several directions.


Measuring the angles and intensities of these diffracted beams enables scientists to produce a 3-dimensional image of electron density. However, information about the molecular structure can be lost in taking the measurements, due to restraints posed by physics.


Scientists attempt to sidestep this problem by comparing the crystallography results to previously solved protein structures that resemble the unknown structure. The technique to "fill in the blanks" is called molecular replacement.


Molecular replacement has its own limitations in interpreting the electron density maps produced by X-ray crystallography, according to the authors of the paper. Techniques such as automatic chain tracing often follow the comparative model more closely than the actual structure of the protein under question. These mistakes lead to failure to obtain an accurate configuration of the molecule.


The researchers showed that this limitation can be substantially reduced by combining computer algorithms for protein structure modeling with those for determining structure via X-ray crystallography.


Several years ago, University of Washington researchers and their colleagues developed a structure prediction method called Rosetta. This program takes a chain of amino acids -- protein-building blocks strung all in a row -- and searches for the lowest energy conformation possible from folding, twisting and packing the chain into a three-dimensional (3-D) molecule.


The researchers found that even very poor electron density maps from molecular replacement solutions could be useful. These maps could guide Rosetta structural prediction searches that are based on energy optimization. By taking these energy-optimized predicted models, and looking for consistency with the electron density data contained in the X-ray crystallography, new maps are generated. The new maps are then subjected to automatic chain tracing to produce 3-D models of the protein molecular structure. The models are checked with a sophisticated monitoring technique to see if any are successful.


To test the performance of their new integrated method, the researchers looked at 13 sets of X-ray crystallography data on molecules whose structures could not be solved by expert crystallographers. These structures remained unsolved even after the application of an extensive array of other approaches. The new integrated method was able to yield high resolution structures for 8 of these 13 highly challenging models.


"The results show that structural prediction methods such as Rosetta can be even more powerful when combined with X-ray crystallography data," the researchers noted. They added that the integrated approach probably outperforms others because it provides physical chemistry and protein structural information that can guide the massive sampling of candidate configurations. This information eliminates most conformations that are not physically possible.


Our procedures, the authors noted, required considerable computation, as up to several thousand Rosetta model predictions are generated for each structure. The researchers have developed automated procedures that potentially could narrow down the possibilities and lessen the number of times a model is rebuilt to make corrections. This automation could reduce computing time.


Through Baker's laboratory, many members of the general public contribute their unused home computer time to help in the effort to obtain structural models of proteins that are biologically and medically significant. The scientific discovery game is called "Fold It." (http://fold.it/portal/)


Other authors of the paper appearing this week in Nature are Dr. Randy J. Read of the University of Cambridge, United Kingdom; Dr. Alexander Wlodawer of the National Cancer Institute; Drs. Gustav Oberdorfer and Ulrike Wagner of University of Graz, Austria; Dr. Eugene Valkov of the University of Cambridge; Drs. Assaf Alon and Deborah Fass of the Weizmann Institute of Science in Rehovot, Israel; Drs. Herbert L. Axelrod and Debanu Das of the SLAC National Accelerator Laboratory in Menlo Park, Calif.; Dr. Sergey M. Vorobiev of Columbia University in New York; and Dr. Hideo Iwai of the University of Helsinki, Finland.


The research was funded by the National Institute of General Medical Sciences and the National Center for Research Resources at the National Institutes of Health, the Howard Hughes Medical Institute, the Israel Science Foundation, DK Molecular Enzymology, Austrian Science Fund, the Center for Cancer Research at the National Cancer Institute, the Academy of Finland, and the U.S. Department of Energy's Office of Science, Biological and Environmental Research. The Joint Center for Structural Genomics, which is supported by the NIH's Protein Structure Initiative, contributed to the protein production and structural work.


Story Source:


The above story is reprinted (with editorial adaptations) from materials provided by University of Washington, via EurekAlert!, a service of AAAS.

Journal Reference:

Frank DiMaio, Thomas C. Terwilliger, Randy J. Read, Alexander Wlodawer, Gustav Oberdorfer, Ulrike Wagner, Eugene Valkov, Assaf Alon, Deborah Fass, Herbert L. Axelrod, Debanu Das, Sergey M. Vorobiev, Hideo Iwai, P. Raj Pokkuluri, David Baker. Improved molecular replacement by density- and energy-guided protein structure optimization. Nature, 2011; DOI: 10.1038/nature09964

Single atom stores quantum information

 A data memory can hardly be any smaller: researchers working with Gerhard Rempe at the Max Planck Institute of Quantum Optics in Garching have stored quantum information in a single atom. The researchers wrote the quantum state of single photons, i.e. particles of light, into a rubidium atom and read it out again after a certain storage time. This technique can be used in principle to design powerful quantum computers and to network them with each other across large distances.


Quantum computers will one day be able to cope with computational tasks in no time where current computers would take years. They will take their enormous computing power from their ability to simultaneously process the diverse pieces of information which are stored in the quantum state of microscopic physical systems, such as single atoms and photons. In order to be able to operate, the quantum computers must exchange these pieces of information between their individual components. Photons are particularly suitable for this, as no matter needs to be transported with them. Particles of matter however will be used for the information storage and processing. Researchers are therefore looking for methods whereby quantum information can be exchanged between photons and matter. Although this has already been done with ensembles of many thousands of atoms, physicists at the Max Planck Institute of Quantum Optics in Garching have now proved that quantum information can also be exchanged between single atoms and photons in a controlled way.


Using a single atom as a storage unit has several advantages -- the extreme miniaturization being only one, says Holger Specht from the Garching-based Max Planck Institute, who was involved in the experiment. The stored information can be processed by direct manipulation on the atom, which is important for the execution of logical operations in a quantum computer. "In addition, it offers the chance to check whether the quantum information stored in the photon has been successfully written into the atom without destroying the quantum state," says Specht. It is thus possible to ascertain at an early stage that a computing process must be repeated because of a storage error.


The fact that no one had succeeded until very recently in exchanging quantum information between photons and single atoms was because the interaction between the particles of light and the atoms is very weak. Atom and photon do not take much notice of each other, as it were, like two party guests who hardly talk to each other, and can therefore exchange only a little information. The researchers in Garching have enhanced the interaction with a trick. They placed a rubidium atom between the mirrors of an optical resonator, and then used very weak laser pulses to introduce single photons into the resonator. The mirrors of the resonator reflected the photons to and fro several times, which strongly enhanced the interaction between photons and atom. Figuratively speaking, the party guests thus meet more often and the chance that they talk to each other increases.


The photons carried the quantum information in the form of their polarization. This can be left-handed (the direction of rotation of the electric field is anti-clockwise) or right-handed (clock-wise). The quantum state of the photon can contain both polarizations simultaneously as a so-called superposition state. In the interaction with the photon the rubidium atom is usually excited and then loses the excitation again by means of the probabilistic emission of a further photon. The Garching-based researchers did not want this to happen. On the contrary, the absorption of the photon was to bring the rubidium atom into a definite, stable quantum state. The researchers achieved this with the aid of a further laser beam, the so-called control laser, which they directed onto the rubidium atom at the same time as it interacted with the photon.


The spin orientation of the atom contributes decisively to the stable quantum state generated by control laser and photon. Spin gives the atom a magnetic moment. The stable quantum state, which the researchers use for the storage, is thus determined by the orientation of the magnetic moment. The state is characterized by the fact that it reflects the photon's polarization state: the direction of the magnetic moment corresponds to the rotational direction of the photon's polarization, a mixture of both rotational directions being stored by a corresponding mixture of the magnetic moments.


This state is read out by the reverse process: irradiating the rubidium atom with the control laser again causes it to re-emit the photon which was originally incident. In the vast majority of cases, the quantum information in the read-out photon agrees with the information originally stored, as the physicists in Garching discovered. The quantity that describes this relationship, the so-called fidelity, was more than 90 percent. This is significantly higher than the 67 percent fidelity that can be achieved with classical methods, i.e. those not based on quantum effects. The method developed in Garching is therefore a real quantum memory.


The physicists measured the storage time, i.e. the time the quantum information in the rubidium can be retained, as around 180 microseconds. "This is comparable with the storage times of all previous quantum memories based on ensembles of atoms," says Stephan Ritter, another researcher involved in the experiment. Nevertheless, a significantly longer storage time is necessary for the method to be used in a quantum computer or a quantum network. There is also a further quality characteristic of the single-atom quantum memory from Garching which could be improved: the so-called efficiency. It is a measure of how many of the irradiated photons are stored and then read out again. This was just under 10 percent.


The storage time is mainly limited by magnetic field fluctuations from the laboratory surroundings, says Ritter. "It can therefore be increased by storing the quantum information in quantum states of the atoms which are insensitive to magnetic fields." The efficiency is limited by the fact that the atom does not sit still in the centre of the resonator, but moves. This causes the strength of the interaction between atom and photon to decrease. The researchers can thus also improve the efficiency: by greater cooling of the atom, i.e. by further reducing its kinetic energy.


The researchers at the Max Planck Institute in Garching now want to work on these two improvements. "If this is successful, the prospects for the single-atom quantum memory would be excellent," says Stephan Ritter. The interface between light and individual atoms would make it possible to network more atoms in a quantum computer with each other than would be possible without such an interface; a fact that would make such a computer more powerful. Moreover, the exchange of photons would make it possible to quantum mechanically entangle atoms across large distances. The entanglement is a kind of quantum mechanical link between particles which is necessary to transport quantum information across large distances. The technique now being developed at the Max Planck Institute of Quantum Optics could some day thus become an essential component of a future "quantum Internet."


Story Source:


The above story is reprinted (with editorial adaptations) from materials provided by Max-Planck-Gesellschaft.

Journal Reference:

Holger P. Specht, Christian Nölleke, Andreas Reiserer, Manuel Uphoff, Eden Figueroa, Stephan Ritter, Gerhard Rempe. A single-atom quantum memory. Nature, 2011; DOI: 10.1038/nature09997

Chemist designs new polymer structures for use as 'plastic electronics'

 Iowa State University's Malika Jeffries-EL says she's studying doing structure-property studies so she can teach old polymers new tricks.


Those tricks improve the properties of certain organic polymers that mimic the properties of traditional inorganic semiconductors and could make the polymers very useful in organic solar cells, light-emitting diodes and thin-film transistors.


Conductive polymers date back to the late 1970s when researchers Alan Heeger, Alan MacDiarmid and Hideki Shirakawa discovered that plastics, with certain arrangements of atoms, can conduct electricity. The three were awarded the 2000 Nobel Prize in Chemistry for the discovery.


Jeffries-EL, an Iowa State assistant professor of chemistry, is working with a post-doctoral researcher and nine doctoral students to move the field forward by studying the relationship between polymer structures and the electronic, physical and optical properties of the materials. They're also looking for ways to synthesize the polymers without the use of harsh acids and temperatures by making them soluble in organic solvents.


The building blocks of their work are a variety of benzobisazoles, molecules well suited for electrical applications because they efficiently transport electrons, are stable at high temperatures and can absorb photons.


And if the polymers are lacking in any of those properties, Jeffries-EL and her research group can do some chemical restructuring.


"With these polymers, if you don't have the properties you need, you can go back and change the wheel," Jeffries-EL said. "You can change the chemical synthesis and produce what's missing."


That, she said, doesn't work with silicon and other inorganic materials for semiconductors: "Silicon is silicon. Elements are constant."


The National Science Foundation is supporting Jeffries-EL's polymer research with a five-year, $486,250 Faculty Early Career Development grant. She also has support from the Iowa Power Fund (a state program that supports energy innovation and independence) to apply organic semiconductor technology to solar cells.


The research group is seeing some results, including peer-reviewed papers over the past two years in Physical Chemistry Chemical Physics, Macromolecules, the Journal of Polymer Science Part A: Polymer Chemistry, and the Journal of Organic Chemistry.


"This research is really about fundamental science," Jeffries-EL said. "We're studying the relationships between structure and material properties. Once we have a polymer with a certain set of properties, what can we do?"


She and her research group are turning to the molecules for answers.


"In order to realize the full potential of these materials, they must be engineered at the molecular level, allowing for optimization of materials properties, leading to enhanced performance in a variety of applications," Jeffries-EL wrote in a research summary. "As an organic chemist, my approach to materials begins with small molecules."


Story Source:


The above story is reprinted (with editorial adaptations) from materials provided by Iowa State University.

Bacteria have evolved a unique chemical mechanism to become antibiotic-resistant

For the first time, scientists have been able to paint a detailed chemical picture of how a particular strain of bacteria has evolved to become resistant to antibiotics. The research is a key step toward designing compounds to prevent infections by recently evolved, drug-resistant "superbugs" that often are found in hospitals, as well as in the general population.



A paper describing the research, by a team led by Squire Booker, an associate professor in the department of chemistry and the department of biochemistry and molecular biology at Penn State University, will be posted by the journal Science on its early-online site on 28 April. This paper is a continuation of research led by Booker published in another paper in Science earlier this month.


The team began by studying a protein made by a recently evolved "superbug." Booker explained that, several years ago, genetic studies had revealed that Staphylococcus sciuri -- a non-human -- had evolved a new gene called cfr. The protein created by this gene had been found to play a key role in one of the bacterium's mechanisms of antibiotic resistance. Later, the same gene was found to have crossed over into a strain of -- a very common kind of bacteria that constitutes part of the flora living in the human nose and on the skin, and which is now the cause of various antibiotic-resistant infections. Because this gene often is found within a mobile DNA element, it can move easily from a non-human pathogen to other species of bacteria that infect humans. "The gene, which has been found in Staphylococcus aureus isolates in the United States, Mexico, Brazil, Spain, Italy, and Ireland, effectively renders the bacteria resistant to seven classes of ," Booker explained. "Clearly, bacteria with this gene have a distinct . However, until now, the detailed process by which the protein encoded by that gene affected the of the bacteria was unclear; that is, we didn't have a clear 3D picture of what was going on at the molecular level."


To solve the chemical mystery of how such bacteria outsmart so many antibiotics, Booker and his team investigated how the Cfr protein accomplishes a task called methylation -- a process by which enzymes add a small molecular tag to a particular location on a nucleotide -- a molecule that is the structural unit of RNA and DNA. When this molecular tag is added by a protein called RlmN, it facilitates the proper functioning of the bacterial ribosome -- a gigantic macromolecular machine that is responsible for making proteins that bacteria need to survive. Many classes of antibiotics bind to the ribosome, disrupting its function and thereby killing the bacteria. The Cfr protein performs an identical function as the RlmN protein, but it adds the molecular tag at a different location on the same nucleotide. The addition of the tag blocks binding of antibiotics to the ribosome without disrupting its function. "What had perplexed scientists is that the locations to which RlmN and Cfr add molecular tags are chemically different from all others to which tags routinely are appended, and should be resistant to modification by standard chemical methods," Booker said. "What we've discovered here is so exciting because it represents a truly new chemical mechanism for methylation. We now have a very clear chemical picture of a very clever mechanism for antibiotic resistance that some have evolved."


Booker also said he believes the next step will be to use this new information to design compounds that could work in conjunction with typical antibiotics. "Because we know the specific mechanism by which bacterial cells evade several classes of antibiotics, we can begin to think about how to disrupt the process so that standard antibiotics can do their jobs," he said.


Provided by Pennsylvania State University (news : web)