Saturday, August 27, 2011

Salmonella stays deadly with a 'beta' version of cell behavior

Salmonella cells have hijacked the protein-building process to maintain their ability to cause illness, new research suggests.

Scientists say that these have modified what has long been considered typical cell behavior by using a beta form of an amino acid – as opposed to an alpha form – during the act of making proteins.

Beta versions of occur in nature under rare and specific circumstances, but have never been observed as part of synthesis. Before this finding, in fact, researchers had determined that virtually all proteins were constructed with the alpha forms of amino acids.

This work has shown that when researchers delete any one of three genes from the process that makes use of the beta form of the amino acid, or if they insert the alpha form in the beta version's place, Salmonella are no longer able to cause disease. The amino acid in question is lysine, one of 22 genetically encoded amino acids that are strung together in cells to make proteins.

"When these genes were knocked out, the cells became sensitive to antibiotics. And if we put beta lysine into the medium where cells were growing, they became resistant to antibiotics," said Michael Ibba, professor of microbiology at Ohio State University and a senior author of the study. "So we could see the beta amino acid being taken up and used. The cells really do need the beta amino acid to be resistant to antibiotics, and for other aspects of their virulence."

This finding suggests that the process using this specific beta amino acid could be an attractive antibiotic target for this common pathogen, the researchers say.

The Centers for Disease Control and Prevention estimates that about 1.4 million people in the United States are infected with Salmonella each year, though only 40,000 cases are reported. Most people infected with Salmonella develop diarrhea, fever and abdominal cramps. Though recovery can occur within a week without treatment, some severe cases require antibiotic treatment and hospitalization.

The study is published in the Aug. 14 online edition of the journal Nature Chemical Biology.

This work began when University of Toronto scientists exploring the origins of Salmonella's virulence identified three genes that were clear players in the process. These three genes – called YjeK, PoxA and EF-P – were unusual in this context.

Genes that confer virulence in bacteria typically have a specific job, such as producing toxins or transporters. But these three virulence genes all looked like they should have a role in the protein synthesis machinery – which is Ibba's expertise.

Under normal circumstances in cells, an enzyme will select amino acids in the cell and place them on a molecule called transfer RNA, or tRNA, which leads to translation of the genetic code into proteins.

In Salmonella cells, these steps are similar, but with a few surprising twists, Ibba said. He and colleagues confirmed that the YjeK gene makes beta lysine, and showed that the PoxA gene takes that beta lysine and attaches it to EF-P – a protein that partially mimics the shape and function of tRNA.

"It's a really unexpected pathway," said Ibba, also an investigator in Ohio State's Center for RNA Biology. "It is a mimic of what normally makes protein in a cell. Where a cell would normally be expected to use an alpha amino acid, Salmonella puts on a beta amino acid. And it ends up making molecules that lead to the cells being virulent."

The research team first reconstructed this unusual protein synthesis process in test tube experiments, and then followed with studies in cell cultures. Even before they took on studying the mechanism, however, they knew that the effects of these virulence genes were powerful: In earlier animal studies, deleting any one of the three genes and then infecting mice with these altered Salmonella cellshad no effect on the animals. When the genes were left intact and cells were injected into mice, the resulting Salmonella infection killed the animals.

In addition, when the researchers tricked Salmonella cells into using alpha lysine for this pathway instead of beta lysine, the cells lost their ability to cause illness.

"This tells us the cell is not going to be able to easily replace the beta amino acid," Ibba said. "It is essential for virulence in Salmonella."

And that, he said, is why that amino acid might be such an effective drug target, especially as humans don't seem to make beta amino acids at all. "You have to make an antibiotic look like something natural, only different. If you have something that's already different like a beta amino acid, you've potentially got a much better drug target because it involves chemistry that's comparatively rare in the cell. It's harder for the cell to try to alter its own chemistry to develop resistance," Ibba said.

From here, the researchers are observing later in the protein-building process to figure out how this hijacked system actually gives its virulence.

Provided by The Ohio State University (news : web)

A systematic way to find battery materials

Lithium-ion batteries have become a leading energy source for everything from smartphones and laptops to power tools and electric cars, and researchers around the world are actively seeking ways to nudge their performance toward ever-higher levels. Now, a new analysis by researchers at MIT and the University of California at Los Angeles (UCLA) has revealed why one widely used compound works particularly well as the material for one of these batteries' two electrodes -- an understanding they say could greatly facilitate the process of searching for even better materials.

Lithium-ion batteries’ energy and power density -- that is, how much electricity they can store for a given weight, and how fast they can deliver that power -- are determined mostly by the material used for the cathode (the positive electrode). When such batteries are being used, lithium atoms are stored within the crystal structure of the cathode; when the battery is being recharged, lithium ions flow back out of it. Many different materials are currently used for these cathodes.

But one of those materials has been a bit of a mystery. Lithium iron phosphate (LiFePO4) performs well as a cathode, but this performance has been hard to explain because unlike other cathode materials, phosphate seemed to require a two-phase process to store lithium — something that should theoretically reduce its efficiency, but for some reason does not.

That anomaly has now been explained. A more detailed analysis showed that, in fact, the compound was following a single-phase process after all, but doing so in an unusual way -- which might point the way to discovery of many other such compounds that had previously been overlooked. The new analysis was carried out by Gerbrand Ceder, the Richard P. Simmons (1953) Professor of Materials Science and Engineering at MIT, his graduate student Rahul Malik, and postdoc Fei Zhou of UCLA, and published in the journal Nature Materials.

According to accepted theory, lithium iron phosphate “should have been a low-rate” cathode material, Ceder says — meaning that it could produce electricity only at a very low current, suitable for use with very-low-power devices. Instead, “it has become one of the highest-rate materials in use,” something that “always puzzled us,” he says.

Most cathode materials are porous, absorbing lithium ions during charging like water going into a sponge. But it was thought that lithium iron phosphate required a two-phase process, first forming one compound, which then morphed into a final, stable compound. The extra step was expected to add complexity and reduce the reaction’s speed.

But the new experiments, which were able to probe the activity of the material as it absorbed the lithium, found that even though the material ends up reaching an equilibrium where it has two separate phases, in operation it actually undergoes a single-phase process. “The way it actually absorbs lithium is not two-phase,” Ceder says, “but it separates into two phases when it’s done.”

That makes it much more similar to conventional single-phase than had been thought -- and means that it makes sense to look at a wide range of other candidate materials that had been ignored because they were also assumed to require a two-phase process. This analysis makes it possible to “understand better which of these two-phase materials will actually work,” Ceder says, opening up thousands of new candidate materials to be studied. “Now we have a way of evaluating which materials may have potential,” Ceder says. “It broadens the possibilities.”

Previously, he says, it had been known that “some two-phase materials do zilch, some do very well,” but nobody knew why. Now it is likely that the ones that work well are actually using a single-phase reaction, as turns out to be the case with lithium iron phosphate. Ceder and his colleagues have been developing computer algorithms that incorporate a wide variety of known properties of materials so that large numbers of candidates can be screened quickly and efficiently to search for very specific combinations of properties needed for a particular application.

Understanding the dynamics of how lithium ions get incorporated into different molecular structures “was the missing piece in the high-throughput screening process,” Ceder says. “Hopefully we’ll be able to do that better now.”

Brent Fultz, professor of materials science and applied physics at the California Institute of Technology, who was not involved in this work, says these findings represent a “significant” step forward in understanding the behavior of this material. 

“Some oddities in the crystal structure” of lithium have been known, he says, “including a solid solution phase that exists at temperatures only a bit above room temperature. What is so interesting about the work from this MIT group is that it shows how the solid solution phase is far from simple.” He says “the authors make a strong case that the solid solution phase plays a bigger role in the performance” of this material than had been expected, and adds that “the work suggests alternative directions to the design of cathode for batteries.”
This story is republished courtesy of MIT News (http://web.mit.edu/newsoffice/), a popular site that covers news about MIT research, innovation and teaching.

Provided by Massachusetts Institute of Technology (news : web)

ECHA urges registrants to verify compliance with data sharing and joint submission obligations

The Agency has noted that for some substances companies have submitted registrations possibly without taking part in a single SIEF. Registrants who find themselves in this situation are strongly encouraged to verify their compliance with data sharing and joint submission obligations. Registrants may approach the ECHA Helpdesk to obtain the contact details of other registrants for their substance.


Under the REACH Regulation, companies have to share existing information on the same substance and submit one joint dossier. Compliance with these obligations has to take place before registration dossiers are submitted to ECHA, and is facilitated through the SIEF and inquiry processes.


However, in some cases the pre-SIEF and SIEF formation process may not have worked as expected, and multiple separate (joint) registrations for the same substance have been submitted. This can be seen from some duplicate entries in ECHA's dissemination portal.. As registrants in these situations may be found in breach of the REACH Regulation, ECHA strongly encourages the registrants concerned to verify compliance with their data sharing and joint submission obligations and remedy any deficiencies as soon as possible.


To support registrants in ensuring compliance, ECHA now facilitates contact between existing registrants in the event that the contact details cannot be found via the normal channels, ie. the (pre-)SIEF or inquiry process, as appropriate. If companies become aware that other separate (joint) registrations exist for their substance, they can contact the ECHA Helpdesk to obtain the contact details of these other registrants, subject to the verification of certain information. Note that this process is only recommended to registrants who become aware of other registration of the same substance outside of their SIEF and/or joint submission (e.g. via the dissemination portal).


This new service is open to all companies that have submitted a registration for their substance(s) to ECHA and received a registration number. In order for a request to be processed, it is mandatory to indicate the registration number and the related submission number. ECHA will then provide the requester with the contact details of other registrants of the same substance as the requester via REACH-IT. In turn, these other registrants will be informed of the identity of the requester and the fact that their contact details have been passed on. If a third party representative has been appointed, ECHA will only disclose the contact details of that representative.


 

Searching for spin liquids: Much-sought exotic quantum state of matter can exist

The world economy is becoming ever more reliant on high tech electronics such as computers featuring fingernail-sized microprocessors crammed with billions of transistors. For progress to continue, for Moore's Law -- according to which the number of computer components crammed onto microchips doubles every two years, even as the size and cost of components halves -- to continue, new materials and new phenomena need to be discovered.


Furthermore, as the sizes of electronic components shrink, soon down to the size of single atoms or molecules, quantum interactions become ever more important. Consequently, enhanced knowledge and exploitation of quantum effects is essential. Researchers at the Joint Quantum Institute (JQI) in College Park, Maryland, operated by the University of Maryland and the National Institute of Standards and Technology (NIST), and at Georgetown University have uncovered evidence for a long-sought-after quantum state of matter, a spin liquid.


The research was performed by JQI postdoctoral scientists Christopher Varney and Kai Sun, JQI Fellow Victor Galitski, and Marcos Rigol of Georgetown University. The results appear in an editor-recommended article in the 12 August issue of the journal Physical Review Letters.


You can't pour a spin liquid into a glass. It's not a material at all, at least not a material you can touch. It is more like a kind of magnetic disorder within an ordered array of atoms. Nevertheless, it has many physicists excited.


To understand this exotic state of matter, first consider the concept of spin, which is at the heart of all magnetic phenomena. For instance, a refrigerator magnet, at the microscopic level, consists of trillions of trillions of iron atoms all lined up. Each of these atoms can be thought of loosely as a tiny spinning ball. The orientation of that spin is what makes the atom into a tiny magnet. The refrigerator magnet is an example of a ferromagnet, the ferro part coming from the Latin word for iron. In a ferromagnet, all the atomic spins are lined up in the same way, producing a large cooperative magnetic effect.


Important though they may be, ferromagnets aren't the only kind of material where magnetic interactions between spins are critical. In anti-ferromagnets, for instance, the neighboring spins are driven to be anti-aligned. That is, the orientations of the spins alternate up and down (see top picture in figure). The accumulative magnetic effect of all these up and down spins is that the material has no net magnetism. The high-temperature superconducting materials discovered in the 1980s are an important example of an anti-ferromagnetic structure.


More complicated and potentially interesting magnetic arrangements are possible, which may lead to a quantum spin liquid. Imagine an equilateral triangle, with an atom (spin) at each corner. Anti-ferromagnetism in such a geometry would meet with difficulties. Suppose that one spin points up while a second spin points down. So far, so good. But what spin orientation can the third atom take? It can't simultaneously anti-align with both of the other atoms in the triangle. Physicists employ the word "frustration" to describe this baffling condition where all demands cannot be satisfied.


In everyday life frustration is, well, frustrating, and actually this condition is found throughout nature, from magnetism to neural networks. Furthermore, understanding the different manifestations of a collection of magnetically interacting spins might help in designing new types of electronic circuitry.


One compromise that a frustrated spin system makes is to simultaneously exist in many spin orientations. In a quantum system, this simultaneous existence, or superposition, is allowed.


Here's where the JQI researchers have tried something new. They have studied what happens when frustration occurs in materials with a hexagonal (six sided) unit cell lattice.


What these atoms do is interact via their respective spins. The strength of the interaction between nearest neighbor (NN) atoms is denoted by the parameter J1. Similarly, the force between next nearest neighbors (NNN) -- that is, pairs of atoms that have at least one intervening atom between them -- is denoted by J2. Letting this batch of atoms interact among themselves, even on a pretend lattice as small as this, entails an immense calculation. Varney and his colleagues have calculated what happens in an array of hexagons consisting of 30 sites where the spins are free to swing about in a two-dimensional plane (this kind of approach is called an XY model).


Christopher Varney, who has appointments at Maryland and Georgetown, said that the interactions of atoms can be represented by a matrix (essentially a two-dimensional spreadsheet) with 155 million entries on each side. This huge number corresponds to the different spin configurations that can occur on this honeycomb-structured material.


What the researchers found were a "kaleidoscope" of phases, which represent the lowest-energy states that are allowed given the magnetic interactions. Just as water can exist in different phases -- steam, liquid, and ice -- as the temperature is changed, so here a change in the strengths of the interactions among the spins (the J1 and J2 parameters) results in different phases. For example, one simple solution is an antiferromagnet (upper picture in figure).


But one phase turns out to be a true quantum spin liquid having no order at all. When J2 is between about 21% and 36% of the value of J1, frustration coaxes the spins into disorder; the entire sample co-exists in millions of quantum states simultaneously.


It's difficult for the human mind to picture a tiny two-dimensional material in so many states at the same time. JQI fellow, Victor Galitski, suggests that one shouldn't think of the spins as residing at the original atomic sites but rather as free ranging particle-like entities dubbed "spinons." These spinons bob about, just as water molecules bob about in liquid water (see lower picture in figure). Hence the name quantum spin liquid.


Another reason for using the word liquid, Galitski says, is this 'bobbing about' is analogous to what happens inside a metal. There, the outer electrons of most atoms tend to leave their home atoms and drift through the metal sample as if they constituted a fluid, called a "Fermi liquid."


Electrons in a metal are able to drift since it takes only an infinitesimal amount of energy to put them into motion. The same is true for the fluctuating spins in the hexagonal model studied by the JQI scientists. Indeed, their spin model assumes a temperature of absolute zero, where quantum effects abound.


Writing in an essay that accompanied the article in Physical Review Letters, Tameem Albash and Stephan Haas, scientists at the University of Southern California, say that the JQI/Georgetown team "present a convincing example" of the new spin liquid state.


How can this new frustration calculation be tested? The experimental verification of the spin liquid state in a 2-dimenstional hexagonal lattice, Albash and Haas suggest, "will probably be tested using cold atoms trapped in optical lattices. In the past few years, this technology has become a reliable tool to emulate quantum many body lattice systems with tunable interactions." Indeed the authors propose such an experiment.


What would such a spin liquid material be good for? It's too early to tell. But some speculations include the idea that these materials could support some exotic kind of superconductivity or would organize particle-like entities that possessed fractional electric charge.


"Kaleidoscope of Exotic Quantum Phases in a Frustrated XY Model" by Christopher N. Varney, Kai Sun, Victor Galitski, and Marcos Rigol, Physical Review Letters, 107, 077201, (12 August 2011).


Story Source:


The above story is reprinted (with editorial adaptations) from materials provided by Joint Quantum Institute, University of Maryland.

Journal Reference:

Christopher Varney, Kai Sun, Victor Galitski, Marcos Rigol. Kaleidoscope of Exotic Quantum Phases in a Frustrated XY Model. Physical Review Letters, 2011; 107 (7) DOI: 10.1103/PhysRevLett.107.077201

How receptors talk to G proteins

The mechanism by which cells respond to stimuli and trigger hormonal responses, as well as the senses of sight, smell, and taste, has for the first time been brought into focus with the help of high-brightness x-rays provided by the U.S. Department of Energy Office of Science’s Advanced Photon Source (APS) at Argonne National Laboratory. This breakthrough will pave the way to new research avenues in drug discovery, cell signaling, and cellular regulation.


Receiving a signal, interpreting it, and responding correctly: these are three activities that must be good at in order to respond to stimuli. For responding to hormones, and for the senses of sight, smell, and taste, the activated receptors are coupled with G proteins. These G protein-coupled receptors (GPCRs) are involved in a complex set of steps in which an extracellular hormone or neurotransmitter binds to and activates a GPCR in the cell membrane. The activated receptor passes on the signal to a G-protein inside the cell, triggering reactions that ultimately create a cellular response to the stimulus.


The intricacy of this set of reactions has been appreciated—and studied—for some time. What was lacking, however, was a finely-detailed of the GPCR as it actually signals across the membrane. The importance of acquiring this knowledge and knowing a lot more about how GPCRs function is apparent when we consider that the human genome contains over 800 GPCR genes.


The structure of protein receptors that are involved in cellular responses to is now available thanks to experiments carried out at the National Institute of General Medical Sciences and National Cancer Institute Collaborative Access Team (GM/CA-CAT) beamline 23-ID-B at the APS, and the process of cell signaling has come into crystal clear focus. This crystal structure is the first high-resolution look at transmembrane signaling by a GPCR and adds critical insight about signal transduction across the plasma membrane.


The general model for GPCR signaling is that the G-protein is activated by a receptor that has received a stimulus from outside the cell. The research team, comprising members from Stanford University, the University of Copenhagen, the University of Michigan, the University of Wisconsin, Vrije Universiteit Brussel, and Trinity College studied a specific model system for GPCR signaling that has long been used by biochemists and about which much was already known. In this system, a ß2 adrenergic receptor (ß2AR) is bound by the outside stimulus, known as an agonist, and then activates Gs, the stimulatory G protein for adenylyl cyclase.


The researchers were able to capture the ß2AR bound to the Gs protein before the latter went on to the next step in the dance, which would be binding a nucleotide. One technical challenge, and perhaps the reason that the crystal structure had been so elusive to date, was to create a stable ß2AR-Gs complex in detergent solution. Once this problem was solved, the team could proceed to unveiling the crystal structure of this active-state ß2AR-Gs complex (see the accompanying figure).


The new data allowed the research team to construct the early stages of GPCR signaling. Using their own observations, and other data collected on this model system, a mechanism for the initial stages of complex formation came into being. The data provided a pinpoint determination of exactly which changes in the molecules permitted that binding stage to carry on to completion.


Shape changes in the Gs, particularly in the nucleotide-binding pocket, prepared the protein for the next step. In the ß2AR, the researchers were able to identify major changes in two areas of the molecule that fit well with understanding of how the ß2AR interacts with molecules outside the cell and how the two molecules—ß2AR and Gs—interact each other.


Article co-author Brian Kobilka, of Stanford University, recalled how the project reached this important milestone: “For the past five years, Roger Sunahara and I have been working together to understand how receptors and G proteins interact with each other. This was the next logical step after getting the first inactive-state structures of the ß2AR alone in 2007.


“The receptor and purification procedures were well established at the time we started the project, but most of the other methods were developed during the project. The minibeam and the rastering‡ capabilities of the GM/CA-CAT beamline at the APS were essential for collecting the diffraction data on these crystals.”


Future research, Kobilka said, involves “how the complex forms and dissociates after activation.”


More information: Soren G. F. et al. “Crystal structure of the b2 adrenergic receptor–Gs protein complex,” Nature, advance online publication, 19 July 2011. DOI: 10.1038/nature10361


Provided by Argonne National Laboratory (news : web)