Sunday, March 20, 2011

Combining two peptide inhibitors might block tumor growth

A new study suggests that combining two experimental anticancer peptide agents might simultaneously block formation of new tumor blood vessels while also inhibiting the growth of tumor cells.

This early test of the two agents in a model suggests that the double hit can stifle , avoid drug resistance and cause few side effects, say researchers at the Ohio State University Comprehensive Cancer Center – Arthur G. James Cancer Hospital and Richard J. Solove Research Institute (OSUCCC – James) who developed the agents and evaluated their effectiveness in laboratory and animal tests.

The scientists designed one of the agents to prevent human epithelial growth factor from interacting with HER-2, a molecule that marks a particularly aggressive form of breast cancer. The other inhibitor blocks the action of vascular endothelial growth factor (VEGF), which stimulates the growth of new blood vessels that tumors need to grow beyond a certain size.

The findings are described in two papers published online in the Journal of Biological Chemistry. One presents the development of a novel VEGF inhibitor; the other describes the HER-2 inhibitor and the preclinical testing of the two agents together.

"When we combined our peptide HER-2 inhibitor with the VEGF peptide that inhibits angiogenesis, we observed significant additive benefits in reducing tumor burdens in preclinical studies," says principal investigator Pravin Kaumaya, professor of obstetrics and gynecology, of molecular and cellular biochemistry, and of microbiology, and director of the division of vaccine development at the OSUCCC – James.

The strategy of targeting both HER-2 and VEGF pathways should also discourage the development of , Kaumaya says, because it simultaneously inhibits two pathways that are essential for tumor survival. "Combined peptide inhibitors might be appropriate in several types of cancer to overcome acquired resistance and provide clinical benefit," he adds.

Peptide inhibitors consist of short chains of amino acids (the VEGF inhibitor is 22 amino acids long) that conform in shape to the active site of the target receptor. In addition, Kaumaya engineered the VEGF peptide to be resistant to protease, an enzyme, thereby increasing its efficacy. The shape of the peptide HER-2 inhibitor engineered by Kaumaya and his colleagues, for example, is highly specific for the HER-2 receptor. It physically binds to the receptor, which prevents another substance, called epithelial growth factor, from contacting the receptor and stimulating the cancer cells to grow.

Other categories of targeted drugs in clinical use are humanized monoclonal antibodies and small-molecule TKI inhibitors. Both groups are associated with severe side effects and are very expensive, Kaumaya says. "We believe peptide inhibitors offer non-toxic, less-expensive alternatives to humanized monoclonal antibodies and small-molecule inhibitors for the treatment of solid tumors, with the potential for improved efficacy and better clinical outcomes," he says.

Provided by Ohio State University Medical Center

Iowa State, Ames Lab researcher hunts for green catalysts

L. Keith Woo is searching for cleaner, greener chemical reactions. Woo, an Iowa State University professor of chemistry and an associate of the U.S. Department of Energy's Ames Laboratory, has studied catalysts and the chemical reactions they affect for more than 25 years. And these days, his focus is on green catalysis. That, he said, is the search for catalysts that lead to more efficient chemical reactions. That could mean they promote reactions at lower pressures and temperatures. Or it could mean they promote reactions that create less waste. Or it could mean finding safer, cleaner alternatives to toxic or hazardous conditions, such as using water in place of organic solvents."We're trying to design, discover and optimize materials that will produce chemical reactions in a way that the energy barrier is lowered," Woo said. "We're doing fundamental, basic catalytic work."And much of that work is inspired by biology.In one project, Woo and his research group are studying how iron porphyrins (the heme in the hemoglobin of red blood cells) can be used for various catalytic applications. Iron porphyrins are the active sites in a variety of the enzymes that create reactions and processes within a cell. Most of the iron porphyrin reactions involve oxidation and electron transfer reactions.Because the iron porphyrins of biology have evolved into highly specialized catalysts, Woo and his research group are studying how they can be used synthetically with the goal of developing catalysts that influence a broader range of reactions."We've found porphyrins are capable of doing many reactions – often as well, or better, or cheaper than other catalysts," Woo said.Another project is using combinatorial techniques to accelerate the development, production and optimization of catalysts. Woo and his research group are using molecular biology to quickly screen a massive library of DNA molecules for catalyst identification and development. The goal is to create water-soluble catalysts for organic reactions. "Combinatorial approaches such as these have been applied to drug design, but their use in transition metal catalyst development is in its infancy," Woo wrote in a summary of the project.A third project is looking for catalysts that allow greener production of lactams, which are compounds used in the production of solvents, nylons and other polymers. Commercial lactam production traditionally uses harsh reagents and conditions, such as sulfuric acid and high temperatures, and also creates significant wastes.Woo, in collaboration with Robert Angelici, a Distinguished Professor Emeritus of Chemistry, has found a gold-based catalyst that eliminates the need for the acid and high pressure and also eliminates the wastes. The Iowa State Research Foundation Inc. is seeking a patent on the technology.And, in a fourth project, Woo is working to understand the chemistry behind the chemical reactions that create bio-oil from the fast pyrolysis of biomass. Fast pyrolysis quickly heats biomass (such as corn stalks and leaves) in the absence of oxygen to produce a liquid bio-oil that can be used to manufacture fuels and chemicals.Woo's projects are supported by grants from the National Science Foundation, the U.S. Department of Energy, Iowa State's Institute for Physical Research and Technology, Iowa State's Bioeconomy Institute, and the National Science Foundation Engineering Research Center for Biorenewable Chemicals based at Iowa State. Woo's research team includes post-doctoral researcher Wenya Lu and doctoral students B.J. Anding, Taiwo Dairo, Erik Klobukowski and Gina Roberts.Sit down with Woo and he'll call up slide after slide of the chemical equations that describe chemical reactions.And before long he's describing how catalysts are discovered these days."The traditional way to develop catalysts was very Edisonian – one experiment at a time," Woo said. "It was all by trial and error."Now, with high-throughput approaches, Woo said his research group is able to quickly test a reaction using one hundred trillion different catalysts.And that, Woo said, is "helping us find less expensive and more environmentally friendly materials and conditions to perform these catalytic reactions."

Gulf oil spill: Airborne chemistry measurements assess flow rate, fate of spilled gases and oil

NOAA scientists and academic partners have found a way to use air chemistry measurements taken hundreds of feet above last year's BP Deepwater Horizon oil spill to estimate how fast gases and oil were leaking from the reservoir thousands of feet underwater. The scientists also determined the fate of most of those gas and oil compounds using atmospheric chemistry data collected from the NOAA WP-3D research aircraft overflights in June.


The study, accepted for publication in Geophysical Research Letters, a publication of the American Geophysical Union, is available online as a paper in press.


"We present a new method for understanding the fate of most of the spilled gases and oil," said Tom Ryerson, lead author of the report, from NOAA's Earth System Research Laboratory in Boulder, CO. "We found that the spilled gases and oil (spilled fluid) obeyed a simple rule: whether a compound can dissolve or evaporate determines where it goes in the marine environment. That simple rule, and the methods we lay out in this paper, could enable airborne evaluation of the magnitude of future spills."


Knowing where the spilled gas and oil mixture ended up could also help resource managers and others trying to understand environmental exposure levels.


Using the atmospheric measurements and information about the chemical makeup of the leaking reservoir fluid, Ryerson and his colleagues calculated that at least 32,600-47,700 barrels of liquid gases and oil poured out of the breached reservoir on June 10. This range, determined independently of previous estimates presents a lower limit. "Although we accounted for gases that dissolved before reaching the surface, our atmospheric data are essentially blind to gases and oil that remain trapped deep underwater," Ryerson said. Comparison of the new result with official estimates is not possible because this airborne study could not measure that trapped material.


Not including that trapped material, atmospheric measurements combined with reservoir composition information showed that about one-third (by mass) of the oil and gas dissolved into the water column on its way to the surface. The team found another 14 percent by mass (570,000 lbs per day) was lost quickly to the atmosphere within a few hours after surfacing, and an additional 10 percent was lost to the atmosphere over the course of the next 24 to 48 hours.


Among the study's other key findings:

Some compounds evaporated essentially completely to the atmosphere, which allowed scientists to make an estimate of flow rate based solely on atmospheric measurements and reservoir composition information.Airborne instruments picked up no enhanced levels of methane, the lightest-weight hydrocarbon in the leaking reservoir fluid, showing that it dissolved essentially completely in the water column.Benzene -- a known human carcinogen -- and ethane were found in only slightly elevated concentrations in the air, meaning they dissolved nearly completely in the water.A number of slightly heavier carbon compounds ended up in both the air and water, with the precise fraction depending on the compound. Based on these data, the team inferred different exposure risks of mid- and shallow-water marine species to elevated levels of potentially toxic compounds.

A portion of oil and gas was "recovered" by response activities and piped from the leaking wellhead to the Discoverer Enterprise drill ship on the ocean surface. The research team calculated this recovered fraction by measuring emissions from natural gas flaring aboard the recovery ship. They calculated a recovery rate of 17,400 barrels of reservoir fluid (liquid gas and oil) for June 10, and which accounted for approximately one-third to one-half of the group's total estimate of 32,600-47,700 barrels of fluid per day.


Ryerson and his colleagues concluded that the technique they developed could be applied to future oil spills, whether in shallow or deep water. The Gulf research flights were possible only because a NOAA WP-3D research aircraft had already been outfitted with sensitive chemistry equipment for deployment to California for an air quality and climate study and was redeployed to the Gulf. NOAA's Gulf flights were in support of the Unified Area Command's effort to observe and monitor the environmental effects of the spill.


Story Source:


The above story is reprinted (with editorial adaptations) from materials provided by National Oceanographic and Atmospheric Administration.

Journal Reference:

Ryerson, T.B., K.C. Aikin, W.M. Angevine, E.L. Atlas, D.R. Blake, C.A. Brock, F.C. Fehsenfeld, R.-S. Gao, J.A. de Gouw, D.W. Fahey, J.S. Holloway, D.A. Lack, R.A. Lueb, S. Meinardi, A.M. Middlebrook, D.M. Murphy, J.A. Neuman, J.B. Nowak, D. D. Parrish, J. Peischl, A.E. Perring, I.B. Pollack, A. R. Ravishankara, J. M. Roberts, J. P. Schwarz, J. R. Spackman, H. Stark, C. Warneke, L. A. Watts. Atmospheric emissions from the Deepwater Horizon spill constrain air-water partitioning, hydrocarbon fate, and leak rate. Geophysical Research Letters, 2011; DOI: 10.1029/2011GL046726

New laser technique opens doors for drug discovery

 Researchers have demonstrated that a new laser technique can be used to measure the interactions between proteins tangled in a cell's membrane and a variety of other biological molecules. These extremely difficult measurements can aid the process of drug discovery.


Scientists estimate that about 30 percent of the 7,000 proteins in a human cell reside in the cell's membrane, and that these membrane proteins initiate 60 to 70 percent of the signals that control the operation of the cell's molecular machinery. As a result, about half of the drugs currently on the market target membrane proteins.


Despite their importance, they are difficult to study. Individual membrane proteins are extremely hard to purify, so scientists have very little structural information about them. In addition, existing methods to measure their activity have serious limitations. Most existing assays remove the membranes from their natural environment or modify them in a variety of different ways, such as attaching fluorescent labels, in order to analyze membrane protein activity.


"In addition to being expensive and time-consuming, these modifications can affect the target membrane's function in unpredictable ways," said Vanderbilt Professor of Chemistry Darryl Bornhop, who developed the new technique.


By contrast, in an article published online in the journal Nature Biotechnology, Bornhop's research group and their collaborators at The Scripps Research Institute report that the laser-based technique, called backscattering interferometry (BSI), can precisely measure the binding force between membrane proteins and both large and small molecules in a natural environment.


This is a powerful tool and a major advance in measuring membrane protein interactions," said Lawrence Marnett, director of the Vanderbilt Institute of Chemical Biology. "This is a powerful tool and a major advance in measuring membrane protein interactions," said Lawrence Marnett, director of the Vanderbilt Institute of Chemical Biology. Marnett, who is also Mary Geddes Stahlman Professor of Cancer Research, was not involved in the study but is planning on collaborating with the Bornhop group.


Lasers aid measurement


BSI is deceptively simple. It measures the binding force between two molecules mixed in a microscopic liquid-filled chamber by shining a red laser like those used in barcode scanners through it. When the geometry of the chamber is correct, the laser produces an interference pattern that is very sensitive to what the molecules are doing. If the molecules begin sticking together, for example, the pattern begins to shift.


In the new study, the researchers created synthetic membranes that contained a small protein, called GM1, that is a primary target that cholera toxins bind with in order to get into a cell. When they mixed these membranes with cholera toxin B, they measured a binding force consistent with that obtained by other methods.


The researchers performed similar validation tests with naturally derived membranes and three membrane proteins, one associated with breast cancer, another associated with pain and inflammation and the neurotransmitter GABA known to aid in relaxation and sleep and to regulate anxiety.


When they mixed the membranes containing each of these proteins with molecules known to bind with them, the BSI technique provided measurements that agreed with the values obtained by other methods, the scientists reported.


Vanderbilt has applied for and received three patents on the process and has several other patents pending.


The university has issued an exclusive license to develop the technology to Molecular Sensing, Inc. Bornhop is one of the founders of the start-up and serves as its chief scientist.


Vanderbilt research associate Amanda Kussrow and Michael Baksh, Mauro Mileni and M.G. Finn from The Scripps Research Institute contributed to the study, which was funded by awards from the National Institutes of Health, Joint Center for Innovative Membrane Protein Technologies and the Skaggs Institute for Chemical Biology.


Story Source:


The above story is reprinted (with editorial adaptations) from materials provided by Vanderbilt University.

Journal Reference:

Michael M Baksh, Amanda K Kussrow, Mauro Mileni, M G Finn, Darryl J Bornhop. Label-free quantification of membrane-ligand interactions using backscattering interferometry. Nature Biotechnology, 2011; DOI: 10.1038/nbt.1790

New measurement into biological polymer networks

The development of a new measurement technology under a research project funded by the Air Force Office of Scientific Research and the National Science Foundation is probing the structure of composite and biological materials.


"Our results have provided some of the first microscopic insights into a sixty year old puzzle about the way polymeric networks react to repeated shear strains," said Dr. Daniel Blair, Assistant Professor, and principal investigator of the Soft Matter Group in the Department of Physics at Georgetown University.


Blair, Professor Andreas Bausch and other researchers at Technische Universtaet Muenchen (Technical University of Munich) used the muscle filament known as actin to construct a unique polymer network. In their quest to understand more about bio-polymers, they developed the rheometer and confocal microscope system (measures the mechanical properties of materials), which provide a unique and unprecedented level of precision and sensitivity for investigating polymeric systems which were previously too small to visualize during mechanical stress experiments. The rheometer and confocal microscopes clearly visualized the fluorescently labeled actin network and they filmed the polymer filaments' movement in 3-D when mechanical stress was applied.


The rheometer and confocal microscopes, will help to lay the groundwork for future generations of materials that will possibly be used to create synthesized muscle tissue for the Air Force. These materials may even be ideally suited for powering micro-robots. The rheometer and confocal microscopes enabled the scientists to see the shearing process during the Mullins Effect when biological polymers become dramatically softer as seen in conventional polymers. Moreover, these materials also demonstrate dramatic strengthening in a way that is very different compared to conventional polymeric solids. The researchers' next steps will be to use the Mullins Effect as a mechanical standard for understanding the properties of composite and biological networks.


"We will use confocal-rheology as a benchmark system for generating new collaborations and expanding the technique to other AFOSR sponsored projects," said Blair. "For example, in collaboration with Dr. Fritz Vollrath of the Oxford Silk Group and Dr. David Kaplan from Tufts University, we are investigating how shear stress influences the formation of silk fibers."


Blair noted that the new technology is impacting a number of other AFOSR supported projects as a platform for investigating the strengthening of nano-composite networks such as carbon nanotubes and cellulose nanofibers embedded in conventional materials.


Blair predicts that there will be possible private sector uses for the new technology in the area of the green revolution and its inherent smart, soft biological materials.


Story Source:


The above story is reprinted (with editorial adaptations) from materials provided by Air Force Office of Scientific Research.

Note: If no author is given, the source is cited instead.

Breakthrough in nanocomposite for high-capacity hydrogen storage

ScienceDaily (Mar. 14, 2011) — Since the 1970s, hydrogen has been touted as a promising alternative to fossil fuels due to its clean combustion -- unlike hydrocarbon-based fuels, which spew greenhouse gases and harmful pollutants, hydrogen's only combustion by-product is water. Compared to gasoline, hydrogen is lightweight, can provide a higher energy density and is readily available. But there's a reason we're not already living in a hydrogen economy: to replace gasoline as a fuel, hydrogen must be safely and densely stored, yet easily accessed. Limited by materials unable to leap these conflicting hurdles, hydrogen storage technology has lagged behind other clean energy candidates.

In recent years, researchers have attempted to tackle both issues by locking hydrogen into solids, packing larger quantities into smaller volumes with low reactivity -- a necessity in keeping this volatile gas stable. However, most of these solids can only absorb a small amount of hydrogen and require extreme heating or cooling to boost their overall energy efficiency.

Now, scientists with the U.S. Department of Energy (DOE) Lawrence Berkeley National Laboratory (Berkeley Lab) have designed a new composite material for hydrogen storage consisting of nanoparticles of magnesium metal sprinkled through a matrix of polymethyl methacrylate, a polymer related to Plexiglas. This pliable nanocomposite rapidly absorbs and releases hydrogen at modest temperatures without oxidizing the metal after cycling -- a major breakthrough in materials design for hydrogen storage, batteries and fuel cells.

"This work showcases our ability to design composite nanoscale materials that overcome fundamental thermodynamic and kinetic barriers to realize a materials combination that has been very elusive historically," says Jeff Urban, Deputy Director of the Inorganic Nanostructures Facility at the Molecular Foundry, a DOE Office of Science nanoscience center and national user facility located at Berkeley Lab. "Moreover, we are able to productively leverage the unique properties of both the polymer and nanoparticle in this new composite material, which may have broad applicability to related problems in other areas of energy research."

Urban, along with coauthors Ki-Joon Jeon and Christian Kisielowski used the TEAM 0.5 microscope at the National Center for Electron Microscopy (NCEM), another DOE Office of Science national user facility housed at Berkeley Lab, to observe individual magnesium nanocrystals dispersed throughout the polymer. With the high-resolution imaging capabilities of TEAM 0.5, the world's most powerful electron microscope, the researchers were also able to track defects -- atomic vacancies in an otherwise-ordered crystalline framework -- providing unprecedented insight into the behavior of hydrogen within this new class of storage materials.

"Discovering new materials that could help us find a more sustainable energy solution is at the core of the Department of Energy's mission. Our lab provides outstanding experiments to support this mission with great success," says Kisielowski. "We confirmed the presence of hydrogen in this material through time-dependent spectroscopic investigations with the TEAM 0.5 microscope. This investigation suggests that even direct imaging of hydrogen columns in such materials can be attempted using the TEAM microscope."

"The unique nature of Berkeley Lab encourages cross-division collaborations without any limitations," said Jeon, now at the Ulsan National Institute of Science and Technology, whose postdoctoral work with Urban led to this publication.

To investigate the uptake and release of hydrogen in their nanocomposite material, the team turned to Berkeley Lab's Energy and Environmental Technologies Division (EETD), whose research is aimed at developing more environmentally friendly technologies for generating and storing energy, including hydrogen storage.

"Here at EETD, we have been working closely with industry to maintain a hydrogen storage facility as well as develop hydrogen storage property testing protocols," says Samuel Mao, director of the Clean Energy Laboratory at Berkeley Lab and an adjunct engineering faculty member at the University of California (UC), Berkeley. "We very much enjoy this collaboration with Jeff and his team in the Materials Sciences Division, where they developed and synthesized this new material, and were then able to use our facility for their hydrogen storage research."

Adds Urban, "This ambitious science is uniquely well-positioned to be pursued within the strong collaborative ethos here at Berkeley Lab. The successes we achieve depend critically upon close ties between cutting-edge microscopy at NCEM, tools and expertise from EETD, and the characterization and materials know-how from MSD."

This research is reported in a paper titled, "Air-stable magnesium nanocomposites provide rapid and high-capacity hydrogen storage without heavy metal catalysts," appearing in the journal Nature Materials. Co-authoring the paper with Urban, Kisielowski and Jeon were Hoi Ri Moon, Anne M. Ruminski, Bin Jiang and Rizia Bardhan.

This work was supported by DOE's Office of Science.

The Molecular Foundry is one of the five DOE Nanoscale Science Research Centers (NSRCs), premier national user facilities for interdisciplinary research at the nanoscale. Together the NSRCs comprise a suite of complementary facilities that provide researchers with state-of-the-art capabilities to fabricate, process, characterize and model nanoscale materials, and constitute the largest infrastructure investment of the National Nanotechnology Initiative. The NSRCs are located at DOE's Argonne, Brookhaven, Lawrence Berkeley, Oak Ridge and Sandia and Los Alamos National Laboratories. For more information about the DOE NSRCs, please visit http://nano.energy.gov.

Story Source:

The above story is reprinted (with editorial adaptations by ScienceDaily staff) from materials provided by DOE/Lawrence Berkeley National Laboratory, via EurekAlert!, a service of AAAS.

Journal Reference:

Ki-Joon Jeon, Hoi Ri Moon, Anne M. Ruminski, Bin Jiang, Christian Kisielowski, Rizia Bardhan, Jeffrey J. Urban. Air-stable magnesium nanocomposites provide rapid and high-capacity hydrogen storage without using heavy-metal catalysts. Nature Materials, 2011; DOI: 10.1038/nmat2978

Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.

Economical and environmentally-friendly pulp bleaching

A simulation model for pulp bleaching was created in the Virtual Pulp Bleaching (VIP) project led by Professor Tapani Vuorinen from the Aalto University School of Chemical Technology. The model will yield information of phenomena taking place during pulp bleaching, especially about reactions between lignin and bleaching chemicals that cause the pulp's brown colour. The chemicals used in pulp bleaching are costly. In addition, water and energy are used in the bleaching process. With the help of the knowledge produced by the VIP model, pulp can be bleached more economically and in a more environmentally friendly way.


The VIP model developed includes a library of chemical reactions for most commonly used bleaching chemicals. The models of the chemical reactions and other phenomena which participate in the bleaching process function as a part of process equipment models with the combination of which it is possible to construct flow diagrams used in the bleaching process at the plant. In addition, the model can be used to simulate experimental arrangements at the laboratory scale.


The simulation model will be used in research and teaching, especially at Aalto University. In industry, the model can be used for planning of the bleaching process, training of personnel, as well as optimization of the existing processes.


Bleaching is the process used after pulping to remove lignin residual


Cellulose is one of the raw materials of paper and cardboard, which is obtained from wood pulping process. Pulping takes place in a high temperature with the help of chemicals; lignin that glues the wood fibres together partially dissolves and the fibres separate from each other. After the pulping process is over, the cellulose is bleached aiming to remove the rest of the lignin, the so-called residual lignin, with different chemicals. Lignin removal is important because lignin gives the fibres their brown colour. Pulps pronounced brightness is important in the production of fine grade paper.


Lignin polymer consists of different chemical groups, which differ from each other in their reactivity with different bleaching chemicals and, for example, in the extent they affect the colour of the fibre. For the construction of the VIP model, attention was directed to these types of different chemical groups to allow the model to provide a sufficiently realistic picture of the effect of different conditions, such as those due to temperature and chemical doses, on the success of bleaching.


The project was multidisciplinary: it drew together scientists from the fields of wood chemistry and the modelling of chemical processes as well as experts from the pulping industry. This kind of cooperation was necessary to obtain a useful model.


The Virtual Pulp Bleaching project is one of the awarded projects of the Forestcluster's EffTech program. The five-year Intelligent and resource efficient production technologies (EffTech) program started in 2008, and its total budget is € 40 million. The VIP project includes Aalto University, VTT Technical Research Centre of Finland and the Lappeenranta University of Technology.


Story Source:


The above story is reprinted (with editorial adaptations) from materials provided by Aalto University.

.