Find out more at eXoNews! 
Orion Spacecraft!
BrainTrek, BDTs, Space Ether!
Hostile Holes, Safe Nanoparticles,
Gas on Mars! Chimps With Hammers!
Orion Spacecraft!

NASA's Constellation Program is getting to work on the new
spacecraft that will return humans to the moon and blaze a
trail to Mars and beyond. This artist's rendering represents a
concept of a crew exploration vehicle (CEV) and service
module. (NASA)
NASA News Release

August 22, 2006 - NASA announced Tuesday that its new crew exploration vehicle will be named Orion. Orion is the vehicle NASA’s Constellation Program is developing to carry a new generation of explorers back to the moon and later to Mars. Orion will succeed the space shuttle as NASA's primary vehicle for human space exploration.

Orion's first flight with astronauts onboard is planned for no later than 2014 to the International Space Station. Its first flight to the moon is planned for no later than 2020.

Orion is named for one of the brightest, most familiar and easily identifiable constellations.

"Many of its stars have been used for navigation and guided explorers to new worlds for centuries," said Orion Project Manager Skip Hatfield. "Our team, and all of NASA - and, I believe, our country - grows more excited with every step forward this program takes. The future for space exploration is coming quickly."

After driving a short distance from their landing site, two
explorers stop to inspect a robotic lander and its small rover
in this artist's concept of a future Mars mission. (NASA/ Pat
Rawlings, SAIC)

In June, NASA announced the launch vehicles under development by the Constellation Program have been named Ares, a synonym for Mars. The booster that will launch Orion will be called Ares I, and a larger heavy-lift launch vehicle will be known as Ares V.

Orion will be capable of transporting cargo and up to six crew members to and from the International Space Station. It can carry four crewmembers for lunar missions. Later, it can support crew transfers for Mars missions.

Orion borrows its shape from space capsules of the past, but takes advantage of the latest technology in computers, electronics, life support, propulsion and heat protection systems. The capsule's conical shape is the safest and most reliable for re-entering the Earth’s atmosphere, especially at the velocities required for a direct return from the moon.

Orion will be 16.5 feet in diameter and have a mass of about 25 tons. Inside, it will have more than 2.5 times the volume of an Apollo capsule. The spacecraft will return humans to the moon to stay for long periods as a testing ground for the longer journey to Mars.

NASA's Johnson Space Center, Houston, manages the Constellation Program and the agency's Marshall Space Flight Center, Huntsville, Ala., manages the Exploration Launch Projects' office for the Exploration Systems Mission Directorate, Washington.



Depressed mouse
McGill University News Release

Montreal August 22, 2006 - A new breed of permanently 'cheerful' mouse is providing hope of a new treatment for clinical depression. TREK-1 is a gene that can affect transmission of serotonin in the brain. Serotonin is known to play an important role in mood, sleep and sexuality. By breeding mice with an absence of TREK-1, researchers were able create a depression-resistant strain.

The details of this research, which involved an international collaboration with scientists from the University of Nice, France, are published in Nature Neuroscience this week.

"Depression is a devastating illness, which affects around 10% of people at some point in their life," says Dr. Guy Debonnel an MUHC psychiatrist, professor in the Department of Psychiatry at McGill University, and principal author of the new research. "Current medications for clinical depression are ineffective for a third of patients, which is why the development of alternate treatments is so important."

Mice without the TREK-1 gene ('knock-out' mice) were created and bred in collaboration with Dr. Michel Lazdunski, co-author of the research, in his laboratory at the University of Nice, France.

Happy mouse (Paul Terry)

"These 'knock-out' mice were then tested using separate behavioral, electrophysiological and biochemical measures known to gauge 'depression' in animals," says Dr. Debonnel. "The results really surprised us; our 'knock-out' mice acted as if they had been treated with antidepressants for at least three weeks."

This research represents the first time depression has been eliminated through genetic alteration of an organism. "The discovery of a link between TREK-1 and depression could ultimately lead to the development of a new generation of antidepressant drugs," noted Dr. Debonnel.

According to Health Canada and Statistics Canada, approximately 8% of Canadians will suffer from depression at some point in their lifetime. Around 5% of Canadians seek medical advice for depression each year; a figure that has almost doubled in the past decade. Figures in the U.S. are comparable, with approximately 18.8 million American adults (about 9.5% of the population) suffering depression during their life.

McGill University -

Bush Wrong On Nuke Power?
Massachusetts Institute of Technology News Releas
By Anne Trafton

CAMBRIDGE August 22, 2006 - The Bush administration is eagerly pushing nuclear power as a way to help solve the U.S. energy crisis. But in its new plan for nuclear waste management, the administration is taking the wrong approach, says an MIT professor who studies the nuclear energy industry.

"My hope is that over time, the administration will rethink its priorities in this area," says Richard Lester, professor of nuclear engineering and director of the Industrial Performance Center.

In a recent article published in Issues in Science and Technology, Lester argued that the Bush administration's plan, known as GNEP (Global Nuclear Energy Partnership), is not the best way to encourage further development of nuclear energy.

Oops, there goes another nuclear
containment facility... (LLNL)

GNEP, which President Bush announced earlier this year, is meant to stimulate the nuclear industry by coming up with better ways to manage spent nuclear fuel. The plan focuses on reprocessing spent fuel, but Lester believes the administration should focus on finding regional storage facilities for the nuclear waste.

Right now, uncertainty over how to deal with spent fuel, which remains radioactive for hundreds of thousands of years, is one of the major obstacles to the construction of new plants. Thousands of spent fuel rods are now stored in secure pools or concrete casks located near nuclear plants, which is not considered a long-term solution.

The administration has been pushing a plan to move all of the nation's spent fuel to a repository at Yucca Mountain, Nevada, but that facility is not scheduled to open until at least 2017. Many years and billions of dollars have gone into planning for the repository there, over the protests of Nevada residents, and success is still not assured.
If the project fails, an alternative will be needed. And even if it succeeds, spent fuel will remain at nuclear power plants for decades before it can be removed.

Several nuclear energy companies have sued the federal government for failing to fulfill its contractual obligation to remove spent nuclear fuel from their plants. That failure does not bode well for construction of new plants, Lester said.

"If electric power companies can't believe the government is going to fulfill its obligations, it's going to be a real deterrent for them to go ahead with new power plants," he said.

In the meantime, the Bush plan calls for developing new technology to reprocess spent fuel to recover usable plutonium and uranium and eliminate other long-lived radioactive elements known as actinides. But according to Lester, the government's efforts would be better focused on other solutions, such as establishing a small number of regional facilities, where nuclear plants could send their spent fuel to be stored safely for several decades.

GNEP does not address the utilities' spent fuel storage problem. Instead, it "is being sold as a technical fix for three other problems," Lester said, but "each of these problems is either not as serious as the administration suggests or could be solved in a different way that is less costly and less risky."

Those perceived problems are lack of space at Yucca Mountain; the long life of radioactive material; and a potential shortage of uranium.

Yucca Mountain, a ridgeline geological formation about 100 miles northwest of Las Vegas, has already been tunneled in preparation for waste storage. When Congress approved the Yucca Mountain site, it put a 70,000-metric-ton limit on the amount of waste that could be stored there, but there is room for much more if Congress wants to raise the limit, Lester said.

Any effort to remove the long-lived radioactivity from the waste would require construction of reprocessing plants, special "burner" reactors and other nuclear facilities, which would be costly and difficult to site. And even if these plants were successfully built, it would be nearly impossible to eliminate all of the long-lived radioisotopes in the waste, Lester says.

"When you really look at the technical feasibility of reducing the toxic lifetime of waste, it has less potential than the administration is claiming, and the costs and shorter-term risks of doing it are significant," he said. Moreover, according to Lester, there are other, less costly ways to reduce the long-term risks of nuclear waste disposal that the administration has ignored.

Supporters of GNEP also say that reprocessing spent fuel could be necessary in the future if uranium becomes scarce, but according to the 2003 MIT report, "The Future of Nuclear Power," there is enough uranium to last for several decades, even if many new nuclear plants are built.

Lester said he is not opposed to research on new fuel cycle technologies, but he argues that reprocessing will not be needed for several decades, if then, and that to spend billions of dollars over the next few years on demonstrating reprocessing and related technologies, as the administration is proposing, would not be a wise use of resources.

Massachusetts Institute of Technology -

BDTs - Ballistic Deflection Transistors

Ballistic Transistor at work. (UR)
University of Rochester News Release

August 16, 2006 - Computer designers at the University of Rochester are going ballistic.

"Everyone has been trying to make better transistors by modifying current designs, but what we really need is the next paradigm," says Quentin Diduck, a graduate student at the University who thought up the radical new design. "We've gone from the relay, to the tube, to semiconductor physics. Now we're taking the next step on the evolutionary track."

That next step goes by the imposing name of "Ballistic Deflection Transistor," and it's as far from traditional transistors as tubes. Instead of running electrons through a transistor as if they were a current of water, the ballistic design bounces individual electrons off deflectors as if playing a game of atomic billiards.

Though today's transistor design has many years of viability left, the amount of heat these transistors generate and the electrical "leaks" in their ultra-thin barriers have already begun to limit their speed.

Research groups around the world are investigating strange new designs to generate ways of computing at speeds unthinkable with today's chips. Some of these groups are working on similar single-electron transistors, but these designs still compute by starting and stopping the flow of electrons just like conventional designs. But the Ballistic Deflection Transistor adds a new twist by bouncing the electrons into their chosen trajectories—using inertia to redirect for "free," instead of wrestling the electrons into place with brute energy.

Such a chip would use very little power, create very little heat, be highly resistant to "noise" inherent in electronic systems, and should be easy to manufacture with current technologies. All that would make it incredibly fast. The National Science Foundation is so impressed with the idea that it just granted the University of Rochester team $1.1 million to develop a prototype.

"We've assembled a unique team to take on this chip," says Marc Feldman, professor of computer engineering at the University. "In addition to myself and Quentin, we have a theoretical physicist, a circuit designer, and an expert in computer architecture. We're not just designing a new transistor, but a new archetype as well, and as far as I know, this is the first time an architect has been involved in the actual design of the transistor on which the entire architecture is built."

The team has already had some luck in fabricating a prototype. The ballistic transistor is a nano-scale structure, and so all but impossible to engineer just a few years ago. Its very design means that this "large" prototype is already nearly as small as the best conventional transistor designs coming out of Silicon Valley today. Feldman and Diduck are confident that the design will readily scale to much smaller dimensions.

There's one hurdle the team isn't quite as confident about: "We're talking about a chip speed measured in terahertz, a thousand times faster than today's desktop transistors" Diduck says. "We have to figure out how to test it because there's no such thing as a terahertz oscilloscope!"

The Science Behind the Ballistics

The Ballistic Deflection Transistor (BDT) should produce far less heat and run far faster than standard transistors because it does not start and stop the flow of its electrons the way conventional designs do. It resembles a roadway intersection, except in the middle of the intersection sits a triangular block. From the "south" an electron is fired, as it approaches the crossroads, it passes through an electrical field that pushes the electron slightly east or west. When the electron reaches the middle of the intersection, it bounces off one side of the triangle block and is deflected straight along either the east or west roads. In this way, if the electron current travels along the east road, it may be counted as a zero, and as a one if it travels down the west road.

A traditional transistor registers a "one" as a collection of electrons on a capacitor, and a "zero" when those electrons are removed. Moving electrons on and off the capacitor is akin to filling and emptying a bucket of water. The drawback to this method is that it takes time to fill and empty that bucket. That refill time limits the speed of the transistor—the transistors in today's laptops run at perhaps two gigahertz, meaning two billion refills every second. A second drawback is that these transistors produce immense amounts of heat when that energy is emptied.

The BDT design should also be able to resist much of the electrical noise present in all electronic devices because the noise would only be present in the electrical "steering" field, and calculations show the variations of the noise would cancel themselves out as the electron passes through.

The BDT is "ballistic" because it is made from a sheet of semiconductor material called a "2D electron gas," which allows the electrons to travel without hitting impurities, which would impede the transistor's performance.

The BDT prototype was fabricated at the Cornell Nanofabrication Facility with the support provided by the Office of Naval Research.

The $1.1 million is an NSF Nanotechnology Integrated Research Team grant, which is only awarded to promising research. The team is comprised of Marc Feldman, professor of electrical and computer engineering, Martin Margala and Paul Ampadu, assistant professors of electrical and computer engineering, and Yonathan Shapir, professor of physics and astronomy.

To see an animation, go to

University of Rochester -

Dark Matter Exists!
University of Arizona News Release
By Lori Stiles

August 21, 2006 - Astronomers have discovered first direct proof that dark matter exists. University of Arizona astronomers and their colleagues got side-on views of two merging galaxy clusters in observations made with state-of-the-art optical and X-ray telescopes.

"Nature gave us this fantastic opportunity to see hypothesized dark matter separated from ordinary matter in this merging system," said UA astronomer Douglas Clowe, leader of the study.

"Prior to this observation, all of our cosmological models were based on an assumption that we couldn't prove: that gravity behaves the same way on the cosmic scale as on Earth," Clowe said. "The clusters we've looked at in these images are a billion times larger than the largest scales at which we can measure gravity at present, which are on the scale of our solar system."

Clowe added, "What's amazing about this is that the process of galaxy clusters merging is thought to go on all of time. That's how galaxy clusters gain mass. But the fact that we caught this thing only 100 million years after it occurred -- so recently that it barely registers on the cosmic time scale -- is tremendous luck."

This composite image shows the galaxy cluster 1E0657-556, also
known as the "bullet cluster", formed after the collision of two large
clusters of galaxies. Hot gas detected by Chandra is seen as two pink
clumps in the image and contains most of the "normal" matter in the
two clusters. An optical image from Magellan and the Hubble Space
Telescope shows galaxies in orange and white. The blue clumps show
where most of the mass in the clusters is found, using a technique
known as gravitational lensing. Most of the matter in the clusters
(blue) is clearly separate from the normal matter (pink), giving
direct evidence that nearly all of the matter in the clusters is dark.
This result cannot be explained by modifying the laws of gravity.
(NASA/ CXC /M.Markevitch /STScI; Magellan /U.Arizona/ D.Clowe et al.

Astronomers have known since the 1930s that most of the universe must be made up of something other than normal matter, the stuff that makes stars, planets, all things and creatures. Given the way that galaxies move through space and scientists' understanding of gravity, astronomers theorize that the universe must contain about five times more dark matter than normal matter.

But for the past 70 years, no one had any direct empirical evidence that dark matter even exists.

"Astronomers have been in the somewhat embarrassing position of saying that we understand the universe, although more than 80 percent of it is something we don't know anything about," said UA astronomy Professor Dennis Zaritsky, a member of the discovery team.

"Either most of the matter in the universe is in some invisible, undiscovered form we call 'dark matter' that causes galaxies to move as they do, or we just don't understand the fundamental laws of gravity," Zaritsky said.

When galaxy clusters merge, the galaxies themselves are so sparsely scattered in space that they don't collide, Clowe said. "Even if two galaxies do pass through each other, the distance between the stars is so great that even stars won't collide. Galaxies basically plow through each other almost without slowing down."

Most of a galaxy cluster's normal mass is in its diffuse hot gas. Galaxy clusters typically contain 10 times as much ordinary mass in gas as in stars. So when galaxy clusters merge, the hot gas from each cluster exerts a drag force on the other, slowing all the gas down, Clowe said.The upshot is that the galaxies themselves continue speeding through space, leaving the gas behind.

Observations made with NASA's Chandra X-ray Observatory showed the bulk of ordinary matter is in the hot gas clouds left in the wake of the galaxies. Part of this million-degree plasma of hydrogen and helium, the part from the smaller cluster, forms a spectacular bullet-shaped cloud because a bow shock, or supersonic shock wave, is created in the 10 million mph collision.

But when the astronomers mapped the region of the sky around the galaxies in optical light, they discovered far more mass near the galaxies, ahead of the gas cloud. They analyzed gravitational lensing of distant galaxies in images taken with NASA's Hubble Space Telescope, the European Southern Observatory's 2-meter Wide-Field Imager and one of the twin 6.5-meter Magellan telescopes that a consortium that includes UA operates in Chile.

Gravitational lensing is a phenomenon caused by gravity bending distant starlight. When the astronomers analyzed the shapes and patterns of the distorted light, they discovered the mass of non-luminous, or dark, matter that causes the lensing is far greater than the mass of ordinary matter in the gas cloud.

Clowe and Zaritsky said that dark matter particles are not expected to interact with either normal matter or dark matter particles except through gravity. Hence, they would pass through the collision just as galaxies do.

"We see that dark matter has careened through the collision efficiently," Zaritsky said.

"We're actually using this system to test the idea that dark matter particles are collisionless," Clowe said.

"The bottom line is, there really is dark matter out there," Zaritsky said. "Now we just need to figure out what it is."

The team is publishing the research in a forthcoming issue of the Astrophysical Journal Letters. In addition to Clowe and Zaritsky of UA's Steward Observatory, team members include Marusa Bradac of the Kavli Institute for Particle Astrophysics and Cosmology in Stanford, Calif., Anthony Gonzalez of the University of Florida, and Maxim Markevitch, Scott Randall and Christine Jones of the Harvard-Smithsonian Center for Astrophysics.

University of Arizona -

Space Ether - Dark Matter Doesn't Exist!

They posit an ether that is a field, rather than a substance, and which pervades
New Scientist News Release

August 23, 2006 - From his office window, Glenn Starkman can see the site where Albert Michelson and Edward Morley carried out their famous 1887 experiment that ruled out the presence of an all-pervading "aether" in space, setting the stage for Einstein's special theory of relativity.

So it seems ironic that Starkman, who is at Case Western Reserve University in Cleveland, Ohio, is now proposing a theory that would bring ether back into the reckoning. While this would defy Einstein, Starkman's ether would do away with the need for dark matter.

Nineteenth-century physicists believed that just as sound waves move through air, light waves must move through an all-pervading physical substance, which they called luminiferous ("light-bearing") ether. However, the Michelson-Morley experiment failed to find any signs of ether, and 18 years after that, Einstein's special relativity argued that light propagates through a vacuum. The idea of ether was abandoned – but not discarded altogether, it seems.

Starkman and colleagues Tom Zlosnik and Pedro Ferreira of the University of Oxford are now reincarnating the ether in a new form to solve the puzzle of dark matter, the mysterious substance that was proposed to explain why galaxies seem to contain much more mass than can be accounted for by visible matter. They posit an ether that is a field, rather than a substance, and which pervades space-time. "If you removed everything else in the universe, the ether would still be there," says Zlosnik.

This ether field isn't to do with light, but rather is something that boosts the gravitational pull of stars and galaxies, making them seem heavier, says Starkman. It does this by increasing the flexibility of space-time itself.

"We usually imagine space-time as a rubber sheet that's warped by a massive object," says Starkman. "The ether makes that rubber sheet more bendable in parts, so matter can seem to have a much bigger gravitational effect than you would expect from its weight." The team's calculations show that this ether-induced gravity boost would explain the observed high velocities of stars in galaxies, currently attributed to the presence of dark matter.

This is not the first time that physicists have suggested modifying gravity to do away with this unseen dark matter. The idea was originally proposed by Mordehai Milgrom while at Princeton University in the 1980s. He suggested that the inverse-square law of gravity only applies where the acceleration caused by the field is above a certain threshold, say a0. Below that value, the field dissipates more slowly, explaining the observed extra gravity. "It wasn't really a theory, it was a guess," says cosmologist Sean Carroll at the University of Chicago in Illinois.

Starkman's team must now carefully check whether
the ether theory fits with the motions of planets
within our solar system. (NASA)

Then in 2004 this idea of modified Newtonian dynamics (MOND) was reconciled with general relativity by Jacob Bekenstein at the Hebrew University in Jerusalem, Israel (New Scientist, 22 January 2005, p 10), making MOND a genuine contender in the eyes of some physicists. Bekenstein's work was brilliant, but fiendishly complicated, using many different and arbitrary fields and parameters," says Ferreira. "We felt that something so complicated couldn't be the final theory.

Now Starkman's team has reproduced Bekenstein's results using just one field - the new ether. Even more tantalisingly, the calculations reveal a close relationship between the threshold acceleration a0 - which depends on the ether - and the rate at which the universe's expansion is accelerating. Astronomers have attributed this acceleration to something called dark energy, so in a sense the ether is related to this entity. That they have found this connection is a truly profound thing, says Bekenstein. The team is now investigating how the ether might cause the universe's expansion to speed up.

Andreas Albrecht, a cosmologist at the University of Calfornia, Davis, believes that this ether model is worth investigating further. "We've hit some really profound problems with cosmology Ð with dark matter and dark energy," he says. "That tells us we have to rethink fundamental physics and try something new."

Both Bekenstein and Albrecht say Starkman's team must now carefully check whether the ether theory fits with the motions of planets within our solar system, which are known to a high degree of accuracy, and also explain what exactly this ether is. Ferreira agrees: "The onus is definitely on us to pin this theory down so it doesn't look like yet another fantastical explanation," he says.

However, physicists may be reluctant to resurrect any kind of ether because it contradicts special relativity by forming an absolute frame of reference . "Interestingly, this controversial aspect should make it easy to test for experimentally," says Carroll.

New Scientist -

Hostile Black Holes!

This artist's concept depicts a supermassive black hole at the center of a galaxy.
The blue color here represents radiation pouring out from material very close
to the black hole. The grayish structure surrounding the black hole, called a
torus, is made up of gas and dust. (NASA/ JPL-Caltech)
NASA News Release

August 23, 2006 - Supermassive black holes in some giant galaxies create such a hostile environment, they shut down the formation of new stars, according to NASA Galaxy Evolution Explorer findings published in the August 24 issue of Nature.

The orbiting observatory surveyed more than 800 nearby elliptical galaxies of various sizes. An intriguing pattern emerged: the more massive, or bigger, the galaxy, the less likely it was to have young stars. Because bigger galaxies are known to have bigger black holes, astronomers believe the black holes are responsible for the lack of youthful stars.

"Supermassive black holes in these giant galaxies create unfriendly places for stars to form," said Dr. Sukyoung K. Yi of Yonsei University in Seoul, Korea, who led the research team. "If you want to find lots of young stars, look to the smaller galaxies."

Previously, scientists had predicted that black holes might have dire consequences for star birth, but they didn't have the tools necessary to test the theory.
The Galaxy Evolution Explorer, launched in 2003, is well-suited for this research.

It is extremely sensitive to the ultraviolet radiation emitted by even low numbers of young stars.

Black holes are monstrous heaps of dense matter at the centers of galaxies. Over time, a black hole and its host galaxy will grow in size, but not always at the same rate.

Yi and his collaborators found evidence that the black holes in elliptical galaxies bulk up to a critical mass before putting a stop to star formation. In other words, once a black hole reaches a certain size relative to its host galaxy, its harsh effects become too great for new stars to form. According to this "feedback" theory, the growth of a black hole slows the development of not only stars but of its entire galaxy.

How does a black hole do this? There are two possibilities. First, jets being blasted out of black holes could blow potential star-making fuel, or gas, out of the galaxy center, where stars tend to arise.

The second theory relates to the fact that black holes drag surrounding gas onto them, which heats the gas. The gas becomes so hot that it can no longer clump together and collapse into stars.

Other authors of this research include: Drs. Kevin Schawinski, Sadegh Khochfar and Sugata Kaviraj of the University of Oxford, England; Dr. Young-Wook Lee of Yonsei University in Seoul, Korea; Drs. Alessandro Boselli, Jose Donas and Bruno Milliard of the Laboratory of Astrophysics of Marseille, France; Tim Conrow, Drs. Tom Barlow, Karl Forster, Peter G. Friedman, D. Chris Martin, Patrick Morrissey, Mark Seibert, Todd Small and Ted K. Wyder of the California Institute of Technology in Pasadena; Dr. Susan Neff of NASA's Goddard Space Flight Center, Greenbelt, Maryland; Dr. David Schiminovich of Columbia University, N.Y.; Drs. Tim Heckman, Alex Szalay and Luciana Bianchi of Johns Hopkins University, Baltimore, Md.; Dr, Barry Madore of the Observatories of the Carnegie Institute of Washington in Pasadena; and Dr. R. Michael Rich of the University of California, Los Angeles.

Additional information about Galaxy Evolution Explorer is online at


Safe Nanoparticles?

Nanoparticles (NIST)
Brookhaven National Laboratory News Release

UPTON NY August 21, 2006 - Scientists at the U.S. Department of Energy’s Brookhaven National Laboratory have developed a screening method to examine how newly made nanoparticles — particles with dimensions on the order of billionths of a meter — interact with human cells following exposure for various times and doses.

This has led to the visualization of how human cells interact with some specific types of carbon nanoparticles. The method is described in a review article on carbon nanoparticle toxicity in a special section of the August 23, 2006, issue of the Journal of Physics: Condensed Matter devoted to developments in nanoscience and nanotechnology, now available online.

Nanoparticles may have different physical, chemical, electrical, and optical properties than occur in bulk samples of the same material, in part due to the increased surface area to volume ratio at the nanoscale.

Many scientists believe that understanding these nanoscale properties and finding ways to engineer new nanomaterials will have revolutionary impacts — from more efficient energy generation and data storage to improved methods for diagnosing and treating disease.

Brookhaven Lab is currently building a Center for Functional Nanomaterials (CFN) with state-of-the-art facilities for the fabrication and study of nanomaterials, with an emphasis on atomic-level tailoring of nanomaterials and nanoparticles to achieve desired properties and functions.

“Nanomaterials show great promise, but because of their extremely small size and unique properties, little is known about their effects on living systems,” said lead author Barbara Panessa-Warren, a Brookhaven biologist who has been developing a nanoparticle cytotoxicity-screening model for the past five years.

“Our experiments may provide scientists with information to help redesign nanoparticles to minimize safety concerns, and to optimize their use in health-related applications. They may also lead to effective screening practices for carbon-based materials.”

A variety of studies conducted in living animals, which are described in the review article, have found a range of toxic effects resulting from exposure to carbon-based nanoparticles. All of these in vivo studies clearly show that multiple factors interact following nanoparticle exposure to produce acute and chronic changes within individual cells and the organism itself.

In vitro laboratory studies, such as the cell-culture method developed by the Brookhaven team, are an attempt to simplify the research by eliminating many of the variables found in animal studies, giving researchers greater control over experimental conditions.

“By combining techniques of molecular biology with sophisticated imaging methods, we can rapidly gather information about the response of specific cell types to specific nanoparticles, making in vitro testing an inexpensive and immediate tool for screening and fine-tuning nanoparticle design to maximize safety and target specificity,” Panessa-Warren said.

In the Brookhaven team’s studies, the scientists used lung and colon epithelial cells — chosen to represent two likely routes of nanoparticle exposure (inhalation and ingestion) — grown as cell monolayers, where the individual cells join together to form a tight layer with many of the characteristics of lung and colon cells growing in the body as an epithelial layer. These monolayers of living cells are then exposed to varying doses of carbon nanoparticles over differing amounts of time, and the cells are studied at each time period and dose.

The scientists also tested the response of the cells to different types of nanoparticles (a raw nanotube preparation containing mostly single-walled carbon nanotubes, nanoropes, graphene and trace elements; partially cleaned air-oxidized carbon nanotubes; as well as, carbon-nanotube-derived loops used to carry antibodies).

They assessed cell viability (did the cells live or die?) and growth characteristics of the monolayer, and examined any alterations within the cells using various microscopy techniques. These techniques enabled them to visualize the first contact of the nanoparticles with the cells and follow this process “ultrastructurally” so they could see how the cells responded and determine whether the nanoparticles entered the cells or caused specific changes to the cell surfaces of those cells that did not die.

Using this in vitro screening, the scientists found that a type of engineered carbon nanoparticle called a ‘nanoloop,’ which was made at Brookhaven, did not appear to be toxic to either cell type regardless of dose and time. In contrast, both colon and lung cells exposed to carbon nanoparticles from the raw nanotube preparation showed increased cell death with increased exposure time and dose.

Microscopic studies revealed losses of cell-to-cell attachments in the monolayers, and changes in cell-surface morphology on cells where carbon nanotubes and other carbon nanoparticles had attached. Damage was severe for both the low and higher doses at three hours, suggesting that exposure time may be even more predictive of damage than nanoparticle concentration.

Using electron microscopy, the scientists found that areas in which the carbon nanoparticles, and especially carbon nanotubes, touched or attached to the cell surface, the plasma membranes became damaged and were microscopically interrupted. Images of the cell surfaces with attached carbon nanoparticles showed membrane holes that exposed the underlying cell cytoplasm.

Transmission electron microscopy revealed that small carbon particles could pass into the cells and become incorporated into the cell nuclei. Neighboring cells with no attached carbon nanoparticles appeared normal and continued to grow, suggesting that direct contact with untreated nanoparticles is required for damage to occur.

Carbon nanoparticles called fullerenes, also known
as “buckyballs.”

These findings agree with recent biochemical studies in the literature that reported the production of reactive oxygen species (free radicals) and lipid peroxidation of cell membranes following living cell contact with other forms of carbon nanoparticles called fullerenes, also known as “buckyballs.”

“Although our screening method gives us a quick way to analyze human cell responses to nanoparticles at a visual macro- and micro- scale, we are now taking this to a molecular and genetic level to see whether the cells are stressed,” said Pannessa-Warren.

“Ultimately any new nanomaterials intended for large-scale production or use would also have to be tested in vivo — where the combined reactions of many cell types and tissues, as well as the blood, immune, and hormonal factors, are all taken into account to assess biocompatibility and assure safety,” she added.

“Still, our methods give us a way to screen-out those nanoparticles that shouldn’t even make it that far, or identify ways to improve them first.”

This research was funded by the Office of Basic Energy Sciences within the U.S. Department of Energy’s Office of Science. The CFN at Brookhaven Lab is one of five Nanoscale Science Research Centers being constructed at national laboratories by the DOE’s Office of Science to provide the nation with resources unmatched anywhere else in the world for synthesis, processing, fabrication, and analysis at the nanoscale.

Brookhaven National Laboratory -

Gas on Mars!
NASA News Release

Artist concept showing sand-laden jets shoot into the
Martian polar sky. (Arizona State University/ Ron Miller)

August 16, 2006 - Every spring brings violent eruptions to the south polar ice cap of Mars, according to researchers interpreting new observations by NASA's Mars Odyssey orbiter.

Jets of carbon dioxide gas erupting from the ice cap as it warms in the spring carry dark sand and dust high aloft. The dark material falls back to the surface, creating dark patches on the ice cap which have long puzzled scientists. Deducing the eruptions of carbon dioxide gas from under the warming ice cap solves the riddle of the spots. It also reveals that this part of Mars is much more dynamically active than had been expected for any part of the planet.

"If you were there, you'd be standing on a slab of carbon-dioxide ice," said Phil Christensen of Arizona State University, Tempe, principal investigator for Odyssey's camera. "All around you, roaring jets of carbon dioxide gas are throwing sand and dust a couple hundred feet into the air."

You'd also feel vibration through your spacesuit boots, he said. "The ice slab you're standing on is levitated above the ground by the pressure of gas at the base of the ice."

The team began its research in an attempt to explain mysterious dark spots, fan-like markings, and spider-shaped features seen in images that cameras on Odyssey and on NASA's Mars Global Surveyor have observed on the ice cap at the Martian south pole.

The dark spots, typically 15 to 46 meters (50 to 150 feet) wide and spaced several hundred feet apart, appear every southern spring as the sun rises over the ice cap. They last for several months and then vanish -- only to reappear the next year, after winter's cold has deposited a fresh layer of ice on the cap. Most spots even seem to recur at the same locations.

An earlier theory proposed that the spots were patches of warm, bare ground exposed as the ice disappeared. However, the camera on Odyssey, which sees in both infrared and visible-light wavelengths, discovered that the spots are nearly as cold as the carbon dioxide ice, suggesting they were just a thin layer of dark material lying on top of the ice and kept chilled by it. To understand how that layer is produced, Christensen's team used the camera -- the Thermal Emission Imaging System -- to collect more than 200 images of one area of the ice cap from the end of winter through midsummer.

Dark spots (left) and 'fans' appear to scribble dusty hieroglyphics on top
of the Martian south polar cap. (NASA/ JPL/ Malin Space Science Systems)

Some places remained spot-free for more than 100 days, then developed many spots in a week. Fan-shaped dark markings didn't form until days or weeks after the spots appeared, yet some fans grew to half a mile in length. Even more puzzling was the origin of the "spiders," grooves eroded into the surface under the ice. The grooves converge at points directly beneath a spot.

"The key to figuring out the spiders and the spots was thinking through a physical model for what was happening," said Christensen. The process begins in the sunless polar winter when carbon dioxide from the atmosphere freezes into a layer about three feet thick on top of a permanent ice cap of water ice, with a thin layer of dark sand and dust in between. In spring, sunlight passing through the slab of carbon dioxide ice reaches the dark material and warms it enough that the ice touching the ground sublimates -- turns into gas.

Before long, the swelling reservoir of trapped gas lifts the slab and eventually breaks through at weak spots that become vents.

High-pressure gas roars through at speeds of 161 kilometers per hour (100 miles per hour) or more. Under the slab, the gas erodes ground as it rushes toward the vents, snatching up loose particles of sand and carving the spidery network of grooves.

Christensen, Hugh Kieffer (U.S. Geological Survey, retired) and Timothy Titus (USGS) report the new interpretation in the Aug. 17, 2006, issue of the journal "Nature."

JPL, a division of the California Institute of Technology, Pasadena, manages Mars Odyssey and Mars Global Surveyor missions for the NASA Science Mission Directorate. Odyssey's Thermal Emission Imaging System is operated by Arizona State University.

For additional information about Odyssey and the new findings, visit  and

Chimps With Hammers!

It may indicate that nut-cracking has been invented [by chimps]
on more than one occasion in widely separated populations.
Cell Press News Release

August 21, 2006 - In a finding that challenges a long-held belief regarding the cultural spread of tool use among chimpanzees, researchers report that chimpanzees in the Ebo forest, Cameroon, use stone hammers to crack open hard-shelled nuts to access the nutrient-rich seeds.

The findings are significant because this nut-cracking behavior was previously known only in a distant chimpanzee population in extreme western Africa and was thought to be restricted by geographical boundaries that prevented cultural spread of the technique from animal to animal.

The findings, which involve the most endangered and least-understood subspecies of chimpanzee, are reported by Dr. Bethan Morgan and Ekwoge Abwe of the Zoological Society of San Diego's Conservation and Research for Endangered Species (CRES) and appear in the August 22nd issue of the journal Current Biology, published by Cell Press.

Prior to this discovery, it was thought that chimpanzee nut-cracking behavior was confined to the region west of the N'Zo-Sassandra River in Cote d'Ivoire. Because there are no relevant ecological or genetic differences between populations on either side of this "information barrier," explain the researchers of the new study, the implication had been that nut-cracking is a behavioral tradition constrained in its spread by a physical barrier:

It was absent to the east of the river because it had not been invented there. The new finding that chimpanzees crack open nuts more than 1700 km east of the supposed barrier challenges this long-accepted model. According to the authors of the study, the discontinuous distribution of the nut-cracking behavior may indicate that the original "culture zone" was larger, and nut-cracking behavior has become extinct between the N'Zo-Sassandra and Ebo.

Alternatively, it may indicate that nut-cracking has been invented on more than one occasion in widely separated populations.

This is one of the first reports of tool use for Pan troglodytes vellerosus, the most endangered and understudied chimpanzee subspecies. It highlights the necessity to preserve the rich array of cultures found across chimpanzee populations and communities, which represent our best model for understanding the evolution of hominid cultural diversity.

As such, the new finding promises to both benefit research and inform the conservation of our closest living relative.

Cell Press -

Paperback books by Rich La Bonté - Free e-previews!