CLICK HERE FOR BLOGGER TEMPLATES AND MYSPACE LAYOUTS »

Thursday, October 16, 2008

THE CARTESIAN DIVER

new twist for the great classic
Cartesian divers were first noted by a student of Galileo Galilee! Some people back in those days thought its mysterious dives and accents in the water smacked of dark magic, as witnessed by the name it was sometimes called: Devil's Diver. The advent of clear plastic bottles made Cartesian divers popular in school science classes, as well they should be. What a cool way to learn about density, buoyancy, compression of gases, etc.
Unfortunately, some diver designs are either too expensive for large groups, or they hide what's going on inside the diver. For example, pen caps and modeling clay are cheap enough for everyone to make, but you can't see the air compressing and the water rushing in. Using clear or translucent eye droppers costs at least a dollar each. Not bad if you're just making one at home, but quite a bit of money if you are making them with a large group. I offer two designs to fill the void. If my design is not what you are looking for

HOW TO MAKE THE VORTEX

two un-crushed, 2-liter soda bottles
Try to get bottles that have not been really crushed, because the dents that remain can interfere with the spinning water.
1/2" diameter PVC plastic pipe.
It is our good fortune that very cheap and easy to find 1/2" PVC pipe fits snuggly inside 2-liter bottles. When used to connect two bottles, the pipe keeps them from flopping around.
Some hardware stores might sell it by the foot (you only need 2 inches per project). From other places, like building centers, you will have to buy 5 or 10 feet at a time. It is still little more than a dollar for 10 feet. PVC pipe is white. Do not get "CPVC" pipe, which is more expensive and a tanish color. One more confusing thing: 1/2" diameter pipe is actually bigger than 1/2". Go by what it says on the pipe, not by actually measuring the diameter.
clear plastic food wrap
The only really challenging part of this project is making the bottles completely sealed--no leaks. The best way to do it is with plastic wrap. By stretching the plastic tightly as it is wrapped around the bottles, it seals very well.
electrical tape or duct tape or masking tape; scissors; hacksaw (or just a hacksaw blade)
Duct tape is best. The hacksaw or hacksaw blade is for cutting the plastic pipe.
cold water, or refrigeration or ice or snow or cold weather to make it cold.
I know it seems a bit strange to be concerned about air and water temperature as we seal the two bottles together. The reason behind it is quite interesting. I go into it more in the Activities and Explanations page.
For building this project, you should know this much: It is important that the air and water in the bottles be cold when they are being sealed. As the air warms up, it will expand a little. Because it's trapped inside, the air pressure inside the bottles becomes a little higher than the air pressure outside. That's good, because it keeps the bottles firm, just like the higher air pressure in a bicycle tire keeps the tire from going flat.
Without the higher pressure inside, the bottles will dent a little. Because they are no longer perfectly round, it's harder to make a good vortex. Fortunately, this is easy to prevent by keeping things cool when sealing the bottles.
Step 1
Cut a 2" long piece of pipe.
PVC pipe is easy to cut and it only takes a few seconds. Even if you don't have a hacksaw frame, you can cut with just a blade if you wrap a little tape around as a handle. Hack saws are much safer than any other kind of saw.
I am a believer in letting even young kids participate as much as possible, so I offer this illustration as a proven way kids (green hands) can do some sawing, while you (blue hands) maintain control of the saw. Because of the way the pipe is supported on both ends, it doesn't slide around.
Here you can see a fast and safe jig for cutting, made of a scrap of wood and 5 nails. The hacksaw slides against two nails that automatically cut a 2" piece. It's worth making if you are working with a group.
When you're done sawing, clean out the plastic "crud" that sticks to the end of the pipe where you made the cut. Use your fingernail.
Step 2
Prepare the bottles for sealing.
These sub-steps don't have to be done in any particular order:
*Rinse out bottles.
*Peel off labels. Save a colorful piece for the next part.
*Cut some confetti from the plastic labels you just peeled off. They become tracers in the vortex (they remind me of the house spinning around in the Wizard of Oz). No piece should be larger than the finger nail on your pinky. Put the pieces into one of the bottles.
*Cut a piece of electrical tape 3" long, or cut duct tape into a a 3/4" slice, 3" long. Wrap it tightly around the middle of the pipe. This tape will keep the pipe from falling into one bottle or the other.
*Fill one bottle about 2/3 full of the coldest water you can get. For example, if there is a drinking fountain with chilled water, it would be worth it to catch it in a cup and pour it into the bottle. Alternately, you could but half a handful of crushed ice or snow in both bottles, or you could put the bottles in the refrigerator or outside (if it's cold) for half an hour.
If the kids making the vortex are very young (kindergarten or first grade), only fill the bottle half full. It will be easier to carry and easier to start the vortex.
Step 3
Just before you wrap.
Push the pipe into the filled bottle until it gets hard to push (because of the tape). Turn the other bottle upside down and push it on the other side of the pipe. Push and twist it pretty hard so it squishes some of the tape.
Just before sealing the bottles, turn them upside down so the empty bottle fills (a few drops of water might leak out). This will cool the air inside. The top bottle will probably dent in a little, which means the air in the system is contracting. Pull the bottles apart for an instant to let more air in, so the bottle is not dented anymore. Do this just before sealing so the air doesn't have a chance to warm up much.
Step 4
Seal the bottles together.
The key to sealing the bottles is to keep the plastic wrap tight as you wrap it on.
Take the roll out of the box and hold it in one hand. Start wrapping the end around the bottles with the other hand. At some point, the wrapped part won't slip off even when you pull very hard. You should notice the plastic stretching over the handles. That is where the seal will be made. Wrap at least 10 tight wraps around the bottles. I know I'm repeating myself, but if you don't apply the plastic under stretching tension, it won't seal.
Although the bottles should now be sealed, wrap some duct tape or electrical tape a few of times around the plastic. This keeps it from unraveling, and it keeps the bottles from separating when you lift by the top bottle.
Step 5
Use it!
Turn the bottles over. Grab the very top an swish it in a circular motion two or three times, then stop suddenly. This will give the water enough circular momentum to create the vortex.
If the bottle dents, don't worry. It will fill out in a hour or so, as the water warms--if you sealed it well. Until then, at the same time you swishing the top end in a circle to get the water moving, squeeze the the bottom bottle. When the dent is on the bottom, it doesn't interfere with the vortex.

thaumatrope

The thaumatrope is a good warm up for the movie wheel and it only takes a minute to make. The mysterious message written on the thaumatrope pattern will appear when you spin it. Interestingly, the thaumatrope preceded --and led to the invention of-- the movie wheel.
What you need.
thread, tape and scissors
The thread could be dental floss or even a couple of very thin rubber band cut open and tied together.
STEP 1
Cut out the thaumatrope pattern and fold in half.

Check the printed paper to make sure it did not re-scale the size of the pattern. If it says something like, "Scaled-60%" try another browser. Netscape seems to be the worst at re-scaling. Cut on the solid lines. Fold carefully right on the dashed line so the printed part is on the outside. Using a strait edge or a table corner will help make the fold straight.
STEP 2
Tape in the string and tape halves together.
Tape a string onto the non-print side so it splits one of the halves, as shown. Then tape the halves shut. You should now have a two-dimensional rectangle with letters on both sides a string splitting it right through the middle.
STEP 3
Try it out
Twirl the thaumatrope by rolling the string between thumb and forefinger of each hand as fast as you can. If you are using dental floss, sometimes you have to roll it awhile before it works smoothly. If using a thin rubber band, pull it slightly as you spin it. You should see "PERSISTENCE OF VISION." That expression was used to explain how we perceived animation. It is being superceded by the expression, "phi phenomenon."

Scientists Propose Creation Of New Type Of Seed Bank

While an international seed bank in a Norwegian island has been gathering news about its agricultural collection, a group of U.S. scientists has just published an article outlining a different kind of seed bank, one that proposes the gathering of wild species –– at intervals in the future –– effectively capturing evolution in action.
In the October issue of Bioscience, Steven J. Franks of Fordham University, Susan J. Mazer of the University of California, Santa Barbara, and a group of colleagues, have proposed a method of collecting and storing seeds of natural plant populations. They argue for the collection of many species in a way that evolutionary responses to future changes in climate can be detected. They call it the "Resurrection Initiative."
"In contrast to existing seed banks, which exist primarily for conservation, this collection would be for research that would allow a greater understanding of evolution," said Franks.
"This seed collection would form an important resource that can be used for many types of research, just as GenBank –– the collection of genetic sequences and information –– forms a key resource for research in genetics and genomics," said Franks.
"Typically, seed banks are focused on the preservation of agricultural species or other plant species of strong economic interest, say, forest species, forest trees," said Mazer. This is to make sure that scientists can maintain a genetically diverse seed pool in the event of some kind of ecological calamity that requires the replenishing of seeds from a certain part of the world or from certain species. "But that implies a relatively static view of a seed bank, a snapshot forever of what a species provides."
Evolutionary biologists recognize that the gene pool of any species is a dynamic resource that changes over time as a result of random events such as highly destructive climatic events like hurricanes, but also through sustained and ongoing processes like evolution by natural selection.
While most scientists agree that the climate is changing, the extent to which species will be able to evolve to keep up with these changes is unknown.
According to the article, the only way that scientists can detect the results of those sorts of calamitous changes –– and test evolutionary predictions about what sorts of changes might occur over time –– is to sample seed banks in a repeated fashion. Then they must compare the attributes of the gene pools that are sampled at different times to a baseline.
"One way that we can obtain this baseline is by collecting seeds at a given point in time and archiving them under ideal environmental conditions, so that they all stay alive, and so that 10, 20, and 30 years down the road, we can compare them to seeds that we collect in the future to see how the gene pool has changed," explained Mazer.
This approach will allow a number of things that a one-time, seed-sampling event doesn't. Scientists can evaluate the result of the effects of climate change, land use change, and other kinds of environmental changes such as the spread of disease on the gene pool.
"Currently seed banks don't allow this for a couple of reasons," said Mazer. "First, they focus on species that have been under cultivation for a long period. Species that have been under cultivation have relatively low levels of genetic variation –– because we have been selecting them only for the attributes that we want. Wild species, by contrast, contain a high degree of genetic variation in almost any trait that we might examine."
Agricultural species are often selected to have a predictable flowering time, a predictable seed size –– and a predictable degree of tolerance for drought, salt, or heavy metals. By contrast, wild species retain a much greater degree of genetic diversity in all of these traits.
Mazer explained that scientists don't know whether or not the environmental changes that are ongoing, due to changes in climate or land use practices, are reducing the amount of genetic variation in the wild. If they are, the only way it can be detected will be by sampling representative seeds from a large number of populations at very regular intervals.
"The approach that we would use is not simply to collect seeds over various time intervals and to archive them, but in the future to raise them in a common environment comparing seeds that were collected in 2010, 2030, and 2050, for example," said Mazer. "If we found, for example, that the plants that come from seeds that were collected 50 years from now flower much earlier than those that were collected today, we could logically infer that natural selection over 50 years had favored plants, that is genotypes that flowered earlier and earlier, relative to those that delayed flowering."
Mazer explained that scientists and the public have been thrilled recently by an increase in the understanding of the value of seed banks, and in particular with the seed bank that is underway in Norway, called the Svalbard Global Seed Vault, on the island of Spitsbergen.
"However, that kind of seed bank doesn't finish the job," said Mazer. "The Norwegian seed bank is planning to preserve hundreds of thousands of varieties of agricultural plant species, but most of those samples represent only a tiny fraction of that which you would find in a wild population of a wild species." Nor does it allow for insights into the evolutionary process, enabled by the combination of seed banking and subsequent raising of plants as proposed by the "Resurrection Initiative."

Wednesday, October 15, 2008

Major Discovery May Unleash Solar Revolution

MIT researchers have developed a new catalyst, consisting of cobalt metal, phosphate and an electrode. When the catalyst is placed in water and electricity runs through the electrode, oxygen gas is produced. When another catalyst is used to produce hydrogen gas, the oxygen and hydrogen can be combined inside a fuel cell, creating carbon-free electricity to power a house or an electric car, day or night. Photo / MIT News Office
window.google_render_ad();
In a revolutionary leap that could transform solar power from a marginal, boutique alternative into a mainstream energy source, MIT researchers have overcome a major barrier to large-scale solar power: storing energy for use when the sun doesn't shine.Until now, solar power has been a daytime-only energy source, because storing extra solar energy for later use is prohibitively expensive and grossly inefficient. With today's announcement, MIT researchers have hit upon a simple, inexpensive, highly efficient process for storing solar energy.Requiring nothing but abundant, non-toxic natural materials, this discovery could unlock the most potent, carbon-free energy source of all: the sun. 'This is the nirvana of what we've been talking about for years,' said MIT's Daniel Nocera, the Henry Dreyfus Professor of Energy at MIT and senior author of a paper describing the work in the July 31 issue of Science. 'Solar power has always been a limited, far-off solution. Now we can seriously think about solar power as unlimited and soon.'Inspired by the photosynthesis performed by plants, Nocera and Matthew Kanan, a postdoctoral fellow in Nocera's lab, have developed an unprecedented process that will allow the sun's energy to be used to split water into hydrogen and oxygen gases. Later, the oxygen and hydrogen may be recombined inside a fuel cell, creating carbon-free electricity to power your house or your electric car, day or night.The key component in Nocera and Kanan's new process is a new catalyst that produces oxygen gas from water; another catalyst produces valuable hydrogen gas. The new catalyst consists of cobalt metal, phosphate and an electrode, placed in water. When electricity -- whether from a photovoltaic cell, a wind turbine or any other source -- runs through the electrode, the cobalt and phosphate form a thin film on the electrode, and oxygen gas is produced.Combined with another catalyst, such as platinum, that can produce hydrogen gas from water, the system can duplicate the water splitting reaction that occurs during photosynthesis.The new catalyst works at room temperature, in neutral pH water, and it's easy to set up, Nocera said. 'That's why I know this is going to work. It's so easy to implement,' he said.'Giant leap' for clean energySunlight has the greatest potential of any power source to solve the world's energy problems, said Nocera. In one hour, enough sunlight strikes the Earth to provide the entire planet's energy needs for one year.James Barber, a leader in the study of photosynthesis who was not involved in this research, called the discovery by Nocera and Kanan a 'giant leap' toward generating clean, carbon-free energy on a massive scale.'This is a major discovery with enormous implications for the future prosperity of humankind,' said Barber, the Ernst Chain Professor of Biochemistry at Imperial College London. 'The importance of their discovery cannot be overstated since it opens up the door for developing new technologies for energy production thus reducing our dependence for fossil fuels and addressing the global climate change problem.''Just the beginning'Currently available electrolyzers, which split water with electricity and are often used industrially, are not suited for artificial photosynthesis because they are very expensive and require a highly basic (non-benign) environment that has little to do with the conditions under which photosynthesis operates.More engineering work needs to be done to integrate the new scientific discovery into existing photovoltaic systems, but Nocera said he is confident that such systems will become a reality.'This is just the beginning,' said Nocera, principal investigator for the Solar Revolution Project funded by the Chesonis Family Foundation and co-Director of the Eni-MIT Solar Frontiers Center. 'The scientific community is really going to run with this.'Nocera hopes that within 10 years, homeowners will be able to power their homes in daylight through photovoltaic cells, while using excess solar energy to produce hydrogen and oxygen to power their own household fuel cell. Electricity-by-wire from a central source could be a thing of the past.

Quantum 'Traffic Jam' in High-Temperature Superconductors

Scientists at the U.S. Department of Energy's Brookhaven National Laboratory, in collaboration with colleagues at Cornell University, Tokyo University, the University of California, Berkeley, and the University of Colorado, have uncovered the first experimental evidence for why the transition temperature of high-temperature superconductors — the temperature at which these materials carry electrical current with no resistance — cannot simply be elevated by increasing the electrons' binding energy. The research — to be published in the August 28, 2008, issue of Nature — demonstrates how, as electron-pair binding energy increases, the electrons' tendency to get caught in a quantum mechanical 'traffic jam' overwhelms the interactions needed for the material to act as a superconductor — a freely flowing fluid of electron pairs.'We've made movies to show this traffic jam as a function of energy. At some energies, the traffic is moving and at others the electron traffic is completely blocked,' said physicist J.C. Seamus Davis of Brookhaven National Laboratory and Cornell University, lead author on the paper. Davis will be giving a Pagels Memorial Public Lecture to announce these results at the Aspen Center for Physics on August 27.Understanding the detailed mechanism for how quantum traffic jams (technically referred to as 'Mottness' after the late Sir Neville Mott of Cambridge, UK) impact superconductivity in cuprates may point scientists toward new materials that can be made to act as superconductors at significantly higher temperatures suitable for real-world applications such as zero-loss energy generation and transmission systems and more powerful computers.The idea that increasing binding energy could elevate a superconductor's transition temperature stems from the mechanism underlying conventional superconductors' ability to carry current with no resistance. In those materials, which operate close to absolute zero (0 kelvin, or -273 degrees Celsius), electrons carry current by forming so-called Cooper pairs. The more strongly bound those electron pairs, the higher the transition temperature of the superconductor.But unlike those metallic superconductors, the newer forms of high-temperature superconductors, first discovered some 20 years ago, originate from non-metallic, Mott-insulating materials. Elevating these materials’ pair-binding energy only appears to push the transition temperature farther down, closer to absolute zero rather than toward the desired goal of room temperature or above.“It has been a frustrating and embarrassing problem to explain why this is the case,” Davis said. Davis’s research now offers an explanation.In the insulating 'parent' materials from which high-temperature superconductors arise, which are typically made of materials containing copper and oxygen, each copper atom has one 'free' electron. These electrons, however, are stuck in a Mott insulating state — the quantum traffic jam — and cannot move around. By removing a few of the electrons — a process called 'hole doping' — the remaining electrons can start to flow from one copper atom to the next. In essence, this turns the material from an insulator to a metallic state, but one with the startling property that it superconducts — it carries electrical current effortlessly without any losses of energy.'It’s like taking some cars off the highway during rush hour. All of a sudden, the traffic starts to move,' said Davis.cuprate high-temperature superconductorThe proposed mechanism for how these materials carry the current depends on magnetic interactions between the electrons causing them to form superconducting Cooper pairs. Davis's research, which used 'quasiparticle interference imaging' with a scanning tunneling microscope to study the electronic structure of a cuprate superconductor, indicates that those magnetic interactions get stronger as you remove holes from the system. So, even as the binding energy, or ability of electrons to link up in pairs, gets higher, the 'Mottness,' or quantum traffic-jam effect, increases even more rapidly and diminishes the ability of the supercurrent to flow.'In essence, the research shows that what is believed to be required to increase the superconductivity in these systems — stronger magnetic interactions — also pushes the system closer to the 'quantum traffic-jam' status, where lack of holes locks the electrons into positions from which they cannot move. It's like gassing up the cars and then jamming them all onto the highway at once. There's lots of energy, but no ability to go anywhere,' Davis said.With this evidence pointing the scientists to a more precise theoretical understanding of the problem, they can now begin to explore solutions. 'We need to look for materials with such strong pairing but which don’t exhibit this Mottness or 'quantum traffic-jam' effect,' Davis said.Scientists at Brookhaven are now investigating promising new materials in which the basic elements are iron and arsenic instead of copper and oxygen. 'Our hope is that they will have less 'traffic-jam' effect while having stronger electron pairing,' Davis said. Techniques developed for the current study should allow them to find out.This research was funded primarily by the Brookhaven Lab's Laboratory-Directed Research and Development Fund, by the Office of Basic Energy Sciences within DOE's Office of Science, by a Grant-in-Aid for Scientific Research from the Ministry of Science and Education (Japan), and by the 21st Century COE Program of the Japan Society for the Promotion of Science.

New 'Nano-Positioners' May Have Atomic-Scale Precision

Engineers have created a tiny motorized positioning device that has twice the dexterity of similar devices being developed for applications that include biological sensors and more compact, powerful computer hard drives.The device, called a monolithic comb drive, might be used as a 'nanoscale manipulator' that precisely moves or senses movement and forces. The devices also can be used in watery environments for probing biological molecules, said Jason Vaughn Clark, an assistant professor of electrical and computer engineering and mechanical engineering, who created the design.The monolithic comb drives could make it possible to improve a class of probe-based sensors that detect viruses and biological molecules. The sensors detect objects using two different components: A probe is moved while at the same time the platform holding the specimen is positioned. The new technology would replace both components with a single one - the monolithic comb drive.The innovation could allow sensors to work faster and at higher resolution and would be small enough to fit on a microchip. The higher resolution might be used to design future computer hard drives capable of high-density data storage and retrieval. Another possible use might be to fabricate or assemble miniature micro and nanoscale machines.Research findings were detailed in a technical paper presented in July during the University Government Industry Micro/Nano Symposium in Louisville. The work is based at the Birck Nanotechnology Center at Purdue's Discovery Park.Conventional comb drives have a pair of comblike sections with 'interdigitated fingers,' meaning they mesh together. These meshing fingers are drawn toward each other when a voltage is applied. The applied voltage causes the fingers on one comb to become positively charged and the fingers on the other comb to become negatively charged, inducing an attraction between the oppositely charged fingers. If the voltage is removed, the spring-loaded comb sections return to their original position.By comparison, the new monolithic device has a single structure with two perpendicular comb drives.Clark calls the device monolithic because it contains comb drive components that are not mechanically and electrically separate. Conventional comb drives are structurally 'decoupled' to keep opposite charges separated.'Comb drives represent an advantage over other technologies,' Clark said. 'In contrast to piezoelectric actuators that typically deflect, or move, a fraction of a micrometer, comb drives can deflect tens to hundreds of micrometers. And unlike conventional comb drives, which only move in one direction, our new device can move in two directions - left to right, forward and backward - an advance that could really open up the door for many applications.'Clark also has invented a way to determine the precise deflection and force of such microdevices while reducing heat-induced vibrations that could interfere with measurements.Current probe-based biological sensors have a resolution of about 20 nanometers.'Twenty nanometers is about the size of 200 atoms, so if you are scanning for a particular molecule, it may be hard to find,' Clark said. 'With our design, the higher atomic-scale resolution should make it easier to find.'Properly using such devices requires engineers to know precisely how much force is being applied to comb drive sensors and how far they are moving. The new design is based on a technology created by Clark called electro micro metrology, which enables engineers to determine the precise displacement and force that's being applied to, or by, a comb drive. The Purdue researcher is able to measure this force by comparing changes in electrical properties such as capacitance or voltage.Clark used computational methods called nodal analysis and finite element analysis to design, model and simulate the monolithic comb drives.The research paper describes how the monolithic comb drive works when voltage is applied. The results show independent left-right and forward-backward movement as functions of applied voltage in color-coded graphics.The findings are an extension of research to create an ultra-precise measuring system for devices having features on the size scale of nanometers, or billionths of a meter. Clark has led research to create devices that 'self-calibrate,' meaning they are able to precisely measure themselves. Such measuring methods and standards are needed to better understand and exploit nanometer-scale devices.The size of the entire device is less than one millimeter, or a thousandth of a meter. The smallest feature size is about three micrometers, roughly one-thirtieth as wide as a human hair.'You can make them smaller, though,' Clark said. 'This is a proof of concept. The technology I'm developing should allow researchers to practically and efficiently extract dozens of geometric and material properties of their microdevices just by electronically probing changes in capacitance or voltage.'In addition to finite element analysis, Clark used a simulation tool that he developed called Sugar.'Sugar is fast and allows me to easily try out many design ideas,' he said. 'After I narrow down to a particular design, I then use finite element analysis for fine-tuning. Finite element analysis is slow, but it is able to model subtle physical phenomena that Sugar doesn't do as well.'Clark's research team is installing Sugar on the nanoHub this summer, making the tool available to other researchers. The nanoHub is operated by the Network for Computational Nanotechnology, funded by the National Science Foundation and housed at Purdue's Birck Nanotechnology Center. The researchers also are in the process of fabricating the devices at the Birck Nanotechnology Center.

Cluster Watches Earth’s Leaky Atmosphere

Oxygen is constantly leaking out of Earth’s atmosphere and into space. Now, ESA’s formation-flying quartet of satellites, Cluster, has discovered the physical mechanism that is driving the escape. It turns out that the Earth’s own magnetic field is accelerating the oxygen away.The new work uses data collected by Cluster from 2001 to 2003. During this time, Cluster amassed information about beams of electrically charged oxygen atoms, known as ions, flowing outwards from the polar regions into space. Cluster also measured the strength and direction of the Earth’s magnetic field whenever the beams were present. Hans Nilsson, Swedish Institute of Space Physics, headed a team of space scientists who analysed the data. They discovered that the oxygen ions were being accelerated by changes in the direction of the magnetic field. “It is a bit like a sling-shot effect,” says Nilsson.Having all four Cluster spacecraft was essential to the analysis because it gave astronomers a way to measure the strength and direction of the magnetic field over a wide area. “Cluster allowed us to measure the gradient of the magnetic field and see how it was changing direction with time,” says Nilsson.Before the space age, scientists believed that Earth’s magnetic field was filled only with particles from the solar wind, the constant sleet of particles that escapes from the Sun. They thought this formed a large cushion that protected the Earth’s atmosphere from direct interaction with the solar wind.“We are beginning to realise just how many interactions can take place between the solar wind and the atmosphere,” says Nilsson. Energetic particles from the solar wind can be channelled along the magnetic field lines and, when these impact the atmosphere of the Earth, they can produce aurorae. This occurs over the poles of Earth. The same interactions provide the oxygen ions with enough energy to accelerate out of the atmosphere and reach the Earth’s magnetic environment.The Cluster data were captured over the poles with the satellites flying at an altitude of anywhere between 30,000 and 64,000 kilometres. Measurements taken by earlier satellites during the 1980s and 1990s showed that the escaping ions were travelling faster the higher they were observed. This implied that some sort of acceleration mechanism was involved and several possibilities were proposed. Thanks to this new Cluster study, the mechanism accounting for most of the acceleration has now been identified.At present, the escape of oxygen is nothing to worry about. Compared to the Earth’s stock of the life-supporting gas, the amount escaping is negligible. However, in the far future when the Sun begins to heat up in old age, the balance might change and the oxygen escape may become significant. “We can only predict these future changes if we understand the mechanisms involved,” says Nilsson.For now, Cluster will continue collecting data and providing new insights into the complex magnetic environment surrounding our planet.

Large Hadron Collider Set To Unveil A New World Of Particle Physics

The field of particle physics is poised to enter unknown territory with the startup of a massive new accelerator--the Large Hadron Collider (LHC)--in Europe this summer. On September 10, LHC scientists will attempt to send the first beam of protons speeding around the accelerator.The LHC will put hotly debated theories to the test as it generates a bonanza of new experimental data in the coming years. Potential breakthroughs include an explanation of what gives mass to fundamental particles and identification of the mysterious dark matter that makes up most of the mass in the universe. More exotic possibilities include evidence for new forces of nature or hidden extra dimensions of space and time.'The LHC is a discovery machine. We don't know what we'll find,' said Abraham Seiden, professor of physics and director of the Santa Cruz Institute for Particle Physics (SCIPP) at the University of California, Santa Cruz.SCIPP was among the initial group of institutions that spearheaded U.S. participation in the LHC. About half of the entire U.S. experimental particle-physics community has focused its energy on the ATLAS and CMS detectors, the largest of four detectors where experiments will be performed at the LHC. SCIPP researchers have been working on the ATLAS project since 1994. It is one of many international physics and astrophysics projects that have drawn on SCIPP's 20 years of experience developing sophisticated technology for tracking high-energy subatomic particles.The scale of the LHC is gigantic in every respect--its physical size, the energies attained, the amount of data it will generate, and the size of the international collaboration involved in its planning, construction, and operation. In September, high-energy beams of protons will begin circulating around the LHC's 27-kilometer (16.8-mile) accelerator ring located 100 meters (328 feet) underground at CERN, the European particle physics lab based in Geneva, Switzerland. After a period of testing, the beams will cross paths inside the detectors and the first collisions will take place.Even before the machine is ramped up to its maximum energy early next year, it will smash protons together with more energy than any previous collider. The debris from those collisions--showers of subatomic particles that the detectors will track and record--will yield results that could radically change our understanding of the physical world.In a talk at the American Physical Society meeting earlier this year, Seiden gave an overview of the LHC research program, including a rough timeline for reaching certain milestones. One of the most highly anticipated milestones, for example, is detection of the Higgs boson, a hypothetical particle that would fill a major gap in the standard model of particle physics by endowing fundamental particles with mass. Detection of the Higgs boson is most likely to occur in 2010, Seiden said.But there's no guarantee that the particle actually exists; nature may have found another way to create mass. 'I'm actually hoping we find something unexpected that does the job of the Higgs,' Seiden said.Technically, the Higgs boson was postulated to explain a feature of particle interactions known as the breaking of electroweak symmetry, and the LHC is virtually guaranteed to explain that phenomenon, according to theoretical physicist Howard Haber.'We've been debating this for 30 years, and one way or another, the LHC will definitively tell us how electroweak symmetry breaking occurs. That's a fundamental advance,' said Haber, a professor of physics at UCSC.Haber and other theorists have spent years imagining possible versions of nature, studying their consequences, and describing in detail what the evidence would look like in the experimental data from a particle accelerator such as the LHC. The Higgs boson won't be easy to find, he said. The LHC should produce the particles in abundance (if they exist), but most of them will not result in a very distinctive signal in the detectors.'It's a tough game. You can only do it by statistical analysis, since there are other known processes that produce events that can mimic a Higgs boson signal,' Haber said.Evidence to support another important theory--supersymmetry--could show up sooner. In many ways, supersymmetry is a more exciting possibility than the Higgs boson, according to theorist Michael Dine, also a professor of physics at UCSC.'By itself, the Higgs is a very puzzling particle, so there have been a lot of conjectures about some kind of new physics beyond the standard model. Supersymmetry has the easiest time fitting in with what we know,' Dine said.Adding to its appeal, supersymmetry predicts the existence of particles that are good candidates to account for dark matter. Astronomers have detected dark matter through its gravitational effects on stars and galaxies, but they don't yet know what it is. Particles predicted by supersymmetry that could account for dark matter may be identified at the LHC as early as next year, Seiden said.'Initially, we'll be looking for things that are known standards to make sure that everything is working properly. In 2009, we could start really looking for new things like supersymmetry,' he said.The massive ATLAS detector--45 meters (148 feet) long and 25 meters (82 feet) high--has involved more than 2,000 physicists at 166 institutions. Seiden's team at SCIPP has been responsible for developing the silicon sensors and electronics for the detector's inner tracker, which measures the trajectories of charged particles as they first emerge from the site of the collisions.Seiden is now leading the U.S. effort to develop a major upgrade of ATLAS. The current detector is designed to last for 10 years, and the upgrade will coincide with a planned increase in the luminosity of the proton beams at the LHC (which will then become the 'Super LHC').'These large projects take such a long time, we have to start early,' Seiden said.Meanwhile, operation and testing of the current ATLAS detector is already under way at CERN, said Alexander Grillo, a SCIPP research physicist who has been working on the project from the start.'We've been operating it and looking at cosmic ray particles,' he said. 'Nature gives us these cosmic rays for free, and they're the same kinds of particles we'll see when the machine turns on, so it enables us to check out certain aspects of the detector. But we're very excited to start seeing collisions from the machine.'ATLAS and the other LHC detectors are designed with 'trigger' systems that ignore most of the signals and record only those events likely to yield interesting results. Out of the hundreds of millions of collisions happening every second inside the detector, only 100 of the most promising events will be selected and recorded in the LHC's central computer system.'We'll be throwing away a lot of data, so we have to make sure the triggers are working correctly,' Seiden said.Grillo noted that the ATLAS project has been a great opportunity for UCSC students. Both graduate students and undergraduates have been involved in the development of the detector, along with postdoctoral researchers, research physicists, and senior faculty.'The graduate students and postdocs get to go to Geneva, but even the undergraduates get a chance to work in a real physics lab and be part of a major international experiment,' Grillo said.SCIPP's prominent role in the LHC is also a boon for theoretical physicists at UCSC who are not directly involved in the collaboration, such as Dine, Haber, Thomas Banks, and others.'There is a high level of interaction and camaraderie between theorists and experimentalists at UCSC, which is not the case at other leading institutions,' Dine said. 'For me, it's valuable just in terms of being aware of what's happening on the experimental side.'According to Haber, the LHC is certain to generate a lot of excitement in the world of physics.'If nothing were found beyond what we know today, that would be so radical, because it would be in violation of a lot of extremely fundamental principles,' he said.

Mars Odyssey Shifting Orbit for Extended Mission

The longest-serving of six spacecraft now studying Mars is up to new tricks for a third two-year extension of its mission to examine the most Earthlike of known foreign planets.NASA's Mars Odyssey is altering its orbit to gain even better sensitivity for its infrared mapping of Martian minerals. During the mission extension through September 2010, it will also point its camera with more flexibility than it has ever used before. Odyssey reached Mars in 2001.The orbit adjustment will allow Odyssey's Thermal Emission Imaging System to look down at sites when it's mid-afternoon, rather than late afternoon. The multipurpose camera will take advantage of the infrared radiation emitted by the warmer rocks to provide clues to the rocks' identities.'This will allow us to do much more sensitive detection and mapping of minerals,' said Odyssey Project Scientist Jeffrey Plaut of NASA's Jet Propulsion Laboratory, Pasadena, Calif.The mission's orbit design before now used a compromise between what works best for the Thermal Emission Imaging System and what works best for another instrument, the Gamma Ray Spectrometer.On commands from its operations team at JPL and at Denver-based Lockheed Martin Space Systems, Odyssey fired thrusters for nearly 6 minutes on Sept. 30, the final day of the mission's second two-year extension.'This was our biggest maneuver since 2002, and it went well,' said JPL's Gaylon McSmith, Odyssey mission manager. 'The spacecraft is in good health. The propellant supply is adequate for operating through at least 2015.'Odyssey's orbit is synchronized with the sun. The local solar time has been about 5 p.m. at whatever spot on Mars Odyssey flew over as it made its dozen daily passes from between the north pole region to the south pole region for the past five years. (Likewise, the local time has been about 5 a.m. under the track of the spacecraft during the south-to-north leg of each orbit.)The push imparted by the Sept. 30 maneuver will gradually change that synchronization over the next year or so. Its effect is that the time of day on the ground when Odyssey is overhead is now getting earlier by about 20 seconds per day. A follow-up maneuver, probably in late 2009 when the overpass time is between 2:30 and 3:00 p.m., will end the progression toward earlier times.While aiding performance of the Thermal Emission Imaging System, the shift to mid-afternoon is expected to stop the use of one of three instruments in Odyssey's Gamma Ray Spectrometer suite. The suite's gamma ray detector needs a later-hour orbit to avoid overheating of a critical component. The suite's neutron spectrometer and high-energy neutron detector are expected to keep operating.The Gamma Ray Spectrometer provided dramatic discoveries of water-ice near the surface throughout much of high-latitude Mars, the impetus for NASA's Phoenix Mars Lander mission. The gamma ray detector has also mapped global distribution of many elements, such as iron, silicon and potassium, a high science priority for the first and second extensions of the Odyssey mission. A panel of planetary scientists assembled by NASA recommended this year that Odyssey make the orbit adjustment to get the best science return from the mission in coming years.Increased sensitivity for identifying surface minerals is a key science goal for the mission extension beginning this month. Also, the Odyssey team plans to begin occasionally aiming the camera away from the straight-down pointing that has been used throughout the mission. This will allow the team to fill in some gaps in earlier mapping and also create some stereo, three-dimensional imaging.Odyssey will continue providing crucial support for Mars surface missions as well as conducting its own investigations. It has relayed to Earth nearly all data returned from NASA rovers Spirit and Opportunity. It shares with NASA's Mars Reconnaissance Orbiter the relay role for Phoenix. It has made targeted observations for evaluating candidate landing sites.Mars Odyssey, launched in 2001, is managed by JPL, a division of the California Institute of Technology, Pasadena, for NASA's Science Mission Directorate, Washington. Lockheed Martin Space Systems, Denver, is the prime contractor for the project and built the spacecraft. Investigators at Arizona State University, Tempe, operate the Thermal Emission Imaging System. Investigators at the University of Arizona, Tucson, head operation of the Gamma Ray Spectrometer. Additional science partners are located at the Russian Aviation and Space Agency, which provided the high-energy neutron detector, and at Los Alamos National Laboratories, New Mexico, which provided the neutron spectrometer

New Material Mimics The Ability Of Gecko Feet To Grip Surfaces

Scanning electron microscope image of the vertically-aligned multi-walled carbon nanotubes grown for this research.Image courtesy: GeorgiaTech
window.google_render_ad();
The race for the best “gecko foot” dry adhesive got a new competitor this week with a stronger and more practical material reported in the journal Science by a team of researchers from four U.S. institutions.Scientists have long been interested in the ability of gecko lizards to scurry up walls and cling to ceilings by their toes. The creatures owe this amazing ability to microscopic branched elastic hairs in their toes that take advantage of atomic-scale attractive forces to grip surfaces and support surprisingly heavy loads. Several research groups have attempted to mimic those hairs with structures made of polymers or carbon nanotubes.In a paper published in the October 10 issue of Science, researchers from the University of Dayton, the Georgia Institute of Technology, the Air Force Research Laboratory and the University of Akron describe an improved carbon nanotube-based material that for the first time creates directionally-varied (anisotropic) adhesive force. With a gripping ability nearly three times the previous record – and ten times better than a real gecko at resisting perpendicular shear forces – the new carbon nanotube array could give artificial gecko feet the ability to tightly grip vertical surfaces while being easily lifted off when desired.Beyond the ability to walk on walls, the material could have many technological applications, including connecting electronic devices and substituting for conventional adhesives in the dry vacuum of space. The research has been sponsored by the National Science Foundation and the U.S. Air Force Research Laboratory at Wright-Patterson Air Force Base near Dayton, Ohio.“The resistance to shear force keeps the nanotube adhesive attached very strongly to the vertical surface, but you can still remove it from the surface by pulling away from the surface in a normal direction,” explained Liming Dai, the Wright Brothers Institute Endowed Chair in the School of Engineering at the University of Dayton. “This directional difference in the adhesion force is a significant improvement that could help make this material useful as a transient adhesive.”The key to the new material is the use of rationally-designed multi-walled carbon nanotubes formed into arrays with “curly entangled tops,” said Zhong Lin Wang, a Regents’ Professor in the Georgia Tech School of Materials Science and Engineering. The tops, which Wang compared to spaghetti or a jungle of vines, mimic the hierarchical structure of real gecko feet, which include branching hairs of different diameters.When pressed onto a vertical surface, the tangled portion of the nanotubes becomes aligned in contact with the surface. That dramatically increases the amount of contact between the nanotubes and the surface, maximizing the van der Waals forces that occur at the atomic scale. When lifted off the surface in a direction parallel to the main body of the nanotubes, only the tips remain in contact, minimizing the attraction forces, Wang explained.“The contact surface area matters a lot,” he noted. “When you have line contact along, you have van der Waals forces acting along the entire length of the nanotubes, but when you have a point contact, the van der Waals forces act only at the tip of the nanotubes. That allows us to truly mimic what the gecko does naturally.”In tests done on a variety of surfaces – including glass, a polymer sheet, Teflon and even rough sandpaper – the researchers measured adhesive forces of up 100 Newtons per square centimeter in the shear direction. In the normal direction, the adhesive forces were 10 Newtons per square centimeter – about the same as a real gecko.The resistance to shear increased with the length of the nanotubes, while the resistance to normal force was independent of tube length.Though the material might seem most appropriate for use by Spider-Man, the real applications may be less glamorous. Because carbon nanotubes conduct heat and electrical current, the dry adhesive arrays could be used to connect electronic devices.“Thermal management is a real problem today in electronics, and if you could use a nanotube dry adhesive, you could simply apply the devices and allow van der Waals forces to hold them together,” Wang noted. “That would eliminate the heat required for soldering.”Another application might be for adhesives that work long-term in space. “In space, there is a vacuum and traditional kinds of adhesives dry out,” Dai noted. “But nanotube dry adhesives would not be bothered by the space environment.”In addition those already mentioned, the research team also included Liangti Qu from the University of Dayton, Morley Stone from the Air Force Research Laboratory, and Zhenhai Xia from the University of Akron.Qu, a research assistant in the laboratory of Liming Dai, grew the nanotube arrays with a low-pressure chemical vapor deposition process on a silicon wafer. During the pyrolytic growth of the vertically-aligned multi-walled nanotubes, the initial segments grew in random directions and formed a top layer of coiled and entangled nanotubes. This layer helped to increase the nanotube area available for contacting a surface.Qu noted that sample purity was another key factor in ensuring strong adhesion for the carbon nanotube dry adhesive.For the future, the researchers hope to learn more about the surface interactions so they can further increase the adhesive force. They also want to study the long-term durability of the adhesive, which in a small number of tests became stronger with each attachment.And they may also determine how much adhesive might be necessary to support a human wearing tights and red mask.“Because the surfaces may not be uniform, the adhesive force produced by a larger patch may not increase linearly with the size,” Dai said. “There is much we still need to learn about the contact between nanotubes and different surfaces.”

Scientists Engineer Superconducting Thin Films

Scientists Engineer Superconducting Thin Films

Scientists Engineer Superconducting Thin Films

One major goal on the path toward making useful superconducting devices has been engineering materials that act as superconductors at the nanoscale — the realm of billionths of a meter. Such nanoscale superconductors would be useful in devices such as superconductive transistors and eventually in ultrafast, power-saving electronics.In the October 9, 2008, issue of Nature, scientists at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory report that they have successfully produced two-layer thin films where neither layer is superconducting on its own, but which exhibit a nanometer-thick region of superconductivity at their interface. Furthermore, they demonstrate the ability to elevate the temperature of superconductivity at this interface to temperatures exceeding 50 kelvin (-370°F), a relatively high temperature deemed more practical for real-world devices.Photo of Bozovic“This work provides definitive proof of our ability to produce robust superconductivity at the interface of two layers confined within an extremely thin, 1-2-nanometer-thick layer near the physical boundary between the two materials,” said physicist Ivan Bozovic, who leads the Brookhaven thin film research team. “It opens vistas for further progress, including using these techniques to significantly enhance superconducting properties in other known or new superconductors.”Bozovic foresees future research investigating different combinations of non-superconducting materials. “Further study of the temperature-enhancement mechanism might even tell us something about the big puzzle — the mechanism underlying high-temperature superconductivity, which remains one of the most important open problems in condensed matter physics,” he said.Bozovic’s team had reported in 2002 the bizarre observation that the critical temperature — the temperature below which the sample superconducts — could be enhanced by as much as 25 percent in bilayers of two dissimilar copper-based materials. However, at that time, the scientists had no understanding of what caused this enhancement and in which part of the sample the superconductivity was located.To investigate this further, they synthesized more than 200 single-phase, bilayer and trilayer films with insulating, metallic, and superconducting blocks in all possible combinations and of varying layer thickness. The films were grown in a unique atomic-layer-by-layer molecular beam epitaxy system designed and built by Bozovic and coworkers to enable the synthesis of atomically smooth films as well as multilayers with perfect interfaces. “The greatest technical challenge was to prove convincingly that the superconducting effect does not come from simple mixing of the two materials and formation of a third, chemically and physically distinct layer between the two constituent layers,” Bozovic said. Collaborators at Cornell University ruled out this possibility using atomic-resolution transmission electron microscopy to identify the samples’ constituent chemical elements, proving that the layers indeed remained distinct.“It is too early to tell what applications this research might yield,” Bozovic said, “but already at this stage we can speculate that this brings us one big step closer to fabrication of useful three-terminal superconducting devices, such as a superconductive field-effect transistor.” In such a device, one would be able to switch the transistor from the superconducting to the resistive state by means of an external electric field, controlled by applying a voltage and using the third (gate) electrode. Circuits built from such devices would be much faster and use less power than the current ones based on semiconductors.“No matter what the applications, this work is a nice demonstration of our ability to engineer and control materials at sub-nanometer scale, with designed and enhanced functionality,” Bozovic said.The Brookhaven scientists have filed a U.S. provisional patent application for this work. For information about licensing, please contact Kimberley Elcess, 631-344-4151, elcess@bnl.gov.In addition to Bozovic, the research team includes Adrian Gozar, Gennady Logvenov, and Anthony Bollinger of Brookhaven Lab, Lenna Fitting Kourkoutis and David A. Muller of Cornell University, and Lucille A. Giannuzzi of the FEI Company, Hillsboro, Oregon. The research at Brookhaven Lab was funded by the Office of Basic Energy Sciences within the DOE’s Office of Science; the Cornell work was funded by the Office of Naval Research.

Sharper Jupiter Images From Next-Generation Adaptive Optics

Sharper Jupiter Images From Next-Generation Adaptive Optics

Sharper Jupiter Images From Next-Generation Adaptive Optics

A 2008 image of Jupiter obtained by the Very Large Telescope in Chile shows a shift in high-level atmospheric haze compared with a 2005 image taken by the Hubble Space Telescope. The brightest haze has shifted southward by about 6,000 kilometers, probably in response to a global upheaval that began two years ago. The false color, infrared VLT image combines a series of images taken over 20 minutes on Aug. 17 by a Multi- Conjugate Adaptive Optics Demonstrator (MAD) prototype mounted on the telescope. The image sharpening corresponds to seeing details about 300 kilometers wide on the surface of the planet. Credit: ESO/F. Marchis, M. Wong, E. Marchetti, P. Amico, S. Tordo

A two-hour observation of Jupiter using an improved technique to remove atmospheric blur has produced the sharpest whole-planet picture ever taken from the ground, according to astronomers from the University of California, Berkeley, and the European Southern Observatory (ESO).The series of 265 snapshots taken with the help of a prototype Multi-Conjugate Adaptive Optics (MCAO) instrument mounted on the ESO's Very Large Telescope (VLT) revealed changes over the past three years in Jupiter's smog-like haze, probably a response to a planet-wide upheaval more than a year ago.The images prove the value of multi-conjugate systems, which use two or more guide stars instead of one as references to sense the atmospheric turbulence in the instrument's field of view, and two or more deformable mirrors to correct for it. The multiple-star technique produces sharp images over a wider area of sky - an area about three times larger than that produced by single-star adaptive optics systems employed on large telescopes such as Keck II and Gemini North in Hawaii.'This type of adaptive optics has a big advantage for looking at large objects, such as planets, star clusters or nebulae,' said lead researcher Franck Marchis, a research astronomer at UC Berkeley and the SETI Institute in Mountain View, Calif. 'While regular adaptive optics systems provide excellent correction in a small field of view, MCAO provides good correction over a larger area of sky.'A full MCAO instrument is due to be installed on Gemini North by 2010 and is proposed for the VLT.With conventional adaptive optics systems like those on Gemini and Keck, the sharpening degrades with distance between the target and a reference object, be it a star or a moon of a planet. Multi-conjugate adaptive optics maintains a steady sharpness over the whole sky area between the reference objects.This allowed the researchers to observe Jupiter for almost two hours on Aug. 16 and 17, 2008, a record duration, according to the observing team. Conventional adaptive optics systems using a single reference moon cannot monitor Jupiter that long because the moon moves too far from the planet. The Hubble Space Telescope cannot observe Jupiter continuously for more than about 50 minutes, because it is regularly blocked by the Earth during Hubble's 96-minute orbit.Using the MCAO Demonstrator (MAD) mounted on the Melipal 8.2-meter telescope of the VLT, ESO astronomer Paola Amico, MAD project manager Enrico Marchetti and MAD integration engineer Sebastien Tordo tracked two moons, Europa and Io, one on each side of Jupiter, to provide a good correction across the full disk of the planet.'It has been the most challenging observation we performed with MAD because we had to track with high accuracy the two moons having different absolute velocities while simultaneously chasing Jupiter, which was moving with respect to them,' said Marchetti.The VLT image had a resolution of 90 milliarcseconds, 'twice the resolution of the Hubble Space Telescope,' said Marchis. The improved resolution is due not only to MCAO but also the much larger aperture of the 8.2-meter VLT. 'Ground-based telescopes are getting comparable to or better than Hubble in the near-infrared, even for extended objects like Jupiter,' he said.'We wanted to check if there is a link between South Equatorial Belt outbreaks and the Great Red Spot's turbulent wake,' explained team member Mike Wong, a UC Berkeley research astronomer. The belt is some 18,000 kilometers wide north-to-south, and it encircles Jupiter just above the track of the Great Red Spot.Instead, the team found a major alteration in the brightness of the equatorial haze, which lies in a 16,000-kilometer wide belt over Jupiter's equator. They measured increased sunlight reflecting off upper atmospheric haze, which means either that the amount of haze has increased or that it has moved up to higher altitudes. The brightest portion had shifted south by more than 6,000 kilometers, Wong said.This conclusion came from a comparison with images taken in 2005 by Wong and UC Berkeley astronomer Imke de Pater using the Hubble Space Telescope's Near Infrared Camera and Multi-Object Spectrometer (NICMOS) instrument. The Hubble images, taken at infrared wavelengths very close to those used for the VLT study, show more haze in the northern half of the bright Equatorial Zone, while the 2008 VLT images show a clear shift to the south.The haze, which could be the nitrogen compound hydrazine - used on Earth as a rocket propellant - or possibly frozen crystals of ammonia, water or ammonium hydrosulfide from deeper in the gaseous planet, is very prominent in infrared images. Because visible light can penetrate to deeper levels than infrared light, optical telescopes see light reflected from deeper, thicker clouds lying beneath the haze.The haze behaves somewhat like particles in the tops of thunderheads (known as cumulonimbus anvils) on Earth or in the ash plumes from large volcanic eruptions, which rise into the upper atmosphere and spread around the world, Wong said. On Jupiter, ammonia injected into the upper atmosphere interacts with sunlight to form hydrazine, which condenses into a mist of fine ice particles. The hydrazine chemistry in Jupiter's atmosphere is similar to that occurring in the Earth's atmosphere after a volcanic eruption, when sulfur dioxide is converted by solar ultraviolet light into sulfuric acid.'The change we see in the haze could be related to big changes in cloud patterns associated with last year's upheaval, but we need to look at more data to narrow down precisely when the changes occurred,' Wong said.