Published: July 29, 2008

It is well known that panes of stained glass in old European churches are thicker at the bottom because glass is a slow-moving liquid that flows downward over centuries.

Well known, but wrong. Medieval stained glass makers were simply unable to make perfectly flat panes, and the windows were just as unevenly thick when new.

The tale contains a grain of truth about glass resembling a liquid, however. The arrangement of atoms and molecules in glass is indistinguishable from that of a liquid. But how can a liquid be as strikingly hard as glass?

“They’re the thickest and gooiest of liquids and the most disordered and structureless of rigid solids,” said Peter Harrowell, a professor of chemistry at the University of Sydney in Australia, speaking of glasses, which can be formed from different raw materials. “They sit right at this really profound sort of puzzle.”

Philip W. Anderson, a Nobel Prize-winning physicist at Princeton, wrote in 1995: “The deepest and most interesting unsolved problem in solid state theory is probably the theory of the nature of glass and the glass transition.”

He added, “This could be the next breakthrough in the coming decade.”

Thirteen years later, scientists still disagree, with some vehemence, about the nature of glass.

Peter G. Wolynes, a professor of chemistry at the University of California, San Diego, thinks he essentially solved the glass problem two decades ago based on ideas of what glass would look like if cooled infinitely slowly. “I think we have a very good constructive theory of that these days,” Dr. Wolynes said. “Many people tell me this is very contentious. I disagree violently with them.”

Others, like Juan P. Garrahan, professor of physics at the University of Nottingham in England, and David Chandler, professor of chemistry at the University of California, Berkeley, have taken a different approach and are as certain that they are on the right track.

“It surprises most people that we still don’t understand this,” said David R. Reichman, a professor of chemistry at Columbia, who takes yet another approach to the glass problem. “We don’t understand why glass should be a solid and how it forms.”

Dr. Reichman said of Dr. Wolynes’s theory, “I think a lot of the elements in it are correct,” but he said it was not a complete picture. Theorists are drawn to the problem, Dr. Reichman said, “because we think it’s not solved yet — except for Peter maybe.”

Scientists are slowly accumulating more clues. A few years ago, experiments and computer simulations revealed something unexpected: as molten glass cools, the molecules do not slow down uniformly. Some areas jam rigid first while in other regions the molecules continue to skitter around in a liquid-like fashion. More strangely, the fast-moving regions look no different from the slow-moving ones.

Meanwhile, computer simulations have become sophisticated and large enough to provide additional insights, and yet more theories have been proffered to explain glasses.

David A. Weitz, a physics professor at Harvard, joked, “There are more theories of the glass transition than there are theorists who propose them.” Dr. Weitz performs experiments using tiny particles suspended in liquids to mimic the behavior of glass, and he ducks out of the theoretical battles. “It just can get so controversial and so many loud arguments, and I don’t want to get involved with that myself.”

For scientists, glass is not just the glass of windows and jars, made of silica, sodium carbonate and calcium oxide. Rather, a glass is any solid in which the molecules are jumbled randomly. Many plastics like polycarbonate are glasses, as are many ceramics.

Understanding glass would not just solve a longstanding fundamental (and arguably Nobel-worthy) problem and perhaps lead to better glasses. That knowledge might benefit drug makers, for instance. Certain drugs, if they could be made in a stable glass structure instead of a crystalline form, would dissolve more quickly, allowing them to be taken orally instead of being injected. The tools and techniques applied to glass might also provide headway on other problems, in material science, biology and other fields, that look at general properties that arise out of many disordered interactions.

“A glass is an example, probably the simplest example, of the truly complex,” Dr. Harrowell, the University of Sydney professor, said. In liquids, molecules jiggle around along random, jumbled paths. When cooled, a liquid either freezes, as water does into ice, or it does not freeze and forms a glass instead.

In freezing to a conventional solid, a liquid undergoes a so-called phase transition; the molecules line up next to and on top of one another in a simple, neat crystal pattern. When a liquid solidifies into a glass, this organized stacking is nowhere to be found. Instead, the molecules just move slower and slower and slower, until they are effectively not moving at all, trapped in a strange state between liquid and solid.

The glass transition differs from a usual phase transition in several other key ways. Energy, what is called latent heat, is released when water molecules line up into ice. There is no latent heat in the formation of glass.

The glass transition does not occur at a single, well-defined temperature; the slower the cooling, the lower the transition temperature. Even the definition of glass is arbitrary — basically a rate of flow so slow that it is too boring and time-consuming to watch. The final structure of the glass also depends on how slowly it has been cooled.

By contrast, water, cooled quickly or cooled slowly, consistently crystallizes to the same ice structure at 32 degrees Fahrenheit.

To develop his theory, Dr. Wolynes zeroed in on an observation made decades ago, that the viscosity of a glass was related to the amount of entropy, a measure of disorder, in the glass. Further, if a glass could be formed by cooling at an infinitely slow rate, the entropy would vanish at a temperature well above absolute zero, violating the third law of thermodynamics, which states that entropy vanishes at absolute zero.

Dr. Wolynes and his collaborators came up with a mathematical model to describe this hypothetical, impossible glass, calling it an “ideal glass.” Based on this ideal glass, they said the properties of real glasses could be deduced, although exact calculations were too hard to perform. That was in the 1980s. “I thought in 1990 the problem was solved,” Dr. Wolynes said, and he moved on to other work.

Not everyone found the theory satisfying. Dr. Wolynes and his collaborators so insisted they were right that “you had the impression they were trying to sell you an old car,” said Jean-Philippe Bouchaud of the Atomic Energy Commission in France. “I think Peter is not the best advocate of his own ideas. He tends to oversell his own theory.”

Around that time, the first hints of the dichotomy of fast-moving and slow-moving regions in a solidifying glass were seen in experiments, and computer simulations predicted that this pattern, called dynamical heterogeneity, should exist.

Dr. Weitz of Harvard had been working for a couple of decades with colloids, or suspensions of plastic spheres in liquids, and he thought he could use them to study the glass transition. As the liquid is squeezed out, the colloid particles undergo the same change as a cooling glass. With the colloids, Dr. Weitz could photograph the movements of each particle in a colloidal glass and show that some chunks of particles moved quickly while most hardly moved.

“You can see them,” Dr. Weitz said. “You can see them so clearly.”

The new findings did not faze Dr. Wolynes. Around 2000, he returned to the glass problem, convinced that with techniques he had used in solving protein folding problems, he could fill in some of the computational gaps in his glass theory. Among the calculations, he found that dynamical heterogeneity was a natural consequence of the theory.

Dr. Bouchaud and a colleague, Giulio Biroli, revisited Dr. Wolynes’s theory, translating it into terms they could more easily understand and coming up with predictions that could be compared with experiments. “For a long time, I didn’t really believe in the whole story, but with time I became more and more convinced there is something very deep in the theory,” Dr. Bouchaud said. “I think these people had fantastic intuition about how the whole problem should be attacked.”

For Dr. Garrahan, the University of Nottingham scientist, and Dr. Chandler, the Berkeley scientist, the contrast between fast- and slow-moving regions was so striking compared with the other changes near the transition, they focused on these dynamics. They said that the fundamental process in the glass transition was a phase transition in the trajectories, from flowing to jammed, rather than a change in structure seen in most phase transitions. “You don’t see anything interesting in the structure of these glass formers, unless you look at space and time,” Dr. Garrahan said.

They ignore the more subtle effects related to the impossible-to-reach ideal glass state. “If I can never get there, these are metaphysical temperatures,” Dr. Chandler said.

Dr. Chandler and Dr. Garrahan have devised and solved mathematical models, but, like Dr. Wolynes, they have not yet convinced everyone of how the model is related to real glasses. The theory does not try to explain the presumed connection between entropy and viscosity, and some scientists said they found it hard to believe that the connection was just coincidence and unrelated to the glass transition.

Dr. Harrowell said that in the proposed theories so far, the theorists have had to guess about elementary atomic properties of glass not yet observed, and he wondered whether one theory could cover all glasses, since glasses are defined not by a common characteristic they possess, but rather a common characteristic they lack: order. And there could be many reasons that order is thwarted. “If I showed you a room without an elephant in the room, the question ‘why is there not an elephant in the room?’ is not a well-posed question,” Dr. Harrowell said.

New experiments and computer simulations may offer better explanations about glass. Simulations by Dr. Harrowell and his co-workers have been able to predict, based on the pattern of vibration frequencies, which areas were likely to be jammed and which were likely to continue moving. The softer places, which vibrate at lower frequencies, moved more freely.

Mark D. Ediger, a professor of chemistry at the University of Wisconsin, Madison, has found a way to make thin films of glass with the more stable structure of a glass that has been “aged” for at least 10,000 years. He hopes the films will help test Dr. Wolynes’s theory and point to what really happens as glass approaches its ideal state, since no one expects the third law of thermodynamics to fall away.

Dr. Weitz of Harvard continues to squeeze colloids, except now the particles are made of compressible gels, enabling the colloidal glasses to exhibit a wider range of glassy behavior.

“When we can say what structure is present in glasses, that will be a real bit of progress,” Dr. Harrowell said. “And hopefully something that will have broader implications than just the glass field.”


Published: July 29, 2008

When science is testing new ideas, the result is often a two-papers-forward-one-paper-back intellectual tussle among competing research teams.

When the work touches on issues that worry the public, affect the economy or polarize politics, the news media and advocates of all stripes dive in. Under nonstop scrutiny, conflicting findings can make news coverage veer from one extreme to another, resulting in a kind of journalistic whiplash for the public.

This has been true for decades in health coverage. But lately the phenomenon has been glaringly apparent on the global warming beat.

Discordant findings have come in quick succession. How fast is Greenland shedding ice? Did human-caused warming wipe out frogs in the American tropics? Has warming strengthened hurricanes? Have the oceans stopped warming? These questions endure even as the basic theory of a rising human influence on climate has steadily solidified: accumulating greenhouse gases will warm the world, erode ice sheets, raise seas and have big impacts on biology and human affairs.

Scientists see persistent disputes as the normal stuttering journey toward improved understanding of how the world works. But many fear that the herky-jerky trajectory is distracting the public from the undisputed basics and blocking change. “One of the things that troubles me most is that the rapid-fire publication of unsettled results in highly visible venues creates the impression that the scientific community has no idea what’s going on,” said W. Tad Pfeffer, an expert on Greenland’s ice sheets at the University of Colorado.

“Each new paper negates or repudiates something emphatically asserted in a previous paper,” Dr. Pfeffer said. “The public is obviously picking up on this not as an evolution of objective scientific understanding but as a proliferation of contradictory opinions.”

Several experts on the media and risk said that one result could be public disengagement with the climate issue just as experts are saying ever more forcefully that sustained attention and action are needed to limit the worst risks. Recent polls in the United States and Britain show that the public remains substantially divided and confused over what is happening and what to do. Some environmentalists have blamed energy-dependent industries and the news media for stalemates on climate policy, arguing that they perpetuate a false sense of uncertainty about the basic problem.

But scientists themselves sometimes fail to carefully discriminate between what is well understood and what remains uncertain, said Kimberly Thompson, an associate professor of risk analysis and decision science at Harvard.

And, Dr. Thompson said, the flow of scientific findings from laboratory (or glacier) to journal to news report is fraught with “reinforcing loops” that can amplify small distortions.

For example, she said, after scientists learn that accurate, but nuanced, statements are often left out of news accounts, they may pre-emptively oversimplify their description of some complex finding. Better, but more difficult, Dr. Thompson said, would be to work with the reporter to characterize the weight of evidence behind the new advance and seek to place it in context.

To support clarity, Stephen H. Schneider, a climatologist at Stanford, helped create a glossary defining what is meant by phrases like “very likely” (greater than 90 percent confidence) in the reports from the Intergovernmental Panel on Climate Change. In a news media universe where specialized reporting is declining and a Web mash-up of instant opinion and information is emerging, Dr. Schneider said, it is ever more important for scientists to take responsibility for communicating in ways that stick, while sticking with the facts.

Dr. Thompson said climate science presented particularly tough challenges, given the long time lag before the worst effects kick in and the persistent uncertainty about the likelihood of worst-case outcomes. She said the news media sometimes overplayed the uncertainty by balancing opposing views in a story without characterizing the overall level of confidence in either side. And sometimes they do the opposite, sacrificing accuracy for impact, she said.

“Words that we as scientists use to express uncertainty routinely get dropped out to make stories have more punch and be stronger,” she said, adding that those words are important to include because “they convey meaning to readers not only in the story at hand, but more generally about science being less precise than is typically conveyed.”

Public-relations offices at leading scientific journals and hubs for research also could do more to avoid overplaying incremental research results, she and several other experts said.

Donald Kennedy, a Stanford professor emeritus who was the editor in chief of the journal Science from 2000 until earlier this year, said the flow of papers on climate, glaciology and relevant ocean sciences greatly increased in his tenure. “I do think we grew more sensitive to the need for critical review of papers likely to initiate or continue the kind of controversy that results in a whiplash effect,” Dr. Kennedy said.

Roger A. Pielke Jr., a political scientist at the University of Colorado, warned that the focus by the public and media on the stream of evolving climate science could distract from the need for policies now that made sense regardless of uncertainties. “The example of reducing losses to hurricanes is a good one,” Dr. Pielke said, “where the actions that make the most sense are really independent of the debate over greenhouse gases and hurricane behavior.”

“The same might be said for many health studies on fat, coffee, carbs,” he added. “The lesson from experts is to eat a balanced diet and get plenty of exercise,” which stays the same despite the various disputes.

He said his advice for scientists who wanted to “dampen the whiplash effect” was to “discuss the ‘So what?’ implications of the work explicitly, rather than leaving that step to advocates or politicians, or reporters.”

Increasingly, scientists are taking their message straight to the public., and are among Web sites where issues are explored in an ongoing way, rather than in response to news releases and scientific papers. Other new Web ventures, like at Princeton and the Yale Forum on Climate Change and the Media, focus on improving media coverage.

Robert J. Brulle , a sociologist at Drexel University, said it was hard to be optimistic about such efforts. “In this public sphere,” he said, “it is assumed that the better argument, backed up with solid scientific evidence, will prevail.” He said many studies had shown that people tended to sift sources of information to reinforce existing views.

Morris Ward, the editor of the Yale effort (, says that it will be up to the public to choose to be better informed on momentous issues that do not fit the normal template for news or clash with their ingrained worldviews. “At some point,” he said, “the public at large has to step up to the plate in terms of scientific and policy literacy, in terms of commitment to education and strong and effective political leadership, and in terms of their own general self-improvement.”

By Juan Carlos Perez, IDG News Service
Published: July 25, 2008

In a discovery that would probably send the Dr. Evil character of the “Austin Powers” movies into cardiac arrest, Google recently detected more than a trillion unique URLs on the Web.

This milestone awed Google search engineers, who are seeing the Web growing by several billion individual pages every day, company officials wrote in a blog post Friday.

In addition to announcing this finding, Google took the opportunity to promote the scope and magnitude of its index.

“We don’t index every one of those trillion pages — many of them are similar to each other, or represent auto-generated content … that isn’t very useful to searchers. But we’re proud to have the most comprehensive index of any search engine, and our goal always has been to index all the world’s data,” wrote Jesse Alpert and Nissan Hajaj, software engineers in Google’s Web Search Infrastructure Team.

It had been a while since Google had made public pronouncements about the size of its index, a topic that routinely generated controversy and counterclaims among the major search engine players years ago.

Those days of index-size envy ended when it became clear that most people rarely scan more than two pages of Web results. In other words, what matters is delivering 10 or 20 really relevant Web links, or, even better, a direct factual answer, because few people will wade through 5,000 results to find the desired information.

It will be interesting to see if this announcement from Google, posted on its main official blog, will trigger a round of reactions from rivals like Yahoo, Microsoft and

In the meantime, Google also disclosed interesting information about how and with what frequency it analyzes these links.

“Today, Google downloads the web continuously, collecting updated page information and re-processing the entire web-link graph several times per day. This graph of one trillion URLs is similar to a map made up of one trillion intersections. So multiple times every day, we do the computational equivalent of fully exploring every intersection of every road in the United States. Except it’d be a map about 50,000 times as big as the U.S., with 50,000 times as many roads and intersections,” the officials wrote.

Published: July 22, 2008

For some 40 years, the nation has used the same formula for calculating poverty, using the cost of food as a gauge and applying a single poverty threshold across the nation — from Boise to the Bronx. After a year of work, Mayor Michael Bloomberg of New York has presented a new formula for measuring poverty that creates a far more realistic view of life in the city. It should stand as an example to other cities — and, ultimately, the federal government.

Recasting the federal formula that has been in place since the 1960s, the Bloomberg administration found another 400,000 poor people in New York City and came up with a poverty rate of 23 percent, compared with about 19 percent. That result was tempered by fewer extremely poor people (those whose incomes are half of the poverty threshold), which may be explained by the reach of programs like temporary assistance to needy families.

But the city found that 32 percent of the elderly are poor — far higher than the 18 percent under the former measure. Out-of-pocket medical costs were a consistent big drain on the wallets of residents 65 or older.

No formula would be perfect, but Mr. Bloomberg’s employs common sense. It is absurd, for example, that the poverty threshold in New York City, one of the most expensive, has been the same as the least expensive: $20,444 for a family of four. The mayor raised New York’s poverty ceiling to a more believable $26,138.

Washington’s obsolete measure looks only at the cost of food relative to income. Mass production and other efficiencies have generally made food less costly, while housing, transportation and energy costs have risen. The mayor’s formula, taken from recommendations by the National Academy of Sciences, factors in those big-ticket items, as well as government assistance.

Mr. Bloomberg’s approach is rightly getting attention from other cities and Capitol Hill. The campaign of Senator Barack Obama has endorsed it. The next step, which should be taken quickly, is for the mayor to use what he has learned — and what we already knew — to get more help to New York’s many poor.


七月 17, 2008






七月 13, 2008







迎人的编贝, 人的晚云