Pages

Tuesday, January 31, 2012

Weaker sun will not delay global warming: study

 A weaker sun over the next 90 years is not likely to significantly delay a rise in global temperature caused by greenhouse gases, a report said Monday.



The study, by Britain's Meteorological Office and the university of Reading, found that the Sun's output would decrease up until 2100 but this would only lead to a fall in global temperatures of 0.08 degrees Celsius.
Scientists have warned that more extreme weather is likely across the globe this century as the Earth's climate warms.
The world is expected to heat up by over 2 degrees Celsius this century due to increased greenhouse gas emissions.
Current global pledges to cut carbon dioxide and other greenhouse gas emissions are not seen as sufficient to stop the planet heating up beyond 2 degrees, a threshold scientists say risks an unstable climate in which weather extremes are common.
"This research shows that the most likely change in the sun's output will not have a big impact on global temperatures or do much to slow the warming we expect from greenhouse gases," said Gareth Jones, climate change detection scientist at the Met Office.
"It's important to note this study is based on a single climate model, rather than multiple models which would capture more of the uncertainties in the climate system," he added.
During the 20th century, solar activity increased to a maximum level and recent studies have suggested this level of activity has reached, or is nearing, an end.
The scientists used this maximum level as a starting point to project possible changes in the sun's activity over this century.
The study also showed that if the sun's output went below a threshold reached between 1645 and 1715 - called the Maunder Minimum when solar activity was at its lowest observed level - global temperature would fall by 0.13 degrees Celsius.
"The most likely scenario is that we'll see an overall reduction of the sun's activity compared to the 20th Century, such that solar outputs drop to the values of the Dalton Minimum (around 1820)," said Mike Lockwood, solar studies expert at the university of Reading.
"The probability of activity dropping as low as the Maunder Minimum - or indeed returning to the high activity of the 20th Century - is about 8 percent."

Kepler telescope team finds 11 new solar systems

NASA's planet-hunting Kepler space telescope has found 11 new planetary systems, including one with five planets all orbiting closer to their parent star than Mercury circles the Sun, scientists said on Thursday.

The discoveries boost the list of confirmed extra-solar planets to 729, including 60 credited to the Kepler team. The telescope, launched in space in March 2009, can detect slight but regular dips in the amount of light coming from stars. Scientists can then determine if the changes are caused by orbiting planets passing by, relative to Kepler's view.
Kepler scientists have another 2,300 candidate planets awaiting additional confirmation.
None of the newly discovered planetary systems are like our solar system, though Kepler-33, a star that is older and bigger than the Sun, comes close in terms of sheer numbers. It has five planets, compared to our solar system's eight, but the quintet all fly closer to their parent star than Mercury orbits the Sun.
The planets range in size from about 1.5 times the diameter of Earth to five times Earth's diameter. Scientists have not yet determined if any are solid rocky bodies like Earth, Venus, Mars and Mercury or if they are filled with gas like Jupiter, Saturn, Uranus and Neptune.
The Kepler team previously found one star with six confirmed planets and a second system with five planets, said planetary scientist Jack Lissauer, with NASA's Ames Research Center in Moffett Field, California.
Nine of the new systems contain two planets and one has three, bringing the total number of newly discovered planets to 26. All are closer to their host stars than Venus is to the Sun.
"This has tripled the number of stars which we know have more than one transiting planet, so that's the big deal here," Lissauer told Reuters.
"We're starting to think in terms of planetary systems as opposed to just planets: Do they all tend to have similar sizes? What's the spacing? Is the solar system unusual in those regards?" he said.
Kepler is monitoring more than 150,000 stars in the constellations Cygnus and Lyra.
The research is published in four different papers in Astrophysical Journal and the Monthly Notices of the Royal Astronomical Society.

Snowy owls soar south from Arctic in rare mass migration

 Bird enthusiasts are reporting rising numbers of snowy owls 

from the Arctic winging into the lower 48 states this winter in a 

mass southern migration that a leading owl researcher called 

"unbelievable."


Thousands of the snow-white birds, which stand 2 feet tall with 5-foot wingspans, have been spotted from coast to coast, feeding in farmlands in Idaho, roosting on rooftops in Montana, gliding over golf courses in Missouri and soaring over shorelines in Massachusetts.
A certain number of the iconic owls fly south from their Arctic breeding grounds each winter but rarely do so many venture so far away even amid large-scale, periodic southern migrations known as irruptions.
"What we're seeing now -- it's unbelievable," said Denver Holt, head of the Owl Research Institute in Montana.
"This is the most significant wildlife event in decades," added Holt, who has studied snowy owls in their Arctic tundra ecosystem for two decades.
Holt and other owl experts say the phenomenon is likely linked to lemmings, a rodent that accounts for 90 percent of the diet of snowy owls during breeding months that stretch from May into September. The largely nocturnal birds also prey on a host of other animals, from voles to geese.
An especially plentiful supply of lemmings last season likely led to a population boom among owls that resulted in each breeding pair hatching as many as seven offspring. That compares to a typical clutch size of no more than two, Holt said.
Greater competition this year for food in the Far North by the booming bird population may have then driven mostly younger, male owls much farther south than normal.
Research on the animals is scarce because of the remoteness and extreme conditions of the terrain the owls occupy, including northern Russia and Scandinavia, he said.
The surge in snowy owl sightings has brought birders flocking from Texas, Arizona and Utah to the Northern Rockies and Pacific Northwest, pouring tourist dollars into local economies and crowding parks and wildlife areas. The irruption has triggered widespread public fascination that appears to span ages and interests.
"For the last couple months, every other visitor asks if we've seen a snowy owl today," said Frances Tanaka, a volunteer for the Nisqually National Wildlife Refuge northeast of Olympia, Washington.
But accounts of emaciated owls at some sites -- including a food-starved bird that dropped dead in a farmer's field in Wisconsin -- suggest the migration has a darker side. And Holt said an owl that landed at an airport in Hawaii in November was shot and killed to avoid collisions with planes.
He said snowy owl populations are believed to be in an overall decline, possibly because a changing climate has lessened the abundance of vegetation like grasses that lemmings rely on.
This winter's snowy owl outbreak, with multiple sightings as far south as Oklahoma, remains largely a mystery of nature.
A snowy white owl takes flight in this undated handout photo courtesy of U.S. Fish & Wildlife Service. Bird enthusiasts are reporting rising numbers of snowy owls from the Arctic winging into the lower 48 states this winter in a mass southern migration that a leading owl researcher called "unbelievable" according to Denver Holt, head of Owl Research Institute in Montana. REUTERS/U.S. Fish&Wildlife Service/Handout

Monday, January 30, 2012

Cold Plasma Above Earth

 Cold plasma has been well-hidden. Space physicists have long lacked clues to how much of this electrically charged gas exists tens of thousands of miles above Earth and how the stuff may impact our planet’s interaction with the Sun. Now, a new method developed by Swedish researchers makes cold plasma measurable and reveals significantly more cold, charged ions in Earth’s upper altitudes than previously imagined. 

At these lofty elevations, storms of high-energy charged particles -- space weather -- roil the atmosphere, creatingauroras, buffeting satellites, and sometimes wreaking havoc with electronic devices and electric grids on Earth. The new evidence of abundant cold (i.e. low-energy) ions may change our understanding of this tumultuous space weather and lead to more accurate forecasting of it, scientists say. The finding might also shed light on what’s happening around other planets and moons -- for instance, helping explain why the once robust atmosphere of Mars is so wispy today. 

“The more you look for low-energy ions, the more you find,” said Mats Andre, a professor of space physics at the Swedish Institute of Space Physics in Uppsala, Sweden, and leader of the research team. “We didn’t know how much was out there. It’s more than even I thought.” 

The low-energy ions are created in the ionosphere, a region of the upper atmosphere where solar energy can sweep electrons away from molecules, leaving atoms of elements like hydrogen and oxygen with positive charges. Actually detecting these ions at high altitudes has been extremely difficult. 

Now that has changed, making it apparent that low-energy ions abound in the distant reaches where Earth’s atmosphere gives way to outer space. Researchers knew the ions were present at altitudes of about 100 kilometers (60 miles), but Andre and his colleague Chris Cully looked much higher, between 20,000 and 100,000 km (12,400 to 60,000 mi). While the concentration of the previously hidden cold ions varies, about 50 to 70 percent of the time the particles make up most of the mass of great swaths of space, according to the researchers’ satellite measurements and calculations. And, in some high-altitude zones, low-energy ions dominate nearly all of the time. Even at altitudes around 100,000 km -- about a third of the distance to the Moon -- the team detected these previously elusive low-energy ions. 

A scientist examines one of the European Space Agency's four Cluster satellites, used in a recent Geophysical Research Letters study to measure low-energy ions. Credit: European Space Agency
Finding so many relatively cool ions in those regions is surprising, Andre said, because there’s so much energy blasting into Earth’s high altitudes from the solar wind -- a rushing flow of hot plasma streaming from the Sun, which stirs up space-weather storms. 

This hot plasma is about 1,000 times hotter than what Andre considers cold plasma -- but even cold is a relative term. The low-energy ions have an energy that would correspond to about 500,000 degrees Celsius (about one million degrees Fahrenheit) at typical gas densities found on Earth. But because the density of the ions in space is so low, satellites and spacecraft can orbit without bursting into flames. 

The researchers’ new findings have been accepted for publication in Geophysical Research Letters, a journal of theAmerican Geophysical Union

For decades, space physicists have struggled to accurately detect low-energy ions and determine how much of the material is leaving our atmosphere. The satellite Andre works on, one of four European Space Agency CLUSTER spacecraft, is equipped with a detector with thin wire arms that measures the electric field between them as the satellite rotates. But, when the scientists gathered data from their detectors, two mysterious trends appeared. Strong electric fields turned up in unexpected regions of space. And as the spacecraft rotated, measurements of the electric field didn’t fluctuate in the smoothly changing manner that Andre expected. 

“To a scientist, it looked pretty ugly,” Andre said. “We tried to figure out what was wrong with the instrument. Then we realized there’s nothing wrong with the instrument.” Unexpectedly, they found that cold plasma was altering the structure of electrical fields around the satellite. Once they understood that, they could use their field measurements to reveal the presence of the once-hidden ions. 

It’s a clever way of turning the limitations of a spacecraft-based detector into assets, said Thomas Moore, senior project scientist for NASA’s Magnetospheric Multiscale mission at the Goddard Space Flight Center in Greenbelt, Maryland. He was not involved in the new research. 

As scientists use the new measurement method to map cold plasma around Earth, they could discover more about how hot and cold plasmas interact during space storms and other events, deepening researchers’ understanding of space weather, Andre said. 

The new measurements indicate that about a kilogram (two pounds) of cold plasma escapes from Earth’s atmosphere every second, Andre said. Knowing that rate of loss for Earth may help scientists better reconstruct what became of the atmosphere of Mars, which is thought to once have been denser and more similar to Earth’s. The new cold plasma results might also help researchers explain atmospheric traits of other planets and moons, Andre suggested. 

An artist's rendition of Magnetospheric Multiscale mission as it sweeps through a magnetic reconnection event caused when the solar wind meets Earth's magnetic fields. Credit: SWRI
And closer to home, if scientists could develop more accurate space weather forecasts, they could save satellites from being blinded or destroyed, and better warn space station astronauts and airlines of danger from high-energy radiation. While low-energy ions are not responsible for the damage caused by space weather, they do influence that weather. Andre compared the swaths of ions to, say, a low-pressure area in our familiar, down-to-Earth weather -- as opposed to a harmful storm. It is a key player, even if it doesn’t cause the damage itself. “You may want to know where the low-pressure area is, to predict a storm,” Andre noted. 

Improving space weather forecasts to the point where they’re comparable to ordinary weather forecasting, was “not even remotely possible if you’re missing most of your plasma,” Moore, with NASA, said. Now, with a way to measure cold plasma, the goal of high-quality forecasts is one step closer. 

“It is stuff we couldn’t see and couldn’t detect, and then suddenly we could measure it,” Moore said of the low-energy ions. “Now you can actually study it and see if it agrees with the theories.”

Photo from NASA Mars Orbiter Shows Wind's Handiwork


Photo from NASA Mars Orbiter Shows Wind's Handiwork

Sand dunes trapped in an impact crater in Noachis Terra on MarsThis enhanced-color image shows sand dunes trapped in an impact crater in Noachis Terra, Mars. Image credit: NASA/JPL-Caltech/Univ. of Arizona
› Full image and caption

  • Share278
January 25, 2012
Some images of stark Martian landscapes provide visual appeal beyond their science value, including a recent scene of wind-sculpted features from the High Resolution Imaging Science Experiment (HiRISE) camera on NASA's Mars Reconnaissance Orbiter.

The scene shows dunes and sand ripples of various shapes and sizes inside an impact crater in the Noachis Terra region of southern Mars. Patterns of dune erosion and deposition provide insight into the sedimentary history of the area.

The Mars Reconnaissance Orbiter has been examining Mars with six science instruments since 2006. Now in an extended mission, the orbiter continues to provide insights about the planet's ancient environments and about how processes such as wind, meteorite impacts and seasonal frosts are continuing to affect the Martian surface today. This mission has returned more data about Mars than all other orbital and surface missions combined.

More than 20,600 images taken by HiRISE are available for viewing on the instrument team's website:http://hirise.lpl.arizona.edu. Each observation by this telescopic camera covers several square miles, or square kilometers, and can reveal features as small as a desk.

HiRISE is operated by the University of Arizona, Tucson. The instrument was built by Ball Aerospace & Technologies Corp., Boulder, Colo. The Mars Reconnaissance Orbiter project is managed by the Jet Propulsion Laboratory, Pasadena, Calif., for NASA's Science Mission Directorate, Washington. JPL is a division of the California Institute of Technology, also in Pasadena. Lockheed Martin Space Systems, Denver, built the spacecraft. 

Saturday, January 28, 2012

Cambridge scientist 'debunks' flying myth


Aircraft
A scientist at Cambridge University has debunked the long-held myth about how aircraft stay aloft.
Aeroplanes can fly as their wings cause the air pressure underneath to be greater than that above, lifting them into the air. But, engineers have for years been frustrated by a theory, which wrongly explains what causes the pressure change, a myth commonly found in school textbooks and flight manuals.
But, Prof Holger Babinsky of Cambridge University's engineering department has now created a minute-long video, posted on 'YouTube' website, to lay to rest the myth once and for all, 'The Daily Telegraph' reported.
According to conventional wisdom, the pressure change happens as the air on the curved upper surface of the wing has further to travel than that below the flat underneath surface, meaning it must travel faster to arrive at the other side of the wing at the same time.
Prof Babinsky says the myth goes against the laws of physics and the real explanation has nothing to do with the distance the air has to travel.
According to him, the curvature of the wing causes the change in air pressure because it pulls some of the air upwards, which reduces pressure, and forces the rest beneath
it, creating higher pressure.
A law known as the Bernoulli equation means that when pressure is lower, air moves faster -- so the air stream above the wing does move more quickly than the one below, but this is not what causes the difference in pressure.

World's first magnetic soap 'produced'


Soap

In a pioneering research, scientists claim to have produced the world's first magnetic soap that is composed of iron-rich salts dissolved in water.
A team at Bristol University says that its soap, which responds to a magnetic field when placed in solution, would calm all concerns over the use of surfactants in oil-spill clean-ups and revolutionise industrial cleaning products.
For long, researchers have been searching for a way to control soaps (or surfactants as they are known in industry) once they are in solution to increase the ability to dissolve oils in water and then remove them from a system.
The Bristol University team produced the magnetic soap by dissolving iron in a range of inert surfactant materials composed of chloride and bromide ions, very similar to those found in everyday mouthwash or fabric conditioner.
The addition of the iron creates metallic centres within the soap particles, say the scientists led by Julian Eastoe.
To test its properties, the team introduced a magnet to a test tube containing their new soap lying beneath a less dense organic solution, the 'Angewandte Chemie' journal reported.
When the magnet was introduced the iron-rich soap overcame both gravity and surface tension between the water and oil, to levitate through the organic solvent and reach the source of magnetic energy, proving its magnetic properties.

Friday, January 27, 2012

A non-hydrolytic sol–gel route to highly active MoO3–SiO2–Al2O3 metathesis catalysts


Mesoporous mixed oxides prepared via non-hydrolytic sol–gel outperform markedly other MoO3-based catalysts in the metathesis of propene.
  Graphical abstract: A non-hydrolytic sol–gel route to highly active MoO3–SiO2–Al2O3 metathesis catalysts

Mind the Science Gap – Helping science students connect with a non-science audience


Studying for a Masters degree in Public Health prepares you for many things.  But it doesn’t necessarily give you hands-on experience of how to take complex information and translate it into something others can understand and use.  Yet as an increasing array of public health issues hit the headlines, from fungicide residues in orange juice to the safe development of new technologies, this is exactly where public health professionals need to be developing their skills.  And it’s not only in the public domain: the ability to translate complex science into actionable intelligence is more important now than ever in supporting policy makers and business leaders make decisions that are grounded in evidence rather than speculation.
These were just some of the drivers behind a new course I have just started teaching at the University of Michigan School of Public Health that built around science blogging.  OK, so maybe I wanted to have a little fun with the students as well.  But my experiences with the blog  2020 Science have taught me that the discipline of writing a science-based blog for a broad audience is invaluable for developing highly transferrable communication skills.  And it’s not just me.  Emailing with the scientist, author and blogger Sheryl Kirshenbaum about the course, she admitted “blogging taught me how to effectively communicate with broad audiences”.  (Sheryl also added that she’s also learned a great deal from many wonderful editors – to which I can only add “me too!”).
The new course throws ten Masters of Public Health students in at the deep end by challenging each of them to publish ten posts over ten weeks on the blog Mind The Science Gap – and to respond to the comments they receive.  As this is a science blog, each post will be based around published health-related research.  The challenge for the writers will be to translate this into a science-grounded piece that is relevant and accessible to a broad audience.
The key objective here is to develop new skills through experience.  And for this, I am encouraging as many people as possible to comment on the posts.  As any science blogger will tell you, even simple comments like “I liked this” or “this was confusing” are extremely helpful in understanding what works and what doesn’t.  But I am also hoping readers will look beyond the educational aspects of the exercise, and engage with the students on the subjects they are writing about.  This is where I suspect the experience will become most empowering.
There’s another aspect of the course that intrigues me.  Rather naively, I started this exercise imagining a series of impersonal posts that focused on intellectually interesting but emotionally ambivalent scientific studies.  What I forgot is that public health matters to people.  And so it’s going to be tough for our bloggers to separate what they write about from their passions – and those of their readers.  In fact I’m not even sure that such a separation would be appropriate – for communication to be relevant, it needs to go beyond the numbers.  But how do you effectively combine science with a desire to make the world a better place in a blog?  I try to achieve this on my own blog, but I must admit don’t have any easy answers here.  So as the Mind The Science Gap students develop their skills, I’m going to be doing some learning of my own as I watch how they respond to this particular challenge.
At the end of the day, Mind The Science Gap is about teaching the next generation of public health professionals how to connect more effectively with non-specialist and non-technical audiences – whether they are managers, clients, policy makers or members of the public.  It isn’t about creating a new wave of science bloggers.  But in the process, I like to think that some of the participants will get the blogging bug. Whether they do or not, I’m looking forward to ten weeks of engaging, entertaining and hopefully challenging posts from ten talented students.


Read more: http://2020science.org/#ixzz1keRk1Ove

National Academy publishes new nanomaterials risk research strategy


The US National Academy of Science today published its long-awaited Research Strategy for Environmental, Health, and Safety Aspects of Engineered Nanomaterials. I won’t comment extensively on the report as I was a member of the committee that wrote it.  But I did want to highlight a number of aspects of it that I think are particularly noteworthy:
Great progress so far, but it’s time to change gears. Something we grappled with as a committee was what the value of yet another research strategy was going to be.  After all, it wasn’t so long ago that the US federal government published a well received strategy of its own.  A key driver behind our strategy was a sense that the past decade has been one of defining the challenges we face as the field of nanotechnology develops, while the next decade will require more focus as an ever greater number of nanotechnology-enabled products hit the market.  In other words, from a research perspective it’s time to change gears, building on past work but focusing on rapidly emerging challenges.
Combining life cycle and value chain in a single framework for approaching nanomaterial risk research.  As a committee, we spent considerable time developing a conceptual framework for approaching research addressing the health and environmental impacts of engineered nanomaterials.  What we ended up using was a combination of value chain – ranging from raw materials to intermediate products to final products – and material/product life cycle at each stage of the value chain.  This effectively allows risk hot spots to be identified at each point of a material and product’s development, use and disposal cycle.
Principles, not definitions.  Rather than rely on a single definition of engineered nanomaterial to guide risk-related research, we incorporated a set of principles into our conceptual framework to help identify materials of concern from an environment, health and safety impact perspective.  These build on the principles proposed by myself, Martin Philbert and David Warheit in a toxicology review published last year.  From the National Academies report:
…the present committee focuses on a set of principles in lieu of definitions to help identify nanomaterials and associated processes on which research is needed to ensure the responsible development and use of the materials. The principles were adopted in part because of concern about the use of rigid definitions of ENMs that drive EHS research and risk-based decisions … The principles are technology-independent and can therefore be used as a long-term driver of nanomaterial risk research. They help in identifying materials that require closer scrutiny regarding risk irrespective of whether they are established, emerging, or experimental ENMs. The principles are built on three concepts: emergent risk, plausibility, and severity; …
Emergent risk, as described here, refers to the likelihood that a new material will cause harm in ways that are not apparent, assessable, or manageable with current risk-assessment and risk-management approaches. Examples of emergent risk include the ability of some nanoscale particles to penetrate to biologically relevant areas that are inaccessible to larger particles, the failure of some established toxicity assays to indicate accurately the hazard posed by some nanomaterials, scalable behavior that is not captured by conventional hazard assessments (such as behavior that scales with surface area, not mass), and the possibility of abrupt changes in the nature of material-biologic interactions associated with specific length scales. Identifying emergent risk depends on new research that assesses a novel material’s behavior and potential to cause harm.
Emergent risk is defined in terms of the potential of a material to cause harm in unanticipated or poorly understood ways rather than being based solely on its physical structure or physicochemical properties. Thus, it is not bound by rigid definitions of nanotechnology or nanomaterials. Instead, the principle of emergence enables ENMs that present unanticipated risks to human health and the environment to be distinguished from materials that probably do not. It also removes considerable confusion over how nanoscale atoms, molecules, and internal material structures should be considered from a risk perspective, by focusing on behavior rather than size.
Many of the ENMs of concern in recent years have shown a potential to lead to emergent risks and would be tagged under this principle and thus require further investigation. But the concept also allows more complex nanomaterials to be considered—those in the early stages of development or yet to be developed. These include active and self-assembling nanomaterials. The principle does raise the question of how “emergence” is identified, being by definition something that did not exist previously. However the committee recognized that in many cases it is possible to combine and to interpret existing data in ways that indicate the possible emergence of new risks. For example, some research has suggested that surface area is an important factor that affects the toxic potency of some ENMs; ENMs that have high specific surface area and are poorly soluble might pose an emergent risk.
Plausibility refers in qualitative terms to the science-based likelihood that a new material, product, or process will present a risk to humans or the environment. It combines the possible hazard associated with a material and the potential for exposure or release to occur. Plausibility also refers to the likelihood that a particular technology will be developed and commercialized and thus lead to emergent risks. For example, the self-replicating nanobots envisaged by some writers in the field of nanotechnology might legitimately be considered an emergent risk; if it occurs, the risk would lie outside the bounds of conventional risk assessment. But this scenario is not plausible, clearly lying more appropriately in the realm of science fiction than in science. The principle of plausibility can act as a crude but important filter to distinguish between speculative risks and credible risks.
The principle of severity refers to the extent and magnitude of harm that might result from a poorly managed nanomaterial. It also helps to capture the reduction in harm that may result from research on the identification, assessment, and management of emergent risk. The principle offers a qualitative reality check that helps to guard against extensive research efforts that are unlikely to have a substantial effect on human health or environmental protection. It also helps to ensure that research that has the potential to make an important difference is identified and supported.
Together, those three broad principles provide a basis for developing an informed strategy for selecting materials that have the greatest potential to present risks. They can be used to separate new materials that raise safety concerns from materials that, although they may be novel from an application perspective, do not present undetected, unexpected, or enhanced risks. They contribute to providing a framework for guiding a prioritized risk-research agenda. In this respect, the principles were used by the committee as it considered the pressing risk challenges presented by ENMs.
Maintaining current research and development funding levels.  As a committee, we felt that the current US federal government of ~$120 million into environment, health and safety-specific nanotechnology research was reasonable, especially given the current economic climate.  However, we did recommend that, as knowledge develops and commercialization of products using nanomaterials increases,  funded research is aligned with areas and priorities identified within the strategy.
Developing cross-cutting activities.  There were five areas where the committee felt that further funding was needed to ensure the value of nano-risk research was fully realized.  Each of these cuts across areas of research, and provides the means to maximize the benefit of the science being supported.  From the report:
Informatics: $5 million per year in new funding for the next 5 years should be used to support the development of robust informatics systems and tools for managing and using information on the EHS effects of ENMs. The committee concluded that developing robust and responsive informatics systems for ENM EHS information was critical to guiding future strategic research, and translating research into actionable intelligence. This includes maximizing the value of research that is EHS-relevant but not necessarily EHS-specific, such as studies conducted during the development of new therapeutics. Based on experiences from other areas of research, investment in informatics of the order of $15 million is needed to make substantial progress in a complex and data rich field. However, within the constraints of nanotechnology R&D, the committee concluded that the modest investment proposed would at least allow initial informatics systems to be developed and facilitate planning for the long-term.
Instrumentation: $10 million per year in new funding for the next 5 years should be invested in translating existing measurement and characterization techniques into platforms that are accessible and relevant to EHS research and in developing new EHS- specific measurement and characterization techniques for assessing ENMs under a variety of conditions. The committee recognized that the proposed budget is insufficient for substantial research into developing new nanoscale characterization techniques— especially considering the cost of high-end instruments such as analytic electron microscopes—in excess of $2 million per instrument. However, the proposed budget was considered adequate to support the translation of techniques developed or deployed in other fields for the EHS characterization of ENMs.
Materials: Investment is needed in developing benchmark ENMs over the next 5 years, a long-standing need that has attracted little funding to date. The scope of funding needed depends in part on the development of public-private partnerships. However, to assure that funding is available to address this critical gap, the committee recommends that $3-5 million per year be invested initially in developing and distributing benchmark ENMs. While more funds could be expended on developing a library of materials, this amount will assure that the most critically needed materials are developed. These materials will enable systematic investigation of their behavior and mechanisms of action in environmental and biologic systems. The availability of such materials will allow benchmarking of studies among research groups and research activities. The committee further recommends that activities around materials development be supported by public- private partnerships. Such partnerships would also help to assure that relevant materials are being assessed.
Sources: $2 million per year in new funding for the next 5 years should be invested in characterizing sources of ENM release and exposure throughout the value chain and life cycle of products. The committee considered that this was both an adequate and reasonable budget to support a comprehensive inventory of ENM sources.
Networks: $2 million per year in new funding for the next 5 years should be invested in developing integrated researcher and stakeholder networks that facilitate the sharing of information and the translation of knowledge to effective use. The networks should allow participation of representatives of industry and international research programs and are a needed complement to the informatics infrastructure. They would also facilitate dialogue around the development of a dynamic library of materials. The committee concluded that research and stakeholder networks are critical to realizing the value of federally funded ENM EHS research and considered this to be an area where a relatively small amount of additional funding would have a high impact—both in the development of research strategies and in the translation and use of research findings. Given the current absence of such networks, the proposed budget was considered adequate.
Authority and accountability.  In our report, we talk quite a bit about the need for an entity within the federal government to take the lead in implementing a risk research strategy.  While the US National Nanotechnology Initiative have done a great job coordinating interagency activities, we felt that there is only so far coordination without authority can go if socially and economically important research is to be conducted in a timely and relevant manner.  What this “entity” might look like – we left that to the federal government to chew over.
There’s a lot more to the report – including (as you would expect) a broad assessment of research areas that need attention if the science of nanomaterial human health and environmental impacts is to continue to develop effectively.
This is the first of two reports- the second is due in around 18 months, and will look at progress toward implementing a relevant and effective research strategy.


Tuesday, January 24, 2012

Google Search Kills Human Memory


Bangalore: With technology penetrating every part of our lives, we have learned to search for information on the internet. But could this practice be altering the way we store and process information? A new study has revealed findings that could just go to show that Google could be destroying our memories.
Human brain


Psychology professors from the Columbia University, the University Of Wisconsin-Madison, Harvard University came together and published the study that said people“remember less by knowing information than by knowing where the information can be found”, essentially meaning that  we tend to forget information searched for, and rather remember where on the internet it can be found. “The Internet, with its search engines such as Google and databases such as IMDB and the information stored there, has become an external memory source that we can access at any time,” said the study. “It has become so commonplace to look up the answer to any question the moment it occurs, it can feel like going through withdrawal when we can’t find out something immediately.”


The four tests that the professors conducted on individuals in short, confirmed that searching for information on the internet could be wiping out our “internal memory”, which stores data.  “When people expect to have future access to information, they have lower rates of recall of the information itself and enhanced recall instead for where to access it”, said the study.


So instead of remembering data, say what happened during an event, we tend to remember which sites we saw that information on, or where on Google we found the site.


According to the study “These results suggest that processes of human memory are adapting to the advent of new computing and communication technology. We are becoming symbiotic with our computer tools growing into interconnected systems that remember less by knowing information than by knowing where the information can be found.”  “This gives us the advantage of access to a vast range of information—although the disadvantages of being constantly ‘wired’ are still being debated”, concluded the professors in the study.

Monday, January 23, 2012

The Northernmost Dish In The World: Tracking Satellites and Dodging Polar Bears


Cold Dish Greg White
Sten-Christian Pedersen oversees the northernmost antenna array on Earth, 25 dishes tracking about 100 satellites on the small archipelago of Svalbard, 500 miles south of the North Pole. Even when the winds are –76°F and visibility is 10 feet, Pedersen drives to the satellite station. When there is a risk of avalanche, he takes a helicopter. When there are polar bears, he carries a firearm.
Pedersen says the antennas are protected by “radomes: close to a drum skin, but made out of plastic.” Under the radome is the dish, a lattice of supports with four motors in its base; when the dishes move, it is barely discernable. Svalbard Satellite Station tracks polar orbits—satellites make a complete lengthwise circuit of the globe every 100 minutes, 14 times a day. In 25 days, a polar-orbiting satellite will have seen the whole surface of the Earth.
As a satellite passes overhead, SvalSat’s technicians have 12 to 15 minutes to download its data at 300 megabytes a second. The data (images, mostly) is then sent through fiber-optic cables in concrete tubes aboveground (permafrost prevents burial), before it reaches an undersea cable connecting to northern Norway. A new satellite named NNP, a weather satellite, just launched. Pedersen says that the station is in “freeze, to minimize any risk of a problem.” His week has been an easy one. The sun recently set for the last time until mid-February. “Then,” he says, “you will start to see small light from the south. The blue season, we call it.”

Cloud-Based Quantum Computing Will Allow Secure Calculation on Encrypted Bits


Entangled Qubits Clusters of entangled qubits allow remote quantum computing to be performed on a remote server, while keeping the contents and results hidden. EQUINOX GRAPHICS
When quantum computers eventually reach larger scales, they’ll probably remain pretty precious resources, locked away in research institutions just like our classical supercomputers. So anyone who wants to perform quantum calculations will likely have to do it in the cloud, remotely accessing a quantum server somewhere else. A new double-blind cryptography method would ensure that these calculations remain secret. It uses the uncertain, unusual nature of quantum mechanics as a double advantage.
Imagine you’re a developer and you have some code you’d like to run on a quantum computer. And imagine there’s a quantum computer maker who says you can run your code. But you can’t trust each other — you, the developer, don’t want the computer maker to rip off your great code, and the computer builder doesn’t want you to peep its breakthrough machine. This new system can satisfy both of you.
Stefanie Barz and colleagues at the University of Vienna’s Center for Quantum Science and Technology prepared an experimental demonstration of a blind computing technique, and tested it with two well-known quantum computing algorithms.
Here’s how it would work: You, the developer, prepare some quantum bits, in this case photons that have a polarity (vertical or horizontal) known only to you. Then you would send these to the remote quantum server. The computer would entangle the qubits with even more qubits, using a quantum entangling gate — but the computer wouldn’t know the nature of the entangled states, just that they are in fact entangled. The server is “blind” to the entanglement state, and anyone tapping into the server would be blind, too.
Imagine the computer tries to snoop on the qubits and see their entanglement, which could then be used to extract the information they carry. You’d be able to tell, because of the laws of quantum mechanics. The cat is both dead and alive until you check whether it’s dead or alive, and then it’s one or the other. If your photon has a specific state, you’d be able to tell that it was spied upon.
Back to the entangled bits. The actual information processing takes place via a sequence of measurements on your qubits. These measurements would be directed by you, based on the particular states of each qubit (which, again, only you know). The quantum server would run the measurements and report the results to you. This is called measurement-based quantum computation. Then you’d be able to interpret the results, based on your knowledge of the qubits’ initial states. To the computer — or any interceptor — the whole thing would look utterly random.
Since you know the entangled state on which the measurements were made, you can be certain whether the server really was a quantum computer. And you wouldn’t have to disclose your algorithm, the input or even the output — it’s perfectly secure, the researchers write in their paper, published online today in Science.
Blind quantum computation is more secure than classical blind computation, which relies on tactics like the backward factoring of prime numbers, said Vlatko Vedral, a researcher at the University of Oxford who wrote a Perspective piece explaining this finding.
“The double blindness is guaranteed by the laws of quantum physics, instead of the assumed difficulty of of computational tasks as in classical physics,” Vedral writes.
The Vienna team argues their simulation is a potentially useful technique for future cloud-based quantum computing networks.
“Our experiment is a step toward unconditionally secure quantum computing in a client-server environment where the client’s entire computation remains hidden, a functionality not known to be achievable in the classical world,” they write.