Pages

Thursday, February 23, 2012

The James Webb Telescope


The James Webb Space Telescope will let NASA peer into deep space to the beginnings of our universe. It will be specially equipped to view infrared light, which escapes from the dust clouds where the first stars and planets formed.
Comparing mirrors
The James Webb Telescope
Webb’s primary mirror’s resolution is acute enough to see details of a penny at 24 miles.

Packed away for launch

The telescope is folded up at launch and automatically deploys to its full size once in space.

Remaining in place

The Webb telescope must remain cold – around -370 degrees – to best observe infrared light. Its orbit will keep it in place relative to the sun and earth.

Seeing green

Costs for the Webb telescope have mounted to an estimated $8.8 billion. How annual funding has grown (scale in millions):
Note:
The $8.8-billion overall cost includes development, launch and five years of operations and science. Data from 2013 to 2017 estimated.

NASA's Webb telescope: Revolutionary design, runaway costs


The $8.8-billion Webb Space Telescope promises to provide a glimpse at the first light in the universe. But its spiraling cost may push NASA into a new era of austerity.

Webb Space Telescope under construction
Specialists work on the sun shield for the James Webb Space Telescope at Northop Grumman Corp. in Redondo Beach. (Allen J. Schaben, Los Angeles Times / December 20, 2011)

In deep, cold space, nearly a million miles from Earth, a giant telescope later this decade will scan for the first light to streak across the universe more than 13 billion years ago.

The seven-ton spacecraft, one of the most ambitious and costly science projects in U.S. history, is under construction for NASA at Northrop Grumman Corp.'s space park complex in Redondo Beach.

The aim is to capture the oldest light, taking cosmologists to the time after the big bang when matter had cooled just enough to start forming the first blazing stars in what had been empty darkness. Astronomers have long dreamed about peering into that provenance.

"It is the actual formation of the universe," said Alan Dressler, the astronomer at the Observatories of the Carnegie Institution for Science in Pasadena who chaired a committee that proposed the telescope more than a decade ago.

If the James Webb Space Telescope works as planned, it will be vastly more capable than any of the dozen currently deployed U.S. space telescopes and will be a dramatic symbol of U.S. technological might. But for all its sophistication, the project also reveals a deeply ingrained dysfunction in the agency's business practices, critics say. The Webb's cost has soared to $8.8 billion, more than four times the original aerospace industry estimates, which nearly led Congress to kill the program last year.

The agency has repeatedly proposed such technologically difficult projects at bargain-basement prices, a practice blamed either on errors in its culture or a political strategy. Rep. Frank R. Wolf (R-Va.), chairman of the House appropriations subcommittee that controls NASA's budget, said a combination of both problems affected the Webb.

"There was not adequate oversight," Wolf said. "And there were reports that the cost estimates were being cooked a little bit, some by the company, some by NASA."

It could spell a new era for the space agency, in which it will have money for just one flagship science mission per decade rather than one every few years as it has in the past. The Webb's cost growth, along with an austere budget outlook for NASA, is depleting the agency's pipeline of big science missions. A much-discussed mission to return samples of Martian soil to Earth, for example, may be unaffordable, according to the House Science Committee staff.

The Webb telescope was conceived by the astronomy community in the late 1990s as a more modest project with a smaller mirror for about $500 million. Then-NASA chief Daniel Goldin challenged the science community in a major speech to double the capability of the telescope for the same price.

Dressler, who was in the audience when Goldin gave the speech, recalled: "It astonished everybody. It made no sense that you could build a telescope six times larger than Hubble … and have it come in cheaper. We were so stunned, we didn't know what to do."

The early lowball cost figures had no official standing, but they shaped political expectations many years later.

Not surprisingly, the price began to rise, first to $1 billion and then to more than $2 billion when the aerospace industry began submitting estimates. By 2008, when the program was well underway, the cost hit $5 billion.

NASA was running into technical difficulties in manufacturing almost every aspect of the telescope, and it was forced to stretch out the schedule, said Richard Howard, NASA's head of the Webb program and the agency's deputy chief technologist. The agency kept investing in the most difficult technologies for the Webb, leaving other parts of the project out of sync. As a result, some components will be boxed up and stored for years while other pieces are completed.

The delays boosted the cost even more. By last year, the cost estimate to build the telescope hit $8 billion, not including about $940 million in contributions by international partners and about $800 million NASA will spend for five years of operation. The launch date slipped from 2014 to 2018, meaning an army of experts will have to keep working years more on the project. In the past, NASA could tap reserves in its larger budget to get through technical problems, but those funding pools have dried up, Howard said.

The skyrocketing cost infuriated many in Congress. Last year, Wolf led an effort by House Republicans to eliminate all of the Webb's funding, though it was ultimately restored by a conference committee. But to those working on the program, the message was sent.

"It didn't feel good," said Scott Willoughby, Northrop's general manager for the project. "It is costing more than it should. But we didn't make any bad choices. The money was well-spent. We are building the telescope we originally conceived."

Indeed, an independent review panel commended the telescope team last year for its technical merit. The machine has required a whole list of revolutionary developments.

The 21-foot-diameter mirror will be six times larger in area than Hubble, focused by more than 100 motors on its back. Made up of 18 hexagonal segments covered in a thin layer of gold, it is so big that it must be folded up for launch — another innovation.

To withstand the brutal temperature shifts in space and to save weight, the mirror is made of the rare element beryllium. Only a few companies in the world can polish beryllium so finely that mere atoms can be brushed off. One of those companies is L-3 Communications SSG-Tinsley Inc. in Richmond, northeast of San Francisco. The grinding and polishing process took seven years and required the company to build eight custom machines that cost $1 million apiece.

"We had to find a way to do this right," said John Kincade, a vice president with L-3. "The mirrors have to be perfect."

As ancient light traverses the universe, it shifts to the infrared region of the spectrum, requiring the Webb to have mirrors capable of collecting very faint emissions and detecting them with special sensors that must be kept at nearly the lowest possible temperature known to exist. The satellite will rely on four instruments, supplied by a European consortium, Canada, the University of Arizona and the Jet Propulsion Laboratory in La CaƱada Flintridge, with other partners.

To achieve those low temperatures, the Webb will have a sophisticated refrigeration system and a five-layer plastic shade to shield the mirror and instruments from the sun. The shade will stretch to the size of a tennis court, keeping temperatures on one side at minus-388 degrees Fahrenheit and the other hot enough to fry an egg at 185 degrees. If it all works, not only will the Webb see the first light of the universe, but it will spot new planets and even determine whether those distant bodies hold water, Howard said.

Howard is confident now that the cost will not increase further and that NASA can execute the program on the new schedule. If the cost does go higher, Wolf admits Congress is not likely to kill the program but says NASA will get hurt in many other ways.

"The real danger is not that [the Webb] will not be funded, but it will consume so many other NASA programs," he warned.

Friday, February 17, 2012

AMD HD7770 Review


AMD HD7770 Review
Introduction
If you're a big fan of getting a lot of performance for a reasonable outlay, then the numbers 770 will get you all excited. After all the 4770 and 5770s were both cards with great levels of performance and yet came in at a very reasonable price-point.
Given that the top end cards in the 7 series are both more expensive than we're used to from AMD, and the rejig of their naming conventions with the 6 series, will the HD7770 live up to the lofty expectations of its predecessor?
Technical Specifications
One of the main points of the HD7770 is that it has all the AMD technologies built in to it. Rather than cutting down the featureset as well as the amount of Stream Processors and clockspeed, AMD have solely reduced the performance of the card, but left all the Eyefinity and rendering elements alone.
AMD HD7770 Review     AMD HD7770 Review

Sunday, February 12, 2012

Deconstructing Dengue: How Old Is That Mosquito?



Scientists can spend years working on problems that at first may seem esoteric and rather pointless. For example, there's a scientist in Arizona who's trying to find a way to measure the age of wild mosquitoes.
As weird as that sounds, the work is important for what it will tell scientists about the natural history of mosquitoes. It also could have major implications for human health.
Here's why. There's a nasty disease called dengue that is just beginning to show up in the United States. It's caused by a virus, and it's transmitted from person to person by a mosquito. A mild case of dengue is no worse than flu. A serious case can mean death.
Michael Riehle at the University of Arizona is trying to solve a curious puzzle about dengue: why there have been dozens of cases in nearby Texas and none, or virtually none, in Arizona. Riehle thinks the answer has to do with Arizona's geography.
"It's right on the edge of the range where these dengue mosquitoes are found," he says. "It's a fairly harsh environment, and we think that they might not be surviving long enough to efficiently transfer the disease to other people."
So to test his hypothesis, Riehle wants to be able to compare the life spans of mosquitoes in Arizona with those in Texas.
It's not easy to tell how old a mosquito is: It's not as if they carry around birth certificates or government-issued IDs. Right now the tools for measuring the age of mosquitoes are pretty crude. For example, you can look at a female mosquito's ovaries to see if they have produced any eggs. Riehle says if they have, that means the mosquito is at least five days old, since they can't produce eggs before that. "But that's all it can tell us — less than five days, or more than five days," Riehle says.
So Riehle has a new idea. He wants to see if he can use a mosquito's gene to tell its age.
Looking For Clues In Genes
There are ways to tell when a particular gene is switched on or off in a mosquito. Riehle is looking for genes that switch on or off when the mosquito reaches a particular age. He's found one so far. He needs more in order to make more age estimates.

Saturday, February 11, 2012

The Continent Where Climate Went Haywire

From floods to cyclones to fires of unimaginable ferocity, climate 
change has unleashed a host of plagues on Australia. But catastrophe has 
spawned a national rebirth.

“The river came up to right where we’re sitting, and the waters were more than two feet deep,” Peter Goodwin tells me in the driveway of his ranch-style house perched on the banks of the Balonne River in St. George, a village of 3,500 in eastern Australia. It is a drizzly Sunday afternoon in April, three months after a devastating flood that drenched a landmass the size of France and Germany combined and isolated the town after the rain-swollen river rose to a record 45 feet.
Agricultural areas like St. George were hardest hit by the relentless rains and overflowing rivers that swamped roads, cut off power lines, washed away vineyards and fruit orchards, drowned thousands of head of cattle and other livestock, and covered homes and everything inside them in thick layers of sediment and mud. Shell-shocked residents are still digging out from under the debris.
“That’s the hard part of the flood—the aftermath,” says Goodwin, 60, a crusty, compactly built man with piercing blue eyes and calloused hands who works as an operations manager for the local municipality and has been staying with his grown daughter while he makes his home habitable again. “You get a lot of help during the flood, but then everyone settles back into their routine. There are a lot of houses down there that are still empty,” he adds, gesturing toward the riverbank. “And they will be for a long time to come”...

Thursday, February 9, 2012

New 'Cell Assay On a Chip:' Solid Results from Simple Means

The great artist and inventor Leonardo da Vinci once said that "simplicity is the ultimate sophistication." National Institute of Standards and Technology (NIST) research engineer Javier Atencia certainly believes in the wisdom of what da Vinci preached; he has a reputation for creating novel microfluidic devices out of ordinary, inexpensive components. This time, he has combined a glass slide, plastic sheets and double-sided tape into a "diffusion-based gradient generator" -- a tool to rapidly assess how changing concentrations of specific chemicals affect living cells.



Atencia's latest innovation offers a simple and inexpensive way to expose an array of cultured cells to a chemical gradient -- a solution where the chemical concentration changes gradually and predictably across the array. Such gradients are a rapid, high-throughput way to evaluate the effect on cell growth or toxicity.
There are two distinct advantages to the new NIST system over traditional microfluidic cell assay devices. The first is that the gradient is created by diffusion -- the gentle movement of matter from one point to another by random molecular motion. Conventional microfluidic systems usually mix fluids by pumping them in a circular motion or by twisting and folding them together. Diffusion greatly reduces the risk of cells being swept away or damaged by shearing forces in the test fluid.
The second big advantage is simplicity. The gradient generator is built in layers, with each section precisely positioned with an alignment tab. The base is a glass slide, upon which is attached a strip of double-sided tape cut to have a row of four micrometer-sized channels. Atop this goes a polystyrene strip cut to have two lines each of four tiny circular "wells" where each pair lines up with the ends of the channel below it. The next layer is another strip of double-sided tape, this time with a Y-shaped canal cut into it to serve as the flow path for the chemical gradient. Finally, a Mylar strip cut to have an identical Y-canal serves as the cover.
The hinged cover allows access to the wells for adding test cells. Once done, the cover is lowered and affixed, sealing the gradient generator. Fluid flow in and out of the system is accomplished using another Atencia innovation, magnetic connectors. Kept at constant pressure, this flow assures a steady-state stream through the device and creates a diffusion gradient in each buried channel. Cells in the channels are simultaneously exposed to a range of chemical concentrations from high to low.
To test the device, Atencia and his colleagues loaded it with cells genetically engineered to produce large amounts of green fluorescent protein (GFP) and then introduced cycloheximide (CHX), a chemical that shuts down ribosomes, the cell's protein factories. Cells exposed to the toxin quickly stop synthesizing GFP, decreasing fluorescence by an amount directly related to the concentration of CHX.
This is exactly what the researchers observed in the gradient generator assays. The cells were exposed three times to CHX, and each time, the level of GFP fluorescence increased as the concentration of CHX in the gradient decreased, and vice versa.
In his previous microfluidics creations, Atencia turned a simple plastic dish into a "microfluidic palette" for exposing cells to multiple chemical concentrations in a single chamber and then merged donut-shaped magnets and plastic tubes to make a leak-free connector for getting fluids into and out of a microchannel.

Sunday, February 5, 2012

You might be a terrorist if you have two cell phones, use Web proxies


Are you concerned about your digital privacy enough to use proxies and encryption software? Do you voice chat with fellow PC gamers? If so, you might be a terrorist, according to FBI and DoJ. Public Intelligence has discovered a federal document outlining a "Communities Against Terrorism" initative that describes various "suspicious" activities and offers tips on reporting them to the government.
The guideline (PDF) lists many perfectly normal behaviors but tries to validate their inclusion by emphasizing extremes that are completely open to interpretation. For instance, someone involved in terrorist activities might be "overly" concerned about their privacy and attempt to shield their screen from view of others. Such an individual might also travel "illogical" distances to visit an Internet CafĆ©.
You might also be a terrorist if you use multiple cell phones, switch SIM cards in the same handset, mask your IP address, download information on "timers, electronics or remote transmitters/receivers", or partake in "suspicious" communications through VoIP services. The list goes as far as to mention logging on to a residential-based Internet provider, such as checking your Comcast or AOL email.
Naturally, it's virtually impossible to define what constitutes being "overly" private, traveling "illogical" distances or making "suspicious" VoIP calls. By making such broad strokes, it could be argued that anyone engaging in such activities is suspect, regardless of how innocuous they may be. Nonetheless, if you feel compelled to report your neighbor or coworker, the FBI and DoJ suggest that you:
  • Gather information about the suspects without drawing attention to yourself
  • Log license plates, vehicle descriptions, names, languages spoken and ethnicities
  • Don't collect metadata, content or search the individuals' electronic communications

Researchers: Don't trust satellite phones, encryption broken


Two researchers, Be­ne­dikt Dries­sen and Ralf Hund, managed to break the proprietary ciphers used for satellite phones, GMR-1 and GMR-2. In a public talkabout their discovery, the researchers said, "Don't trust satellite phones".
While satellite phones have been mostly replaced by GSM/CDMA phones in consumer markets, such telecommunications devices are still used today by governments, military, relief and non-governmental organizations, businesses and even individuals in remote locations where cell phone towers are not an option. According to the researchers, there are current several hundred thousand satellite phone subscribers.
The researchers were able to reverse engineer the cryptographic algorithms utilized by the phones after analyzing freely-available firmware intended for updating their DSP (digital signal processor) chips. Because the ciphers are completely mathematical and don't employ the use of private keys, anyone who can receive a satellite phone transmission and knows the cipher's algorithm can easily eavesdrop on an intentionally private conversation.
"Even though a niche mar­ket com­pa­red to the G2 and G3 mo­bi­le sys­tems, there are se­ver­al 100,000 sat­pho­ne sub­scri­bers world­wi­de. Given the sen­si­ti­ve na­tu­re of some of their ap­p­li­ca­ti­on do­mains (e.g., na­tu­ral di­sas­ter areas or mi­li­ta­ry cam­paigns), se­cu­ri­ty plays a par­ti­cu­lar­ly im­portant role for sat­pho­nes. In this paper, we ana­ly­ze the en­cryp­ti­on sys­tems used in the two exis­ting (and com­pe­ting) sat­pho­ne stan­dards, GMR-1 and GMR-2. The first main cont­ri­bu­ti­on is that we were able to com­ple­te­ly re­ver­se en­gi­neer the en­cryp­ti­on al­go­rith­ms em­ploy­ed. Both ciph­ers had not been pu­bli­cly known pre­vious­ly. We de­scri­be the de­tails of the re­co­very of the two al­go­rith­ms from fre­e­ly avail­able DSP-firm­ware up­dates for sat­pho­nes, which in­clu­ded the de­ve­lop­ment of a cust­om di­sas­sem­bler and tools to ana­ly­ze the code, and ex­ten­ding prior work on bi­na­ry ana­ly­sis to ef­fi­ci­ent­ly iden­ti­fy cryp­to­gra­phic code."
ETSI and Immarsat are the two companies responsible for the GMR-1 and GMR-2 stream cipher standards, respectively. The inherent flaw found within these systems appears to be their disregard for Kerchoff's Principle. This 129-year-old axiom basically states that in order to be truly secure, a cryptosystem should be effectively indecipherable even when the algorithms and processes to generate the obfuscation are exposed. To achieve this, cryptographers have typically employed the use of private keys. Private keys, which are only known by the sender and/or recipient(s), provide a way to "unlock" the encryption so that only intended parties can decipher the information. This thinking also allows open-source cryptosystems to be just as secure as their commercial counterparts even though the inner-workings of such systems are fully exposed to the public.
"The se­cond main cont­ri­bu­ti­on lies in the cryp­t­ana­ly­sis of the two pro­prie­ta­ry stream ciph­ers. We were able to adopt known A5/2 ci­pher­text-on­ly at­tacks to the GMR- 1 al­go­rithm with an aver­a­ge case com­ple­xi­ty of 232 steps. With re­spect to the GMR-2 ci­pher, we de­ve­lo­ped a new at­tack which is power­ful in a known-plain­text set­ting. In this si­tua­ti­on, the en­cryp­ti­on key for one ses­si­on, i.e., one phone call, can be re­co­ver­ed with ap­pro­xi­mate­ly 50?65 bytes of key stream and a mo­de­ra­te com­pu­ta­tio­nal com­ple­xi­ty. A major fin­ding of our work is that the stream ciph­ers of the two exis­ting sa­tel­li­te phone sys­tems are con­s­i­der­a­b­ly wea­ker than what is sta­te-of- the-art in sym­me­tric cryp­to­gra­phy."
 Comments
government, search, hacking, security, research, military, benedikt driessen, ralf hund, encryption, satellite phones, gmr-1, gmr-2, cryptography, etsi, immarsat

Graphics Card Overclocking: Is It Really Worth It?


Graphics Card Overclocking: Is It Really Worth It?
Overclocking plays a vastly different role in the computer industry today than it did 10 years ago, a time when overclockers were considered outlaws by manufacturers. Back then even mentioning overclocking could void your warranty and industry leaders like Intel were working to eliminate it all together.
In contrast, nowadays processor and graphics cards manufacturers have embraced the practice, touting high 'overclockability' as a feature and in the process using it to sell enthusiast oriented products at a premium.
Take the popular mid-range GeForce GTX 560 Ti as an example. Base model non-overclocked cards start at ~$229, but finding them isn’t so plain and easy as most manufacturers prefer to push their overclocked counterparts. While the Nvidia specification calls for a 822MHz core clock speed, you shouldn't be surprised to see outgoing models running at 900MHz or more for this particular GPU series.
Sounds too good to be true? It probably is. Often these factory overclocked models cost 10 – 20% more than the standard models, while only offering half that amount in extra performance. The alternative is to overclock manually and thankfully that's easy to do as both AMD and Nvidia drivers provide their own custom overclocking facilities.
Here's another scenario that begs the question of whether overclocking is worth it... You go out to buy a new graphics card, set a budget, and it'd seem that for another $30-60 you can always go with the next step up that performs a little better. Or, you could save those extra dollars, go for the budget model and overclock it and basically match the next step up's performance.
With that in mind, we have hand-picked three graphics cards that represent select price ranges to see just how much extra value can be obtained through overclocking. For the $100+ range we have the Radeon HD 6750, the GeForce GTX 560 Ti has been used to represent the $200+ market. Then at the top of the food chain we have the Radeon HD 6970 going for $300 and up.
Each of these graphics cards will be overclocked to their maximum stable frequency using the stock air cooling. The comparison will be drawn using their non-overclocked results as well as a number of competing products, usually the ones costing a little bit more.
Our test system specs look like this:
- Intel Core i7-3960X Extreme Edition (3.30GHz)
- x4 2GB G.Skill DDR3-1600(CAS 8-8-8-20)
- Gigabyte G1.Assassin2 (Intel X79)
- OCZ ZX Series (1250w)
- Crucial m4 512GB (SATA 6Gb/s)
- Microsoft Windows 7 SP1 64-bit
- Nvidia Forceware 285.62
- AMD Catalyst 11.12
The graphics cards tested include (from least expensive to most expensive):
- Radeon HD 6750 (1024MB) stock and overclocked at 800/1300MHz
- GeForce GTX 550 Ti (1024MB)
- Radeon HD 6850 (1024MB)
- GeForce GTX 560 Ti (1024MB) stock and overclocked at 955/2250MHz
- Radeon HD 6950 (2048MB)
- Radeon HD 6970 (2048MB) stock and overclocked at 955/1470MHz
- GeForce GTX 570 (1280MB)
- GeForce GTX 580 (1536MB

$100+ Price Point
The Radeon HD 6750 comes by default with a core clock of 700MHz and RAM speed of 1150MHz (4.6GHz GDDR5). We were able to reach 800MHz core and 1300MHz (5.2GHz) memory speeds. This means the core was boosted by 14% and the memory 13%, so in the best of scenarios we are expecting a ~14% performance increase.
Playing Battlefield 3 at 1920x1200 the Radeon HD 6750 rendered just 17.1fps. Overclocking provided a frame rate increase of 13% which is just 2fps. While a 13% performance gain sounds substantial, at 19.3fps Radeon HD 6750 owners will still need to turn the visual quality down in this game.
Again we find that the Radeon HD 6750 is too slow to enjoy the latest games in all of their visual glory as it produced just 14.1fps when testing with Crysis 2. When overclocked we saw a 15% performance boost but this represented too little of a change. By lowering the visual quality perhaps that 15% performance gain could come in handy.
Dirt 3 was better able to exploit the benefits from our overclocking efforts. The Radeon HD 6750 saw a 14% performance increase which worked out to be a 4fps gain. Nonetheless, going from 27.8fps on average to the 31.8fps of the overclocked configuration, lag was considerably less noticeable when we pushed for the extra MHz.
The Witcher saw modest gains in our gameplay test. The Radeon HD 6750 averaged 24.9fps which was then increased to 28.1fps, a 13% performance increase that nonetheless went unnoticed.

$200+ Price Point
The GeForce GTX 560 Ti which by default comes with a core clock of 822MHz and a memory frequency of 1002MHz (4.0GHz) reached a core clock of 955MHz and a memory speed of 1125MHz (4.5GHz) when overclocked. This is a 16% core clock increase and a 12% rise in memory frequency.
The GeForce GTX 560 Ti rendered 39.2fps without the overclock, but the 14% gain meant that the GTX 560 Ti was now faster than a standard Radeon HD 6970.
The GeForce GTX 560 Ti went from 28.9fps to 32.6fps in Crysis 2 which worked out to be an additional 4fps on average.
The GeForce GTX 560 Ti was already sitting pretty with an average of 60.5fps, but the additional 10fps were not knocked back as the overclock yielded a 16% performance increase.
The GeForce GTX 560 Ti saw an 11% performance increase as it was able to render 4fps more in The Witcher 2
$300+ Price Point
Finally the Radeon HD 6970 which comes clocked at an already high 880MHz for the core and 1375MHz (5.5GHz) for the memory provided very limited headroom for overclocking. The core maxed out at 955MHz while the memory went on to 1470MHz (5.8GHz) for a 9% core and 7% memory overclock.
Our mild 9% overclock allowed for a shy 6% performance gain as the frame rate increased by just 2fps. This didn’t do a whole lot for the Radeon HD 6970 and still meant that it was much slower than the GeForce GTX 570 in this game.
The Radeon HD 6970 only saw a 1fps performance gain when testing with Crysis 2.
On Dirt 3 the Radeon HD 6970 enjoyed an extra 5fps for a 7% performance gain. This was enough to knock out the GeForce GTX 570 running at stock frequencies.
The Radeon HD 6970 also managed an extra 4fps in The Witcher 2 for a 6% performance increase
Final Thoughts
Is graphics card overclocking worth it? In short, yes it can be. Certain configurations respond well to the change and after all, it's free and mostly safe. However game changing results should not be expected.
In our tests we saw modest gains all along, and the mid-range GeForce GTX 560 Ti seemed to be the card benefiting the most from bumping the stock specs. In the best of scenarios your games will play a bit more smoothly if you are not pushing the GPU beyond its real capabilities. In the worst of cases, you won't gain anything and you can dial things down back to normal.
In the four games that we tested the Radeon HD 6750 delivered on average 3fps more when overclocked, a 14% performance gain. The GeForce GTX 560 Ti averaged an extra 6fps for a 13% increase, and the Radeon HD 6970 averaged 3fps more for a 6% performance boost.
AMD and Nvidia are being extra very careful about how they build their GPU series now. They almost never offer a card that can be overclocked to replace the model above it, you can only close the gap. You might be able to overclock the GPU higher but other factors like memory bus width will always hold the performance back. This goes in stark contrast with the latest CPUs which appear to be wide open to enthusiasts' abuse and potentially huge performance gains.
 
Going to these extremes for the sake of overclocking is probably not worth your while with the current crop of GPUs -- at least not if all you care about are raw performance gains. Image credit: xtremesystems.org
But back to the GPU world, where overclocking may not make sense is when it is carried out by the manufacturer. Factory overclocked cards almost always cost more than the performance gain provided. For example, last year we reviewed the HIS Radeon HD 6850 IceQ X Turbo which was priced 15% above that of a standard Radeon HD 6850 graphics card, yet only provided a 4% performance improvement.
While it's true that more expensive factory overclocked cards usually come with upgraded coolers, an added bonus that goes beyond raw performance, there are many similar upgraded cards without the overclocking that retail for less.
What's been your own experience? Do you overclock your graphics card these days? Did you use to? (please keep the Crysis jokes to a minimum :)).

Thursday, February 2, 2012

Male spiders of one species lose their genitals after sex to increase sperm count in females

Male spiders of one species lose their genitals after sex to increase sperm count in females

Nephilengys malabarensis female with a severed male palp (red box) lodged in her epigynum after copulation, and a half-cannibalized male at her side. Image
 Researchers have known for some time that the male sex organ, called a palp, in orb-web spiders is often broken off during copulation with females; what hasn’t been so clear is why. Now, new research by Daiqin Li of the National University of Singapore and colleagues have found, as they describe in their paper published in Biology Letters, that by breaking off their palp, the male spiders ensure that more of their sperm enters the female after he runs away or is eaten by his partner.