Slider

Science

SCITECH

AMAZING FACTS

NATURE SPACE

Psychology

Lead Can Be Turned into Gold

Fact or Fiction?: Lead Can Be Turned into Gold

Particle accelerators make possible the ancient alchemist’s dream—but at a steep cost

gold
For hundreds of years alchemists toiled in their laboratories to produce a mythical substance known as the philosopher’s stone. The supposedly dense, waxy, red material was said to enable the process that has become synonymous with alchemy—chrysopoeia, the metamorphosis, or transmutation, of base metals such as lead into gold.
 
Alchemists have often been dismissed as pseudoscientific charlatans but in many ways they paved the way for modern chemistry and medicine. The alchemists of the 16th and 17th centuries developed new experimental techniques, medicines and other chemical concoctions, such as pigments. And many of them "were amazingly good experimentalists,” says Lawrence Principe, a chemist and science historian at Johns Hopkins University. “Any modern professor of chemistry today would be more than happy to hire some of these guys as lab techs.” The alchemists counted among their number Irish-born scientist Robert Boyle, credited as one of the founders of modern chemistry; pioneering Swiss-born physician Paracelsus; and English physicist Isaac Newton.
 
But despite the alchemists’ intellectual firepower and experimental acumen, the philosopher’s stone lay forever out of reach. The problem, Principe says, is that the alchemists did not yet know that lead and gold were different atomic elements—the periodic table was still hundreds of years away. Believing them to be hybrid compounds, and therefore amenable to chemical change in laboratory reactions, the alchemists pursued the dream of chrysopoeia to no avail.
 
With the dawn of the atomic age in the 20th century, however, the transmutation of elements finally became possible. Nowadays nuclear physicists routinely transform one element to another. In commercial nuclear reactors, uranium atoms break apart to yield smaller nuclei of elements such as xenon and strontium as well as heat that can be harnessed to generate electricity. In experimental fusion reactors heavy isotopes of hydrogen merge together to form helium. (An element is defined by the number of protons in its nucleus whereas an isotope of a given element is determined by the quantity of neutrons.)
 
But what of the fabled transmutation of lead to gold? It is indeed possible—all you need is a particle accelerator, a vast supply of energy and an extremely low expectation of how much gold you will end up with. More than 30 years ago nuclear scientists at the Lawrence Berkeley National Laboratory (LBNL) in California succeeded in producing very small amounts of gold from bismuth, a metallic element adjacent to lead on the periodic table. The same process would work for lead, but isolating the gold at the end of the reaction would prove much more difficult, says David J. Morrissey, now of Michigan State University, one of the scientists who conducted the research. “We could have used lead in the experiments, but we used bismuth because it has only one stable isotope,” Morrissey says. The element’s homogeneous nature means it is easier to separate gold from bismuth than it is to separate gold from lead, which has four stable isotopic identities.
 
Using the LBNL’s Bevalac particle accelerator, Morrissey and his colleagues boosted beams of carbon and neon nuclei nearly to light speed and then slammed them into foils of bismuth. When a high-speed nucleus in the beam collided with a bismuth atom, it sheared off part of the bismuth nucleus, leaving a slightly diminished atom behind. By sifting through the particulate wreckage, the team found a number of transmuted atoms in which four protons had been removed from a bismuth atom to produce gold. Along with the four protons, the collision-induced reactions had removed anywhere from six to 15 neutrons, producing a range of gold isotopes from gold 190 (79 protons and 111 neutrons) to gold 199 (79 protons, 120 neutrons), the researchers reported in the March 1981 issue of Physical Review C.
 
The amount of gold produced was so small that Morrissey and his colleagues had to identify it by measuring the radiation given off by unstable gold nuclei as they decayed over the course of a year. In addition to the several radioactive isotopes of gold, the particle collisions presumably produced some amount of the stable isotope gold 197—the stuff of wedding bands and gold bullion—but because it does not decay the researchers were unable to confirm its presence. “The stable isotope would have to be observed in a mass spectrometer,” Morrissey says, “but I think that the number of atoms was, and is still, below the level of detection by mass spec.”
 
Isolating the minute quantities of gold would be even more difficult using lead as a starting material, but smashing high-speed nuclei into a lead target would indeed complete the long-sought transmutation. Some of the collisions would be expected to remove three protons from lead, or one proton from mercury, to produce gold. “It is relatively straightforward to convert lead, bismuth or mercury into gold,” Morrissey says. “The problem is the rate of production is very, very small and the energy, money, etcetera expended will always far exceed the output of gold atoms.”
 
In 1980, when the bismuth-to-gold experiment was carried out, running particle beams through the Bevalac cost about $5,000 an hour, “and we probably used about a day of beam time,” recalls Oregon State University nuclear chemist Walter Loveland, one of the researchers on the project. Glenn Seaborg, who shared the 1951 Nobel Prize in Chemistry for his work with heavy elements and who died in 1999, was the senior author on the resulting study. “It would cost more than one quadrillion dollars per ounce to produce gold by this experiment," Seaborg told the Associated Press that year. The going rate for an ounce of gold at the time? About $560.

Corrosion Types Encountered With Power Cables

Corrosion Types Encountered With Power Cables

Corrosion Types Encountered With Power Cables

Introduction

There are numerous types of corrosion, but the ones that are discussed here are the ones that are most likely to be encountered with underground power cable facilities.
In this initial explanation, lead will be used as the referenced metal. Copper neutral wire corrosion is not discussed here.
Stray DC currents come from sources such as welding operations, flows between two other structures, and –in the days gone by — street railway systems.
Anodic corrosion is due to the transfer of direct current from the corroding facility to the surrounding medium, usually earth. At the point of corrosion, the voltage is always positive on the corroding facility.
In the example of lead sheath corrosion, the lead provides a low resistance path for the DC current to get back to its source. At some area remote from the point where the current enters the lead, but near the inception point of that stray current, the current leaves the lead sheath and is again picked up in the normal DC return path.
The point of entry of the stray current usually does not result in lead corrosion, but the point of exit is frequently a corrosion site.
Clean sided corroded pits are usually the result of anodic corrosion. The products of anodic corrosion such as oxides, chlorides, or sulfates of lead are camed away by the current flow. If any corrosion products are found, they are usually lead chloride or lead sulfate that was created by the positive sheath potential that attracts the chloride and sulfate ions in the earth to the lead.
In severe anodic cases, lead peroxide may be formed. Chlorides, sulfates, and carbonates of lead are white, while lead peroxide is chocolate brown.
Cathodic Corrosion
Corrosion of Metal
Corrosion of Metal - Indicative of current movement between Anodic and Cathodic Areas through the Electrolyte. The more conductive the Electrolyte, the higher rate of current movement and more accelerated the rate of corrosion.

Cathodic corrosion is encountered less fiequently than anodic corrosion, especially with the elimination of most street railway systems.
This form of corrosion is usually the result of the presence of an alkali or alkali salt in the earth. If the potential of the metal exceeds -0.3 volts, cathodic corrosion may be expected in those areas.
In cathodic corrosion, the metal is not removed directly by the electric current, but it may be dissolved by the secondary action of the alkali that is produced by the current. Hydrogen ions are attracted to the metal, lose their charge, and are liberated as hydrogen gas.
This results in a decrease in the hydrogen ion concentration and the solution becomes alkaline. The final corrosion product formed by lead in cathodic conditions is usually lead monoxide and lead / sodium carbonate. The lead monoxide formed in this manner has a bright orange / red color and is an indication of cathodic corrosion of lead.
Galvanic Corrosion
Galvanic corrosion occurs when two dissimilar metals in an electrolyte have a metallic tie between them.
One metal becomes the anode and the other the cathode. The anode corrodes and protects the cathode as current flows in the electrolyte between them. The lead sheath of a cable may become either the anode or the cathode of a galvanic cell.
This can happen because the lead sheath is grounded to a metallic structure made of a dissimilar metal and generally has considerable length.
Copper ground rods are frequently a source of the other metal in the galvanic cell. The corrosive force of a galvanic cell is dependent on the metals making up the electrodes and the resistance of the electrolyte in which they exist. This type of corrosion can often be anticipated and avoided by keeping a close watch on construction practices and eliminating installations having different metals connected together in the earth or other electrolyte.
Chemical Corrosion
Chemical corrosion is damage that can be attributed entirely to chemical attack without the additional effect of electron transfer.
The type of chemicals that can disintegrate lead are usually strong concentrations of alkali or acid.
Examples include alkaline solutions from incompletely cured concrete, acetic acid from volatilized wood or jute, waste products from industrial plants, or water with a large amount of dissolved oxygen.
AC Corrosion
Until about 1970, AC corrosion was felt to be an insigruficant, but possible, cause of cable damage.
In 1907, Hayden reporting on tests with lead electrodes, showed that the corrosive effect of small AC currents was less than 0.5 percent as compared with the effects of equal DC currents. Later work using higher densities of AC current has shown that AC corrosion can be a major factor in concentric neutral corrosion.
Local Cell Corrosion
Local cell corrosion, also known as differential aeration in a specific form, is caused by electrolytic cells that are created by an inhomogenious environment where the cable is installed.
Examples include variations in the concentration of the electrolyte through which the cable passes, variations in the impurities of the metal, or a wide range of grain sizes in the backfill. These concentration cells corrode the metal in areas of low ion concentration.
Differential aeration is a specific form of local cell corrosion where one area of the metal has a reduced oxygen supply as compared with nearby sections that are exposed to normal quantities of oxygen.
The low oxygen area is anodic to the higher oxygen area and an electron flow occurs through the covered (oxygen starved) material to the exposed area (normal oxygen level).
Differential aeration corrosion is common for underground cables, but the rate of corrosion is generally rather slow. Examples of situations that can cause this form of corrosion include a section of bare sheath or neutral wires that are laying in a wet or muddy duct or where there are low points in the duct run that can hold water for some distance.
A cable that is installed in a duct and then the cable goes into a direct buried portion is another good example of a possible differential aeration corrosion condition.
Differential aeration corrosion turns copper a bright green.
Other Forms of Corrosion
There are numerous other forms of corrosion that are possible, but the most probable causes have been presented. An example of another form of corrosion is microbiological action of anaerobic bacteria which can exist in oxygen-fiee environments with pH values between 5.5 and 9.0.
The life cycle of anaerobic bacteria depends on the reduction of sulfate materials rather than on the consumption of free oxygen. Corrosion resulting fiom anaerobic bacteria produces sulfides of calcium or hydrogen and may be accompanied by a strong odor of hydrogen sulfide and a build-up of a black slime.
This type of corrosion is more harmful to steel pipes and manhole hardware than to lead sheaths.
Resource: Electrical Power Cable Engineering – William A. Thue

Windows XP Resists Death Sentence

The proportion of people using Windows XP actually rose in January. It may be little more than a statistical quirk, but the underlying story is what happens when Microsoft pulls the plug on support in barely two months.
windows_xp
Windows Xp
The figures come from NetMarketShare and are based on computers used to visit sites hosted by its statistics client. It found Windows 7 on 47.49 percent of machines, XP on 29.23 percent, Windows 8 on 6.63 percent, Windows 8.1 on 3.95 percent and poor old Vista on 3.3 percent, just ahead of Mac OS X.
To be fair, the XP rise is just 0.25 percentage points on the previous month, so it’s not exactly a sign of revival. However, it does mean that of Windows users, very nearly one in three are still using the system, which debuted in 2001.
After several false starts, and rewriting its standard support timetables, Microsoft is officially scheduled to end support — including security patches — for XP in April. Having repeatedly warned that this time the deadline is for real, it runs a risk of being seen as crying wolf if it backs down.
If it doesn’t however, we’re looking at several hundred million computers being unprotected as and when nefarious folk spot a new flaw in the system, which sounds like the mother of all botnets in the making.
The Windows 8 figures will also be worrying for Microsoft. Fifteen months after its release, only a little over one in ten desktop computers are running the new system. Computerworld notes that Windows 7 was at around 25 percent at the same stage after its release. Part of the difference is the number of people buying tablets rather than new PCs, but it’s still a sign that the public remains sceptical about the system’s merits.
It’s also notable that far more people are running Windows 8 rather than the updated 8.1 version. It’s likely a sign of how many people are prepared to buy a new computer and never apply any updates — which is of course one of the reasons XP refuses to die quietly.

The camera which captures 360 degree images up in air

The Panono camera ball takes 360 degree photographs when it is thrown in the air.
The sphere is covered by 36 cameras which, once airborne, simultaneously capture individual images - these are then pieced together in the cloud to produce a 108 megapixel image which can be explored in any direction.
Panono Camera

X-rays set to reveal electrons’ dance

X-ray diffraction, being celebrated in this International Year of Crystallography, has allowed us to peer inside matter to see where the atoms sit. But might it also let us see inside atoms themselves? That’s the promise in a theoretical study by Henri Suominen and Adam Kirrander of Edinburgh University, UK, who show that in principle the very intense, ultra-short x-ray pulses produced by free-electron laser (FEL) sources will be capable of revealing the motions of electrons in real time as they hop between different energy states in atoms and molecules. If experiments bear this out, the technique might be able to track the movements of electrons in biochemical processes or solar cells, and perhaps guide the design of better optical and electrochemical materials.
‘Imaging electronic motion in real time is extremely appealing and can lead to fundamental new discoveries,’ says Oriol Vendrell of the Center for Free Electron Science in Hamburg, a collaboration between the university and the DESY synchrotron.
Kirrander and Suominen have calculated the diffraction patterns that would result from scattering x-rays from carefully prepared ‘wave packets’ of electrons in exotic atoms called Rydberg atoms. These are atoms in which the outermost electrons sit in very high-energy quantum states far above those of the ‘core’ electrons. Rydberg atoms can be made by using laser pulses to excite the outer electrons into large-radius orbits.
The idea is to use laser ‘pump’ pulses to sculpt a particular wave packet which changes over time, and to use very short x-ray pulses produced by free-electron lasers to produce a series of diffraction patterns from the electrons, from which their spatial distribution can be reconstructed. It’s the wave packet itself – a quantum superposition of individual electron wavefunctions – that acts as the diffraction grid.
Atoms are excited by a pump laser (red beam) and probed by an x-ray pulse (blue beam)

Model atoms

The researchers look at Rydberg atoms because the rearrangements of the wave packets are relatively slow – on the order of several picoseconds (10-12s). Electrons in ordinary atoms and molecules tend to get rearranged by photochemical processes hundreds to thousands of times faster. What’s more, the outer electrons in Rydberg atoms have orbital radii hundreds of times larger than the core electrons, making it easier to spot changes in electron density caused by the Rydberg electron scattering from the much smaller, denser core. Kirrander and Suominen show that, for a particular wave packet in a Rydberg atom prepared from argon, clear differences in the diffraction pattern and thus electron distribution should be evident over roughly 6 picosecond intervals.  
‘This experiment will have to be tried,’ says Vendrell. ‘The main difficulty will be collecting enough signal at the detector’ to resolve changes in the scattering pattern.
Will it be possible to see these processes in a system less exotic than Rydberg atoms, though? ‘The pulse duration at the free-electron lasers is sufficiently short to resolve much faster processes than Rydberg dynamics,’ says Kirrander. Vendrell agrees that resolving ‘the timescales of tens to hundred femtoseconds characteristic of atomic motions in chemical reactions are within current FEL capabilities’.
 ‘One could certainly imagine trying to follow an electron in a biochemical reaction or in a nanowire,’ says Kirrander. But he adds that ‘such experiments are quite far down the line’, and thinks that solar cells or organic light-emitting diodes, where the electron dynamics are slower, might be stronger candidates in the near term.
‘This research is headed towards a better theoretical and experimental understanding of electron dynamics and photochemical processes, making it possible to design materials with specific optical, electric, photochemical or mechano-optical properties,’ says Kirrander. He says that today such materials and molecules are often made and discovered by chance and then modified using chemical intuition, but thinks that ‘the process will grow to become much more targeted and based on detailed predictions – to rely on de novo design rather than serendipitous discovery’.

REFERENCES

Babies near gas wells more likely to have birth defects

By Brian Bienkowski
Staff Writer
Environmental Health News

Oil Fields in Colorado, Energy Tomorrow/flickr
Women who live near natural gas wells in rural Colorado are more likely to have babies with neural tube and congenital heart defects, according to a new study.
As natural gas extraction soars in the United States, the findings add to a growing concern by many activists and residents about the potential for health effects from the air pollutants.
Researchers from the Colorado School of Public Health analyzed birth defects among nearly 125,000 births in Colorado towns with fewer than 50,000 people between 1996 and 2009, examining how close the mothers lived to natural gas wells.
Babies born to mothers living in areas with the highest density of wells – more than 125 wells per mile – were more than twice as likely to have neural tube defects than those living with no wells within a 10-mile radius, according to the study published Tuesday. Children in those areas also had a 38 percent greater risk of congenital heart defects than those with no wells. 
Both types of birth defects were fairly rare, occurring in a small percentage of births, but they can cause serious health effects. The researchers did not find a significant association between gas wells and other effects, including oral cleft defects, preterm births and low birth weight.
Neural tube defects, such as spina bifida, are permanent deformities of the spinal cord or brain. They usually occur during the first month of pregnancy, before a woman knows she is pregnant. Congenital heart defects are problems in how the heart's valves, walls, veins or arteries developed in the womb; they can disrupt normal blood flow through the heart.
Colorado remains among states torn over gas production.
For babies born to mothers in the areas with the most wells, the rate of congenital heart defects was 18 per 1,000, compared with 13 per 1,000 for those living with no wells within a 10-mile radius. For neural tube defects, the rate was 2.87 per 3,000, compared with 1.2 per 3,000 in areas with no wells.
The Colorado Oil and Gas Conservation Commission estimates that 26 percent of the more than 47,000 oil and gas wells in Colorado are located within 150 to 1,000 feet of homes.
“Taken together, our results and current trends in natural gas development underscore the importance of conducting more comprehensive and rigorous research on the potential health effects of natural gas development,” the researchers wrote in the journal Environmental Health Perspectives.
The study was limited in that the researchers didn’t have access to the mothers’ health or socioeconomic information, or their actual exposures to air pollutants. They had to assume that the address when they delivered the baby was the same as during their first trimester. They also knew only if a gas well existed in the year of the births, not how active it was.
Larry Wolk, executive director of the Colorado Department of Public Health and Environment, said no conclusions could be drawn from the study because the researchers didn't know the status of the wells and didn't know the mothers' residential history and health care status.
“Overall, we feel this study highlights interesting areas for further research and investigation, but is not conclusive in itself,” Wolk said in a prepared statement. 
“I would tell pregnant women and mothers who live, or who at-the-time-of-their-pregnancy lived, in proximity to a gas well not to rely on this study as an explanation of why one of their children might have had a birth defect. Many factors known to contribute to birth defects were ignored in this study," said Wolk, who was appointed to his position by Gov. John Hickenlooper last August.
The Colorado Oil and Gas Association forwarded a request for comment to Dollis Wright, head of an environmental communication company in Colorado.
“They didn’t address things like prenatal care, socioeconomic status and access to health care, which can make all the difference in the world when you look at birth defects,” Wright said.
However, the researchers did take into account the mothers' education, smoking, age, ethnicity, smoking and alcohol use.
Colorado is ranked sixth in the United States for natural gas production, and from 2007 to 20011, the state's production rose 27 percent, according to the U.S. Energy Information Administration.
Two years ago, the lead researcher of the new study, Lisa McKenzie, testified before federal officials recommending stronger state regulation of hydraulic fracturing. She cited a 2012 study she conducted that found carcinogens such as benzene near drilling sites in Garfield County, Colo. McKenzie and colleagues did not know how many of the wells in the new study used hydraulic fracturing.
The study comes on the heels of research from the University of Missouri thatfound elevated levels of endocrine-disrupting compounds near natural gas drilling in Garfield County. Some chemicals that alter hormones have been linked to birth defects.
It is unclear what, if anything, related to the natural gas wells could raise the risk of birth defects. However, benzene and other hydrocarbons, particulate matter, sulfur dioxide and nitrogen dioxide, are emitted by trucks, drilling and pipelines near the wells.
Benzene previously has been linked to neural tube defects in other areas, including Texas, where exposure is high from petrochemical industries. Benzene and several other air pollutants around natural gas wells are known to cross the placenta from mother to the fetus.
“One plausible mechanism could be an association between air pollutants emitted during development and congenital heart defects, and possibly neural tube defects,” McKenzie said.
McKenzie said she is more cautious about the neural tube findings than the heart findings because the rate was elevated only among women who lived with the highest density of wells, and because there were only 59 babies with the neural defects.
Wright said the study had some “good news for the energy industry” because when the researchers tightened the radius to two and five miles within the mothers’ homes, the odds of some birth defects dropped lower than the odds at 10 miles.
However, for the congenital heart and neural tube defects, increased risk was found at distances of two, five and 10 miles for the mothers living in areas with the highest densities of wells compared with areas with no wells. 
Colorado remains torn over natural gas production. Four towns – Broomfield, Fort Collins, Lafayette and Boulder – last year passed initiatives to ban or place a moratorium on fracking. The Colorado Oil and Gas Association filed a lawsuit trying to stop the bans in Fort Collins and Lafayette.
Communities should decide whether they want to put pregnant mothers at risk, said Lindsey Wilson, a field associate with Environment Colorado, an environmental group.
“The findings in this study clearly show how important it is that Colorado state officials allow communities to make their own decisions on whether or not to allow fracking within their borders,” Wilson said.

Drilling surprise opens door to volcano-powered electricity

Getting into hot water - one of Iceland’s geothermal power plants. Gretar Ívarsson
Can enormous heat deep in the earth be harnessed to provide energy for us on the surface? A promising report from a geothermal borehole project that accidentally struck magma – the same fiery, molten rock that spews from volcanoes – suggests it could.
The Icelandic Deep Drilling Project, IDDP, has been drilling shafts up to 5km deep in an attempt to harness the heat in the volcanic bedrock far below the surface of Iceland.
But in 2009 their borehole at Krafla, northeast Iceland, reached only 2,100m deep before unexpectedly striking a pocket of magma intruding into the Earth’s upper crust from below, at searing temperatures of 900-1000°C.
This borehole, IDDP-1, was the first in a series of wells drilled by the IDDP in Iceland looking for usable geothermal resources. Thespecial report in this month’s Geothermics journal details the engineering feats and scientific results that came from the decision not to the plug the hole with concrete, as in a previous case in Hawaii in 2007, but instead attempt to harness the incredible geothermal heat.
Wilfred Elders, professor emeritus of geology at the University of California, Riverside, co-authored three of the research papers in the Geothermics special issue with Icelandic colleagues.
“Drilling into magma is a very rare occurrence, and this is only the second known instance anywhere in the world,“ Elders said. The IDDP and Iceland’s National Power Company, which operates the Krafla geothermal power plant nearby, decided to make a substantial investment to investigate the hole further.
This meant cementing a steel casing into the well, leaving a perforated section at the bottom closest to the magma. Heat was allowed to slowly build in the borehole, and eventually superheated steam flowed up through the well for the next two years.
Elders said that the success of the drilling was “amazing, to say the least”, adding: “This could lead to a revolution in the energy efficiency of high-temperature geothermal projects in the future.”
The well funnelled superheated, high-pressure steam for months at temperatures of over 450°C – a world record. In comparison,geothermal resources in the UK rarely reach higher than around 60-80°C.
The magma-heated steam was measured to be capable of generating 36MW of electrical power. While relatively modest compared to a typical 660MW coal-fired power station, this is considerably more than the 1-3MW of an average wind turbine, and more than half of the Krafla plant’s current 60MW output.
Most importantly it demonstrated that it could be done. “Essentially, IDDP-1 is the world’s first magma-enhanced geothermal system, the first to supply heat directly from molten magma,” Elders said. The borehole was being set up to deliver steam directly into the Krafla power plant when a valve failed which required the borehole to be stoppered. Elders added that although the borehole had to plugged, the aim is to repair it or drill another well nearby.
Gillian Foulger, professor of geophysics at Durham University, worked at the Kravla site in the 1980s during a period of volcanic activity. “A well at this depth can’t have been expected to hit magma, but at the same time it can’t have been that surprising,” she said. “At one point when I was there we had magma gushing out of one of the boreholes,” she recalled.
Volcanic regions such as Iceland are not active most of the time, but can suddenly be activated by movement in the earth tens of kilometres below that fill chambers above with magma. “They can become very dynamic, raised in pressure, and even force magma to the surface. But if it’s not activated, then there’s no reason to expect a violent eruption, even if you drill into it,” she said.
“Having said that, with only one experimental account to go on, it wouldn’t be a good idea to drill like this in a volcanic region anywhere near a city,” she added.
The team, she said, deserved credit for using the opportunity to do research. “Most people faced with tapping into a magma chamber would pack their bags and leave,” she said. “But when life gives you lemons, you make lemonade.”
Water and heat = power. nea.is
In Iceland, around 90% of homes are heated from geothermal sources. According to the International Geothermal Association, 10,700MW of geothermal electricity was generated worldwide in 2010. Typically, these enhanced or engineered geothermal systems are created by pumping cold water into hot, dry rocks at depths of between 4-5km. The heated water is pumped up again as hot water or steam from production wells. The trend in recent decades has been steady growth in geothermal power, with Iceland, the Philippines and El Salvador leading the way, producing between 25-30% of their power from geothermal sources. Considerable effort invested in elsewher including Europe, Australia, the US, and Japan, has typically had uneven results, and the cost is high.
With the deeper boreholes, the IDDP are looking for a further prize: supercritical water; at high temperature and under high pressure deep underground, the water enters a supercritical state, when it is neither gas nor liquid. In this state it carries far more energy and, harnessed correctly, this can increase the power output above ground tenfold, from 5MW to 50MW.
Elders said: “While the experiment at Krafla suffered various setbacks that pushed personnel and equipment to their limits, the process itself was very instructive. As well as the published scientific articles we’ve prepared comprehensive reports on the practical lessons learned.“ The Icelandic National Power Company will put these towards improving their next drilling operations.
The IDDP is a collaboration of three energy companies, HS Energy Ltd, National Power Company and Reykjavik Energy, and the National Energy Authority of Iceland, with a consortium of international scientists led by Elders. The next IDDP-2 borehole will be sunk in southwest Iceland at Reykjanes later this year.

AMD reveals its first ARM processor: 8-core Opteron A1100

Calls itself the first server CPU company with an ARM chip.

AMD announces first ARM-based 64-bit Server CPU and development platform

AMD's 64-bit ARM-based Server CPUs will be out for sampling soon

AMD announced plans to build ARM server CPUs back in 2012. Today the company took a big step towards making those chips a reality, announcing that an 8-core ARM System-on-Chip would begin sampling in March.
Codenamed "Seattle," the processors will be branded Opteron A-series and built on a 28 nm process. The first of these will be the A1100. This will have 4 or 8 cores based on ARM's Cortex-A57 design. This is a high performance, 64-bit ARM core, and it will run at clock speeds of at least 2 GHz. The chips will have up to 4MB of level 2 cache and 8MB of level 3 cache, with both caches shared across all the cores. They'll support dual channel DDR3 or DDR4, with up to 128GB RAM. The chips will also include a bunch of connectivity: eight PCIe 3 lanes, eight SATA 3 ports, and two 10 Gigabit Ethernet ports. Rounding out the SoCs, they'll also include dedicated engines for cryptography and compression. The whole thing has an expected power usage of 25W.
While these chips are aimed at high density, low power servers, AMD is also putting together a micro-ATX development kit built around the A1100. This will include a Fedora-based Linux environment with development tools, Apache, MySQL, PHP, and Java 7 and 8.  This software stack is consistent with the goals of these low power servers: running Web applications is likely to be their primary role.
AMD has grand ambitions for ARM in the server room. The company estimates that by 2019, 25 percent of the server market will use ARM processors with widespread use of custom designs in large datacenters. AMD believes that it will be the leader of this ARM Server market, as it brings its existing server processor expertise to bear.
However, it can't be taken for granted that ARM will make itself a big force in the server room. Calxeda, an early pioneer of ultra high density, low power ARM servers, announced that it was closing down late last year in spite of tens of millions of funding and a partnership with HP.

An Aging Brain Is Still Pretty Smart

It may be slower, but it has a wealth of information to draw from.
A photo of a hand and a photo.
New research says older people are slow on memory tests because they have more mental data to search for the answer.
A few years ago, Michael Ramscar, a linguistics researcher at the University of Tübingen in Germany, came across a paper saying that cognitive decline starts as early as age 45. He was 45 himself and felt he hadn't yet peaked. He remembers thinking: "That doesn't make sense to me; 99 percent of the people I look up to intellectually, who keep me on my mettle, are older than I am."
The paper concluded that a person's vocabulary declines after age 45, and that finding really made no sense to him. The researchers were trying to measure how quickly people remembered words, he says, without even considering the quantity of words they had stored in memory.
Ramscar began to wonder who has the better memory: the young person who knows a little and remembers all of it, or the older person who has learned a lot and forgets a little of it?
His research, published in the January 2014 issue of Topics in Cognitive Science, argues that studies on memory ask the wrong questions. It could be that older, wiser heads are so chock full of knowledge that it simply takes longer to retrieve the right bits. (It's important to note that the research is aimed at healthy, aging brains, not those afflicted with Alzheimer's disease or other forms of dementia, which rob the brain of memory and other abilities.)
Many memory tests might ask a 20-year-old and a 70-year-old to memorize a list of items, then recall them. The tests don't address the size and content of each subject's existing memory. So Ramscar created computer models simulating young brains and older brains. He fed information into both models but added buckets more information to the model meant to simulate an older brain.
"I could see precious little evidence of decline in [the models of] healthy, older people," he says. "Their slowness and slight forgetfulness were exactly what I'd expect" because with more to draw on, there are more places to search, and there's more information to search through to find an answer.
Both Fuller and Slower
There's no denying that older people have acquired more experience and information than younger people, says Denise Park, co-director of the Center for Vital Longevity at the University of Texas at Dallas, whose research focuses on how the mind changes and adapts as people age. "As we age we accrue knowledge, have a higher vocabulary score, and know more about the world," says Park. "There's a reason we don't have 20-year-olds running the world."
She sees the value of Ramscar's research but also believes there's no denying that the brain deteriorates with age just as every other body part does.
Rather than the brain being slower just because it contains so much information, it is also slowing down from simple aging, she said. Imaging studies show clearly that even healthy aging brains show signs of shrinkage in areas concerned with learning, reason, and memory. At the same time, the greater store of knowledge helps older brains compensate.
So it could be that a couple of things are going on at the same time: The aging brain is accumulating knowledge, kind of like a library full of dusty volumes. And because it is deteriorating like any other body part, it operates more slowly.
So Park believes that we forget more as we get older, and we compensate for that memory loss by being able to draw on a bigger pool of stored knowledge. "It may be true that knowledge slows down the system, but that doesn't mean that the system, as it ages, doesn't also operate more slowly," says Park. "I would argue that the amount of knowledge allows us to compensate for the slowdown."
Her conclusion: "I strongly believe that our everyday performance does not decline with age." That's because as the ability to retrieve memories quickly declines, the brain is still building up stores of knowledge from which to draw.
Storage Space
In the long run, Ramscar hopes his research, which he discusses on hisblog, can help define normal patterns of aging, and that older people can begin to think about what is happening to their brains as something other than "a decline."
If only we could do as Sherlock Holmes did: He allowed only pertinent clues, like shed hairs or scratched doorjambs, to find a home in his brain. When his colleague Dr. Watson told him, in A Study in Scarlet, that the Earth revolved around the sun, Holmes dismissed the information as unworthy of storage space: "Depend upon it there comes a time when for every addition of knowledge you forget something that you knew before."

42 Kilobytes Unzipped Make 4.5 Petabytes(10,00,000 GB)

In 2001 reports about Zip Bombs or Zip of Death attacks made the round on the Internet and I thought it would be nice to write about one shiny harmless example of that technique. On first glance the file 42.zip is a normal compressed file with the size of 42 Kilobytes. Many users who run a virus scanner will probably run into troubles downloading that file to their computer.
It still looks like a normal 42 Kilobyte archive after the download but the surprise begins when you try to unpack that file. What they did was basically pack a 4.3 Gigabyte file consisting only of zeros. That packed file was replicated 16 times and packed again, and again, and again, and again. Or, to use their own words:

The file contains 16 zipped files, which again contains 16 zipped files, which again contains 16 zipped files, which again contains 16 zipped, which again contains 16 zipped files, which contain 1 file, with the size of 4.3GB.

You could basically unpack the 42 Kilobyte file into 4.5 Petabyte of uncompressed data if your hard drive storage space would be enough to do that. It is usually not enough to do just that, so you either need to browse the file in your archiver of choice, or believe what the creator of the file has posted about the file on the website.
The zip file is password protected, probably to avoid that it gets flagged during download by an antivirus program.
Update: Most modern antivirus programs should detect the file these days and block it from being extracted on the system. If you want to test your antivirus solution download the file to your system and try to extract it. Watch what happens and let us know how it turns out for you.

Nanoscale heat engine exceeds the standard Carnot efficiency limit.

J. Ro nagel,1     O. Abah,2     F. Schmidt-Kaler,1      K. Singer,1       and E. Lutz2
1Quantum, Institut fur Physik, Universitat Mainz, D-55128 Mainz, Germany
2 Institute for Theoretical Physics, University of Erlangen-Nurnberg, D-91058 Erlangen, Germany
(Dated: January 9, 2014)

We consider a quantum Otto cycle for a time-dependent harmonic oscillator coupled to a squeezed thermal reservoir. We show that the e ciency at maximum power increases with the degree of squeezing, surpassing the standard Carnot limit and approaching unity exponentially for large squeezing parameters. We further propose an experimental scheme to implement such a model system by using a single trapped ion in a linear Paul trap with special geometry. Our analytical investigations are supported by Monte Carlo simulations that demonstrate the feasibility of our proposal. For realistic trap parameters, an increase of the e ciency at maximum power of up to a factor of four is reached, largely exceeding the Carnot bound.

Nanoscale heat engine
Heat engines are important devices that convert heat into useful mechanical work. Standard heat engines run
cyclically between two thermal (equilibrium) reservoirs at diff erent temperatures T1 and T2. The second law of thermodynamics restricts their e ciencies to the Carnot limit, n1= 1 - T1/T2 (T1
Triggered by the pioneering study of Scovil and Schulz-DuBois on maser heat engines [2] and boosted by the advances in nanofabrication, an intense theoretical e ort has been devoted to the investigation of their properties in the quantum regime, see e.g. Refs. [3{11]. In particular, theoretical studies have indicated that the effi ciency of an engine may be increased beyond the standard Carnot bound by coupling it to an engineered (nonequilibrium) quantum coherent [12] or quantum correlated [13] reservoir (see also the related Refs. [14{17] for photocell heat engines). These stationary nonthermal reservoirs are characterized by a temperature as well as additional parameters that quantify the degree of quantum coherence or quantum correlations. The maximum e ciency that can be reached in this nonequilibrium setting is limited by a
generalized Carnot e ciency that can surpass the standard Carnot value [18]. Quantum reservoir engineering
techniques are powerful tools that enable the realization of arbitrary thermal and nonthermal environments [19]. Those techniques have rst been experimentally demonstrated in ion traps [20]. Recently, they have been used to produce nonclassical states, such as entangled states, in superconducting qubits [21] and atomic ensembles [22], as well as in circuit QED [23] and ion trap systems [24].

DOI:
10.1103/PhysRevLett.112.030602
PACS:
05.70.-a, 37.10.Ty, 37.10.Vz


Top