Thursday, 31 July 2014

How Death Valley's 'sailing stones' move on their own

For over a century, researchers failed to explain how large stone slabs were moving across a dried lake in America's Death Valley, seemingly with no help. Here's how one man finally solved the mystery.
sailing-stones
Images: Clockwise from left: Lgcharlot/Wikimedia; mikenorton/Shutterstock; Jon Sullivan; Wikimedia
Located above the northwestern side of Death Valley in Eastern California's Mojave Desert, an exceptionally flat dried lake called Racetrack Playa contains a peculiar phenomenon. Dozens of large stone stabs made of dolomite and syenite - often weighing as much as 318 kilograms - move across the cracked mud, leaving a series of smooth trails behind them. 
Some of these trails stretch for a whopping 250 metres. They often form a nice, lightly curved line, but sometimes they form sharp, zig-zagging angles, implying a sudden shift to the right or left. These ‘sailing stones’, as they’ve been nicknamed, are so common on the the Racetrack Playa, they make it look like a well-worn racetrack, hence the name. (Playa is another word for ‘dried lake’.)
It’s obvious that these stones are moving because of the trails, but how? No one knew. Since the 1900s researchers and casual observers were fascinated by the stones but no one could explain how they moved. And the biggest factor that kept the answer obscured over a century was that to this day, no one’s ever seen them move.
According to Marc Lallanilla at LiveScience, while the less informed guesses included everything from aliens and magnetic fields to good old-fashioned pranksters, a popular theory among researchers was that dust devils, which are strong, relatively long-lived whirlwinds, were pushing the stones around as they swept across the playa. But this theory, and others that cropped up, were all disproved. 
And then in 2006, planetary scientist Ralph Lorenz from the Johns Hopkins University Applied Physics Laboratory in the US started investigating the sailing stones. He came to the Racetrack Playa with an interest in studying its similarities to a hydrocarbon lake on Saturn’s moon, Titan, and stayed to put an end to a long-standing mystery. 
To do so, all he needed was a small rock, some water, and an ordinary Tupperware container. Lorez put the small rock in the bottom of the Tupperware container and filled it with a few centimetres of water. Then he put the whole thing in the freezer.
"After putting the container in the freezer, Lorenz ended up with a small slab of ice with a rock embedded in it. By placing the ice-bound rock in a large tray of water with sand at the bottom, all he had to do was gently blow on the rock to get it to move across the water.
And as the ice-embedded rock moved, it scraped a trail in the sand at the tray's bottom. Lorenz devised his clever experiment by researching how the buoyancy of ice can cause large rocks, when encased in ice, to move by floating along tidal beaches in the Arctic Sea."
Calculations by Lorenz and his colleagues of the weather conditions in Death Valley during the winter months appeared to support his theory. "Calculations show that, in this scenario, the ice causes virtually no friction on the water, so the stones are able to glide with just a slight breeze,”Joseph Stromberg reported at Smithsonian Magazine. "The team argues that their model accounts for the movement far better than any other, since it doesn’t require massive wind speeds or enormous ice sheets."
They published their research in the American Journal of Physics.
While the evidence is circumstantial because no one has actually seen it happen, Lorenz's research remains the most likely explaination for the sailing stones of Death Valley.

Organs-on-a-chip technology takes the guesswork out of drug testing

A team of researchers in the US are making a whole set of electronic organs on tiny plastic chips for safer, more efficient drug testing.
chips-new
Images: National Center for Advancing Translational Sciences
Getting a new drug on the market is no small feat. On average it takes over a decade of trial and development before a new drug can hit the shelves, and only a fraction of them even make it that far.
This is partly due to the guesswork that goes into using animals for drug testing. While animal testing has played an invaluable role in the development of our drugs in the past, it's a process that is by no means infallible. What might work wonderfully for a rat in the lab won’t necessarily agree with the biology of a human being, and, on the other hand, the perfect drug for humans might never make it past the initial testing stages because it predicts the wrong response in a lab rabbit.
“If our goal is to create better drugs, in a way that is much more efficient, time and cost-wise, I think it’s almost inevitable that we will have to either minimise or do away with animal testing,” Dan Tagle, associate director of the US National Center for Advancing Translational Sciences,told Jessica Leber at FastCompany.
In fact, said Tagle, 'animal models' are only typically 30% to 60% predictive of human responses to new drugs. “We are learning more and more that mice and rats don’t predict humans. The shortcomings of animal testing are becoming clear."
Along with researchers from the US Food and Drug Administration (FDA) and the US military’s research and development wing, DARPA, Tagle’s team is developing new ‘organ-on-a-chip’ technology to address this problem. The technology uses a series of small, plastic chips that have tiny artificial channels, vessels and flexible membranes built into them and lined with human cells so they can effectively mimic the functions of a number of human organs such as a lung, gut, liver and kidney. The team is now also working on a chip that emulates skin. Once they have all the main components of a human body working correctly on the chips, they'll link them up into a system for drug testing.
"The structure can mimic the inhalation of, say, an asthma medication into the lungs and, later, how it’s broken down in the liver,” says Leber at FastCompany. "It might one day help the military test treatments for biological or chemical weapons; hospitals to use a patient’s own stem cells to develop and test 'personalised' treatments for their disease; and, of course, drug companies to more quickly screen promising new drugs.”
The team has just this week launched a new start-up called Emulate, whose job it is to get this organ-on-a-chip ready for the market. 

Wednesday, 30 July 2014

Smoking mothers may alter the DNA of their children

Pregnant women who smoke don’t just harm the health of their baby—they may actually impair their child’s DNA, according to new research. The finding may explain why the children of smokers continue to suffer health complications later in life.

Early danger. Smoking during pregnancy could have long-lasting effects on the child.
Early danger. Smoking during pregnancy could have long-lasting effects on the child.
Babies born to smoking mothers tend to be smaller, have impaired lung function, and have a higher incidence of birth defects. Even as adults, these individuals exhibit health and behavioral problems, with those born to smokers being more likely to suffer from asthma, nicotine addiction, and substance abuse. “We have a limited understanding of the biological mechanisms for such effects,” write genetic epidemiologist Christina Markunas and perinatal epidemiologist Allen Wilcox of the National Institute of Environmental Health Sciences in Research Triangle Park, North Carolina, in a joint e-mail to Science. One possibility is so-called epigenetic changes. Various environmental triggers—ranging from stress to diet—can chemically modify DNA, turning certain genes on or off.
The new study is one of the largest of its kind to investigate whether maternal smoking can cause such changes. Researchers analyzed blood collected from 889 infants shortly after delivery; approximately one-third of which were born to mothers who self-reported smoking during the first trimester. The team looked for chemical tags called methyl groups—just one of several types of epigenetic modifications to DNA.
The results of the study were startling. Children born to smokers showed epigenetic changes to their DNA that were not present in the children of nonsmokers, the group reported online ahead of print in Environmental Health Perspectives. Compared with infants of nonsmoking mothers, babies born to smokers had alterations in more than 100 gene regions. Among the affected genes were those linked to fetal development, nicotine addiction, and the ability to quit smoking.
The work provides some of the strongest evidence to date that maternal behaviors can modulate fetal DNA during pregnancy. Moreover, the findings are supported by previous research indicating maternal smoking may alter the newborn’s DNA, says Andrea Baccarelli, director of Harvard University’s environmental epigenetics lab. The results of this large-scale investigation are consistent with the findings of previous, smaller studies, as well as research directly examining the effects of cigarette chemicals on cells, he notes. “It is a wonderful example of convergence between [lab-based] toxicology studies and human studies.”
Still, several questions remain. For one, the epigenetic changes detected in newborns may not stick around. “There is no way to tell whether these epigenetic modifications are fleeting and part of regular cell development or more permanent and truly a result of smoke exposure,” says behavioral geneticist Valerie Knopik of Rhode Island Hospital in Providence and Brown University’s Alpert Medical School.
Although more research is needed to understand the full implications of the DNA changes observed in infants, the findings open the door to other questions regarding children’s health. “If maternal smoking can alter the DNA methylation profile of newborns, other environmental exposures to chemicals, such as those found in the air, our homes and food, during pregnancy may also have epigenetic effects,” Markunas and Wilcox write. “We have only scratched the surface of how exposures during pregnancy might affect the baby.”

Tuesday, 29 July 2014

New drug target makes cancer treatments more effective


By targeting a specific molecule found in blood vessels, researchers in the UK have rendered both chemotherapy and radiation therapies far more effective in killing tumours.
chemotherapy-drugs
                  Scientists have figured out how to make chemotherapy treatments more effective. 
A team of researchers has investigated how a molecule called focal adhesion kinase (FAK) affects the success of chemotherapy and radiotherapy treatments in mice with cancer.
Led by Bernardo Tavora from the Queen Mary University of London's Bart Cancer Institute, the team discovered that this molecule signals the body to repair itself after undergoing these cancer-killing treatments, so it ends up mistakenly trying to shield the cancer cells from being destroyed.
So what if they removed the FAK molecules from the blood vessels that grow in tumours? The researchers removed FAK from blood vessels growing in the melanomas and lung cancers of their rodent patients, and reported that both their chemotherapy and radiotherapy treatments became significantly more effective.
According to the press release, "Cells lining the blood vessels send chemical signals, called cytokines, to the tumour to help it resist DNA damage and to recover. The researchers demonstrated that this process requires FAK in order to work, and without it, these signals are never sent – making the tumour more vulnerable to DNA damaging therapy."
To strengthen their findings, the team looked at the effects of FAK levels in blood samples taken from mice with lymphoma. The mice that had naturally low levels of FAK were found to be more likely get through remission following their treatments. This suggests that if drugs were developed to eliminate FAK in a patient’s cancer blood vessels, they could boost chemotherapy and radiotherapy treatments and prevent the cancer from returning at a later stage. 
The team reported their findings in Nature today.
"Although taking out FAK from blood vessels won't destroy the cancer by itself, it can remove the barrier cancer uses to protect itself from treatment,” said Tavora in the press release.

Here's why your phone and laptop batteries degrade so fast

Researchers have figured out what makes rechargeable batteries degrade so quickly, paving the way for new, longer-lasting technology.
lithium-batteries
We've all experienced the woes of a degraded rechargeable battery, whether it’s a phone that can't last a day without a charge or a laptop that’s become so reliant on its charger you can barely call it a portable device anymore. It's a problem that scientists have struggled to solve because they couldn’t figure out what was causing such sudden and rapid battery deterioration.
But now researchers at the US Department of Energy have identified why rechargeable batteries lose their ability to hold a charge over time. Working with lithium-ion batteries, which are the most commonly used type of battery in consumer devices, the researchers were able to map their charge and discharge process down to a few billionths of a metre to find exactly how degradation occurs.
Their research appears in a pair of studies published by Nature Communications,and according to Matt Safford at Smithsonian Magazine, two main culprits in battery degradation were identified:
"The first: microscopic vulnerabilities in the structure of the battery material steer the lithium ions haphazardly through the cell, eroding the battery in seemingly random ways, much like rust spreads across imperfections in steel. 
In the second study, focused on finding the best balance between voltage, storage capacity and maximum charge cycles, researchers not only found similar issues with the ion flow, but also tiny accumulations of nano-scale crystals left behind by chemical reactions, which cause the flow of ions to become even more irregular after each charge. Running batteries at higher voltages also led to more ion path irregularities, and thus a more rapidly deteriorating battery."
While Daniel Abraham, who carries out his own lithium-ion battery research at the Argonne National Laboratory in the US, told Safford at Smithsonian Magazine that there may be more to battery degradation than what was identified by these two studies, the team is optimistic that their research will lead to longer-lasting, more compact and more powerful battery technology.
This will be especially important, said Huolin Xin, a materials scientist at the Brookhaven National Lab in the US and coauthor on both studies, if electric cars are to be more economically viable in the future. It's generally accepted that we need to replace our phones and laptops every three years to maintain maximum performance, but for an electric car, that rechargeable battery should last for at least 10 to 15 years. The team is hoping that their research will lead to the development of electric car batteries that will last for three decades or more.

Friday, 25 July 2014

UNSW students break a longstanding world record with solar car

The team of students behind the Sunswift eVe has broken a 26-year-old speed record.

unsw_solar_sunswift_record_run

On Monday a team of students from the University of New South Wales (UNSW) in Australia broke a 26-year-old world electric car record using a solar car known as Sunswift eVe. This car became the fastest electric car to complete a 500 km distance course. 
The Sunswift eVe, Australia’s fastest sun-powered race car, achieved an average speed of more than 100 km/h. The previous record was 73 km/h. The new record, however, still needs to be ratified by the Fédération Internationale de l’Automobile (FIA), but there shouldn’t be any issues. 
“This record was about establishing a whole new level of single-charge travel for high-speed electric vehicles, which we hope will revolutionise the electric car industry,” said project director and third-year engineering student Hayden Smith.
This is the fifth iteration of the Sunswift eVe. An earlier version was used in a journey from Perth to Sydney to set a world record for the fastest solar powered road trip.
The car has solar panels on the roof and the hood to charge a 60 kg battery. The panels, however, were switched off on Monday, when the record was broken in Geelong, Victoria, and the car relied solely on the battery charge.
But there’s more to the potential of solar vehicles than just breaking records. 
As UNSW reports: “No secret has been made of Sunswift's long-term goals for the car. They expect it to meet Australian road registration requirements within as little as one year, and have previously said its zero-emission solar and battery storage systems make it ‘a symbol for a new era of sustainable driving’.”

Wednesday, 23 July 2014

Alaska fisherman pulled up rare blue-colored red king crab

The Associated Press
In this photo taken July 4, 2014, and provided by the Alaska Dept. of Fish and Game, crab fisherman Frank McFarland, left, holds up a rare blue-colored red king crab he caught in his commercial crabbing pots as Frank Kavairlook Jr., right, looks on in Nome, Alaska. The blue crab is being kept alive at the Norton Sound Seafood Center until McFarland can have it mounted. The rare colored crab has become a rock star of sorts, with people showing up at the center to have their photos taken with it. (AP Photo/Alaska Dept. of Fish and Game)
NOME, Alaska (AP) — A rare blue-colored red king crab was part of a fisherman's catch earlier this month in Nome, Alaska.
KNOM reports (http://is.gd/NQ1wSP) Frank McFarland found the blue crab in his pot when fishing on July Fourth off Nome. The blue crab is being kept alive at the Norton Sound Seafood Center until McFarland can have it mounted.
The rare crab has become a rock star of sorts, with people showing up at the center to have their photos taken with it.
Scott Kent, with the Alaska Department of Fish and Game in Nome, says he has no idea why the red king crab is blue, but suspects it's just a mutation.
Kent says a blue crab "turns up once in a blue moon."


New robotic fingers upgrade the human hand

Researchers at MIT have developed a prototype that will benefit people with disabilities or limited arm strength.

MelaniGonick_robotichand_MIT

A team of researchers at the Massachusetts Institute of Technology (MIT) in the US have invented a robotic hand with two extra fingers that will allow users to perform a wider range of tasks.
The device is a glove with two long fingers and a sensor that can be worn around the wrist. “You do not need to command the robot, but simply move your fingers naturally. Then the robotic fingers react and assist your fingers,” explained Harry Asada, a mechanical engineering professor at MIT to Jennifer Chu over at MIT News.
People with disabilities and those who have lost arm strength due to neck injuries will be greatly benefitted from the technology, because it allows wearers to perform many actions that would normally be difficult for them, such as opening a jar or slicing a loaf of bread.
The research team is working towards more practical solutions, as well as on smaller models for the 'supernumerary robotic fingers'. As Asada explains, “We could make this into a watch or a bracelet where the fingers pop up, and when the job is done, they come back into the watch. Wearable robots are a way to bring the robot closer to our daily life." 
See how it works:https://www.youtube.com/watch?v=FTJW5YSRZhw
The researchers are also working on other robotic projects designed to give extra strength and flexibility to the human body. Check out their Dr Octopus-style robotic arms:https://www.youtube.com/watch?v=LkXpldrhRm4

Discarded water bottles become cheap solar lights

Designers are working on a low-cost, sustainable light source that gives a second life to plastic water and soft drink bottles.

solar-lanterns

In big cities we take for granted how easy it is to use electrical light, but over a billion people living in developing countries and rural areas don't have access to the power grid. In rural India, if someone wants to study at night or walk home in the dark, they'll often have to light a kerosene lamp, which is expensive to maintain and smoky, plus it’s a pretty dangerous fire hazard if it’s accidentally tipped over.
So designers at Designnobis, a firm based in Turkey, have come up with ‘Infinite Light’, a lantern made up of a flexible solar panel, some batteries, and an empty plastic bottle.
The solar panel sits inside the bottle and collects sunlight during the day, and at night, the lantern switches over to battery power when the solar energy has been enhausted. A simple frame holds everything together, and includes a handle at the top that allows the lantern to be held and carried around, or strung up from a ceiling or an outdoor post.
"With Infinite Light, we aimed to create a sustainable lamp with minimum cost,” the team told Adele Peters at FastCompany. "The lighting unit does not require any infrastructure, and it is a ready-to-use package that can be placed in a discarded plastic bottle."
infinite-light
I
Because plastic bottles are accessible to most communities around the world, even the very remote ones, people interested in setting up these solar lanterns will just need the internal parts to be shipped to them. This helps keep the initial costs down, and also means that rubbish that was destined to become landfill can now enjoy a continued existence as a useful, everyday appliance. "We wanted to emphasise the importance of waste materials as a growing resource,"said the designers.
The design won the 2013 Green Dot Award, which is an initiative that encourages businesses to produce environmentally friendly products, and the team at Designnobis is now working on getting their solar lanterns on the market.

Tuesday, 22 July 2014

Eating less beef is better for the environment than giving up cars

A new study has revealed that beef’s environmental impact is 10 times that of chicken and pork.

beef-consumption

As part of a recent study, a research team in the US assessed how much land, water and nitrogen fertiliser was required to raise different kinds of produce, including beef, chicken, pork, eggs, and dairy. Led by Gidon Eshel, professor of environmental science at Bard College in New York, the study was based on data collected between 2000 to 2010 by the US Departments of Agriculture, Interior and Energy.
The results, published in the Proceedings of the National Academy of Sciences today, show that cattle require on average 28 times more land and 11 times more irrigation water than pork or chicken, and six times as much nitrogen fertiliser as egg or poultry production. This adds up to the statistic that beef production releases five times more greenhouse gases than anything else.
While it was already known that beef production is having a pretty significant impact on the environment, this new analysis has quantified the damage in relation to other options to find out what we should be eating more and less of, environmentally speaking.
Lamb and fish meat were not included in the study because US consumption of both is relatively low.
The key to beef’s hefty environmental impact is that cattle are far less efficient at getting the most out of their food than pigs and chickens are. “Only a minute fraction of the food consumed by cattle goes into the bloodstream, so the bulk of the energy is lost,” Eshel told Damian Carrington at the Guardian, adding that feeding cattle grain rather than grass makes this inefficiency even worse.
While the research doesn’t mean you have to give up your beloved steak, it does offer a pretty effective way to reduce your carbon footprint - simply cut down on your consumption of it. "The biggest intervention people could make towards reducing their carbon footprints would not be to abandon cars, but to eat significantly less red meat,” said Tim Benton, the UK Champion for Global Food Security at the University of Leeds to the Guardian
A good ballpark, as recommended by the Australian Department of Health, is sticking to 65 grams of cooked red meat per day, which is 90 to 100 grams of uncooked meat. If the average steak is between 200 and 350 grams, this means sticking to a steak and a half per week. As an extra incentive, not only will the environment thank you, but according to the Conversation, this will significantly reduce your risk of contracting bowel and stomach cancer.

This tree produces 40 different types of fruit

Award-winning artist Sam Van Aken has grown a hybridised fruit tree that produces 40 different varieties of stone fruit each year.

40-fruits

An art professor from Syracuse University in the US, Van Aken grew up on a family farm before pursuing a career as an artist, and has combined his knowledge of the two to develop his incredible Tree of 40 Fruit
In 2008, Van Aken learned that an orchid at the New York State Agricultural Experiment Station was about to be shut down due to a lack of funding. This single orchid grew a great number of heirloom, antique, and native varieties of stone fruit, and some of these were 150 to 200 years old. To lose this orchid would render many of these rare and old varieties of fruit extinct, so to preserve them, Van Aken bought the orchid, and spent the following years figuring out how tograft parts of the trees onto a single fruit tree. 
Working with a pool of over 250 varieties of stone fruit, Van Aken developed a timeline of when each of them blossom in relationship to each other and started grafting a few onto a working tree’s root structure. Once the working tree was about two years old, Van Aken used a technique called chip grafting to add more varieties on as separate branches. This technique involves taking a sliver off a fruit tree that includes the bud, and inserting that into an incision in the working tree. It's then taped into place, and left to sit and heal over winter. If all goes well, the branch will be pruned back to encourage it to grow as a normal branch on the working tree.
After about five years and several grafted branches, Van Aken's first Tree of 40 Fruit was complete.
Aken’s Tree of 40 Fruit looks like a normal tree for most of the year, but in spring it reveals a stunning patchwork of pink, white, red and purple blossoms, which turn into an array of plums, peaches, apricots, nectarines, cherries and almonds during the summer months, all of which are rare and unique varieties. 
Not only is it a beautiful specimen, but it’s also helping to preserve the diversity of the world’s stone fruit. Stone fruits are selected for commercial growing based first and foremost on how long they keep, then how large they grow, then how they look, and lastly how they taste. This means that there are thousands of stone fruit varieties in the world, but only a very select few are considered commercially viable, even if they aren't the best tasting, or most nutritious ones. 
Van Aken has grown 16 Trees of 40 Fruit so far, and they’ve been planted in museums, community centres, and private art collections around the US. He now plans to grow a small orchard of these trees in a city setting.
Of course, the obvious question that remains is what happens to all the fruit that gets harvested from these trees? Van Aken told Lauren Salkeld at Epicurious:
"I've been told by people that have [a tree] at their home that it provides the perfect amount and perfect variety of fruit. So rather than having one variety that produces more than you know what to do with, it provides good amounts of each of the 40 varieties. Since all of these fruit ripen at different times, from July through October, you also aren't inundated."

Monday, 21 July 2014

Bubble wrap used for cheap blood and bacteria tests

Pack it in, pricey lab gear. Bubble wrap can be a cheap, easy way to run a variety of tests on medical and environmental samples.
Standardised and stackable, 96-well assay plates are the gold standard for running small sample diagnostics and simple liquid reaction tests in chemistry labs. But at $1 to $5 a piece, this can be too much for labs around the world with limited resources.
George Whitesides at the Wyss Institute for Biologically Inspired Engineering at Harvard University was on the hunt for low-cost diagnostic tools made from things that are already mass-produced with high quality but low cost.
"We like the idea of using materials that are readily available and seeing how much we can do with them, going far beyond their intended purpose and adapting them to address local problems," says Whitesides. Previously, his team has found uses for paper as devices for testing water quality, egg beaters and CD players as centrifuges and bicycles as power sources.
Popping up in the chemistry lab <i>(Image: RunPhoto)</i>
Popping up in the chemistry lab

Bubble basics

Whitesides says that the idea of bubble wrap for chemical assays popped into his head because it is readily available, cheap, lightweight and the bubbles come in a range of sizes. The interior of the bubbles are sterile, alleviating the need for expensive sterilisation equipment.
The bubbles are permeable to gas, but to inject the reagents needed to react with the things being tested, the bubbles have to be punctured with syringes. The team found that clear nail hardener from a pharmacy can be used to seal them back up.
The transparent compartments would be most useful for simple diagnostic tests that can be analysed visually, such as reactions that change colour, says Whitesides.
For instance, the team successfully ran blood tests for anaemia and diabetes, cultured the common food-borne bacteria Escherichia coli and raised thenematode Caenorhabditis elegans, which is widely used as a model organism in biology experiments.

Friday, 18 July 2014

The World's Largest Laser Crushed a Diamond to Study Planet Formation

When not being used to study nuclear fusion, the world's largest laser can be foundcrushing diamonds, to study planetary formation, of course.
Well, that's a little bit misleading. The world's largest laser and 175 other lasers combined to exert 50 million times the amount of earth's atmospheric pressure on a single diamond, to be exact.  
The laser, which is housed at the US National Ignition Factory at the Lawrence Livermore National Laboratory, measures more than 30 feet long and can be focused on a millimeter target. In doing this, the researchers, led by Lawrence Livermore's Ray Smith, were able to complete a process known as "dynamic ramped compression." That means they were able to slowly and evenly compress the diamond in such a way as to not liquify it and instead compressed it to the density of lead. Smith published his findings in Nature.
If you've seen it before, it's because it was a stand-in for the starship Enterprise's warp core in Star Trek Into Darkness. It's got a lot of uses, to say the least.
So, why do this? To find out what the cores of massive planets like Saturn might look like. As the name suggests, gas giants are made out of gas, but some scientists believe that the core of Jupiter, Saturn, and other, even larger exoplanets might be solid. There are plenty of theoretical ways to calculate a planet's density and what might be at its core, but there are very few ways to study what happens when elements are subjected to the insane pressures of huge planets. 
That's exactly what Smith did, exerting five terapascals of pressure on the diamond (that's equal to 14 times the pressure at Earth's core, and roughly equal to the expected pressure at Saturn's core).
This link is a video on Nature Diamond Crush
 https://www.youtube.com/watch?v=BL37zJj4rU8
"The discovery of multiple planets beyond our Solar System, many of which are much larger than Jupiter and Saturn, has left to a dramatic change in our picture of the Universe," Chris Pickard of the University of London, wrote in an accompanyingNature article. "Understanding the make-up and evolution of these exoplanets requires the development of theoretical models, which depend on the pressure-density equations of state of the most likely planetary materials. Until now, these equations of state have been largely determined by extrapolating from terrestrial data."
Smith's findings suggest that, more or less, those theories are on point. Next up? Simulating the pressure of stars, which is an entirely different proposition.
"The giant exoplanets are a stepping stone to the stars, where petapascal pressures are reached," Pickard wrote. So, tak

Thursday, 17 July 2014

Hardcore pot smoking could damage the brain's pleasure center

High and low. Marijuana abusers react less strongly to dopamine in their brains, resulting in feelings of malaise and withdrawal.
High and low. Marijuana abusers react less strongly to dopamine in their brains, resulting in feelings of malaise and withdrawal.
It probably won’t come as a surprise that smoking a joint now and then will leave you feeling … pretty good, man. But smoking a lot of marijuana over a long time might do just the opposite. Scientists have found that the brains of pot abusers react less strongly to the chemical dopamine, which is responsible for creating feelings of pleasure and reward. Their blunted dopamine responses could leave heavy marijuana users living in a fog—and not the good kind.
After high-profile legalizations in Colorado, Washington, and Uruguay, marijuana is becoming more and more available in many parts of the world. Still, scientific research on the drug has lagged. Pot contains lots of different chemicals, and scientists don’t fully understand how those components interact to produce the unique effects of different strains. Its illicit status in most of the world has also thrown up barriers to research. In the United States, for example, any study involving marijuana requires approval from four different federal agencies, including the Drug Enforcement Administration.
One of the unanswered questions about the drug is what, exactly, it does to our brains, both during the high and afterward. Of particular interest to scientists is marijuana’s effect on dopamine, a main ingredient in the brain’s reward system. Pleasurable activities such as eating, sex, and some drugs all trigger bursts of dopamine, essentially telling the brain, “Hey, that was great—let’s do it again soon.”
Scientists know that drug abuse can wreak havoc on the dopamine system. Cocaine and alcohol abusers, for example, are known to produce far less dopamine in their brains than people who aren’t addicted to those drugs. But past studies had hinted that the same might not be true for those who abuse marijuana.
Nora Volkow, the director of the National Institute on Drug Abuse in Bethesda, Maryland, decided to take a closer look at the brains of marijuana abusers. For help, she and her team turned to another drug: methylphenidate (aka Ritalin), a stimulant known to increase the amount of dopamine in the brain. The researchers gave methylphenidate to 24 marijuana abusers (who had smoked a median of about five joints a day, 5 days a week, for 10 years) and 24 controls.
Brain imaging revealed that both groups produced just as much extra dopamine after taking the drug. But whereas the controls experienced increased heart rates and blood pressure readings and reported feeling restless and high, the marijuana abusers didn’t. Their responses were so weak that Volkow had to double-check that the methylphenidate she was giving them hadn’t passed its expiration date.
This lack of a physical response suggests that marijuana abusers might have damaged reward circuitry in their brains, Volkow and her team report online today in the Proceedings of the National Academy of Sciences. Unlike cocaine and alcohol abusers, marijuana abusers appear to produce the same amount of dopamine as people who don’t abuse the drug. But their brains don’t know what to do with it. This disconnect could be “a key mechanism underlying cannabis addiction,” says Raul Gonzalez, a neuropsychologist at Florida International University in Miami who was not involved with the research. The study “suggests that cannabis users may experience less reward from things others generally find pleasurable and, contrary to popular stereotypes, that they generally feel more irritable, stressed, and just plain crummy. This may contribute to ongoing and escalating cannabis use among such individuals.”
But do marijuana abusers smoke a lot because they feel crummy, or do they feel crummy because they smoke a lot? Volkow doesn’t know. Not being able to tease out cause and effect “is a limitation in a study like this one,” she says. Perhaps the abusers already had less reactive dopamine systems and started smoking a ton of pot to cope with their general malaise. Or maybe prolonged marijuana abuse is actually damaging their brains’ reward circuitry, leading to the apathy and social withdrawal that marijuana abusers often experience.
The lessons for recreational users of marijuana, if any, are unclear. This study used “hardcore volunteer[s]” who were “using quite a lot of cannabis,” says Paul Stokes, a psychiatrist at Imperial College London who wasn’t involved in the research. As such, “it probably tells you more about cannabis dependence than about recreational use.” But when he did a similar brain imaging study of people who smoked marijuana no more than once a week, he observed “similar themes” when it came to dopamine.
All of these are important questions to answer, Volkow says. As availability of the drug increases, she says, it’s something “we all need to know.”

Why the Amazon flows backwards?

A river runs backward. Erosion and other processes taking place at Earth’s surface help explain why large portions of the Amazon River (watershed depicted in lighter colors) reversed course.
A river runs backward. Erosion and other processes taking place at Earth’s surface help explain why large portions of the Amazon River (watershed depicted in lighter colors) reversed course.
Millions of years ago, rivers flowing westward across what is now northern Brazil reversed their course to flow toward the Atlantic, and the mighty Amazon was born. A previous study suggested that the about-face was triggered by gradual changes in the flow of hot, viscous rock deep beneath the South American continent. But new computer models hint that the U-turn resulted from more familiar geological processes taking place at Earth’s surface—in particular, the persistent erosion, movement, and deposition of sediment wearing away from the growing Andes.
The Andes mountains lie just inland of the western coast of South America. The central portion of that mountain range began growing about 65 million years ago, and the northern Andes started rising a few million years later, says Victor Sacek, a geophysicist at the University of São Paulo in Brazil. Yet, field studies suggest that the Amazon River, which today carries sediment-laden water from the Andes across the continent to the Atlantic Ocean, didn’t exist in its current form until about 10 million years ago. Before then, rainfall across much of what is now the Amazon Basin drained westward into massive lakes that formed along the eastern rim of the Andes and then flowed north via rivers into the Caribbean. The geological processes that caused ancient drainage patterns to shift to their modern configurations have been hotly debated.
The lakes east of the Andes formed in a long trough created when the immense weight of that growing mountain chain pressed Earth’s crust downward, Sacek says. But for some reason, the terrain beneath the trough slowly gained elevation over millions of years, and those lakes gradually gave way to a long-lived region of wetlands covering an area the size of Egypt or larger. Later, after the landscape rose even farther, the wetlands disappeared altogether. Previously, scientists proposed that changes in the circulation of molten material in Earth’s mantle—the slow-flowing material that lies between our planet’s core and its crust—pushed the terrain east of the Andes upward, thereby changing drainage patterns.
But new research pins the blame on something more mundane: erosion. Sacek developed a computer model that includes interactions between growth of the Andes, the flexing of Earth’s crust in the region, and climate. (For instance, as the mountains rise, they intercept more moist airflow and receive more rainfall, which in turn boosts the rate of erosion.) The model simulates the evolution of South American terrain during the past 40 million years—a period that commenced after the birth of the central Andes but before the eastern flank of those mountains began to rise, Sacek notes.
Results of the simulation reproduce much of the evidence seen in the geological record, Sacek reports online ahead of print in Earth and Planetary Science Letters. Initially, the lakes form east of the Andes because the mountains press Earth’s crust downward to form a trough faster than sediment can fill it. Then the sinking of the terrain slows down, and accumulation of sediment spilling off the Andes catches up, gradually filling in the lakes and building the landscape higher. Eventually, the terrain just east of the mountain chain becomes higher than that in the eastern realm of the Amazon Basin, a shift that provides a downhill slope extending all the way from the Andes to the Atlantic beginning about 10 million years ago.
“Erosion and sedimentation are powerful forces,” says Jean Braun, a geophysicist at Joseph Fourier University in Grenoble, France. Sacek’s model shows that these processes explain the geological record seen in northern South America, “and they do so with the right timing,” he adds. They also suggest that the amount of sediment carried to the mouth of the Amazon each year and then dumped offshore should increase over time—something actually seen in sediment cores drilled from that area. “That’s a nice bit of prediction by the model,” Braun says.
The gradually increasing rate of sediment accumulation possibly stems from the long time needed for the material to hopscotch its way across the continent, being dumped in one spot and then remobilized by erosion later, says Carina Hoorn, a geologist at the University of Amsterdam. Or, she suggests, the increase may stem from a geologically recent boost in erosion in the Andes triggered by a series of ice ages that commenced about 2.4 million years ago.
One thing Sacek’s model doesn’t do a good job of predicting, he admits, is the size, shape, and persistence of the large area of wetlands that formed in what is now the central Amazon Basin between 10.5 million and 16 million years ago. But it’s possible, he notes, that changes in mantle circulation beneath the region did play a minor role in the evolution of the terrain. Sacek will try to incorporate such processes into future versions of his terrain simulation, to see if they better explain how the landscape evolved.
Such changes in mantle flow are “difficult to quantify and even more difficult to discern [in the real world],” Braun says. But by combining the modest effects of such changes with those triggered by surface processes such as erosion, “you might end up with something that works.”

What do ISS astronauts do with their dirty laundry?

Without washer and dryer, all International Space Station residents throw out their clothes after just a few weeks of use. 

ISS_crew

It’s may sound very celeb-like, but astronauts aboard the ISS have no other choice: once their clothes—undies included, of course—are dirty, they shoot them into the Earth’s atmosphere, where they burn.
But this practice is quite expensive. According to Smithsonian, a crew of six goes through 408 kilograms of clothing every single year! And then there’s the stench - the crew has to keep their dirty garments until there’s enough to be ejected into space.
So, to put an end to this problem and free up storage space, researchers at NASA have developed long-lasting fibres that are easy-to-clean and germ resistant. And the first shipment left on Sunday. So for the next few weeks, ISS astronauts will be testing the new fabric.
Named the Intravehicual Activity Clothing Study (IVA Clothing Study), it replaces crew uniforms with non-cotton clothing that has to be worn during the astronauts’ daily two-and-a-half-hour exercise regime for a total of 15 days. 
“The exercise clothing are hung up to dry for up to four hours and then stored in flame-resistant bags. A questionnaire is taken daily soon after exercise to document perception of the exercise clothing,” reports Shannon Palus over at Smithsonian
Three crew members will also test a shirt that can be worn for daily activities. The volunteers will discard the shirt once they think it can’t be worn anymore, and then they will complete a questionnaire.
If the astronauts find these new clothes useful and the fabric manages to keep them fresh, we may soon see a new collection of space garments for ISS residents. And, with a bit of of luck, the fabric may also be used to create sportswear for those here on Earth - stink-free yoga wear? Yes, please.

Tuesday, 15 July 2014

Stop Repeating: Better Ways to Memorize

Someone is erasing a drawing of the human brain. Conceptual image relating to dementia and memory loss. Digital illustration

new study published in Learning and Memory found that simple repetition interferes with the ability to learn new information, especially when it is similar to a set of familiar facts. This may mean that memorizing facts about an issue through repetition could interfere with the ability to remember a more nuanced version of the same issue later on.
“Our findings suggest that although the ability to generally recognize something is strengthened with multiple encounters, one’s ability to discriminate among similar items in memory decay,” the study says. “In contrast to past beliefs, repetition may reduce the fidelity of memory representations.”
In study, subjects said a list of objects either one or three times. Later on, in the recall phase, another set of similar objects ("lures") was snuck in. Those who had seen objects multiple times better recalled the original objects but had a harder time distinguishing the lures. In other words, their memories were stronger but less precise. Over the long run, repetition can be a false temptress, making us think we've learning something when we really haven't. 
"On your first reading of something, you extract a lot of understanding. But when you do the second reading, you read with a sense of ‘I know this, I know this,’"explain psychologists Henry Roediger and Mark McDaniel, authors of Make it Stick. "So basically, you’re not processing it deeply, or picking more out of it. Often, the re-reading is cursory—and it’s insidious, because this gives you the illusion that you know the material very well, when in fact there are gaps."
Here are a few tips for better memory:
Pace your studying
Not all repetition is bad. It's more accurate to say that cramming is ineffective. “The better idea is to space repetition. Practice a little bit one day, then put your flashcards away, then take them out the next day, then two days later," explain McDaniel and Roediger.
Mentally testing yourself on materials generally increases recall days later, even if there's no feedback on how well you actually remember the facts. In other words, just going over the material in your head at regular intervals has benefits.
Within academia, there's a raging debate about the optimal spacing between recall intervals [PDF]. One of the original systems, by foreign language learning icon Paul Pimsleur, advocated for a pacing of 5 seconds, 25 seconds, two minutes, 10 minutes, one hour, five hours, one day, five days, 25 days, four months, and two years after the facts are initially learned. Since then, others have found that a slight delay of 10 minutes in the first retrieval made the task just mentally challenging enough to be beneficial [PDF]. But it depends on the goal; if its to memorize a speech in a day, you'll probably want to cram more intervals than if you want to remember something five years later [PDF].
I've been experimenting with recall intervals one hour after I read material, then again when I'm at the gym, trying to recall facts learned during the previous three days, one week, and one month prior. The optimal intervals will ultimately depend on your schedule.
Use Loci
The ancient granddaddy of advanced memory techniques is the method of Loci, which involves placing objects in sequential order in a mentally constructed (imaginary) world. The most famous memory man of all time, Solomon Shereshevsky, who could recall sets of random numbers years later, used to imagine himself placing objects near buildings.
World Memory Champion Dominic O'Brien gives practical tips about developing one's own Loci method in You Can Have An Amazing Memory. O'Brien advocates using Loci places of familiarity, like the walk down a familiar neighborhood block or location within your own home. So, for instance, if you want to memorize the words "Duck," "Car," and "Boat," you might imagine placing a duck on the living room floor, a car in the bathroom, and a boat on the patio. For more complicated tasks, it might help to link them together, like imagining a giant duck walking to a car in the bathroom.
Connect the dots
Understanding is the basis for easier memorization. Chess masters have a much easier time memorizing location of chess pieces than beginners, even though their recalling the same information.
In a study published in the Journal of Cognitive Neuroscience, researchers found that second-year biology students had an easier time learning new information if it was related to programs they were already studying. "If you don't immediately know the answer to a question, you could first try recalling what you already know about that topic. This might help you to come up with the right answer after all," concludes one of the researchers.
In other words, the more widely knowledgeable we are about a subject, the easier it is to retain and retrieve information. So, read books and the news widely. The more you know, the more you'll be able to know.