Category Archives: Stanford University

3 plans to avoid blackouts using 100% renewable energy

Researchers have proposed three different methods for providing consistent power in 139 countries using 100 percent renewable energy.

The inconsistencies of power produced by wind, water, and sunlight and the continuously fluctuating demand for energy often hinder renewable energy solutions. In a new paper, which appears in Renewable Energy, the researchers outline several solutions to making clean power reliable enough for all energy sectors—transportation; heating and cooling; industry; and agriculture, forestry, and fishing—in 20 world regions after all sectors have converted to 100 percent clean, renewable energy.

The researchers previously developed roadmaps for transitioning 139 countries to 100 percent clean, renewable energy by 2050 with 80 percent of that transition completed by 2030. The present study examines ways to keep the grid stable with these roadmaps.

Multiple solutions

“Based on these results, I can more confidently state that there is no technical or economic barrier to transitioning the entire world to 100 percent clean, renewable energy with a stable electric grid at low cost,” says lead author Mark Z. Jacobson, a professor of civil and environmental engineering at Stanford University who is also a senior fellow at the Stanford Precourt Institute for Energy and the Stanford Woods Institute for the Environment.

“This solution would go a long way toward eliminating global warming and the 4 million to 7 million air pollution–related deaths that occur worldwide each year, while also providing energy security.”

“…the greatest barrier to the large-scale implementation of clean renewable energy is people’s perception that it’s too hard to keep the lights on…”

The paper builds on a previous 2015 study by Jacobson and colleagues that examined the ability of the grid to stay stable in the 48 contiguous United States. That study only included one scenario for how to achieve the goals. Some criticized that paper for relying too heavily on adding turbines to existing hydroelectric dams—which the group suggested in order to increase peak electricity production without changing the number or size of the dams.

The previous paper was also criticized for relying too much on storing excess energy in water, ice, and underground rocks. The solutions in the current paper address these criticisms by suggesting several different solutions for stabilizing energy produced with 100 percent clean, renewable sources, including solutions with no added hydropower turbines and no storage in water, ice, or rocks.

“Our main result is that there are multiple solutions to the problem,” says Jacobson. “This is important because the greatest barrier to the large-scale implementation of clean renewable energy is people’s perception that it’s too hard to keep the lights on with random wind and solar output.”

Meeting demand

At the heart of this study is the need to match energy supplied by wind, water, and solar power and storage with what the researchers predict demand to be in 2050. To do this, they grouped 139 countries—for which they created energy roadmaps in a previous study—into 20 regions based on geographic proximity and some geopolitical concerns.

Unlike the previous 139-country study, which matched energy supply with annual-average demand, the present study matches supply and demand in 30-second increments for 5 years (2050-2054) to account for the variability in wind and solar power as well as the variability in demand over hours and seasons.

For the study, the researchers relied on two computational modeling programs. The first program predicted global weather patterns from 2050 to 2054. From this, they further predicted the amount of energy that could be produced from weather-related energy sources like onshore and offshore wind turbines, solar photovoltaics on rooftops, and in power plants, concentrated solar power plants, and solar thermal plants over time. These types of energy sources are variable and don’t necessarily produce energy when demand is highest.

The group then combined data from the first model with a second model that incorporated energy produced by more stable sources of electricity, like geothermal power plants, tidal and wave devices, and hydroelectric power plants, and of heat, like geothermal reservoirs. The second model also included ways of storing energy when there was excess, such as in electricity, heat, cold, and hydrogen storage. Further, the model included predictions of energy demand over time.

With the two models, the group was able to predict both how much energy could be produced through more variable sources of energy, and how well other sources could balance out the fluctuating energy to meet demands.

Keeping the lights on

Scenarios based on the modeling data avoided blackouts at low cost in all 20 world regions for all five years examined and under three different storage scenarios. One scenario includes heat pumps—which are used in place of combustion-based heaters and coolers—but no hot or cold energy storage; two add no hydropower turbines to existing hydropower dams; and one has no battery storage.

The fact that no blackouts occurred under three different scenarios suggests that many possible solutions to grid stability with 100 percent wind, water, and solar power are possible, a conclusion that contradicts previous claims that the grid cannot stay stable with such high penetrations of just renewables.

Overall, the researchers found that the cost per unit of energy—including the cost in terms of health, climate and energy—in every scenario was about one quarter what it would be if the world continues on its current energy path. This is largely due to eliminating the health and climate costs of fossil fuels. Also, by reducing water vapor, the wind turbines included in the roadmaps would offset about 3 percent of global warming to date.

Although the cost of producing a unit of energy is similar in the roadmap scenarios and the non-intervention scenario, the researchers found that the roadmaps roughly cut in half the amount of energy needed in the system. So, consumers would actually pay less.

Green energy is more popular if it’s the default

The vast amount of these energy savings come from avoiding the energy needed to mine, transport, and refine fossil fuels, converting from combustion to direct electricity, and using heat pumps instead of conventional heaters and air conditioners.

“One of the biggest challenges facing energy systems based entirely on clean, zero-emission wind, water, and solar power is to match supply and demand with near-perfect reliability at reasonable cost,” says Mark Delucchi, coauthor of the paper and a research scientist at the University of California, Berkeley. “Our work shows that this can be accomplished, in almost all countries of the world, with established technologies.”

Planning ahead, working together

Jacobson and his colleagues says that a remaining challenge of implementing their roadmaps is that they require coordination across political boundaries.

“Ideally, you’d have cooperation in deciding where you’re going to put the wind farms, where you’re going to put the solar panels, where you’re going to put the battery storage,” says Jacobson. “The whole system is most efficient when it is planned ahead of time as opposed to done one piece at a time.”

Germany’s big push for renewables is paying off

In light of this geopolitical complication, they are also working on smaller roadmaps to help individual towns, many of which have already committed to achieving 100 percent renewable energy.

Additional coauthors of this paper are from Aalborg University in Denmark and UC Berkeley.

Source: Stanford University

The post 3 plans to avoid blackouts using 100% renewable energy appeared first on Futurity.

Just 1 degree changes our risk of severe weather

Current commitments won’t meet the Paris Agreement’s aspirational goals of limiting temperature—and that could make the world a degree warmer and considerably more prone to extreme weather.

The difference between this UN goal and the actual country commitments is a mere 1 C, which may seem negligible. But a new study in Science Advances finds that even that 1-degree difference could increase the likelihood of extreme weather.

In this study, Noah Diffenbaugh, professor of earth system science at Stanford University’s School of Earth, Energy & Environmental Sciences and colleagues expanded on previous work analyzing historical climate data, which demonstrated how greenhouse gas emissions have increased the probability of recording-breaking hot, wet, and dry events in the present climate.

Now, the group analyzed similar models to estimate the probability of extreme weather events in the future under two scenarios of the Paris Agreement: increases of 1.5 to 2 degrees if countries live up to their aspirations, or 2 to 3 degrees if they meet the commitments that they have made.

“The really big increases in record-setting event probability are reduced if the world achieves the aspirational targets rather than the actual commitments,” says Diffenbaugh, who is also senior fellow in the Stanford Woods Institute for the Environment. “At the same time, even if those aspirational targets are reached, we still will be living in a climate that has substantially greater probability of unprecedented events than the one we’re in now.”

Droughts, floods, and heat

The new study is the latest application of an extreme event framework that Diffenbaugh and other researchers at Stanford have been developing for years. They have applied this framework to individual events, such as the 2012-2017 California drought and the catastrophic flooding in northern India in June 2013. In their 2017 paper on severe events, they found that global warming from human emissions of greenhouse gases has increased the odds of the hottest events across more than 80 percent of the globe for which reliable observations were available, while also increasing the likelihood of both wet and dry extremes.

“Damages from extreme weather and climate events have been increasing, and 2017 was the costliest year on record.”

The framework relies on a combination of historical climate observations and climate models that are able to simulate the global circulation of the atmosphere and ocean. The group uses output from these models run under two conditions: one that includes only natural climate influences, like sunspot or volcano activity, and another that also includes human influences like rising carbon dioxide concentrations. The researchers compare the simulations to historical extreme event data to test whether the condition with natural or human influences best represents reality.

For the new study, the researchers expanded the number of climate models from their previous paper that had investigated the 1 degree of global warming that has already occurred, strengthening their earlier conclusions. Then, they used their findings to predict the probabilities of severe events in the two Paris Agreement scenarios.

Two very different scenarios

Although the researchers knew that increases in temperature would very likely lead to increases in severe events, the stark difference in the outcomes of the two scenarios surprised them.

The researchers found that emissions consistent with the commitments countries have made are likely to result in a more than fivefold increase in probability of record-breaking warm nights over approximately 50 percent of Europe, and more than 25 percent of East Asia.

People report the most stress about this climate worry

This 2 to 3 degrees of global warming would also likely result in a greater than threefold increase in record-breaking wet days over more than 35 percent of North America, Europe, and East Asia. The authors found that this level of warming is also likely to lead to increases in hot days, along with milder cold nights and shorter freezes.

Meeting the Paris Agreement’s goal of keeping the global-scale warming to less than 2 degrees is likely to reduce the area of the globe that experiences greater than threefold increases in the probability of record-setting events. However, even at this reduced level of global warming, the world is still likely to see increases in record-setting events compared to the present.

When people build a dam, plan the management of a river, or build on a floodplain, it is common practice to base decisions on past historical data. This study provides more evidence that these historical probabilities no longer apply in many parts of the world. The new analysis helps clarify what the climate is likely to look like in the future and could help decision makers plan accordingly.

“Damages from extreme weather and climate events have been increasing, and 2017 was the costliest year on record,” Diffenbaugh says. “These rising costs are one of many signs that we are not prepared for today’s climate, let alone for another degree of global warming.”

Record temperature surge from 2014 to 2016

“But the good news is that we don’t have to wait and play catch-up,” Diffenbaugh adds. “Instead, we can use this kind of research to make decisions that both build resilience now and help us be prepared for the climate that we will face in the future.”

Additional coauthors of this paper are Deepti Singh, postdoctoral fellow at Columbia University and incoming faculty at Washington State University, and Justin Mankin, visiting research scholar and incoming faculty member at Dartmouth College, and scientist with Columbia University and the NASA Goddard Institute for Space Studies.

The School of Earth, Energy & Environmental Sciences and the Woods Institute for the Environment at Stanford University; The Earth Institute and Lamont-Doherty Earth Observatory of Columbia University; and the US Department of Energy funded this work.

Source: Stanford University

The post Just 1 degree changes our risk of severe weather appeared first on Futurity.

Millennials don’t want to delay spouse, house, kids

Millennials are marrying, buying homes, and starting families later in life. But this group—young adults in their 20s and 30s—hope to reach important life goals at the same age as previous generations, including those now in their 60s, 70s, and older, according to a new study.

Researchers found that the ideal timing of major milestones has remained relatively constant across generations.

“Millennials want to achieve the same things around the same time as everyone else,” says Tamara Sims, a research scientist at the Stanford Center on Longevity, about the findings of the study, called the Milestones Project.

On average, people over 25 said they wanted ideally to marry by 27, buy a home by 28, and start a family by 29. However, the extent to which people reached these goals decreased with every successive generation, with those between 25 and 34 being the least likely to achieve them.

“Our findings suggest that young adults are not the disruptors that they have been made out to be,” Sims says. “They are indeed getting married, buying a home, and starting a family later than their ideal age at lower rates than other generations, but this decline did not start with them.”

As part of the project, researchers surveyed four generations—1,716 participants ranging from ages 25 to 75 and older—to find out when people hoped to attain their goals versus when they actually reached them.

The study shows that home ownership was a goal that the fewest number of American millennials actually reached. And millennials are not alone. Researchers found that even those aged between 35 and 54 experience a 7-year difference between when they intended to buy a home and when they did. Those 65 and older reported buying homes only one or two years after their ideal age for home ownership.

80’s kids have 50/50 odds of out-earning parents

In addition, the study showed that millennials want to save for retirement sooner than previous generations, and 43 percent are actually doing so, more than any other older generation did when they were that age. This finding could be attributed to an increase in policies and programs promoting retirement savings in recent years, Sims says.

“Beliefs and values about the right way of doing things—in this case, when you should get married, buy a home—are very ingrained in our culture,” says Jeanne L. Tsai, a Stanford professor of psychology in her comments about the new study. “At the same time, I think the results on saving for retirement are really encouraging. They suggest that with education and alternative models for doing things, beliefs, expectations, and even behavior can change.”

Without good jobs, more young parents skip marriage

Discrepancies between what people desire and what actually happens in their life can reliably predict poorer health and well-being, Sims says about previous research, noting that it is important to track these generational changes and strive to reduce those discrepancies.

“People are appearing to pursue ideals for life that were set around World War II, and it doesn’t make sense that we as a society haven’t questioned these ideals,” Sims says. “We hope this study, along with the center’s broader mission, helps people rethink their goals in this era of long life and empower younger generations.”

Source: Stanford University

The post Millennials don’t want to delay spouse, house, kids appeared first on Futurity.

Product labels like ‘Fair Trade’ mean less than you’d think

Buying ethically sourced products is not as straightforward as it might seem, according to the first large-scale analysis of sustainable sourcing practices.

Imagine, for example, you want some chocolate. You scan the market shelf for a bar with a Fair Trade or Rainforest Alliance certification because you don’t want your indulgence to drive labor abuse and deforestation. It’s the right thing to do, right?

While more than half of the global companies surveyed apply sustainability practices somewhere in their supply chain, according to the study, these efforts actually tend to have a much more limited reach than consumers might imagine given media attention to the issue and the proliferation of sustainable product labeling.

“Our results show a glass half full and half empty,” says study coauthor Eric Lambin, professor in Stanford University’s School of Earth, Energy & Environmental Sciences and a senior fellow at the Stanford Woods Institute for the Environment.

The paper, published in the Proceedings of the National Academy of Sciences, relates sourcing practices to the UN Sustainable Development Goals, an agenda for a sustainable global economy. With global supply chains touching more than 80 percent of global trade and employing more than one in five workers, corporate supply chains have the potential to play an outsized role in achieving the UN goals.

The researchers analyzed 449 publicly listed companies in the food, textile, and wood-products sectors, and found about half use some form of sustainable sourcing practice ranging from third-party certification of production standards to environmental training for suppliers. Among their findings:

  • More than 70 percent of sustainable sourcing practices cover only a subset of input materials for a given product. For example, a company might use recycled materials for the packaging of a product, but leave the remainder of a product’s upstream impact unaddressed.
  • Only 15 percent of sustainable sourcing practices focus on health, energy, infrastructure, climate change, education, gender, or poverty.
  • Almost all sustainable sourcing practices address only a single tier in the supply chain, usually first-tier suppliers, such as the textile factories that sew T-shirts. Often, the remaining processes, from dying the cloth to growing the cotton, remain unaddressed.
  • More than a quarter of sustainable sourcing practices apply to only a single product line. For example, a company may use Fair Trade certification for only one type of chocolate bar among many that it sells.

“Advancing environmental and social goals in supply chains can quickly become very complex,” says study coauthor Joann de Zegher, a postdoctoral fellow at the Stanford Graduate School of Business. “This complexity is reflected in our findings that companies use a broad range of strategies and that current efforts have limited reach.”

On a hopeful note, the researchers find that companies on the receiving end of consumer and civil society pressure are “significantly more likely” to adopt at least one sustainable sourcing practice. So, perhaps unsurprisingly, companies headquartered in countries with many active nongovernmental organizations are more likely to use sustainable sourcing practices, according to the study.

Grocery store program pushed farmers to go green

“The pressure consumers put on firms when they demand more sustainable products might be paying off,” says study lead author Tannis Thorlakson, a graduate student in the Emmett Interdisciplinary Program in Environment and Resources of Stanford’s School of Earth, Energy & Environmental Sciences.

“I hope this paper acts as a call to action for those 48 percent of companies that aren’t doing anything to address sustainability challenges in their supply chain.”

The National Science Foundation and the Teresa Elms and Robert D. Lindsay Fellowship at Stanford supported the work.

Source: Stanford University

The post Product labels like ‘Fair Trade’ mean less than you’d think appeared first on Futurity.

On ‘Day Zero,’ will Cape Town shut off its water?

Cape Town, South Africa—a modern city of nearly 4 million residents (plus over 1.5 million tourists yearly)—is on the brink of running out of water. In May, the city could be forced to cut off the vast majority of its taps.

Buzz Thompson, a water law expert at Stanford University, recently talked about how Cape Town got into this dire situation, what will happen on “Day Zero” (the day an entire city runs out of water), and if other cities face similar crises.

The post On ‘Day Zero,’ will Cape Town shut off its water? appeared first on Futurity.

Lesser-known relative of the laser could leave the lab soon

Researchers may have found a way to solve the weakness of a type of light source similar to lasers. The alternative light source could lead to smaller, lower-cost, and more efficient sources of light pulses.

“Sometimes you completely reshape your understanding of systems you think you know…”

Although critical for varied applications, such as cutting and welding, surgery and transmitting bits through optical fiber, lasers have some limitations—namely, they only produce light in limited wavelength ranges.

Now, researchers have modified similar light sources, called optical parametric oscillators, to overcome this obstacle.

Until now, these lesser-known light sources have been mostly confined to the lab because their setup leaves little room for error—even a minor jostle could knock one out of alignment.

Their work, which appears in Physical Review Letters, demonstrates a new way to produce femtosecond pulses—pulses measured by quadrillionths of a second—in desirable wavelength ranges using this light source. The technology could potentially lead to better detection of pollutants and diseases by merely scanning the air or someone’s breath.

Turning knobs

The light source these researchers study consists of an initial step where pulses of light from a traditional laser are passed through a special crystal and converted into a wavelength range that’s difficult to access with conventional lasers. Then, a series of mirrors bounce the light pulses around in a feedback loop. When this feedback loop is synchronized to the incoming laser pulses, the newly converted pulses combine to form an increasingly strong output.

Traditionally, people could not convert much of the initial light pulses into the desired output with such a contraption. But to be effective in real-world applications, the group had to bump up that percentage.

“We needed higher conversion efficiency to prove it was a source worth studying,” says Alireza Marandi, a staff member in the Ginzton Lab at Stanford University. “So we just say, ‘OK, what are the knobs we have in the lab?’ We turned one that made the mirrors reflect less light, which was against the standard guidelines, and the conversion efficiency doubled.” The researchers published their initial experimental results two years ago in Optica.

Cranking up the power in a conventional design usually results in two undesirable outcomes: The pulses lengthen and the conversion efficiency drops. But in the new design, where the researchers significantly decreased the reflectivity of their mirrors, the opposite occurred.

“We were thinking about this regime based on the standard design guidelines, but the behavior we would see in the lab was different,” says Marc Jankowski, lead author of the paper and a graduate student in the Ginzton Lab. “We were seeing an improvement in performance, and we couldn’t explain it.”

In the palm of a hand

After more simulations and lab experiments, the group found that the key was not just making the mirrors less reflective but also lengthening the feedback loop. This lengthened the time it took for the light pulses to complete their loop and should have slowed them too much. But the lower reflectivity, combined with the time delay, caused the pulses to interact in unexpected ways, which pulled them back into synchronization with their incoming partners.

X-ray laser turns molecule into tiny ‘black hole’

This unanticipated synchronization more than doubled the bandwidth of the output, which means it can emit a broader span of wavelengths within the range that is difficult to access with conventional lasers. For applications like detecting molecules in the air or in a person’s breath, light sources with greater bandwidth can resolve more distinct molecules. In principle, the pulses this system produces could be compressed to as short as 18 femtoseconds, which can be used to study the behavior of molecules.

The decision to reduce the mirror reflectivity had the surprising consequence of making a formerly persnickety device more robust, more efficient, and better at producing ultra-short light pulses in wavelength ranges that are difficult to access with traditional lasers.

The next challenge is designing the device to fit in the palm of a hand.

“You talk with people who have worked with this technology for the past 50 years and they are very skeptical about its real-life applications because they think of these resonators as a very high-finesse arrangement that is hard to align and requires a lot of upkeep,” says Marandi, who is also coauthor of the paper. “But in this regime of operation these requirements are super-relaxed, and the source is super-reliable and doesn’t need the extensive care required by standard systems.”

This newfound design flexibility makes it easier to miniaturize such systems onto a chip, which could lead to many new applications for detecting molecules and remote sensing.

“Sometimes you completely reshape your understanding of systems you think you know,” Jankowski says. “That changes how you interact with them, how you build them, how you design them, and how useful they are. We’ve worked on these sources for years and now we’ve gotten some clues that will really help bring them out of the lab and into the world.”

Laser turns water into alien ice in nanoseconds

Additional coauthors of the paper are from Stanford University, ETH Zurich, and the US Military Academy at West Point.

The US Department of Defense and the National Science Foundation funded the research.

Source: Stanford University

The post Lesser-known relative of the laser could leave the lab soon appeared first on Futurity.

Killer cells target leukemia broadcasting ‘come and get me’

Researchers used CRISPR gene-editing to equip certain immune cells with a homing beacon to target leukemia.

Leukemia is a deadly cancer in which rogue white blood cells roam the bloodstream, slowly killing the body that gave them life. But it has an Achilles’ heel. Many leukemia cells are betrayed by a molecule on their exterior surfaces known as CD19.

“We’re trying to design smarter cells.”

When activated, CD19 will kill the cancer cell to which it is attached. To cancer biochemists, CD19 is like a tiny radio signal broadcasting to the world, “I’m leukemia. Come and get me.” But when a body is without the immune cells equipped to hear CD19’s siren song, the leukemia is free to carry on its lethal business undeterred.

So, researchers created leukemia-specific human immune cells that track down and kill any leukemia cell exhibiting the CD19 signal.

Developing better hunter-killer cells to target cancers is part of what goes on in the lab of Stanley Qi, assistant professor of bioengineering and of chemical systems biology.

Though this is still basic research, Qi’s approach could one day lead to new ways to treat the roughly 170,000 Americans who were diagnosed with leukemia and other blood-related cancers last year.

Beyond leukemia

But leukemia is just the beginning. Cancers of the blood system account for a mere fraction of all cancers, most of which are solid tumors—clumps of cells that grow inappropriately in breasts, ovaries, lungs, and prostate, for example.

Solid tumors take refuge within a complex microenvironment of molecules, hormones, and growth factors that help these unwanted cells spread and suppress the immune system agents that seek to kill the tumor.

Qi hopes to prove that his technique could work on all cancers because it targets a beacon found not just on leukemia, but on almost every type of cell in the body, including solid cancers.

By using CRISPR to hack ever more precisely into the genome, Qi believes it may one day be possible to bioengineer therapeutic agents to dial in on not just cancers, but other diseases that use the same radio-like signaling that has already used to attack leukemia.

Tuning the antennae

Qi’s team used the CRISPR gene-editing technique to modify cellular receivers known as G protein-coupled receptors—GPCRs for short.

One of the largest and most important families of chemical receptors in human physiology, GPCRs are like cellular antennae, constantly searching for biochemical signals that allow cells to communicate and to function together as tissues.

“…we can now create GPCR antenna devices that recognize virtually any molecule imaginable…”

When antennae molecules recognize a particular signal—a molecule like CD19, for instance—they initiate a cascade of cellular communications with the nucleus that triggers a broad array of genetic outcomes ranging from immune responses to chemical generation to cell reproduction.

When GPCRs detect opiates, for instance, they instruct cells to flood the body with pleasure-enhancing, painkilling dopamine. As such, GPCRs are the gateways—the input/output devices—by which various important hormones, proteins, fatty acids, and drugs communicate on a cellular level.

GPCRs are found on the surface of almost every cell type in the body. Of the 20,000 or so genes that make up the human genome, 800 alone are dedicated to distinct GPCR variations. “That’s a huge proportion of our genetic code,” Qi says, noting that some 40 percent of all drugs already on the market today target GPCRs.

‘Vaccine’ kills cancer in mice

Therein lies the excitement in this research. By developing a technique that can turn the plethora of GCPRs into tattle tales for different illnesses and dysfunctions, Qi’s team developed a platform for hacking into the body’s biochemical communications network to battle disease. In the cancer example described above, the team has been able to recalibrate the GPCR antennae to home in on key molecules present in the tumor microenvironment.

Doing the ChaCha

Qi has dubbed their variation of the CRISPR technique “ChaCha” for the way it involves a dance of two molecules to modify the genetic code of GPCRs.

“With ChaCha we can now create GPCR antenna devices that recognize virtually any molecule imaginable, including hormones, cellular growth factors, and synthetic drugs,” he says.

While there are existing CRISPR techniques that target GPCRs, ChaCha has two key advantages. First, it’s dose dependent. A GPCR trained to recognize a specific hormone, for instance, would be able to modulate its response based on the relative presence of that hormone—more hormone would mean a greater response, and vice versa.

“This is a programmable logic by which cells can figure out what their charge is and when they have completed an assigned task,” he notes. “We’re trying to design smarter cells.”

The second advantage is that ChaCha is reversible. A cell modified for a specific task could be returned to its normal state once its duty was complete.

“As bioengineers, we want total controllability,” Qi says. “ChaCha-modified cells would cease working when the cancer cells are gone.”

Light acts as precision weapon to attack cancer

Early clinical trials have been promising and are already leading to new leukemia therapies. What has been most revolutionary, however, is a growing ability to use living cells as therapies, opening a world beyond traditional chemotherapies.

Qi and collaborators are excited by the broader prospect of adapting their genetic approach to an array of diseases ranging from solid tumors to neurological disorders such as Parkinson’s disease and autoimmune disorders like lupus.

Asked about next steps for ChaCha, Qi says he plans to continue to test the bounds of his technique to make it easier to create cells to attack disease or to conjure desirable chemicals in the body. There has already been commercial interest in the approach.

“We are just at the beginning of a very exciting period in predictably designing living cells for medical uses,” Qi says. “Now we’re moving quickly and in the right direction.”

The researchers report their findings in the journal in Nature Communications.

Primary funding of this research was provided by the Stanford School of Engineering, the Stanford School of Medicine, the Pew Charitable Trusts and the Cancer Research Institute.

Source: Stanford University

The post Killer cells target leukemia broadcasting ‘come and get me’ appeared first on Futurity.

Good attitude about math gets kid brains in high gear

Having a positive attitude about math is connected to better function of the hippocampus, an important memory center in the brain, during performance of arithmetic problems, a new study of elementary school students suggests.

Educators have long observed higher math scores in children who show more interest in math and perceive themselves as being better at it. But it has not been clear if this attitude simply reflects other capacities, such as higher intelligence. The new study marks the first time that scientists have identified the brain pathway that links a positive attitude toward math to achievement in the subject.

The new study also found that, even once IQ and other confounding factors were accounted for, a positive attitude toward math still predicted which students had stronger math performance.

Arithmetic attitude

“Attitude is really important,” says Lang Chen, the study’s lead author and a postdoctoral scholar in psychiatry and behavioral sciences at Stanford University. “Based on our data, the unique contribution of positive attitude to math achievement is as large as the contribution from IQ.”

The scientists had not expected the contribution of attitude to be so large, Chen says. The mechanism underlying its link to cognitive performance was also unexpected.

“It was really surprising to see that the link works through a very classical learning and memory system in the brain,” says senior author Vinod Menon, professor of psychiatry and behavioral sciences.

“Having a positive attitude acts directly on your memory and learning system…”

Researchers had previously hypothesized that the brain’s reward centers might drive the link between attitude and achievement—perhaps children with better attitudes were better at math because they found it more rewarding or motivating.

“Instead, we saw that if you have a strong interest and self-perceived ability in math, it results in enhanced memory and more efficient engagement of the brain’s problem-solving capacities,” Menon says.

The researchers administered standard questionnaires to 240 children ages 7 to 10, assessing demographics, IQ, reading ability, and working-memory capacity. The children’s level of math achievement was measured with tests of their knowledge of arithmetic facts and ability to solve math word problems. Parents or guardians answered surveys about the children’s behavioral and emotional characteristics, as well as their anxiety about math and general anxiety. Children also answered a survey that assessed their attitude toward math, including questions about interest in math and self-perceived math ability, as well as their attitude toward academics in general.

Forty-seven children from the group also participated in MRI brain scans while performing arithmetic problems. Tests were conducted outside the MRI scanner to discern which problem-solving strategies they used. An independent group of 28 children also was given MRI scans and other assessments in an attempt to replicate the findings from the cohort previously given brain scans.

Math and memory

Math performance correlated with a positive attitude toward math even after statistically controlling for IQ, working memory, math anxiety, general anxiety, and general attitude toward academics, the study found.

Children with poor attitudes toward math rarely performed well in the subject, while those with strongly positive attitudes had a range of math achievement.

“A positive attitude opens the door for children to do well but does not guarantee that they will; that depends on other factors as well,” Chen says.

From the brain-imaging results, the scientists found that, when a child was solving a math problem, his or her positive-attitude scores correlated with activation in the hippocampus, an important memory and learning center in the brain. Activity in the brain’s reward centers, including the amygdala and the ventral striatum, was not linked to a positive attitude toward math.

Belief in success predicts how kids do in math and reading

Statistical modeling of the brain imaging results suggested that the hippocampus mediates the link between positive attitude and efficient retrieval of facts from memory, which in turn is associated with better problem solving abilities.

“Having a positive attitude acts directly on your memory and learning system,” Chen says. “I think that’s really important and interesting.”

The study could not disentangle the extent to which a positive attitude came from a child’s prior success in math.

“We think the relationship between positive attitude and math achievement is mutual, bi-directional,” Chen says. “We think it’s like bootstrapping: A good attitude opens the door to high achievement, which means you then have a better attitude, getting you into a good circle of learning. And it can probably go the other way and be a vicious circle, too.”

‘Maximizing learning’

The findings may provide a new avenue for improving academic performance and learning in children who are struggling, Menon says, cautioning that this idea still needs to be tested through active interventions.

“Typically, we focus on skill learning in individual academic domains, but our new work suggests that looking at children’s beliefs about a subject and their self-perceived abilities might provide another inroad to maximizing learning,” Menon says.

What we learn by tracking gifted kids for 45 years

The findings also offer a potential explanation for how a particularly passionate teacher can nurture students’ interest and learning capacities for a subject, he adds. Inspiring teachers may be instinctively sharing their own interest, as well as instilling students in the belief that they can be good at the subject, building a positive attitude even if the student did not have it before.

Funding for the research came from the National Institutes of Health. Stanford’s psychiatry and behavioral sciences department also supported the work.

The researchers report their findings in the journal Psychological Science.

Source: Stanford University

The post Good attitude about math gets kid brains in high gear appeared first on Futurity.


How wings let bugs take over the world

The evolution of wings may have been central to insects’ becoming as abundant and widespread as they are today.

Comprising up to 10 million living species, insects today live on all seven continents and inhabit every terrestrial niche imaginable. But according to the fossil record, they were scarce before about 325 million years ago, outnumbered by their arthropod cousins the arachnids (spiders, scorpions, and mites) and myriapods (centipedes and millipedes).

The oldest confirmed insect fossil is that of a wingless, silverfish-like creature that lived about 385 million years ago. It’s not until about 60 million years later, during a period of the Earth’s history known as the Pennsylvanian, that insect fossils become abundant.

insect wing fossil
When insects such as this Meganeura monyi developed wings roughly 325 million years ago, the insect population exploded, researchers found. (Credit: Alexandre Albore/Wikimedia Commons)

“There’s been quite a bit of mystery around how insects first arose, because for many millions of years you had nothing, and then just all of a sudden an explosion of insects,” says first author Sandra Schachat, a graduate student at Stanford University’s School of Earth, Energy & Environmental Sciences (Stanford Earth).

Many ideas have been proposed to explain this curious lacuna in the insect fossil record, which scientists have dubbed the Hexapod Gap.

According to one popular hypothesis, the amount of oxygen available in Earth’s atmosphere during the late Devonian period limited insect size and abundance.

The strongest evidence for this theory is a model of atmospheric oxygen during the past 570 million years that the late Yale University geologist Robert Berner developed by comparing ratios of oxygen and carbon in ancient rocks and fossils.

According to Berner’s model, the atmospheric oxygen level about 385 million years ago during the start of the Hexapod Gap was below 15 percent, so low that wildfires would have been unsustainable. (For comparison, today’s atmospheric oxygen concentration is about 21 percent.)

Another possibility is that insects were abundant before 323 million years ago, but don’t show up in the fossil record because the kinds of terrestrial sediments capable of preserving them didn’t survive.

‘Bad rocks’

In the new study, which appears in the journal Proceedings of The Royal Society B, Schachat and her colleagues tested both of these arguments—that low oxygen limited insects or that the rocks weren’t right to preserve fossils.

First, the team updated Berner’s nearly decade-old model using updated carbon records.

When they did this, the dip in atmospheric oxygen during the late Devonian disappears. “What this study shows is that environmental inhibition by low oxygen can be ruled out because it is not compatible with the most current data,” says study coauthor and Stanford Earth paleontologist Jonathan Payne.

To test the “bad rocks” hypothesis, the team analyzed a public database of North American rock types for different periods in the Earth’s history and found nothing unusual about the sediments of the late Devonian.

“The rocks could have contained insect fossils. The fact that they don’t indicates the dearth of insects during this period is real and not just an artifact of bad luck with preservation,” says Schachat, who is also a fellow at the Smithsonian Institution in Washington, DC.

‘Instantaneous’ diversification

Not only do the two most popular explanations for the Hexapod Gap appear to be unsubstantiated, the scientists says a study of the insect fossil record suggests that the Hexapod Gap itself might be an illusion.

As part of the new study, the team reexamined the ancient insect fossil record and found no direct evidence for wings before or during the Hexapod Gap. But as soon as wings appear 325 million years ago, insect fossils become far more abundant and diverse.

Watch: To avoid swatting, mosquitoes remember your scent

“The fossil record looks just how you would expect if insects were rare until they evolved wings, at which point they very rapidly increased in diversity and abundance,” Payne says.

Schachat says it’s notable that the first two winged insects in the fossil record are a dragonfly-like insect and a grasshopper-like insect. These represent the two main groups of winged insects: dragonflies have “old wings,” which they cannot fold down onto their abdomens, and grasshoppers have “new wings,” which are foldable.

“The first two winged insects in the fossil record are about as different from each other as you could possibly expect,” Schachat says. “This suggests that, once winged insects originated, they diversified very, very quickly. So quickly that their diversification appears, from a geological perspective and the evidence available in the fossil record, to have been instantaneous.”

The benefit of being first

Being the first and only animals able to fly would have been extremely powerful. Flight allowed insects to explore new ecological niches and provided new means of escape.

“All of a sudden, your abundance can increase because you can just get away from your predators so much more easily,” Schachat says. “You can also eat the leaves that are at the top of a tree without having to walk up the entire tree.”

Flying insects could also create niches that didn’t exist before. “Imagine an omnivorous insect that flies to the top of trees to feed,” Schachat says. “Suddenly, there’s a niche for a predator that can fly to the top of the tree to eat that insect. Wings allowed insects to expand the suite of niches that can be filled. It really was revolutionary.”

Hungry? You could eat (and digest) an insect

While the new study links the evolution of flight to the ascension of insects, it raises new questions about how and why they evolved wings in the first place, says coauthor Kevin Boyce, an associate professor of geological sciences at Stanford Earth.

“In the Devonian, there were only a few insects, all wingless,” Boyce says. “But you come out the other side and we have flight. What happened in between? Good question.”

Payne is also a member of Stanford Bio-X and an affiliate of the Stanford Woods Institute for the Environment. Additional coauthors of the paper are from the Smithsonian Institution, Ohio State University, and the University of Iowa.

Funding for the study came from the National Science Foundation.

Source: Stanford University

The post How wings let bugs take over the world appeared first on Futurity.

Neglected kids do better with earlier foster family placement

Neglected children placed with foster care families earlier in life are more likely to be as resilient and competent socially, academically, and physically as their peers who have never been institutionalized when they reach their teenage years, according to new research focused on children in Romania.

“These kids are not doomed, and many of them end up with normal outcomes…”

Researchers discovered that 56 percent of previously institutionalized children who were randomly placed with foster care families when they were between 6 months and 2 ½ years old were as competent across a range of metrics as their peers at 12 years old. This is more than double the percentage of those children who remained in institutional care, of which only 23 percent were deemed to be competent at age 12, according to the new study, which appears in the Journal of Child Psychology and Psychiatry.

The number of children who met the researchers’ threshold for competent outcomes was even greater among children placed in family care at young ages. Of children who were placed at age 20 months or younger, 79 percent were deemed competent. This is nearly the same rate as children who were never institutionalized at all.

Early action, better outcomes

“This study proves that resilient outcomes can be promoted by placing kids into foster care earlier in life,” says Kathryn Humphreys, a postdoctoral scholar in psychology at Stanford University and a lead author of the study.

“These kids are not doomed, and many of them end up with normal outcomes. So it’s important for us to work on removing them from those neglecting environments as soon as possible.”

There are about 8 million orphan children in the world, and the new research can clarify how best to take care of them, Humphreys says. In the United States, where institutional care is less common than in other places, neglect is still the most common reason for child maltreatment cases reported to child protective services around the country.

Recognizing cases of neglected children and placing them into positive foster care environments is something even developed countries need to strive for, Humphreys says.

The study evaluated children who have been part of a long-term randomized, controlled trial in Romania, called the Bucharest Early Intervention Project, which started in 2000.

Previous research on orphanages and other early institutional care has shown that children in institutions experience severe psychosocial deprivation, leading to long-term developmental challenges. Adding to that, the Bucharest project found that children who were institutionalized longer early in life had poorer IQ scores and mental health.

‘A hopeful lens’

The new study takes a broad approach to analyzing children’s functioning, says Humphreys, who conducted the research during her postdoctoral fellowship at the Tulane University’s School of Medicine and continued the work at Stanford, where she is working with Ian Gotlib, professor of psychology.

Head Start may keep kids out of foster care

Researchers assessed the children’s level of resilience to early deprivation across seven categories of well-being: family relations, peer relations, academic performance, physical health, mental health, substance use, and risk-taking behavior. If a child scored positively in six of seven categories, they were deemed to meet the threshold of adaptive-functioning.

“There is no one metric of resilience,” Humphreys says. “But this was our way of using the existing available data to measure how well children are doing relative to their peers.”

Humphreys says one finding was a surprise: 40 percent of all children who ever experienced institutionalized care met their threshold for resilience.

“When we think about kids in institutional care, we often think about how they end up not faring well,” Humphreys says.

“This research gives us a different, hopeful lens. A lot of kids seem to be doing just as well as their peers. It also gives us a window into how to promote resilience in children who experience neglect—namely, placing them in family care as early as possible.”

Time in orphanage blunts stress response later

Additional coauthors are from Tulane University School of Medicine; the University of Washington; the University of North Carolina, Chapel Hill; Harvard University; and the University of Maryland.

Source: Stanford University

The post Neglected kids do better with earlier foster family placement appeared first on Futurity.