IdeaBig ideas are getting harder and harder to find, and innovations have become increasingly massive and costly endeavors, according to new research.

As a result, tremendous continual increases in research and development will be needed to sustain even today’s low rate of economic growth.

This means modern-day inventors—even those in the league of Steve Jobs—will have a tough time measuring up to the productivity of the Thomas Edisons of the past.

Nicholas Bloom, senior fellow at the Stanford Institute for Economic Policy Research and coauthor of a paper released this week by the National Bureau of Economic Research, contends that so many game-changing inventions have appeared since World War II that it’s become increasingly difficult to come up with the next big idea.

“The thought now of somebody inventing something as revolutionary as the locomotive on their own is inconceivable,” Bloom says.

“It’s certainly true if you go back one or two hundred years, like when Edison invented the light bulb,” he says. “It’s a massive piece of technology and one guy basically invented it. But while we think of Steve Jobs and the iPhone, it was a team of dozens of people who created the iPhone.”

(more…)

By: Timothy H. Dixon, University of South Florida

Climate marchThis summer I worked on the Greenland ice sheet, part of a scientific experiment to study surface melting and its contribution to Greenland’s accelerating ice losses. By virtue of its size, elevation and currently frozen state, Greenland has the potential to cause large and rapid increases to sea level as it melts.

When I returned, a nonscientist friend asked me what the research showed about future sea level rise. He was disappointed that I couldn’t say anything definite, since it will take several years to analyze the data. This kind of time lag is common in science, but it can make communicating the issues difficult. That’s especially true for climate change, where decades of data collection may be required to see trends.

A recent draft report on climate change by federal scientists exploits data captured over many decades to assess recent changes, and warns of a dire future if we don’t change our ways. Yet few countries are aggressively reducing their emissions in a way scientists say are needed to avoid the dangers of climate change.

While this lack of progress dismays people, it’s actually understandable. Human beings have evolved to focus on immediate threats. We have a tough time dealing with risks that have time lags of decades or even centuries. As a geoscientist, I’m used to thinking on much longer time scales, but I recognize that most people are not. I see several kinds of time lags associated with climate change debates. It’s important to understand these time lags and how they interact if we hope to make progress.

(more…)

By: Benjamin F. Jones, Northwestern University and Mohammad Ahmadpoor, Northwestern University

What does hailing a ride with Uber have to do with 19th-century geometry and Einstein’s theory of relativity? Quite a bit, it turns out.

Uber and other location-based mobile applications rely on GPS to link users with available cars nearby. GPS technology requires a network of satellites that transmit data to and from Earth; but satellites wouldn’t relay information correctly if their clocks failed to account for the fact that time is different in space – a tenet of Einstein’s general theory of relativity. And Einstein’s famous theory relies on Riemannian geometry, which was proposed in the 19th century to explain how spaces and curves interact – but dismissed as derivative and effectively useless in its time.

The point is not just that mathematicians don’t always get their due. This example highlights an ongoing controversy about the value of basic science and scholarship. How much are marketplace innovations, which drive broad economic prosperity, actually linked to basic scientific research?

It’s an important question. Plenty of tax dollars and other funds go toward the research performed in academic centers, government labs and other facilities. But what kind of return are we as a society recouping on this large investment in new discoveries? Does scientific research reliably lead to usable practical advances?

(more…)

Google ScholarA journal’s impact factor looks at the number of citations within a particular year, but the significance of some research exceeds a one year time frame. To highlight these papers, Google Scholar released their Classic Papers collection, which highlights highly-cited papers that have stood the test of time.

“This release of classic papers consists of articles that were published in 2006 and is based on our index as it was in May 2017,” Sean Henderson, software engineer at Google Scholar, said in a release. “The list of classic papers includes articles that presented new research. It specifically excludes review articles, introductory articles, editorials, guidelines, commentaries, etc. It also excludes articles with fewer than 20 citations and, for now, is limited to articles written in English.”

In the category of electrochemistry, works by ECS members Gleb Yushin, Christopher Johnson, Yuri Gogotsi, and Bernard Tribollet made the list.

Additionally, Michael Graetzel’s 2006 paper published in the Journal of The Electrochemical Society (JES), “Highly Efficient Dye-Sensitized Solar Cells Based on Carbon Black Counter Electrodes,” claimed the number eight spot.

“A journal from a professional society like ECS will look at the value of the science as the value of the science and not necessarily what its pizzazz is at that particular time,” Robert Savinell, editor of JES, told ECS in a recent podcast. “I think that’s one of the reasons we have this 10 year impact factor that’s at the top of the list. We’re looking at quality of the science in the long term.”

By: Andrew J. Hoffman, University of Michigan

Climate marchWhen politicians distort science, academics and scientists tend to watch in shock from the sidelines rather than speak out. But in an age of “fake news” and “alternative facts,” we need to step into the breach and inject scientific literacy into the political discourse.

Nowhere is this obligation more vivid than the debate over climate change. Contrary to the consensus of scientific agencies worldwide, the president has called climate change a “hoax” (though his position may be shifting), while his EPA administrator has denied even the most basic link to carbon dioxide as a cause.

It’s another sign that we, as a society, are drifting away from the use of scientific reasoning to inform public policy. And the outcome is clear: a misinformed voting public and the passage of policies to benefit special interests.

Using data to meet predetermined goals

We saw this dynamic at work when President Trump announced his intention to withdraw from the Paris Agreement on climate change. In making his case, he presented an ominous economic future: “2.7 million lost jobs by 2025,” and industries devastated by 2040: “Paper – down 12 percent. Cement – down 23 percent. Iron and steel – down 38 percent. Coal – and I happen to love the coal miners – down 86 percent. Natural gas – down 31 percent.”

These data were drawn from a study – one study! – funded by the American Council for Capital Formation, a pro-business lobbying group, and conducted by National Economic Research Associates (NERA), a consulting firm for industrial clients often opposed to environmental regulations. The New York Times Editorial Board called the data “nonsense” and “a cornucopia of dystopian, dishonest and discredited data based on numbers from industry-friendly sources.”

(more…)

By: Erin Baker, University of Massachusetts Amherst

Renewable grideThe U.S. Department of Energy spends US$3-$4 billion per year on applied energy research. These programs seek to provide clean and reliable energy and improve our energy security by driving innovation and helping companies bring new clean energy sources to market. The Conversation

President Trump’s detailed budget request reportedly will ask Congress to cut funding for the Energy Department’s clean energy programs by almost 70 percent, from $2 billion this year to $636 million in 2018. Clean energy advocates and environmental groups strongly oppose such drastic cuts, but some reductions are likely. Where should DOE focus its limited funding to produce the greatest energy and environmental benefits?

My colleagues Laura Diaz Anadon of Cambridge University and Valentina Bosetti of Bocconi University and I recently reviewed 15 studies that asked this question. We found a number of clean energy technologies in electricity and transportation that will help us slow climate change by reducing greenhouse gas emissions, even at lower levels of investment.

(more…)

By: Mohammad S. Jalali, Massachusetts Institute of Technology

ResearchFrom social to natural and applied sciences, overall scientific output has been growing worldwide – it doubles every nine years. The Conversation

Traditionally, researchers solve a problem by conducting new experiments. With the ever-growing body of scientific literature, though, it is becoming more common to make a discovery based on the vast number of already-published journal articles. Researchers synthesize the findings from previous studies to develop a more complete understanding of a phenomenon. Making sense of this explosion of studies is critical for scientists not only to build on previous work but also to push research fields forward.

My colleagues Hazhir Rahmandad and Kamran Paynabar and I have developed a new, more robust way to pull together all the prior research on a particular topic. In a five-year joint project between MIT and Georgia Tech, we worked to create a new technique for research aggregation. Our recently published paper in PLOS ONE introduces a flexible method that helps synthesize findings from prior studies, even potentially those with diverse methods and diverging results. We call it generalized model aggregation, or GMA.

Pulling it all together

Narrative reviews of the literature have long been a key component of scientific publications. The need for more comprehensive approaches has led to the emergence of two other very useful methods: systematic review and meta-analysis.

In a systematic review, an author finds and critiques all prior studies around a similar research question. The idea is to bring a reader up to speed on the current state of affairs around a particular research topic.

(more…)

By: Bruce Weinberg, The Ohio State University

Science funding is intended to support the production of new knowledge and ideas that develop new technologies, improve medical treatments and strengthen the economy. The idea goes back to influential engineer Vannevar Bush, who headed the U.S. Office of Scientific Research and Development during World War II. And the evidence is that science funding does have these effects. The Conversation

But, at a practical level, science funding from all sources supports research projects, the people who work on them and the businesses that provide the equipment, materials and services used to carry them out. Given current proposed cuts to federal science funding – the Trump administration has, for instance, proposed a 20 percent reduction for the National Institutes of Health – it’s important to know what types of people and businesses are touched by sponsored research projects. This information provides a window into the likely effects of funding cuts.

Most existing research into the effects of science funding tries to quantify research artifacts, such as publications and patents, rather than tracking people. I’ve helped to start an emerging project called the UMETRICS initiative which takes a novel approach to thinking about innovation and science. At its core, UMETRICS views people as key to understanding science and innovation – people conduct research, people are the vectors by which ideas move around and, ultimately, people are one of the primary “products” of the research enterprise.

UMETRICS identifies people employed on scientific projects at universities and the purchases made to carry out those projects. It then tracks people to the businesses and universities that hire them, and purchases to the vendors from which they come. Since UMETRICS relies entirely on administrative data provided by member universities (now around 50), the U.S. Census Bureau and other naturally occurring data, there are no reporting errors, sample coverage concerns or burden for people. It covers essentially all federal research funding as well as some funding from private foundations.

(more…)

While not the only source of science, government funded research plays a huge role in the lives of many individuals. From something as simple as the weather apps underpinned by the National Weather Service to the Food and Drug Administration’s work on preventing Salmonella, this tax-payer funded research shapes lives and helps provide knowledge to make crucial decisions.

On January 23, word came from the White House that almost all U.S. scientific government agencies had been temporarily barred from communicating with the public via press releases, blogs, and social media.

It’s not currently clear how extensive the gag order is – with some reports saying that explanations of just published peer reviewed research are barred, while others citing a much more lenient scenario – but it is confirmed that almost all agencies, from the U.S. Department of Interior to the Department of Health and Human Services, received a memo restricting – to some degree – outreach to the public.

Even after the gag order was put in place, federal agencies such as the Badlands National Park continued tweeting on its official account with a stream of facts pertaining to climate change. The tweets have since been deleted, though the park did address the president in a letter on Huffington Post.

(more…)

John Staser, professor of chemical engineering at Ohio UniversityImage: Ohio University

John Staser, professor of chemical engineering at Ohio University
Image: Ohio University

ECS member and Ohio University professor, John Staser, was recently granted $1.5M from the U.S. Department of Energy for biofuels research. Staser and his team will work to develop technology to make biorefineries more efficient and profitable, thereby reducing the cost of environmentally friendly biofuels.

Biofuels are combustible fuels created from biomass. Currently, they are the only viable replacement to petroleum transportation fuels because they can be used in existing combustion engines. Biofuels are typically produced from food crops (sugar cane, corn, soybean, etc.) or materials such as wood, grass, or inedible parts of plants. Ethanol and biodiesel are prominent forms of biofuels that offer an alternative to such transportation fuels as petroleum and jet fuel.

Staser will lead an interdisciplinary team to develop ways to process a class of complex organic polymers known as lignin, which is one of the many waste products produced in the biorefining process.

“It’s not really competitive with gasoline, especially if oil is $40 a barrel,” Staser says. “Before this biofuel becomes feasible, we have to find a way to reduce the manufacturing cost. One way to do this is to come up with a secondary revenue stream for the refinery. So, if biorefineries could waste lignin to do so, biofuel would become a more financially feasible option.”

(more…)

  • Page 1 of 6