[Home]  [Headlines]  [Latest Articles]  [Latest Comments]  [Post]  [Mail]  [Sign-in]  [Setup]  [Help]  [Register] 

Joe Rogan Experience #2138 - Tucker Carlson

Police Dispersing Student Protesters at USC - Breaking News Coverage (College Protests)

What Passover Means For The New Testament Believer

Are We Closer Than Ever To The Next Pandemic?

War in Ukraine Turns on Russia

what happened during total solar eclipse

Israel Attacks Iran, Report Says - LIVE Breaking News Coverage

Earth is Scorched with Heat

Antiwar Activists Chant ‘Death to America’ at Event Featuring Chicago Alderman

Vibe Shift

A stream that makes the pleasant Rain sound.

Older Men - Keep One Foot In The Dark Ages

When You Really Want to Meet the Diversity Requirements

CERN to test world's most powerful particle accelerator during April's solar eclipse

Utopian Visionaries Who Won’t Leave People Alone

No - no - no Ain'T going To get away with iT

Pete Buttplug's Butt Plugger Trying to Turn Kids into Faggots

Mark Levin: I'm sick and tired of these attacks

Questioning the Big Bang

James Webb Data Contradicts the Big Bang

Pssst! Don't tell the creationists, but scientists don't have a clue how life began

A fine romance: how humans and chimps just couldn't let go

Early humans had sex with chimps

O’Keefe dons bulletproof vest to extract undercover journalist from NGO camp.

Biblical Contradictions (Alleged)

Catholic Church Praising Lucifer

Raising the Knife

One Of The HARDEST Videos I Had To Make..

Houthi rebels' attack severely damages a Belize-flagged ship in key strait leading to the Red Sea (British Ship)

Chinese Illegal Alien. I'm here for the moneuy

Red Tides Plague Gulf Beaches

Tucker Carlson calls out Nikki Haley, Ben Shapiro, and every other person calling for war:

{Are there 7 Deadly Sins?} I’ve heard people refer to the “7 Deadly Sins,” but I haven’t been able to find that sort of list in Scripture.

Abomination of Desolation | THEORY, BIBLE STUDY

Bible Help

Libertysflame Database Updated

Crush EVERYONE with the Alien Gambit!

Vladimir Putin tells Tucker Carlson US should stop arming Ukraine to end war

Putin hints Moscow and Washington in back-channel talks in revealing Tucker Carlson interview

Trump accuses Fulton County DA Fani Willis of lying in court response to Roman's motion

Mandatory anti-white racism at Disney.

Iceland Volcano Erupts For Third Time In 2 Months, State Of Emergency Declared

Tucker Carlson Interview with Vladamir Putin

How will Ar Mageddon / WW III End?

What on EARTH is going on in Acts 16:11? New Discovery!

2023 Hottest in over 120 Million Years

2024 and beyond in prophecy

Questions

This Speech Just Broke the Internet

This AMAZING Math Formula Will Teach You About God!


Status: Not Logged In; Sign In

Health/Medical
See other Health/Medical Articles

Title: The man who made scientists question themselves has just exposed huge flaws in evidence used to give drug prescriptions
Source: qz
URL Source: https://qz.com/1016554/aziz-ansari- ... ed-brown-women-to-a-punchline/
Published: Sep 20, 2016
Author: Akshat Rathi
Post Date: 2017-06-29 21:28:25 by Anthem
Keywords: None
Views: 1238
Comments: 5

broken-glass

At least the experiment is reproducible. (Flickr/Julien Belli under CC-BY)

Over the past decade, scientists have increasingly become ashamed at the failings of their own profession: due to a lack of self-policing and quality control, a large proportion of studies have not been replicable, scientific frauds have flourished for years without being caught, and the pressure to publish novel findings—instead of simply good science—has become the commanding mantra. In what might be one of the worst such failings, a new study suggests that even systematic reviews and meta-analyses—typically considered the highest form of scientific evidence—are now in doubt.

The study comes from a single author: John Ioannidis, a highly respected researcher at Stanford University, who has built his reputation showing other scientists what they get wrong. In his latest work, Ioannidis contends that “the large majority of produced systematic reviews and meta-analyses are unnecessary, misleading, or conflicted.” john-ioannidisJohn Ioannidis (Stanford) Systematic reviews and meta-analyses are statistically rigorous studies that synthesize the scientific literature on a given topic. If you aren’t a scientist or a policymaker, you may have never heard of them. But you have almost certainly been affected by them.

If you’ve ever taken a medicine for any ailment, you’ve likely been given the prescription based on systematic reviews of evidence for that medication. If you’ve ever been advised to use a standing desk to improve your health, it’s because experts used meta-analyses of past studies to make that recommendation. And government policies increasingly rely on conclusions stemming from evidence found in such reviews. “We put a lot of weight and trust on them to understand what we know and how to make decisions,” Ioannidis says.

But, he says, in recent years, systematic reviews and meta-analyses have often been wrong.

The race to publish

Ioannidis shows that there has been a startling rise in the number of such reviews published each year.

He notes that one reason for this trend may be the pent-up demand to understand the heaps of evidence accumulated in different scientific disciplines. Systematic reviews and meta-analyses weren’t much in use before the 1980s. As the number of scientists and the papers they’ve published has grown, with governments around the world investing more into basic research, the amount of evidence from single studies has been gathering pace too.

Systematic reviews and meta-analyses go a long way in helping us make sense of such evidence. They do not simply summarize what previous studies have shown. Instead, they scrutinize the evidence in each of the previous studies and draw a meaningful, statistically-valid conclusion on which way the data points. After their utility was shown in the 1980s, especially in the medical sciences, more and more researchers started publishing these sorts of reviews looking at niches within their own fields.

However, Ioannidis argues that a lot of the studies published over the past few years are redundant. Many of these problematic papers, he says, are coming from China.

In the last decade, scientists from China have gone from publishing barely any meta-analyses to outstripping the US in their production. Though China’s investment in science has increased in that time, it alone does not explain the explosion in meta-analyses.

“If you look at the papers at face value, they seem to be very well done,” Ioannidis says. “But there is one critical flaw.”

A lot of the Chinese meta-analyses are in the field of genomics looking at candidate-gene studies. These studies often rely on genomic datasets collected as part of large health studies, involving tens of thousands of patients. Among such data, if enough people with a certain gene are found to suffer from a certain disease, the gene is linked to that disease and scientists can then publish a meta-analysis about the correlation.

The problem is that such studies have been shown to be useless. “About 10 years ago, we saw that doing these studies addressing one or a few genes at a time was leading us nowhere,” Ioannidis says. “We realized that we had to look at the entire genome and combine data with careful planning from multiple teams and with far more stringent statistical rules.”

The vast majority of diseases are the result of the interaction between many genes and many environmental factors. In other words, cherry-picking information about one or a handful of genes has no practical use.

Chinese scientists know this as well as researchers anywhere, but they continue to publish these useless genomic meta-analyses because of a skewed incentive structure. These scientists are often evaluated on the basis of how many studies they’ve published rather than the quality of those studies. And candidate-gene meta-analyses are easy to do. “All you need is tables of people with and without the gene and whether they do and do not get disease,” Ioannidis says.

The science-industrial complex

If the problem were restricted to China and the field of genomics, it might not be that bad, since most genomics study can’t harm or improve people’s health. But other seriously problematic systematic reviews and meta-analyses have affected the course of public medicine. Consider the case of antidepressants.

Between 2007 and 2014, 185 systematic reviews were published on the use of antidepressants. About 30% (58) made some negative statement about antidepressants.

The trouble is that, among those 58, only one had an author who was an employee at a pharmaceutical company at the time—despite the fact that 54 of the 185 total reviews (about 30%) had at least one industry author. That means, when an industry author contributed to a systematic review, the review is 22 times less likely to make a negative statement.  “These systematic reviews have become a marketing tool.”  

“We have a massive factory of industry-supported reviews that paint a picture of antidepressants being wonderful and easy-to-take,” Ioannidis says. “These systematic reviews have become a marketing tool.” And companies have emerged that take advantage of this tool. Over the past decade, Mapi Group, Abacus International, Evidera, and Precision for Value and others have begun to offer their services pharmaceutical industry to run meta-analyses for a fee.

In a study that is still under review, Ioannidis found that the vast majority of meta-analyses done by these contractors are never published. “If the paying customer doesn’t want to see the results because they are negative, the contractor doesn’t publish them,” Ioannidis says. This produces a skewed picture of the evidence, which is exactly what systematic reviews and meta-analyses are supposed to guard us against.

“There’s nothing wrong with such services from helping whoever is ready to pay to understand what the academic literature says about a subject,” says Malcolm Macleod, professor of neurology and translational neuroscience at Edinburgh University, where he also focuses on development and application of systematic reviews and meta-analyses. The concern, he says, is that it’s likely there are cases where “the question is asked in such a way that the answer they find is in the interest of whoever is sponsoring the analysis.”

Consider this example: A pharmaceutical company conducts a meta-analysis of a cancer drug and finds the overall outcome is slightly negative. However, if they eliminate data collected on Tuesdays, which because of statistical randomness had a slightly greater number of negative data points, the overall outcome of the meta-analysis shifts to being positive. This company can then employ an external company and pay them to conduct a meta-analysis without using the data collected on Tuesdays. When published, the meta-analysis would, as the company wanted, show a slightly positive outcome for the cancer drug.

Overall, Ioannidis found that only a tiny sliver of systematic reviews and meta-analyses—about 3%—that are both correct and useful.

“What Ioannidis has done is provide empirical evidence to support what has been a growing concern among those of us working in the field,” Macleod says. “To my knowledge, he is the first to lay bare the phenomenal increase in the proportion of bad reviews being done.”

What can we trust?

“We should be worried, but I wouldn’t just stop at the level of worry,” Ioannidis says. “There are things we can do to improve the situation.”

The vast majority of systematic reviews and meta-analyses are “retrospective,” which means researchers analyze data collected in the past and try to make sense of it. Retrospective reviews have serious limitations. For example, the past studies could be of varying quality and, even though they may ask the same question, they most likely will follow different protocols to collect their data. In such cases, the data sets from the many studies needed for the review would need to be rejiggered to make them comparable—but that can lead to less-than-perfect results. Or the authors of original studies may no longer be easily reachable—so the meta-researchers can’t track down details about the study that weren’t published but are crucial for the meta-analysis.

An initiative called Prospero was started to address these limitations. Prospero is a website where researchers can pre-register a review they are planning. “You make a research plan ahead of time,” Ioannidis says. “You think about how the studies are conducted and data collected. Then you start collecting the data, analyze it, and publish the systematic review.”

These types of reviews are known as “prospective” and they overcome many of the limitations of retrospective reviews. It ensures reviewers have compatible data sets and access to the authors of the studies they are reviewing for any further questions. Currently pre-registration on Prospero is voluntarily, but if it was required by the top journals, it could radically increase the number of prospective reviews that get published.

The other way to fix the problem is to fix the input. A systematic review is only as good as the studies fed into the analysis. If the studies have an overwhelming bias because most are, say, industry-sponsored or because negative studies have remained unpublished, then the systematic review will likely give an incomplete picture of the state of the science for the subject under review.  Only a tiny sliver of systematic reviews and meta-analyses—about 3%—that are both correct and useful. 

To fix the input problem, scientists need to become more transparent about their work. The AllTrials initiative is trying to achieve that. It has teamed up with the world’s leading health organizations and even some pharmaceutical companies to try to get a commitment from all those conducting clinical trials to publish the results of the trial regardless of their outcome. After years of campaigning, last week, the United Nations too joined the cause to ask governments to ensure that all clinical trials are published.

Another way to fix the input problem may be to make systematic reviews and meta-analyses into living documents, like Wikipedia pages. Each such page, Ioannidis suggests, would be managed by group independent researchers interested in the subject area. This way, instead of researchers publishing new reviews every few years, a consistent group of researchers will use standard methodology to update the living document in an ongoing manner.

The more science we produce, the greater our need for high-quality systematic reviews and meta-analyses. So, though the flaws Ioannidis has highlighted may now put the highest form of evidence in doubt, they also give scientists a chance to pull up their socks and fix what is needed to keep science moving forward.


Poster Comment: Some of us have been emphasizing "trust" as a social / cultural issue that is overlooked both in its cause and effect. The causes of "high trust societies" are debated, but the effect of low trust is unavoidably a reduction of social cohesion.(8 images)

Post Comment   Private Reply   Ignore Thread  


TopPage UpFull ThreadPage DownBottom/Latest

#1. To: A K A Stone (#0)

This auto categorized before I caught it. You may want to change the category.

Anthem  posted on  2017-06-29   21:29:46 ET  Reply   Trace   Private Reply  


#2. To: Anthem (#0)

The entire peer-review process is becoming discredited. And the meta-analysis industry has been pretty dubious all along but has veered into outright fraud over the last decade.

Tooconservative  posted on  2017-06-30   10:27:53 ET  Reply   Trace   Private Reply  


#3. To: Tooconservative (#2)

From the article:
"Another way to fix the input problem may be to make systematic reviews and meta-analyses into living documents, like Wikipedia pages. Each such page, Ioannidis suggests, would be managed by group independent researchers interested in the subject area."

Yes, and his recommendation to improve the process is still invidious. I have found that to get "independent researcher" status I have to be affiliated with an approved organization or a be a registered student.

Anthem  posted on  2017-06-30   12:09:32 ET  Reply   Trace   Private Reply  


#4. To: Anthem (#3)

It's the ol' Quis custodiet ipsos custodes? again.

Who will guard the guardians? They become the high priests and gatekeepers of knowledge and "truth". Any system like a Wikipedia is going to be subject to those same problems.

The only real cure is to have more actual replication of experimental data. Science needs to spend more time on replicating claimed experimental data. We need to incentivize fact-checking via replication in science endeavors.

It's the only way to keep them honest.

I think a good start would be to require half of any scientific grant to be used to replicate approved major studies in the scientific discipline. You'd still have problems with entire fields of study being fundamentally corrupt (global warming) or areas of science which aren't really hard science (the social sciences). But you would weed out a lot of useless fluff.

There is also the problem of complete information overload for scholars and researchers. The universities are mills of research, much of it spurious work that someone did as their M.A. or Ph.D. thesis so they could command a higher salary.

We have to value replication of research at least as much as original research itself.

Tooconservative  posted on  2017-06-30   12:32:22 ET  Reply   Trace   Private Reply  


#5. To: Tooconservative (#4) (Edited)

I think a good start would be to require half of any scientific grant to be used to replicate approved major studies in the scientific discipline.

This sounds too "top down". The best work is done organically -- by a curious individual, either testing a result or doing original experiments.

This may sound like a leap, but the problem is too much centralization. Big universities crowd out smaller schools in the States, the feds use agency grants to control funding for science and the old journals act as gatekeepers for publication. The latter problem is being somewhat obviated by the Internet, but still exists. Access to these journals are restricted, as is commentary on studies and experiments. And (here's the leap) it is the Fed Reserve Bank - Federal Government debt nexus that allows centralization to occur. It makes the Fed Gov the fountain of money from which most must drink. Even private R & D is affected by the flow of Fedgov money, as well as by the investment banks who profit from the FRB-FG nexus and use those profits to fund consolidation of industries into giants that in turn dominate the infosphere.

I am in agreement with what you are calling replication. The veracity of science depends not on some vote of "truth", but on the robustness of the falsification process. Popper's Critical Rationalism is an excellent philosophy of science, where nothing is "true", it is either false or not falsified (perhaps yet); e.g. Newton's Mechanics was falsified outside its domain, causing the advance that became known as Special Relativity. SR has been falsified in some situations. Brigman's Operationalism is improving the social sciences by making experiments auditable and repeatable. "Accepted science" doesn't exist. What exists is a body of work that attempts to falsify a result and fails. The larger that body of work the greater the reliability of the theory, as even falsification outside a domain can improve application by restricting it to the domain where it has survived numerous falsification attempts.

Anyway, the main problem that I see is centralization -- a top down approach to science. The problem of information overload is solved organically by different species of scientist. Some are highly specialized and some are cross discipline. Meta-analysis is useful if it is subject to the discipline of Operationalism, but, as has been said, it is highly misused and abused by agenda driven funding. It is the centralization of funding that is the root cause of the problems.

Anthem  posted on  2017-07-01   9:18:59 ET  Reply   Trace   Private Reply  


TopPage UpFull ThreadPage DownBottom/Latest

[Home]  [Headlines]  [Latest Articles]  [Latest Comments]  [Post]  [Mail]  [Sign-in]  [Setup]  [Help]  [Register] 

Please report web page problems, questions and comments to webmaster@libertysflame.com