Joseph Mangano seems happily surprised that people once again are falling for his junk science.
This time the title of the Counterpunch article starts with the rather cynical Let the Counting Begin followed by Fukushima’s Nuclear Casualties. It is just a calculation exercise for Joe, and it could have been an interesting one if it weren't for the fact that:
he is counting dead people in Japan during 2011, claiming that the cause of death for 38,700 of them are unexplained, with the implication that radioactivity from Fukushima is the cause, and
a closer scrutiny shows that once again he is handling the data in a very irresponsible way in order to push his own anti-nuclear agenda.
The recently started Nuclear Literacy Project has a welcome entry, a person with knowledge in the field of radioactivity and nuclear engineering reports on a visit to an anti-nuclear seminar with Helen Caldicott. The person, PhD student Kallie Metzger, entered the meeting with some hope of a good discussion where there would be room for incorrect statements to be straightened out.
What Kallie found, however, was that the renowned anti-nuclear activist was more keen on scaring people into thinking like herself, and questions from the audience were responded to in a hostile and arrogant manner, if at all.
After watching a few videos with Helen Caldicott, including her infamous TV-debate with George Monbiot from last year, we are, unfortunately, not surprised about her behaviour. The good news is that Kallie went to listen to Caldicott, and reported about it. We need more people like Kallie who attends these kind of meetings and try to raise relevant questions when remarkable claims are being stated. If Caldicott continues her tour in the same arrogant manner her audience should diminish rather quickly down to the die-hard fans of her outrageous claims.
So the hero of the week is Kallie Metzger. Read her account of the Caldicott seminar here. Then ask yourself: will you be our next hero?
Once again the would-be world savers Janette Sherman (MD) and Joseph Mangano (something) are pushing for another round of scaremongering dressed in a scientific coat. They have got their nonsense about increased US infant mortality due to Fukushima published in a peer-reviewed journal. This time they have extended their faulty study and extrapolated the effect for the entire US. Lo' and behold, 14 000 deaths so far, they claim! The article, published in the International Journal on Health Services, can be found here. For a bit more easy reading, the press release here will probably do.
We will not spend too much time on scrutinizing this study, and others are already on to it, for instance Michael Moyer in Scientific American, and Barbara Feder Ostrov in Reporting on Health. Furthermore, S&M have not made any amends for their first two faulty attempts (our comments here and here), and since the new article follows the same line of reasoning, we can only condemn them for trying to push the same lousy trick a third time. This is political activism from anti-nuclear icons, it is not science.
From the media releases about this, we find some interesting statements by Mangano in Medpage Today:
In a telephone press conference, Mangano said the finding is a "clarion call for more extensive research."
But he told MedPage Today that the researchers can't rule out factors other than the Fukushima radiation that might have accounted for the excess.
"There are probably a variety of factors that could be linked to this excess of 14,000 deaths," he said. "But it does raise a red flag."
This is indeed a clarion call. It is a call for celebrities like Alex Baldwin and Christie Brinkley to start contemplating what kind of nut-crack that they support financially. And it does raise a red flag, the umpire raises the red flag after three strikes. Sherman & Mangano, you're OUT!
Other posts in various media on the same subject
2011-12-16: Eric McErlain on NEI Nuclear Notes, "Note to Reporters: Be Sure to Fact Check Joseph Mangano, Janette Sherman and Robert Alvarez"
2011-12-19: Eric McErlain on NEI Nuclear Notes"Joseph Mangano Contradicts His Own Press Release on Fukushima Research"
2011-12-20: Rod Adams on Atomic Insights, "Mangano and Sherman have released another bogus study seeking to scare people about radiation"
2011-12-20: Barbara Feder Ostrov in Reporting on Health"Fukushima: Alarmist Claim? Obscure Medical Journal? Proceed With Caution"
2011-12-20: Michael Moyer in Scientific American"Researchers Trumpet Another Flawed Fukushima Death Study"
2011-12-20: Will Davis on Atomic Power Review"Radiation deaths in US due to Fukushima Daiichi: Nope."
2011-12-21: Barbara Feder Ostrov in Reporting on Health"Fukushima Fallout and Infant Deaths: International Journal of Health Services' Vicente Navarro Responds"
2011-12-21: Linda Carroll on MSNBC Vitalsblog "Experts dicount claims of U.S. deaths from Japan radiation"
2011-12-21: Nuit Blanche blog "Pre-publication Peer Review and Lazy Science Reporting"
2011-12-23: Eric McErlain on NEI Nuclear Notes, "Dr. Robert Emery Disputes Joe Mangano's Findings on Radiation and Fukushima"
2011-12-23: Eric McErlain on NEI Nuclear Notes, "Dr. Robert Peter Gale's Statement on the Mangano-Sherman Report on Fukushima Fallout"
2012-01-08: Alfred Körblein in Strahlentelex Nr. 600-601, scrutinizes the study and finds serious flaws, "14.000 Tote in den USA?" (in German)
2012-01-11: Josh Bloom writes in Forbes, "Garbage In, Anti-Nuclear Propaganda Out: The 14,000 Death Fukushima Lie"
Warning: The following text may contain personal attacks and wild speculations about certain people. This will not make any future attempts of dialogue with them any easier, but in my humble opinion they have had their chances.
By concentrating only on the CDC (Centers for Disease Control) data – incomplete at best – and ignoring the on-going radioactive releases from Fukushima, it is apparent that the pro-nuclear forces are alive and active.
Hey, wait a minute. It was not the pro-nuclear forces (whatever that is, but let's embrace the term with a jolly "Fooorward!") who started tampering with the CDC data in a way that would flunk any undergraduate student in Statistics 101, it was you, remember? This does not mean that we ignore the rest of the issue, but we do take offence when anti-nuclear forces fail to use the information from Fukushima to their advantage and have to cook up alarmistic results in order to make the situation look worse than it is. This is indeed very remarkable, aren't the actual events in Fukushima bad enough for you?!?
If I had the mindset of S&M, I would write
By mis-treating the CDC data - incomplete at best - and ignoring all knowledge about radiation effects, and actual radiation levels in the US due to Fukushima, it is apparent that the anti-nuclear scaremongers are alive and active.
This is clearly not a way forward, at least not if you hope for a dialogue and an improvement of the nuclear debate. Oh well, let's move on.
The second section explains that the titles of the previous articles (there are two versions, one in CounterPunch, and one in San Francisco Bay View) includes question marks in order to
stimulate interest and prompt demand for governments – Japan and the U. S. at least – to provide definitive and timely data about the levels of radioactivity in food, air and water.
Hm, Janette and Joe. May I kindly ask: Wouldn't it be better to try to stimulate this interest in a way that does not include cherry-picking, unfounded alarmism that scares the heck out of millions of parents to infants, and a way of throwing random pieces of data around that should reduce whatever credibility you might have had before to a new low point?
We received many responses, some in support of our concerns and some critical about how we used CDC data, including outright ad hominid attacks accusing us of scaremongering and deliberate fraud.
Oh really? I guess that I personally have to plead guilty to this charge, but they link to the blog entry by Michael Moyer in Scientific American as an example of an ad hominid attack. I re-read Moyers scrutiny of the CounterPunch article, but fail to see any personal attacks there. Ok, he uses words like "scaremongering", "froth up", "data fixing", "critically flawed - if not deliberate mistruths". Still, Moyer attacks S&M's actions, not their personal traits.
Or do S&M really mean ad hominid attacks? My first assumption was that they mixed up "ad hominem" and "ad hominid", but maybe they do know the difference? I bring the rest of us up to date by quoting the text on this link: "The former [ad hominem] is a criticism of a particular person; the latter [ad hominid] is a commentary upon a species." So, have Moyer, myself, or anybody else involved in this issue, referred to S&M as neanderthals, platypus, or similar? Not that I can see, but it could be an interesting path to digress upon. Anyhow: Sensitive bunch, those scaremongers...
Now it becomes interesting:
Given the fallibility of humankind, we may have erred, and if so, will admit it. Given the delay in collecting data and the incompleteness of the collection, the criticism may be valid. MMWR (CDC’s Morbidity and Mortality Weekly Report) death reports have certain limits – representing only 30 percent of all U.S. deaths. They list deaths by place of occurrence, while final statistics are place of residence and deaths by the week the report is filed to the local health department, rather than date of death. Finally, some cities do not submit reports for all weeks. The CDC data are available at http://www.cdc.gov/mmwr/mmwr_wk/wk_cvol.html.
So, they admit that they may have erred, or do they say that they may admit it if proven to be the case? Or...? I am not sure, but this is probably as close to admitting anything that they will ever be. It is of course not their fault, it is the limitations of the CDC data that we should put the blame on, not a second thought about their method.
My first interpretation from reading the section is that they admit to that the statement in the first articles about a statistically significant 35% increase in infant deaths may not be correct (no matter who to blame). Unfortunately, it turns out that I am severely mistaken in my interpretation:
Since the article was originally published, we have had the chance to further analyze the CDC data. Historically, the change in infant deaths for the previous six years in eight Pacific Northwest cities from weeks 8-11 (pre-Fukushima) to weeks 12-21 (post-Fukushima) is about 6 percent – never above 11 percent. But in 2011, the change was 35 percent, far above anything ever experienced.
The same eight cities, the same comparison – four weeks 8-11 vs. 10 weeks 12-21 infant deaths:
2005 +4.1 percent
2006 +10.0 percent
2007 +5.1 percent
2008 +5.5 percent
2009 +2.8 percent
2010 +10.9 percent
The average for 2005-2010 is + 6.1 percent for a total of 1,249 infant deaths.
2011 +35.1 percent (162 infant deaths)
Before 2005, there were missing data. But the years 2005 to 2010 are about 98 percent complete.
Argh! Now I really do want to turn ad hominid on these people, whatever species (sparsis timoris?) they may belong to! First they say that maybe they were wrong and if so they maybe will admit it. Then they go on and make more "analysis" from the CDC data base, using the same lousy way of handling the data!
There has been plenty of text here, so let's lighten up with a few plots, showing the CDC data S&M are mistreating. The plots from my previous posts about S&M should be enough for saying that this is rubbish, but one more round with the CDC data base will not hurt, in case that somebody still believes in S&M's fables. We start with plotting the change in infant deaths between weeks 8-11 and weeks 12-21 for the years 2005-2011, i.e. the ones that S&M now claim to have done a more careful analysis on.
The blue squares show the data as given by S&M. Nothing wrong in the data, this is what you get when you do the same treatment as S&M, weeks 12-21 give a higher weekly infant mortality rate than weeks 8-11, with a few percent every year. Except for 2011 where the 35% increase looks really alarming. But we know from before that they have cherry picked the weeks by only using four weeks before Fukushima and 10 weeks after. If we also plot the ratio of weeks 12-21 over weeks 1-7, shown as red diamonds, we get the following trend:
Quite interesting that there is a decreasing trend, and that the decrease is at its lowest level this year, a 20% reduction while it before usually was a slight increase (the 30% for 2005 is almost as much as the 35% that S&M have made so much noise about). In other words, if we compare the weeks after Fukushima with weeks 1-7, there seems to be a very beneficial effect on child mortality. Could it be that hot particles are beneficial for infants? This is rubbish, of course, but so is the question asked by S&M.
Still, why the drastic increase this year if they only used four weeks for the years 2005-2010 as well? And why does the results look completely opposite if we instead compare with weeks 1-7 instead of weeks 8-11. The answer is: statistical variations. We are watching random noise, not real trends due to a single cause that can be easily deduced. For this we need to look at longer time spans (for a start, we probably need to do a lot more as well, but let us not confuse the S&M-fans by introducing too much real scientific reasoning). But in order to understand the discepancy, we have to remember that they present the ratio between the two time periods, not the actual numbers. In the original articles S&M talks about increased infant mortality (you may remember that the articles start with "U.S. babies are dying at an increased rate." Let's take a look at this.
Now we see something interesting. Not only are the number of infant deaths for weeks 8-11 at an all time low during 2011, the number of infant deaths for weeks 12-21 are also at a very low level! U.S. babies are dying at an increased rate? I think not! S&M are shamelessly (if they really believe in their own conclusions then they are indeed very incompetent) showing results for relative numbers, not absolute numbers. Perfectly ok if you are honest with how you handle the data, can refrain from cherry picking the weeks and the areas, and do not suffer from an alarmistic version of Tourette's syndrome. But S&M prefers to show a relative increase of 35% for this year, while the absolute numbers show a decrease when comparing with earlier years.
Still not convinced? Let us include weeks 1-7, weeks 1-21, and weeks 1-52 for each year. Please note that the value for 2011 in the weeks 1-52 (black line) only include the data for weeks 1-25, so it can change somewhat depending on the number of infant deaths that will occur during weeks 26-52 (S&M would probably be able to predict the future, I will not make any such attempts).
If anybody wants to pursue the idea that there is a drastic increase of infant deaths in northwest U.S. after Fukushima, that person will have many things to explain, for instance the drastic increase in infant mortality this year for weeks 1-7. The data does not support the idea of increased infant mortality due to Fukushima, no matter what S&M say.
One more quote from the article by S&M:
We acknowledge that many factors can cause infant deaths, but the critics who ignore Japanese fallout as possible contributing factors are acting irresponsibly.
So, cheating with the CDC data is a more responsible way to act? Nobody is ignoring the fallout. We are just fed up with false claims wrapped in an alarmistic package. Therefore I will, like many others, once again ignore everything else that is written in the S&M article; a mixture of some valid concerns that are heavily diluted with half-truths, advertisement about the Chernobyl book (edited by Sherman) and other reports (written by Mangano), scaremongering, claims about lies and cover up, and some more nonsense claims. It would be interesting to discuss the valid concerns that are addressed, but if we every time have to filter it out from a sea of myths then we would rather spend our time elsewhere.
Janette Sherman and Joseph Mangano: We do care about the consequences from Fukushima, many of them will surely be serious. Many of us are also annoyed by the lack of interest from the media to write properly about the present status. But we do not subscribe to your way of portraying it, as long as you show yourself unable to stick to the truth. Concerned citizens have nothing to learn from you. We do see the elephant in the room while you try to make it into a mammoth. May your methods and your dishonest claims suffer the same fate as this extinct species.
One final quote. In the beginning of the article there is a picture of a pretty baby, with the following figure caption:
Not only is it unthinkable to put our babies at risk by continued use of nuclear power plants, but infant mortality is an indication of an entire population’s health. When an unusual number of babies are dying, we are all at risk and must take a stand.
S&M must have plotted all the data from 2005 until now. If not, here I have done it for you:
Ok, I plot all years over each other, a bit messy. A better alternative would be to plot the years after each other, as done in this figure by Alexey Goldin in his entertaining statistics analysis of the S&M hoax. One may also try to see seasonal trends by plotting mean values for a number of years. In whatever way, S&M still owes us an explanation for their claim of an unusual number of babies that are dying. Can we close this chapter now, please?
Ad hominem/hominid attacks
I start to ask myself, are there Cliffs Notes for epidemiology? Is this how Mangano passed his courses for the MPH degree? Since my first encounter with Cliffs Notes about 20 years ago as foreign exchange student in the U.S., the company seems to have expanded its activities substantially, and are also available on the internet. At that time I was only aware of Cliffs Notes that cover novels that you are expected to read in school. If you are too lazy to read the novel, the Cliffs Notes gives a summary of the book, typical issues and questions that are likely to be on the exam (or good to be aware of if you want to pretend that you read the book and do not want to get expelled from the book-reading club), and a few other short cuts for the illiterate who still wants to graduate from high school. I find no Cliffs Notes for epidemiology, but there are indeed Cliffs Notes on statistics! Well Joe, if this is how you made it through college, please go back and read the following part:
In my first post on this subject the title was Shame on you, S&M! I should probably reconsider this title. If they had done this wilfully then I would stand my ground, but after reading their last article I get more and more convinced that it is just incompetence. They truly believe in what they are doing, and because they are victims of the Dunning-Kruger effect there is no way to make them understand that somewhere along the road they lost contact with reality. And just like Helen Caldicott, who in her debate with George Monbiot said "doctors can't lie", they are convinced that they speak from a higher moral ground.
So to tell somebody to be ashamed because they are incompetent in their field is about as useful as telling my 3-year old daughter to be ashamed for not knowing Bulgarian grammar. The difference is that my daughter may have a good chance to pick it up in a couple of years, if she would like to. For S&M, who have claimed expertise in their field for a long time, I see no hope at all. Maybe, just maybe, if they read Fooled by randomness by Nassim Nicholas Taleb, or some similar literature. They could really pick up some good lessons from a few chapters there. But it wouldn't work, as far as I know Taleb's books are not available in a Cliffs Notes format.
/Mattias Lantz - member of the independent network Nuclear Power Yes Please
In the first post about S&M (here) I was criticized by a commenter for stating that Mangano has a track record of not handling data in an honest way, but I had not given any reference or link to back up this statement. That has been corrected and I have put two links in that post. But from now on it will be much easier. Three nonsense articles by Joe Mangano in slightly over two weeks, all three based on cherry picking. That is a track record as good (hrm, bad...) as any. To make it worse, good ol' Joe has proudly put them on the RPHP web page:
And still no reaction from CounterPunch regarding their strange analysis
I have written twice to Alexander Cockburn, but have received no response. Instead there are a number of new articles that are critical of nuclear power of all kinds. Fine with me, but the lack of interest to get the strange re-analysis by Pierre Sprey corrected makes me wonder about the statement "Ours is muckraking with a radical attitude" . CounterPunch is certainly full of articles with a lot of attitude, but the muckraking seems to be missing.
Muck-raking is a journalistic activity with a proud history that since the days of Ida M. Tarbell and Jacob Riis have led to exposing cases of fraud, social injustice, conspiracies, environmental pollution, and other inconvenient truths, to the public. On a number of occasions it has led to changed laws and policies, and the end to political careers when somebody's darker sides have been exposed. One important aspect of this activity is to check the facts carefully in order to get them right. Otherwise the muck-raking turns into cheap sensationalism in order to sell a few extra numbers. Every country has its own collection of this less honourable tradition that stems from the yellow press days of William Randolp Hearst and Joseph Pulitzer. However, the people behind the bi-weekly newsletter ConterPunch proudly refer to themselves in the following way:
Ours is muckraking with a radical attitude and nothing makes us happier than when CounterPunch readers write in to say how useful they've found our newsletter in their battles against the war machine, big business and the rapers of nature.
In a follow-up editorial on the Sherman-Mangano study (link to the original article here), CounterPunch editor Alexander Cockburn explains that they have received plenty of critique after publishing the article, several readers suspected cherry-picking. So they had their "statistical consultant", Pierre Sprey, go through the data. And indeed, he found that there was no ground for the claims by Janette Sherman and Joe Mangano, when one includes a longer time span for the period before Fukushima. By increasing the time from the four weeks, that happen to be in the dip, to ten weeks, the relative increase in infant mortality after Fukushima disappears. So far so good, their control of the data verifies the conclusion that I and others independently of each other made (my version here). Now the interesting part comes. Cockburn says it the best himself:
But then Sprey went further and looked at the Sherman/Mangano selection of eight cities from the 122 reporting to CDC: the eight were Berkeley, Portland, Sacramento, San Francisco, San Jose, Santa Cruz, Seattle and Boisie. Apparently, they selected Pacific Coast cities that were more or less within 500 miles of the coast and north of Santa Cruz. However their selection did not include all CDC cities within this categorization, because they left out Tacoma and Spokane, thus leaving themselves open to suspicions of cherry-picking cities.So Sprey included Tacoma and Spokane in the data set he reviewed in order to be geographically complete. When Sherman and Mangano's overall selection of cities failed to produce a significant result for ten weeks before and ten weeks after March 11, 2011 (as well for the ten equivalent weeks in 2010 as compared with the same weeks in 2011) Sprey elected to look at smaller, geographically consistent groupings of cities. The results were striking.Simply by moving the boundary line northward from Santa Cruz Sprey found that the four northernmost Pacific Northwest cities in the CDC sample – Portland, Tacoma, Seattle and Spokane – show remarkably significant results – a larger infant mortality increase than the original Sherman-Mangano results.During the ten weeks before March 11 those four cities suffered 55 deaths among infants less than one year old.In the ten weeks after Fukushima 78 infants died – a 42 per cent increase and one that is statistically significant. To confirm once again that these results were not due to seasonality Sprey compared these infant deaths in the ten weeks after Fukushima to the deaths in the equivalent ten weeks a year earlier. The results were almost identical with the ten weeks before Fukushima in 2011. Within the equivalent ten weeks of 2010 53 infants died in these four cities.
Imagine my surprise. I had been playing with the data a bit, and also checked what happens if you include Spokane and Tacoma (see the forum post for details). My conclusion was that Spokane and Tacoma did not matter, while Sprey's re-analysis shows a 42% increase! I must have done some error, or? There were several steps when I copied the information into a spreadsheet, quite tedious to get it into the format that I wanted, so there were many possibilities for mistakes. So I made a few random double checks and could not find any error, but it would be very time consuming to go through the data for every week again. I also checked the published erratas on the CDC pages in order to see if I had missed some vital correction. But then I found a quicker way on the CDC web pages, it turns out that one also can download the data for individual cities or regions directly for the entire year (here). So I extracted the data for the four cities from this link, in this way I would do an independent check of my earlier results. Anyone can check it themselves on the link above. Here is my table:
The table includes weeks 1-23 for 2011. If I understand what Cockburn writes correctly, Sprey have used weeks 2-11 for the time period before Fukushima (compared with the four weeks used by Sherman and Mangano) and weeks 12-21 for the time period after Fukushima (i.e. the same as Sherman and Mangano). So, with the use of a very complicated mathematical operation, called addition, I get 59 for the period before Fukushima, and 53 for the period after. This is identical with my earlier count. As a final check I asked another member of NPYP to make an independent extraction of the data from the CDC data base and then perform the mathematical operation mentioned above. Once again identical results. Sprey got 55 and 78, close enough on the first one, but the second...
Unless I am missing something vital, the numbers in the table above speak for themselves; There is no dramatic increase for these four cities, so something must have gone seriously wrong in Sprey's re-analysis. Actually there are more strange things, Cockburn writes that for the 8 cities the new analysis gave "an increase of infant deaths of only 2.4 per cent" after Fukushima, while my analysis gave a 14% decrease (it is not statistically significant, but if I was pro-nuclear in the same way that Sherman and Mangano are anti-nuclear, I would of course argue for that Fukushima has caused a reduction in the infant deaths in northwestern U.S., and that I had the numbers to show it). Could it be the addition part that failed, or does wishful thinking of the style "there must be an increase in infant mortality somewhere due to Fukushima" play a part? Whatever the cause, Cockburn is explaining the significance of some details in the data from Sprey:
Looking a little more closely at the time trend of the infant deaths after Fukushima, Sprey found that the most dramatic increases in deaths were in the two weeks right after the March 11 disaster. Those two weeks saw a near tripling of weekly deaths, followed by a period of somewhat elevated weekly deaths lasting for about five weeks – roughly 25 per over the pre-March 11 rate, then settling down close to the average pre-Fukushima death rate for the last three weeks of the ten week period post-disaster. These results are necessarily approximate because the weekly sample of deaths is too small for precise statistical conclusions.
This part if of course nonsense when we look at the numbers. Let's plot it as well:
Well, as Cockburn says, there is a "dramatic" increase immediately after Fukushima, by a factor of 3. Now that we have the numbers ourselves we can conclude that a factor of 3 means a jump from 3 on week 11 to 9 on week 12. We have a similar increase in numbers between weeks 5 and 6, but Cockburn does not indicate that as being dramatic, in fact he does not mention it at all. To his defense, the numbers he is looking at are not the same as mine, but where do they come from? Whatever the cause of the error, the dramatic 42% increase now looks more like a...decrease, the mean values are shown as horizontal lines (blue before Fukushima, orange after) in the plot above. Let's plot the data again to make sure, we can do it in the same way as Mangano does, to make it clear for everyone:
This is, of course, a very dishonest way of plotting things if you want to show the whole picture, but if Joseph Mangano can do it, they why shouldn't we? The main point is anyhow clear, there is no increase in infant mortality in the United States of America due to Fukushima. Got it? And if there is, we will no find out through sloppy analysis by charlatans like Sherman and Mangano. And, as it seems, not through the statistics consultants that muck-raking journal CounterPunch is using. It will take careful analysis by serious researchers to find out if there is any real effect. If they would bother to start looking. But Mangano said in the Fox News interview
this is a red flag to raise for more studies to be done
Actually, he is right. It is a red flag to raise for careful scrutiny of all the earlier work by Sherman, Mangano, and their alarmistic friends. Some of their earlier studies have become "common knowledge" in anti-nuclear groups. Last year we had a member of the green party in Sweden standing in the parliament during a debate about nuclear power, where he referred to studies of increased childhood leukemia, authored by Sherman and Mangano, and of course the Chernobyl book edited by Sherman. Maybe we all can move on with life and more pressing issues now, ok? Those who want to hang on to all bad things with Fukushima still have plenty of material to work on. But stick to the facts, please. Ok? Two questions remain:
Why is Alexander Cockburn's editorial so forgiving towards Sherman and Mangano? If I had been the editor of a journal, I would be furious if it turns out that some authors make fools of themselves, and of me as an editor, by cheating with data, and I would make sure that they were never to publish anything more in my journal. Ever. Especially when it comes to such an important issue that worries millions of people, not the least parents of small children. But Cockburn is so happy to have, through Sprey's re-analysis, found that there was indeed an increase, so Sherman and Mangano was right even if they cheated, no shadow should fall upon them. After all, CounterPunch is a muck-raking newsletter with a radical attitude, so there must be some muck to find, and indeed they found it. But when it now turns out that not even this was right, what will he write in the next editorial? "We follow proudly in the sensationalist footsteps of William Randolph Hearst!" or what? We have already established that Janette Sherman and Joseph Mangano should be very ashamed of themselves. If I was Alexander Cockburn I would at least be quite embarrassed.
What went wrong in Sprey's re-analysis?
An email has been sent to Alexander Cockburn, requesting that they do a muck-raking investigation of the skills of their statistical consultant. Furthermore, CounterPunch now has a great opportunity to recover their lost credibility; How about a couple of articles with in depth investigations of the earlier works of Janette Sherman and Joseph Mangano? This could be the starting point in a long series of muck-raking articles where all the controversial statements from the anti-nuclear icons are carefully scrutinized, could it be that there are more "common truths" out there that are based on the same weak evidence (i.e. none) as in the present case? Alexander Cockburn, are you up to the task?
Mattias Lantz - member of the independent network Nuclear Power Yes Please
P.S. What about Seattle? Some observant persons may have noticed from the table above that Seattle does have an increase in infant mortality by a factor 3 between week 11 (2 deaths) and week 12 (6 cases). At least Seattle has an effect due to Fukushima, pretty please? Well, I'll give you the plot for Seattle:
6 cases the week that the radioactivity reached Seattle, compared to 2 the week before, that must be significant! If you still insist on this kind of reasoning, it means that you somehow have ignored the plots I show here and here. Still not convinced? Then go to the CDC data base and pick out the numbers for yourself, and please do not forget to check the pattern for Seattle week by week for earlier years. Here are the links to the CDC data base, instructions for some of them are found if you place the pointer over the link. I recommend to start with the last one, it is the easiest to handle:
The previous post was written in a rush, and somewhat in anger. Here is an attempt to explain better.
It seems as if it wasn't enough for Joe Mangano to spread fear in the northwestern part of the U.S., so he decided to play the same game for Philadelphia, PA. One reason may be that relatively high levels of Iodine-131 in water have been recorded there recently. These are of course reasons for concern, and several reasons have been suggested (see for instance here(1), here(2) and here(3))
We leave that discussion aside and focus on Mangano's new claims. For Philadelphia he suggests that a 48% increase of infant mortality is due to Fukushima. Below is the very detailed graphics (n.b.: irony) of the weekly rate of infant mortality before and after the radiation from Fukushima reached Philadelphia. The picture is a screen dump from the Fox News interview.
Please note that is not enough for Mangano to cherry-pick the data, he is also keen on making the increase look very high by cutting the scale at 4, thus the "After"-pillar looks three times larger than the "Before"-pillar. Oh well, let's not stay sore over that, we go back to his claims.
Mangano's "Before"-pillar corresponds to weeks 7 to 11 (black boxes). For the northwest Sherman and Mangano used weeks 8-11, now it seemed important to bring down the average value by adding week 7 in order to reduce the effect of the high value for week 8. The average becomes 5.0, to be compared with the ten following weeks (week 12-21) which have an average of 7.4. This is 48% higher value, just as Mangano says.
But once again, if we look at the first 6 weeks of the year, we find that the average for that period is 9.5! And the average value for the first 21 weeks of 2011 is 7.42, i.e. slightly higher than the alarming 48% increase level that gave Mangano a few minutes on fame on TV. In other words, Joe is trying to take us for a ride again. shame on him.
During the interview with Fox News he says, among other things:
The real benefit is that this is a red flag to raise for more studies to be done.
Is this a fluke or is there some other reason?We'll see, but we can't rule out Japan. Its too...too distinct.
The talk about a red flag could be a valid argument if data were more convincning. But the data only becomes "...too distinct" if we allow Mangano to play his tricks with them, i.e. by not showing a longer time trend before Fukushima. But there is a red flag to raise, a warning flag that Joseph Mangano is not a man to trust in these matters. And if he has done this, what says that any of his earlier studies have been performed in a more honest way.
During the Fox News interview Mangano also claims that the data from CDC shows a decreasing trend for the corresponding weeks during the previous six years, and now after Fukushima we have a peak instead. Based on what we have seen above, this is a meaningless statement. My intention was to double check all the data for the last six years, but it takes some time to extract the data from the CDC data base and I have better things to do. Therefore I only show the data for 2010, with the same weeks marked for comparison.
The mean values for the different time periods are:
Weeks 1-6: 6.5 (9.5 for 2011)
Weeks 7-11: 5.6 (5.0 for 2011)
Weeks 12-21: 6.1 (7.4 for 2011)
Weeks 1-21: 6.1 (7.43 for 2011)
Weeks 1-52: 6.7 (answer for 2011 will come in January 2012)
So, just by comparing weeks 12-21 between 2010 and 2011 we see that yes, there is an increase for 2011, and Mangano may be correct about a decreasing trend for the years 2005-2010. But considering the great variation in data over the last year this does not indicate anything. There are many other things that Mangano would have to explain then as well, for instance why 7.4 infant deaths are of concern while 9.5 is not. Ah, silly me, a relatively high value is only important if there has been a nuclear accident during that time. Who am I to question that? 😉
Mattias Lantz - member on the independent network Nuclear Power Yes Please
Update 25 June 2011
Several persons have asked about how to get access to the raw data. I put a summary of the links I have used on the follow-up post regarding the strange results from the CounterPunch re-analysis of the data (here), but I will now put them here as well. The last link on the list is the one that is the easiest one to use. Some information will appear if you hold the pointer over each link:
The article, published on 10 June 2011, is authored by Janette D. Sherman and Joseph Mangano, both renowned persons in the anti-nuclear movement. In the text the authors claim a statistically significant increase of infant mortality deaths with 35% after the Fukushima accident in eight selected cities on the U.S. west coast.
The recent CDC Morbidity and Mortality Weekly Report indicates that eight cities in the northwest U.S. (Boise ID, Seattle WA, Portland OR, plus the northern California cities of Santa Cruz, Sacramento, San Francisco, San Jose, and Berkeley) reported the following data on deaths among those younger than one year of age:
4 weeks ending March 19, 2011 - 37 deaths (avg. 9.25 per week)
10 weeks ending May 28, 2011 - 125 deaths (avg.12.50 per week)
This amounts to an increase of 35% (the total for the entire U.S. rose about 2.3%), and is statistically significant. Of further significance is that those dates include the four weeks before and the ten weeks after the Fukushima Nuclear Power Plant disaster.
Furthermore, they try to link the releases of radioactivity from Fukushima and Chernobyl to the relatively high infant mortality rate in the U.S. A look at the data used by Sherman and Mangano does indeed seem to indicate an increase in the number of infant deaths in northwest U.S. after Fukushima, see the plot below:
The Fukushima events started on March 11, i.e. by the end of week 10. Then it took slightly more than a week for the first release of radioactivity to reach the nortwest part of the U.S. The data do show an increased infant mortality rate after Fukushima. The black line shows the average value for the 4 weeks before March 19, and the orange line shows the average value for the 10 weeks after that. The error bars on each data point indicate the statistical uncertainties.
But why are the 10 weeks after Fukushima compared with only 4 weeks before? There seems to be a reason for it, commonly referred to as cherry-picking, i.e. you select the data that supports your theory without showing the full picture. To show the full data set may falsify what you want to show. This is quite common in politics and by people who have an agenda that is more important than the truth. But here we have two persons in medicine, one Medical Doctor and one Master of Public Health, they should be trustworthy professionals who are keen on giving people honest information, right? Let's check their deck of cards closer.
So, if we include data for, say, the first 7 weeks of 2011, we get a very different idea about the situation:
Very interesting, the first seven weeks of 2011 actually has higher infant mortality than the weeks after Fukushima, quite different from what Sherman and Mangano wants us to believe. There is no spike after Fukushima, instead there is a dip during the 4 weeks before! A more detailed report on the closer scrutiny of Sherman and Mangano's article is found in our Deep Repository.
So, why does a Medical Doctor mistreat official data in this way? It is quite remarkable, and embarrasing, especially since Janette Sherman writes about herself on her web page (http://janettesherman.com/about/):
Dr. Sherman’s primary interest is the prevention of illness through public education and patient awareness.
She seems to have forgotten about her primary interest in this case, I fail to see how cherry-picking data can be part of public education and public awareness. And if anybody can see how you can prevent illness through scaring people with false statistics, then please explain it to me. Embarrasing, Janette Sherman...
What baffles me the most is that he and Sherman try to get away with this alarmistic claim by such a lousy handling of official data. Anybody can easily check it for themselves and see that Sherman and Mangano are wilfully interpreting data so that they agree with their already decided view on things. What is worse, they are scaring a lot of people with their claims, for no reason at all. Therefore: Shame on you!
/Mattias Lantz - member of the network Nuclear Power Yes Please
The Buzz Blog on Physics Central comments on the scrutiny done in Scientific American and asks the question why Sherman and Mangano is doing this nonsense: "Beware the Evil Scientists"
The uvdiv blog has a guest post by Alexey Goldin that hopefully is enjoyable also for non-statistics nerds, and he shows data for several years back: A curious case of cherry-picking data for the greater good. I can only agree with his final statement: "At this point it is worthwhile to question either the scientific integrity or statistical competence of Sherman and Mangano. They might be decent people and believe in what they say, but allow themselves to say "small lies" in a service of "Greater Truth". This never ends up well. Because they are likely to kill some unstable people with their small lies."
Several persons have asked about how to get access to the raw data. I put a summary of the links I have used on the follow-up post regarding the strange results from the CounterPunch re-analysis of the data (here), but I will now put them here as well. The last link on the list is the one that is the easiest one to use. Some information will appear if you hold the pointer over each link: