Image

 

What if an original research paper published in a respected international scientific journal declared that keeping men out of women’s sports or banning the gender transitioning of minors caused an increase in suicide attempts among transgender youth? Wouldn’t you expect them to show actual evidence of an increase? Well, you ought. But they didn’t.

Article summary

Nature Human Behavior is an offshoot of Nature, one of the longest-running and most respected names in scientific publication. Launched in January 2017, it claims to be “dedicated to the best research into human behaviour from across the social and natural sciences.” [1]

On September 26, 2024, it published online the article “State-level anti-transgender laws increase past-year suicide attempts among transgender and non-binary young people in the USA,” later appearing in the November 2024 print edition.[2] The article makes a bold claim: in the 19 states that passed “anti-transgender” laws between 2018 and 2022, there was a 7-72% increase in past-year suicide attempts among transgender and non-binary (TGNB) young people beginning in the second year.

This bombshell enjoyed breathless, widespread, and uncritical coverage across much of the mainstream news media, including NPR, NBC, Time, CNN, MSN, and many others – with headlines declaring as fact that “more trans teens attempted suicide after states passed anti-trans laws.”[3]

So, did this research actually show an increase in suicide attempts after so-called “anti-transgender laws”? Not remotely.

Study methodology

The six coauthors of this paper all list as their primary affiliation “The Trevor Project,” an LGBTQ advocacy group founded in 1998 and based in West Hollywood, California. This organization has been exposed by National Review for hosting online forums where underage children freely engage in sexually explicit dialogue with anonymous adults.

The researchers compared survey results between states that did or did not pass so-called “anti-transgender” laws, defined as laws limiting women’s sports to women, requiring bathroom usage in accordance with anatomy rather than identity, or restricting medical experimentation on minors. Using a statistical technique called “difference in differences,” they reported significant increases in suicidality among states that passed such legislation versus those that did not.

The term “suicidality” covers a lot of ground. It could mean thinking about suicide, attempting suicide once, multiple suicide attempts, or completed suicide. The researchers offer no data on completed suicide. This isn’t too surprising, considering that suicide victims can’t complete online surveys, but it’s important information one should wish to know before reaching any profound conclusions. What the authors do claim is that there were more past-year suicide attempts in the study population compared to controls and that more subjects reported at least one past-year suicide attempt. Paradoxically, there was no statistical increase in seriously considering suicide (more on this later).

Critical analysis:

1. Ideological verbiage

Throughout the report, one encounters the sort of verbiage more appropriate to Queer Nation than a serious scientific journal. What’s disturbing is that the editors of Nature somehow found it acceptable. Laws designed to protect women and children or to prevent state intrusion into the parent-child relationship are misrepresented as “anti-transgender.” Worse, those laws communicate “that their identities and bodies are neither valid nor worthy of protection,” which is abject nonsense. Transgender people are defined as those “who identify with a gender identity that is different from societal expectations surrounding their sex assigned at birth,” even though no one has ever had their sex “assigned” at birth. The authors continue the trope that “gender-affirming healthcare has well-established benefits” even though no one’s been actually able to prove any benefit, while the harms are very real. (See: Cass Report).

2. Dubious data source

The research team did no field work and collected no data on suicide or attempted suicide. Rather, their exclusive data source is the “Trevor Project National Survey,” a non-scientific, non-random survey conducted annually by their host organization. Subjects are recruited through social media. Immediately, this should cast suspicion on the results, as online surveys are often unrepresentative and subject to bias.[4], [5] There was no means to verify the truthfulness of responses, and it was not a longitudinal survey. Each annual survey, of which there were five, was a snapshot in time, meaning there was no way to follow individual respondents from one year to the next. They could have been different every year, rendering claims of an “increase” immediately suspect.

Because the authors exclusively rely upon data from their own activist organization, the quality and reliability of the data cannot be assessed. Furthermore, the authors have declined to make their data publicly available, claiming it would “compromise research participant privacy” – although the survey results could readily be shared without the disclosure of personal identifiers.

3. Opaque methodology and misuse of statistics

I consulted with a research statistician who has been affiliated with major universities. He notes that the ”difference-in-differences” methodology is generally thought to be a reliable technique, but be should be used in caution with longitudinal data such as here. More importantly, it is impossible to know whether the statistical methods used by the researchers were appropriate and valid. Moreover, “The paper provides no references on the technique or clear explanation on the model used, i.e., how DiD was applied to the analysis data set. Given the lack of clarity on what model was used, how it was applied to this data set, and limitations of the data set,  I do not find these results credible.”

4. Trans-gressions of regression:

For the difference-in-differences method to yield evidence of causality, there should be no other factors that differ between the various states over the time period in question. The authors necessarily stipulate, “The DD research design assumes that there are no time-varying confounders between treatment and control states (that is, the parallel trends assumption).” Yet there was at least one extraordinary “time-varying confounder” in play, namely a nationwide pandemic that varied over time and between states. They supposedly controlled for COVID using state mortality data, but it’s hard to see why death rates would be relevant. Lockdown policies and social isolation, on the other hand, had profound effects on youth mental health and varied drastically from one state to another. Nationwide, youth suicide rates rose significantly in 2020[6] and again in 2021 but then decreased significantly in 2022.

Nowhere in the article do the authors provide any actual numbers for suicide attempts among the various study groups. The only results they provide are regression coefficients derived from a comparison between (claimed) suicide attempts in legislating versus non-legislating states. [A regression coefficient is the slope of the regression line. A coefficient of 0.5 would mean that for every increase of 1.0 in the control group, the same measure goes up 1.5 in the subject group]. The regression coefficients supposedly become positive in the “treatment” states after the legislation is enacted, but this does not mean there was an increase in reported attempts. Let’s see why.

Using the DiD technique, the authors calculate regression coefficients representing the relative risk of the “treated” population compared to the untreated. But this tells us nothing about the slope of the trend in the “treatment” group: whether it went up, down, or did nothing. In the following illustrations, let’s imagine some event, say autism. A cure has been found, but half the states don’t trust it. In the first graph, we see that “blue” states the prevalence drops a fixed 0.1 per unit of time, but in the more trusting “green” states by 0.2.

A graph with a line drawn on it

Description automatically generated

  

Now, let’s plot the difference between blue states and green states in purple:

A graph with lines and numbers

Description automatically generated

If we then declare “green” as the baseline, as done in this study, the regression coefficient between the two groups here is a positive 0.1, meaning for any unit change in green, blue increases by 1.1. Nifty, huh? A positive regression coefficient but a declining rate. Well, that’s why the regression coefficients are meaningless without more context than the authors provide.

This exercise is more than purely hypothetical. The Trevor project report covers the time period from 2018 to 2022 with five annual surveys. The CDC reports annual youth suicide rates, and indeed, their data showed that nationwide, the suicide rate for ages 10-24 years dropped over 8% in 2022 compared to 2021.[7]

5. Using different “Difference-in-differences” makes a difference.

There are two approaches to doing difference-in-differences. The first, referred to in the article as the “two-way fixed effect,” divides results into just four groups, before and after for both subjects and controls. Using this method, one can look at the entire group of subjects with simple before and after outcomes. The authors ran their data using the two-way fixed effect and found that among the “treated” subjects, legislation resulted in……NADA! Nothing. There was no statistical difference between subjects and controls after legislation was passed:

“In the alternative specification using two-way fixed effects DD models to summarize the overall effects in the post-treatment periods, we found small effects that were not statistically significant in both the model with the full sample and with TGNB young people aged 13–17.”

Now, this seems terribly inconvenient since it is the exact opposite of what the authors claim to have discovered. But, if you torture the data long enough, it will confess to anything. So, they did. And it did.

Because another way of parsing the data is to use the “event-driven” method, which breaks the results down by year. In so doing, the researchers serendipitously found statistically significant correlations that had at first been elusive. Statisticians might frown on this approach for a number of reasons. Some economists warn against using DiD in this way for longitudinal data because it generates incorrect variance estimates.

6. Attempting more suicide while not thinking about it?

The Trevor Project group focused on three possible variables: 1). Any past-year suicide attempt, 2) number of past-year suicide attempts, and 3) thoughts of suicide. According to their data, there was no statistical significance between subjects and controls in contemplating suicide. So, apparently, they were more likely to attempt suicide and made more attempts, but simultaneously no more likely to “seriously consider suicide in the past year.” (I suppose this is what you get when you put all your eggs in the “thirteen-year-olds-filling-online-surveys-found-by-social-media” basket).

“We concluded that our analysis provided minimal evidence that state governments enacting state-level anti-transgender laws had a statistically reliable impact on TGNB young people who reported seriously considering suicide in the past year.”

Even more telling, their chart indicates that in year 3, TGNB youth in the “anti-trans” states were statistically less likely to have seriously considered suicide. In their figure below, the open circles are statistically significant. It is clear that in year 3, both in the total population of ages 13-24 and in the subset of ages 13-17, subjects were less likely to have been considering suicide than the control population. True, it’s a tiny difference, but their entire argument rests on tiny differences. The positive regression coefficients for “at least one past-year suicide attempt” were also less than 0.10.

A graph of a transgender law

Description automatically generated with medium confidence

7. Inconsistency with real-world data

The Trevor Project surveys have been notorious for reporting very high levels of suicidal ideation in the LGBTQ population across the board. While they can cite similar data from the CDC’s “Youth Risk Behavior Survey,” suicide researchers tend to think these are significantly inflated. Writing in the Archives of Sexual Behavior, Alison Clayton with the Society for Evidence-based Gender Medicine notes:[8]

Furthermore, the suicidality of GD youth presenting at CAGCs, while markedly higher than non-referred samples, has been reported to be relatively similar to that of youth referred to generic child and adolescent mental health services (Carmichael, 2017; de Graaf et al., 2022; Levine et al., 2022). A recent study reported that 13.4% of one large gender clinic’s referrals were assessed as high suicide risk (Dahlgren Allen et al., 2021). This is much less than conveyed by the often cited 50% suicide attempt figure for trans youth (Tollit et al., 2019). A recent analysis found that, although higher than population rates, transgender youth suicide (at England’s CAGS) was still rare, at an estimated 0.03% (Biggs, 2022).

In their perfectly reasonable actions to restrict the medicalization of gender dysphoria in minors, American states (over half, at present) are merely following the lead of several Western European states that are well ahead of the US on this front. The United Kingdom first implemented a ban on puberty blockers in December 2020 following a High Court decision in Bell vs Tavistock. The ban was temporarily overturned by an appeal court in September of 2021, only to be effectively reinstated when the Tavistock gender identity clinic was ordered to close following the initial Cass report in July of 2022. There were still loopholes through other practitioners, but on December 11, 2024, the British government declared a comprehensive ban on puberty blockers for gender incongruence or dysphoria for any under 18 years of age.

As in the US, there were howls of protest and extravagant claims of “an explosion” in suicides following the initial ban in 2020. But a comprehensive audit of gender clinic patients uncovered a total of only 12 suicides between 2018 and 2024, or two per year, among many thousands of patients.[9] There was no meaningful difference between the first and last halves of the time period – they did not increase to a meaningful degree. By contrast, in the ongoing NIH-funded trial run by Johanna 0lson-Kennedy, there were two suicides out of 315 children undergoing “gender-affirming” treatment.[10]

Earlier this year (2024), one of the largest and best-to-date reports of suicide mortality examining all gender-referred adolescents in Finland found 20 suicides out of 2083 patients between 1996 and 2019 (0.51 per 1000 person-years), a very low rate but still four times higher than the control group. However, when compared just to the controls undergoing psychiatric treatment, the difference vanished.[11]

Responsible discussion of suicide

It’s not merely that reporting like this is unscientific and dishonest. It is potentially dangerous, tossing fuel on the fire they claim to be fighting. The suicide prevention community has established guidelines for the responsible reporting of suicide, noting that “over 100 studies worldwide have found that risk of contagion is real and responsible reporting can reduce the risk of additional suicides.” Dangerous practices to be avoided include “oversimplifying or speculating on the reason for suicide” and “overstating the problem of suicide by using descriptors like ‘epidemic’ or ‘skyrocketing.’”[12] The Samaritans, another suicide prevention charity based in the UK, warns that:[13]

Young people are a particularly vulnerable audience in relation to media coverage of suicide. They are more susceptible to imitational suicidal behaviour and more likely to be influenced by the media than other age groups. Young people are also at greater risk of contagion if they have been affected by a suicide. Often the deaths of young people receive disproportionate, emotive coverage compared with other deaths by suicide, which can increase the risk of influencing imitational suicidal behaviour.

The warning of the UK independent report is worth quoting:

The way that this issue has been discussed on social media has been insensitive, distressing and dangerous, and goes against guidance on safe reporting of suicide. One risk is that young people and their families will be terrified by predictions of suicide as inevitable without puberty blockers – some of the responses on social media show this.

Another is identification, already-distressed adolescents hearing the message that “people like you, facing similar problems, are killing themselves”, leading to imitative suicide or self-harm, to which young people are particularly susceptible.

Conclusion

Despite their claim to the contrary, the recent article in Nature affords no actual evidence of an increase in suicide ideation or suicide attempts. Their case is made on hypothetical modeling from a dubious data source using questionable statistical methods, and though their data is self-contradictory, those contradictions are ignored.

Too many families are being terrorized with the pernicious lie, “Would you rather have a live son or a dead daughter?” Those lies must stop. Suicide is a tragedy, not a rhetorical trump card or marketing slogan.

  1. “Journal Information | Nature Human Behaviour.” Accessed December 16, 2024. https://www.nature.com/nathumbehav/journal-information.


  2. Lee, Wilson Y., J. Nicholas Hobbs, Steven Hobaica, Jonah P. DeChants, Myeshia N. Price, and Ronita Nath. “State-level anti-transgender laws increase past-year suicide attempts among transgender and non-binary young people in the USA.” Nature Human Behaviour (2024): 1-11.


  3. Simmons-Duffin, Selena. “More Trans Teens Attempted Suicide after States Passed Anti-Trans Laws, a Study Shows.” NPR, September 26, 2024, sec. Shots – Health News. https://www.npr.org/sections/shots-health-news/2024/09/25/nx-s1-5127347/more-trans-teens-attempted-suicide-after-states-passed-anti-trans-laws-a-study-shows.


  4. Deming CA, Harris JA, Castro-Ramirez F, Glenn JJ, Cha CB, Millner AJ, Nock MK. Inconsistencies in self-reports of suicidal ideation and attempts across assessment methods. Psychol Assess. 2021 Mar;33(3):218-229. doi: 10.1037/pas0000976. Epub 2021 Mar 11. PMID: 33705163.


  5. Singh, Swarndeep, and Rajesh Sagar. “A critical look at online survey or questionnaire-based research studies during COVID-19.” Asian Journal of Psychiatry 65 (2021): 102850.


  6. Bridge, J. A., Ruch, D. A., Sheftall, A. H., Hahm, H. C., O’Keefe, V. M., Fontanella, C. A., Brock, G., Campo, J. V., & Horowitz, L. M. (2023). Youth suicide during the first year of the COVID-19 pandemic. Pediatrics151(3), Article e2022058375. https://doi.org/10.1542/peds.2022-058375 


  7. CDC. “CDC Newsroom,” January 1, 2016. https://www.cdc.gov/media/releases/2023/s0810-US-Suicide-Deaths-2022.html.


  8. Clayton, A. Gender-Affirming Treatment of Gender Dysphoria in Youth: A Perfect Storm Environment for the Placebo Effect—The Implications for Research and Clinical Practice. Arch Sex Behav 52, 483–494 (2023). https://doi.org/10.1007/s10508-022-02472-8


  9. GOV.UK. “Review of Suicides and Gender Dysphoria at the Tavistock and Portman NHS Foundation Trust: Independent Report.” Accessed December 20, 2024. https://www.gov.uk/government/publications/review-of-suicides-and-gender-dysphoria-at-the-tavistock-and-portman-nhs-foundation-trust/review-of-suicides-and-gender-dysphoria-at-the-tavistock-and-portman-nhs-foundation-trust-independent-report.


  10. D. Chen et al., “Psychosocial Functioning in Transgender Youth after 2 Years of Hormones,” N Engl J Med 388, no. 3 (Jan 19 2023): 240-50, https://doi.org/10.1056/NEJMoa2206297, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10081536/pdf/nihms-1877102.pdf.


  11. Ruuska, Sami-Matti, Katinka Tuisku, Timo Holttinen, and Riittakerttu Kaltiala. “All-cause and suicide mortalities among adolescents and young adults who contacted specialised gender identity services in Finland in 1996–2019: a register study.” BMJ Ment Health 27, no. 1 (2024).


  12. Reporting on Suicide. “Recommendations.” Accessed December 20, 2024. https://reportingonsuicide.org/recommendations/.


  13. Samaritans. “Samaritans’ Media Guidelines for Reporting Suicide.” Accessed December 20, 2024. https://www.samaritans.org/about-samaritans/media-guidelines/media-guidelines-reporting-suicide/



Discover more from The Soggy Spaniel

Subscribe to get the latest posts sent to your email.

About Author

about author

Steven Willing MD, MBA

Dr. Steven Willing received his medical degree from the Medical College of Georgia, completed an internship in pediatrics from the University of Virginia before undertaking a residency in diagnostic radiology at the Medical College of Georgia, and a fellowship in neuroradiology at the University of Alabama at Birmingham. Dr. Willing spent 20 years in academic medicine at the University of Louisville, the University of Alabama at Birmingham and Indiana University-Purdue University Indianapolis (IUPUI). He also earned an MBA from the University of Alabama at Birmingham in 1997.

During his academic career, Dr. Willing published over 50 papers in the areas of radiology, informatics, and management. He is the author of "Atlas of Neuroradiology", published by W. B. Saunders in 1995.

Now retired from clinical practice, Dr. Willing serves as a radiology consultant to Tenwek Hospital in Bomet, Kenya both remotely and on-site. He is presently the Alabama State Director for the American Academy for Medical Ethics, an adjunct Professor of Divinity at Regent University, and a Visiting Scholar for Reasons to Believe.

What do you think?

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Discover more from The Soggy Spaniel

Subscribe now to keep reading and get access to the full archive.

Continue reading