# Talk:Time to extinction of civilization

## Estimating the mean time between nuclear crises[edit source]

I suspect a typo in the section Estimating the mean time between nuclear crises. I believe that -35 rather than -21 is the correct parameter in the formula in the second paragraph. Please take a look. Thanks! --Lbeaumont (discuss • contribs) 11:25, 31 December 2018 (UTC)

- Duh. Pretty obvious. Thanks. DavidMCEddy (discuss • contribs) 13:44, 31 December 2018 (UTC)

## Like the Drake Equations[edit source]

It might be useful to summarize and parameterize your analysis in a form similar to the Drake equation. Parameters might include the number of nuclear capable states, the number of nuclear weapons on the planet, the number of international conflicts, the number of authoritarian governments, the effectiveness of safety systems, etc. The result could be the probability of killing p% of the earth's population in the next n years or something similarly sobering. Thanks! --Lbeaumont (discuss • contribs) 11:36, 31 December 2018 (UTC)

- Yes, it might be.
- I've contacted the Union of Concerned Scientists, and I plan to contact others including Daniel Ellsberg to try to create a team of experts to refine this analysis. I mentioned in the section on "Time between nuclear crises" that other lists of incidents might be used. Part of the problem with that is that some nuclear incidents like the 1983 Soviet nuclear false alarm incident would likely be highly classified, and the public would likely not learn about them until decades later, if ever. The 1983 Soviet nuclear false alarm incident reportedly only became publicly known in 1998 with the publication of the memoirs of Yury Votintsev. And if the Soviet Union had not imploded, that event would likely still be classified "Top Secret".
- If I get other collaborators, we will need to model the time to publication of an event in addition to the time between events. I would probably want to use a zero-inflated exponential distribution. Then we'd have to ask the experts for two probabilities: (a) that the public would know about it more or less as it happened, as well as (b) whether it might lead to a nuclear Armageddon.
- We will likely also want to model other parts of a Drake analogue, like whether a nuclear war between India and Pakistan might loft enough soot into the stratosphere to reduce incident sunlight and surface temperature worldwide, and if so by how much. If, for example, only half of the world's population starved to death, that would not likely destroy civilization. I think a nuclear war between India and Pakistan has been simulated, but I do not remember the results. That would need to be explored and modeled.
- I do not have personal relations with Ellsberg or anyone else who might have expertise in this field, and I have only limited contact with organizations working issues like this. If you feel inclined to contact others about this, you can assure them that I'm actively working to improve this methodology and accelerate the dissemination of the results, and I'm eager to find collaborators. My resume (as a statistician, Spencer Graves, not as "DavidMCEddy") includes two books, three patents, over 30 published technical papers and software used all over the world, AND most of those publications are in joint with others.
- Only yesterday, I computed 80 and 95 percent confidence intervals for the time between such incidents of (20, 284) and (13, 981) years. Those assume that the time between such incidents is a fixed exponential, which seems reasonable but far from certain. I'm not certain yet how I want to present this additional uncertainty, because there likely were other such events that are still highly classified.
- I also plan to present this at the Joint Statistical Meetings, July 27-August 1, 2019, and publish a version of whatever I have by their deadline in their published proceedings.
- Thanks again for your interest. DavidMCEddy (discuss • contribs) 14:36, 31 December 2018 (UTC)

## L = 420.6 years[edit source]

Michael Shermer calculated the average duration of a civilization (L in the Drake Equation) to be 420 years. See, for example: https://michaelshermer.com/2002/08/why-et-hasnt-called/ This calculation may influence, support, or dispute some of the calculations and conclusions in the essay. Thanks! [This unsigned comment was by user:Lbeaumont at 11:45, 31 December 2018.]

- Thanks. I should look at that. I'd like to see his data set. I'd be happy to collaborate with him and / or you and / or others to create a separate Wikiversity article on that, using the censored data analysis technology described in this paper. Then we could add a section on that to this current article, perhaps right before "Fermi's paradox" with a heading like, "Shermer on lifetime of a civilization".
- I currently have an RMarkdown document companion to this "Time to extinction" article that includes the computation of confidence intervals for the average lifetime using Wilks' theorem. I plan to publish that RMarkdown document as part of the Ecfun package for R, initially on R-Forge -- probably later this week. After that, we can discuss trying to contact Shermer, etc.
- DavidMCEddy (discuss • contribs) 15:35, 31 December 2018 (UTC)

## Title is too general[edit source]

The title of this article, *Time to extinction of civilization*, is more general than the topics discussed. The article only considers extinction via nuclear weapons, and does not consider other existential threats, such as asteroid collision, epidemic, AI runaway, and others. I suggest either these threats be mentioned, or a the title be made more accurate. Thanks! --Lbeaumont (discuss • contribs) 23:55, 31 December 2018 (UTC)

- Sure. I added a section on "Other existential risks".
- However, I'm willing to consider another name, e.g., "Time to nuclear Armageddon"?
- DavidMCEddy (discuss • contribs) 00:46, 1 January 2019 (UTC)

- @Marshallsumter: What do you think of possibly changing the title of this article to, e.g., "Time to nuclear Armageddon"? See the discussion above. Thanks, DavidMCEddy (discuss • contribs) 23:38, 1 January 2019 (UTC)

- @DavidMCEddy: I liked the title as it is! Here's why: any kind of massive natural disaster short of whatever caused the extinction of the dinosaurs (about every 240 million years) is unlikely to be the extinction of civilization. For AI runaway see Technological singularity. For epidemics, consider the bubonic plague. For glacial advances, consider the history of modern humans, see Paleanthropology. In other words, when I started reading the essay, I figured from your earlier essays already announced that it dealt with nuclear annihilation. --Marshallsumter (discuss • contribs) 00:56, 2 January 2019 (UTC)

- @Marshallsumter: Great. I'll leave it as it is. I felt a need to ask you, since User:Lbeaumont suggested it be changed. 2605:A601:4EF:9100:1C9A:E3DA:9188:C443 (discuss) 01:26, 2 January 2019 (UTC)

## Visual presentation of results[edit source]

It would be helpful to add a visual presentation of the results. This might be a graph with "years from now" as the X axis, and "Probability of (defined) Armageddon" on the Y axis. The plot will show the most likely as well as error bars and extend out for perhaps the next 100 years. The italicized text in the introduction provides data points for X=1 and X=40. Thanks! --Lbeaumont (discuss • contribs) 00:58, 1 January 2019 (UTC)

- Thanks. I'm working on something like that. After I post that, you can tell me what you think. I don't know if I'll have it ready to post today. DavidMCEddy (discuss • contribs) 18:30, 1 January 2019 (UTC)

## Please label the (Left-most) axis of the Normal probability plot[edit source]

To make this analysis more clear to the reader (I have above average math literacy, but I am not a specialist in statistics) please label the axis of the normal probability plot. I am perplexed to see the -4 - 4 range on the (left hand) y axis. This overlaps the range of 0-1 on the (right hand) y axis, yet only one series appears. How is this plot best interpreted? Thanks! --Lbeaumont (discuss • contribs) 12:56, 8 January 2019 (UTC)

- @Lbeaumont: These are two different interpretations of the same single line: The right axis is probability, while the left is normal scores. I've expanded the figure caption to explain that with links.

- What do you think about the new figure caption?
- Thanks for your suggestion.
- (Please excuse the delay in responding. I do not recall having received a notice of your post, and I could not find it in my email. It could have been blocked by a temporarily hyperactive spam filter, or maybe I erroneously misfiled it before reading.) DavidMCEddy (discuss • contribs) 14:28, 7 February 2019 (UTC)
- @DavidMCEddy: Thanks. Is the left axis a z score or a normal score derived from ranks? If it is a z score, perhaps you could link directly to that page. Thanks! --Lbeaumont (discuss • contribs) 15:52, 7 February 2019 (UTC)

- @Lbeaumont: The left axis is a normal score derived from ranks.
- I do not recall ever seeing the term "normal score" used in the sense of a w:z score.
- The Wikipedia article on "w:normal score" has only one reference. I did not check that reference, but it's cited to justify the definition of a normal score derived from ranks, and I believe it relates to that definition, not to both, though I could be mistaken.
- In any event, before I posted my reply to you, I added {{cn}} to the first two paragraphs of that article and then added a discussion to that article's "Talk" page questioning the "z score" interpretation. In the latter, I pinged every contributor to that article that looked like a human. If I get no reply in a week or so, I plan to copy those first two paragraphs to the "Talk" page and delete them from the article. If someone thinks they belong there, they need to provide a reference.
- Please excuse the long rant, but as far as I know the "z score" interpretation of that term is not a standard usage. DavidMCEddy (discuss • contribs) 16:17, 7 February 2019 (UTC)

## More sources describing Existential risks.[edit source]

Here are additional resources that describe and discuss existential risks: futureoflife.org, lifeboat.com, existential-risks.com, existentialhope.com, and Macro Challenges (Xrisks directory). These may be worth exploring to broaden the discussion in the article. Thanks! --Lbeaumont (discuss • contribs) 20:41, 21 October 2021 (UTC)