{"id":197,"date":"2007-06-29T14:48:12","date_gmt":"2007-06-29T22:48:12","guid":{"rendered":"https:\/\/bayesianinvestor.com\/blog\/index.php\/2007\/06\/29\/astronomical-waste\/"},"modified":"2015-10-08T18:18:36","modified_gmt":"2015-10-09T02:18:36","slug":"astronomical-waste","status":"publish","type":"post","link":"https:\/\/bayesianinvestor.com\/blog\/index.php\/2007\/06\/29\/astronomical-waste\/","title":{"rendered":"Astronomical Waste"},"content":{"rendered":"<p>Nick Bostrom has a good paper on <a href=\"http:\/\/www.nickbostrom.com\/astronomical\/waste.html\">Astronomical Waste: The Opportunity Cost of Delayed Technological Development<\/a>, which argues that under most reasonable ethical systems that aren&#8217;t completely selfish or very parochial, our philanthropic activities ought to be devoted primarily toward preventing disasters that would cause the extinction of intelligent life.<br \/>\nSome people who haven&#8217;t thought about the Fermi Paradox carefully may overestimate the probability that most of the universe is already occupied by intelligent life. Very high estimates for that probability would invalidate Bostrom&#8217;s conclusion, but I haven&#8217;t found any plausible arguments that would justify that high a probability.<br \/>\nI don&#8217;t want to completely dismiss Malthusian objections that life in the distant future will be barely worth living, but the risk of a Malthusian future would need to be well above 50 percent to substantially alter the optimal focus of philanthropy, and the strongest Malthusian arguments that I can imagine leave much more uncertainty than that. (If I thought I could alter the probability of a Malthusian future, maybe I should devote effort to that. But I don&#8217;t currently know where to start).<br \/>\nThus the conclusion seems like it ought to be too obvious to need repeating, but it&#8217;s far enough from our normal experiences that most of us tend to pay inadequate attention to it. So I&#8217;m mentioning it in order to remind people (including myself) of the need to devote more of our time to thinking about risks such as those associated with <a href=\"http:\/\/www.intelligence.org\/upload\/artificial-intelligence-risk.pdf\">AI<\/a> or <a href=\"https:\/\/bayesianinvestor.com\/blog\/index.php?p=167\/\">asteroid impacts<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Nick Bostrom has a good paper on Astronomical Waste: The Opportunity Cost of Delayed Technological Development, which argues that under most reasonable ethical systems that aren&#8217;t completely selfish or very parochial, our philanthropic activities ought to be devoted primarily toward preventing disasters that would cause the extinction of intelligent life. Some people who haven&#8217;t thought [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"footnotes":"","jetpack_publicize_message":"","jetpack_is_tweetstorm":false,"jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":false,"jetpack_social_options":{"image_generator_settings":{"template":"highway","enabled":false}}},"categories":[26,24],"tags":[134,55,128],"class_list":["post-197","post","type-post","status-publish","format-standard","hentry","category-ai","category-fermi","tag-effective-altruism","tag-ethics","tag-existential-risks"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/p80O1l-3b","_links":{"self":[{"href":"https:\/\/bayesianinvestor.com\/blog\/index.php\/wp-json\/wp\/v2\/posts\/197","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/bayesianinvestor.com\/blog\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/bayesianinvestor.com\/blog\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/bayesianinvestor.com\/blog\/index.php\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/bayesianinvestor.com\/blog\/index.php\/wp-json\/wp\/v2\/comments?post=197"}],"version-history":[{"count":3,"href":"https:\/\/bayesianinvestor.com\/blog\/index.php\/wp-json\/wp\/v2\/posts\/197\/revisions"}],"predecessor-version":[{"id":1106,"href":"https:\/\/bayesianinvestor.com\/blog\/index.php\/wp-json\/wp\/v2\/posts\/197\/revisions\/1106"}],"wp:attachment":[{"href":"https:\/\/bayesianinvestor.com\/blog\/index.php\/wp-json\/wp\/v2\/media?parent=197"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/bayesianinvestor.com\/blog\/index.php\/wp-json\/wp\/v2\/categories?post=197"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/bayesianinvestor.com\/blog\/index.php\/wp-json\/wp\/v2\/tags?post=197"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}