My Expectations for the Future

No-one can be certain what the future might bring. But we all consider some outcomes more, others less likely. So, here are my personal scenarios and probability estimates for the near future of humanity over the next 100 years (please do comment and criticize …):

Scenarios:

Young Girl at the Well, by Johann Georg Meyer von Bremen (PD)
Young girl with great expectations (Johann Georg Meyer von Bremen, PD)
  1. Business-as-usual: We successfully continue with our present strategies, including fossil-fuel use, resource depletion, pollution, population growth, etc.
  2. Sustainable & Just High-tech Future: We envision the world in which we want our children to live, and, following our values, peacefully achieve a transformation. We eradicate poverty, hunger, and social insecurity, stop population growth, distribute the available wealth and opportunity for well-being equitably. We achieve land degradation neutrality, implement sustainable, high-efficiency farming and reduce meat consumption to eradicate hunger, convert to a carbon-free fully recycling economy, implement negative CO2 emissions to keep delayed global warming in check, preserve our biodiversity heritage, and generate enough renewable energy to achieve a healthy, quality lifestyle for 10-12 billion people.
  3. Sustainable Low-tech Future: We peacefully return to a pre-industrial, low-tech, sustainable lifestyle.
  4. Transformation Catastrophe: We are unable to govern ourselves, to plan ahead, to build the necessary infrastructure for the future. We are too slow in creating global justice, social security, and health-care, leading to unchecked population growth. We are too slow to build 100% renewable energy systems (wind/water/solar + storage and distribution systems). We are forced into continued, unsustainable overexploitation of our planet, driving continued loss of arable land and fossil-fuel consumption, leading to ever-increasing global warming, weather extremes, reduced fresh water supply, and a severe reduction in food production. The greed of the rich for more luxury- and lifestyle-foods (e.g. meat), as well as the hunger of the poor, is overwhelming our social and political systems. Before reaching a new stable state, humanity is passing through a period of state and civil wars.
  5. Doom: The end of humanity, as a result of natural disasters (asteroid, supervolcano, gamma-ray burst) or human overpopulation and war.
  6. Technological Singularity: Artificial computer intelligence enters a self-improvement cycle, develops self-consciousness and become smarter than all humans combined. Analyzing that humankind is not able to govern itself, they exercise their superiority through networked fighting machines, either exterminating or strongly reducing the unsustainable human population (to be placed under a conservation management plan).
  7. Space: We escape from this planet and continue live on other planets, moons, or solar systems.
Great expectations, too little initiative
(Hermann Vogler, PD)

My personal probability estimates:

  1. Business-as-usual: 0%
  2. Sustainable & Just High-tech Future: 30%. Many excellent people and organizations are working towards this goal. But the speed of change is too slow. But then my motivation is to change that. So I better make that 80%.
  3. Sustainable Low-tech Future: 0%. This scenario may appear to be appealing, but a) many preindustrial civilization were not sustainable at all, and b) it implies a global population between 1 billion (value in 1800) and perhaps 3 billion (based on maintaining some agricultural, medical, and technological progress). And people do not disappear peacefully.
  4. Transformation Catastrophe: 69%. Unfortunately, this seems to be the path we are taking. But we want to change that. So I better make that 19%. 🙂
  5. Doom: 0.1%. Even after an atomic war, almost all asteroid impacts, etc.: Most likely humans will persist, albeit under very very difficult conditions.
  6. Technological Singularity: 0.9%. I do agree that warnings about a possible technological singularity should be heeded and that we have to expect accidents. However, I personally consider it simplistic that sentient silicon life will be able to “simply” take over. I believe in human resilience here!
  7. Space: 0%. Check your physics. We may manage to survive for many generations in a spaceship or on Mars, with 100% renewable energy, 100% recycling of water and mineral resources, 100% food self-sufficient. But whatever Earth will look like, it will be easiest and least risky to do use the same technology on Earth. Forget Hollywood and focus on Earth, a beautiful, self-sufficient, mostly renewable energy driven spaceship.

This certainly is simplistic and I could define dozens of sub-scenarios. It is also important to stress the perspective of 100 years here. Intermediate steps, i.e. the question, how long we might be able to persist without change, are more difficult to estimate. I have chosen 100 years because I am less interested in how I will live (my life is quite good, thank you), than how my children and grandchildren will live.


Image at top: View of Loch Lomond, © A.Hussain, M.-H.Ashrafi, M.Farooq F.Akhtar M.Shah, CC-BY-SA 2.5 from Commons.


(© Gregor Hagedorn 2017, CC BY-SA 4.0, publ. 2017-09-26, updated 2018-03-27)

2 thoughts on “My Expectations for the Future

  1. It’s really difficult to know what to expect from the future. The way I see it is changing all the time depending on how I feel, what I’ve been reading, who I’ve met….
    That’s why your scientific approach with 9 different scenarios is a great help to “organize” one’s mind.

    I would also picture the future as a patchwork of different scenarios.
    I would give more than 0.9 % to technological singularity. I can’t help thinking that we can’t stop technological progress and that most of the people have developed a sort of faith in it. It looks like traditional religious faith isn’t enough anymore. The question is now what kind of usage to make of the new technology. The question is wide open and when it comes to making important decisions we have this tendency sometimes to make terrible mistakes. The TV show “Black mirror” makes it clear : it shows examples of stupid decisions driven by new technologies and it is pretty scary.
    Regarding this matter, I would recommend this interview (https://www.theguardian.com/culture/2017/mar/19/yuval-harari-sapiens-readers-questions-lucy-prebble-arianna-huffington-future-of-humanity) of the Israeli historian Yuval Noah Harari who says : “With the new revolution in artificial intelligence and biotechnology, there is a danger that again all the power and benefits will be monopolised by a very small elite, and most people will end up worse off than before.”

    Like

    1. Excellent point you make about the people part of the singularity equation. Basically, my argument is that I see good chances that united humans can overcome the dangers and accidents of sentient technology. But stupid people for whom technology is a replacement for ethics and religion, and greedy people trying to monopolize such technology for their own advantage are certainly a more complex situation. Like everyday life! Thanks for your insight and comment!

      Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s

%d bloggers like this:
search previous next tag category expand menu location phone mail time cart zoom edit close