Here's an interesting concept I recently stumbled upon. A civilization can step up the ladder of evolution and technological progress up to a point where it is able to destroy itself. It is generally accepted that, when nuclear weapons were invented, humanity reached such milestone.
Now combine this concept with Fermi's paradox, establishing that there is actually a pretty good chance that there is life in other planets. If Fermi is correct, how is it possible that we haven't made any contact yet? I'm not talking about a flying saucer over the Empire State, but some kind of signal whatsoever.
If previous assumptions are right, there must be something that makes it unlikely for civilizations to reach a moment in technological progress good enough to establish contact beyond their home planets. We can then infer that once civilizations reach that critical point where they can self-destroy, they seldom survive long enough to take the next step. This could be an explanation of what is called The Great Filter.
This concept has been used to predict the inevitable nuclear war that will kill us all. And while this was more fashionable back on the Cold War days than now, it is still true today. But it also resonates with more trending topics like global warming. And when you read some people talking about this phenomenon not being a theory anymore but a reality, when people start talking about adaptation instead of prevention, I cannot help but think on this Great Filter thing.
The question, in more general terms, is whether the strategies that make a species winner in the evolution race will kill them in the long term no matter what. "Only the strongest survive" leads to a set of behaviours that, while effective for individuals and small groups, end up being deadly for the whole.
Are YOU willing to sacrifice your comfort TODAY for the sake of billions you don't know?
Are YOU willing to embark in a project whose results will be seen beyond your lifetime?