Most of us are content to just worry about the future of humanity in our spare time, but there's an entire group of academics at Oxford University in England who make that their professional mission.
Each member of the Future of Humanity Institute has his own focus. Some are concerned with climate change and its impact on humanity; others with the future of human cognition. Department head Nick Bostrom, whose paper Existential Risk Prevention As Global Priority has just been published, has a long history of being worried about our future as a species. Bostrom posits that humanity is the greatest threat to humanity's survival.
Bostrom's paper is concerned with a particular time-scale: Can humanity survive the next century? This rules out some of the more unlikely natural scenarios that could snuff out humans in the more distant future: supervolcanoes, asteroid impacts, gamma-ray bursts and the like. The chances of one of those happening within the very narrow timeframe involved is, according to the paper, extremely small. Further, most other natural disasters, such as a pandemic, are unlikely to kill all humans; we as a species have survived many pandemics and will likely do so in the future.
According to Bostrom, the types of civilization-ending disasters we may unleash upon ourselves include nuclear holocausts, badly programmed superintelligent entities and, my personal favorite, "we are living in a simulation and it gets shut down." (As an aside, how the hell do you prepare for that eventuality?) Additionally, humans face four different categories of existential risk: