“The main purpose of studying the future is to look at what may happen if present trends continue, decide if this is desirable, and, if not, work to change it.”
The roots of futures thinking – the imaging in human minds of the future – can be traced back to the beginnings of human societies. The formalized study of futures came much later, but it can be said that the most advanced civilizations tended to project their thinking and utilize basic methods of planning and foresight. Greek philosopher Plato developed the concept of an ideal society with perfect justice in “The Republic,” and his vision inspired millions of thinkers to imagine the future.
In 1900, Smithsonian Institution curator John Elfreth Watkins wrote an article for The Ladies’ Home Journal, entitled “What May Happen in the Next Hundred Years” filled with predictions that many of his readers probably scoffed at as ridiculously improbable. Indeed, Watkins was pretty far off about some things. He predicted, for example, that the letters ‘C,’ ‘X’ and ‘Q’ would vanish from the alphabet, streets would be relocated underground, and farms would grow strawberries as large as apples. But what’s more impressive is the extent to which Watkins’ vision of the future actually has come to pass — wireless phone networks on which a person in New York could talk to another in China, live TV images being transmitted around the globe, MRI machines, aerial warfare, and high-speed trains traveling between cities at 150 miles per hour. Watkins even predicted the food trucks that have become a fad worldwide.
I would like to continue our previous post about how to think about tomorrow’s world. The plan for this series is to cover following topics:
- List key futurists in the world
- Review a history of the futurism
- Identify key books and publications in this domain
- Review top books about megatrends, trends in IT and technology
Today’s post I would like to devote for those of you who interested in thinking about future, trends and futurologic predictions. But today we will speak about main books and authors who think about future. This post will be helpful for those entrepreneurs who want to change the world but still have no idea on where to start. So, listen to gurus.
Top 10 Groundbreaking Futurists
Today’s futurists — who aim to forecast trends, inventions and events that will appear in the decades ahead – have developed more sophisticated methods for divining what may lie ahead. As Timothy Mack, president of the World Future Society, explains on the organization’s Web site, futurists systematically scan the news media and published results of scientific studies, and conduct carefully structured surveys called “Delphi polls” in which they probe the minds of experts in various fields. Many also now create computer simulations and even conduct role-playing games in an effort to foresee what events and trends might result from certain changes, such as worsening environmental problems, the development of new energy sources or changes in the tax system.
Overall, there is even a scientific discipline which is fully devoted to future studies. (See http://en.wikipedia.org/wiki/Futures_studies)
Let us review the most known futurists which predictions shaped the world.
Sir Arthur Charles Clarke, CBE, FRAS (Sri Lankabhimanya Arthur Charles Clarke) (16 December 1917 – 19 March 2008) was a British science fiction writer, science writer inventor, undersea explorer, and television series host.
He is perhaps most famous for being co-writer of the screenplay for the movie 2001: A Space Odyssey, considered by the American Film Institute to be one of the most influential films of all time. His other science fiction writings earned him a number of Hugo and Nebula awards, along with a large readership, making him into one of the towering figures of the field. For many years he, along with Robert Heinlein and Isaac Asimov, were known as the “Big Three” of science fiction.
Clarke was a lifelong proponent of space travel. In 1934 while still a teenager, he joined the British Interplanetary Society. In 1945, he proposed a satellite communication system—an idea that, in 1963, won him the Franklin Institute‘s Stuart Ballantine Medal. Later he was the chairman of the British Interplanetary Society from 1946–47 and again in 1951–53.
Clarke was also a science writer, who was both an avid populariser of space travel and a futurist of uncanny ability, who won a Kalinga Prize (award given by UNESCO for popularising science) in 1961. These all together eventually earned him the moniker “prophet of the space age”.
Clarke emigrated to Sri Lanka in 1956, largely to pursue his interest in scuba diving. That year he discovered the underwater ruins of the ancient Koneswaram temple in Trincomalee.
Clarke augmented his fame later on in the 1980s, by being the host of several television shows such as Arthur C. Clarke’s Mysterious World.
He lived in Sri Lanka until his death. He was knighted by Queen Elizabeth II in 1998 and was awarded Sri Lanka’s highest civil honour, Sri Lankabhimanya, in 2005.
If you’re puzzled by corporate executives and politicians who incessantly speak in jargon such as “game changer” and “change agent,” thank Alvin Toffler, who worked as a business journalist for Fortune magazine and as a consultant for technology companies such as IBM, Xerox and AT&T. When it was published, “Future Shock” seemed like a blueprint for a creepy, dysfunctional dystopian society where high-tech elite would strive to keep the stressed-out masses under control — sort of like the science fiction movie “Soylent Green,” but without an angry Charlton Heston raging against cannibalism. But in the decades since, we’ve seen Toffler’s predictions become reality in myriad ways, ranging from disposable mobile phones to virtual corporations and “flash mobs” of individuals who gather briefly for a common purpose and then just as suddenly vanish.
As a scientist, Kaku, a professor of theoretical physics at the City University of New York, has done important work on string theory, which tries to reconcile Einsteinian relativity and quantum mechanics by proposing that the fundamental units of nature are incredibly tiny strings of energy
In his 2011 book, “Physics of the Future: How Science Will Shape Human Destiny and Our Daily Lives by the Year 2100,” Kaku relies heavily on the “Delphi poll” method, informally surveying experts in various scientific fields, and even visiting their laboratories to study prototypes of inventions that already exist in an effort to predict future game-changing developments
. Based on that data, Kaku envisions a future society with technologies that would seem like science fiction fantasies today.
He predicts that computers will be able to read our minds, which will give us the power to move objects and machines by thinking about them. He also predicts advances in biotechnology that will enable humans to extend their own life span and to fashion new organisms not found in nature. And nanotechnology will give us the ability to take an object or material and tinker with it at the molecular level to convert it into something completely different, fulfilling the dreams of medieval alchemists who searched for a way to turn lead into gold. Moreover, Kaku envisions national differences eventually fading by 2100, so that the world develops a single, planetary civilization
A former Swedish Army ranger with a Ph.D. in computer science, Alberg is chief executive of Recorded Future, a Cambridge, Mass.-based firm that has pioneered real-time use of the Web and social networks as a way to predict events in the near future. Recorded Future’s computers continually scour tens of thousands of Web sites, blogs and Twitter accounts, and use sophisticated analytical software in an effort to spot “invisible links” between items that actually refer to the same people and future events in which they may be involved. Recorded Future also tracks the volume of such links, and then tries to use that information to gauge the momentum behind a future event, the likelihood that it will happen, and when and where it may occur
Recorded Future’s technology has enough potential that both Google and the U.S. Central Intelligence Agency have become investors
. But as Alberg admitted in a 2011 interview with Business Insider, the software has limitations, in that it’s better at some sorts of predictions than others. It does pretty well with forecasting events that frequently occur, such as stock market volatility, but not so well with infrequent events, such as elections. And predicting so-called “black swan events” — random, seemingly unlikely disasters, such as the cascading near-collapse of Wall Street in 2008, that somehow happen despite the odds against them — remains a daunting challenge
Like Alberg and Recorded Future, Helbing — a German-born physicist, mathematician and sociologist — hopes to use computer software to achieve a level of prescience that priests of the ancient Oracle at Delphi would envy. But Helbing wants to cast an even wider net for data, in hopes of getting a glimpse of not just of a few isolated events, but of big sweeping longer-term changes that will affect humans all over the planet.
At the Swiss Federal Institute of Technology in Zurich, Helbing is leading the creation of the Living Earth Simulator Project, a $1.4 billion effort to build a massive supercomputer system capable of modeling just about any sort of event that could occur on Earth. LES, which Helbing describes as a “nervous system for the planet,” would amass everything from government economic statistics to tweets from everyday Joes. It could also tap into data generated by the increasing number of Internet-connected machines and sensors, and even peruse photos uploaded to the Web by smartphone cameras.
To make sense of this bewildering tsunami of seemingly unrelated stuff, LES employs complicated algorithms, or predictive equations, to look for interconnections between seemingly unrelated events. Helbing envisions that the simulator will be able to predict events ranging from wars and financial crises and epidemics of infectious diseases — ideally, with enough time to spare so that political, business and scientific leaders can take steps to avert disasters before they actually occur.
As a 13-year-old, the New York City-born Kurzweil used telephone parts to fashion a calculator that could find square roots, and by the time he reached Massachusetts Institute of Technology in the late 1960s, he’d already founded a successful analytic software company and sold it for $100,000. In the decades that followed, Kurzweil dreamed up a slew of world-changing innovations, ranging from optical character recognition software to voice and music synthesizers. But the man who arguably is America’s greatest living inventor — Inc. magazine once called him the “rightful heir to Thomas Edison” — probably has achieved even more fame as a futurist.
Kurzweil wasn’t the first to predict that machines eventually would eclipse human intelligence, but he’s boldly put a date on the Singularity, as futurists call that anticipated event. In a 2005 essay, Kurzweil declared that by 2045, “nonbiological intelligence,” as he calls it, will not only have surpassed human capabilities, but will be 1 billion times smarter than the cumulative total of human thinking ability today. But Kurzweil isn’t afraid that some malevolent machine will decide to destroy the human race, the scenario depicted in the “Terminator” film series. Instead, he anticipates a future in which human and machine intelligence will blend together to achieve even more amazing innovations and progress. Kurzweil also envisions humans becoming increasingly artificial in other ways, as well. By the early 2030s, he predicts that most of our internal organs will have been replaced by tiny robots, which will last longer than flesh and work more efficiently
Unlike forecasters who rely on crunching data, the South Carolina-born Gibson — author of novels such as “Neuromancer,” “Virtual Light,” “Pattern Recognition” and the recent “Zero History” — is more of a latter-day Jules Verne, using his imagination to concoct a science-fiction vision of the future. Gibson, who now lives in Canada, began writing fiction in the early 1980s on an old-fashioned manual typewriter. His fantasy bore a startling resemblance to today’s actual multimedia Internet, which at the time existed only as a bare-bones system that connected a few university and military research institutions. Indeed, as science journalist Pagan Kennedy noted in 2012, “A decade later, when we all stepped into cyberspace, the word seemed just right”. But the future that Gibson sketches is dark and dystopian, rather than glittering with promise. His 1988 book “Mona Lisa Overdrive,” for example, describes a phenomenon called “neuroelectronic” addiction, in which “wireheads” become so addicted to digital content that they end up as shriveled, comatose wraiths in cots, hardwired to modems. But Gibson also has predicted more uplifting use of technology. In his 1997 novel “Idoru,” he depicts a Chinese city that’s demolished by authorities, only to be defiantly resurrected in cyberspace as an online oasis for political and creative freedom.
Aubrey de Grey
Centuries ago, the Spanish explorer Ponce de Leon sought a mythical fountain of youth, whose waters were believed to reverse the ravages of old age. Today, the British-born de Grey predicts a future in which we’ll be able to actually achieve that, by altering our bodies at the cellular and molecular level to repair damage or even prevent the changes associated with aging. Not only that, but he’s helping to lead research efforts to accomplish the dream of a human lifespan that would be vastly longer than it is now.
The Cambridge University alumnus started out in computer science, but then switched to the emerging field of biogerontology. De Grey has sketched out an actual plan for rejuvenating the human body, which he calls Strategies for Engineered Negligible Senescence (SENS), which breaks the phenomenon of aging into seven specific classes of damage, and identifies detailed approaches for addressing each. De Gray now heads the SENS Foundation, a nonprofit organization that promotes research, and is editor-in-chief of Rejuvenation Research, a peer-reviewed scientific journal. In a 2010 interview with the Guardian, a British newspaper, de Grey said that he believes the human lifespan eventually will be extended to 1,000 years, and estimated that there is a 30 to 40 percent chance that the first person to live for a millennium is already walking on the planet.
Roberts, a 1983 graduate of the University of Washington, is a journalist who has written for Harper’s magazine, National Geographic and numerous other publications. He covers the complex interplay of economics, technology and the natural world. He’s one of the most prominent forecasters promoting the theory of “peak oil,” which holds that the world may already have achieved its maximum petroleum production, and that supplies of the fuel will decline dramatically in decades to come.
In Roberts’ 2004 book, “The End of Oil,” he predicts that that energy demand will continue to rise, as people in developing nations clamor for automobiles, larger homes with air conditioning, and electronic entertainment available in the U.S. and other technologically and economically advanced societies. Increasingly intense competition for shrinking supplies of petroleum and other fossil fuels, in turn, will lead to conflict and political instability. At the same time, climate change, driven by humans who burn petroleum and other fuels and release greenhouse gases into the atmosphere, will have increasingly destructive effects.
“As energy supplies become harder to transport, as environmental effects worsen, and as energy diplomacy sows even greater geo-political discord, the weight of the existing energy order becomes less and less bearable — and the possibility of a disruption more undeniable,” Roberts writes. He sees it as imperative for the U.S., a major consumer of the world’s energy, to avert an eventual global catastrophe by becoming more energy-efficient and developing alternative energy sources to replace petroleum and other fossil fuels.
Popcorn gained fame for spotting the emerging trend of “cocooning,” in which people overloaded with stimulation tend to stay at home and watch videos instead of going to movie theaters, and have take-out food from restaurants delivered to their addresses. She also accurately predicted that many women who had gained professional opportunities eventually would become disillusioned with the corporate “rat race” and quit in search of healthier, simpler lives. Since then, Popcorn has predicted a variety of other future consumer trends. Some, such as rising demand for cosmetic surgery, tattooing and other forms of body modification, already have come to pass. But others — such as Popcorn’s prediction that young consumers would begin rejecting name brands and altering designer clothes and logos to express their individuality — have yet to take hold. (Incidentally, those same predictions emerged in a William Gibson novel.)
A former U.S. Marine and executive for IBM and Kodak, Naisbitt served as an aide to two presidents, John F. Kennedy and Lyndon B. Johnson, before authoring the 1982 bestseller “Megatrends,” which predicted the rise of a fast-moving global economy and a society for which information would be a commodity on par with manufactured products. In the days before news was available on the Web, Naisbitt based his predictions on what essentially was an analog, paper-based form of Googling; he and his staff searched through more than 200 daily newspapers, looking for recurring events and public behavior.
Since then, Naisbitt has written numerous other books, including a 1990 sequel to “Megatrends,” a version of “Megatrends” aimed at women, and the 2010 “China’s Megatrends,” in which Naisbitt predicted that China eventually would create an entirely new social and economic system that would serve as an alternative to western-style democracy. Naisbitt also forecast, among other things, the growing intellectual freedom in China and rise of a Chinese version of country music.