A supercomputer can do quadrillions of floating-point operations per second. “Supercomputers” are the fastest and most powerful computers in the world. The first supercomputer became introduced into regular use in the early 1970s. In the 1990s machines with thousands of processors began to appear. This marked the beginning of our modern supercomputers. Now in the 21st century, tens of thousands of processors are the norm for supercomputers. These powerful machines can help you and me, and the rest of humanity, in many areas of life. In July 2015, President Obama signed an executive order calling for the U.S. to build the world’s fastest computer by 2025, thus demonstrating why supercomputers are so important. They can help solve the world’s most challenging problems.
Computers are computers and they won’t take over the world. But they can help us do massive calculations. Calculations supercomputers can do in a matter of seconds would take a normal personal laptop over 65 years to complete. And that’s thinking small. As inhabitants of an evolving world, supercomputers are now our best bet in improving civilization. They should lead to improvements in modern medicine, business, science, technology, surveillance and security, and more. One day, perhaps supercomputers can effectively end cancer for humanity.
Yet the biggest challenge in supercomputing is finding a way to develop useful functional architectures and the proper mechanisms for processing large data sets as there is immense data coming into accumulation. But we have not been extracting much meaningful information from these datasets.
Here are 10 Mysteries You Didn’t Know Supercomputers Could Solve In The Future.
10. Cancer Diagnosis and Treatments
A genome is the complete set of DNA instructions in one of your body’s cells. Since 1990, various Human Genome projects have gathered huge amounts of data through a process of bioinformatics, utilizing supercomputers. When technology becomes better, the cost of sequencing one human genome becomes less and less. And we can analyze more genomes at a faster pace. The data is applicable in a vast amount of medical scientific disciplines, including cancer treatment.
In the future, we could develop therapies that are more personalized to each cancer patient. And we’ll have a more thorough understanding of how cancer works by analyzing each genome. By identifying different forms of cancer, we can then develop more specialized drugs to target specific genomes. Or decide how fast a cancerous cell will progress. Or how likely a tumor is to come back. All this with much more accuracy and patient-personalization than is now possible.
9. Supersonic Flight
Fluid dynamics, the study of fluids (liquids, gases, and plasmas) in motion, can help us in our efforts to make supersonic flight more suitable for commercial use. Supercomputer research labs around the world are yielding more accurate and complex fluid dynamic simulations than ever before. This larger computational power could thrust us into a new understanding of aerodynamics. Only two civilian supersonic transports (SSTs), taking passengers at speeds greater than the speed of sound, were ever realized. The famous Concorde took its last commercial flight in 2003. And the Tupolev Tu-144 took its last passenger flight in 1978.
There are many obstacles to overcome before we will be ready for the next wave of SSTs, including the loud sonic booms and expensive construction materials. But General Electric’s research and development division is currently tackling the problem. So a new era of commercial supersonic flight could be closer than we think.
8. Cures For Strokes And Heart Failure
Supercomputers could offer the first 3D high-resolution simulation of “hemodynamics in the systemic arterial tree at cellular resolution.” In plain English: the cardiovascular system. New technology could give a more complete picture of a patient’s cardiovascular condition. Allowing physicians to find solutions to circulatory problems much earlier than ever before.
The 3D models would help physicians identify and treat patients at risk for major health concerns, such as strokes and heart failure, much more efficiently than previously possible. Physicians could analyze a patient’s circulatory system at lighting speed. All this could greatly reduce the number of deaths each year of people suffering from cardiovascular conditions, including but not limited to: heart defects, aneurysms, and atherosclerosis.
7. The Consumer’s Next Purchase
Data Mining is like mining for gold. It allows organizations to compile billions of data points, from which one valuable piece of information can come to light thus leading to groundbreaking new techniques in areas such as medicine, science, business, and even video games. Data Mining helps businesses reach decisions faster. All such information manifests at a remarkable speed. A human being would never have the time or brain power to sort through all that data.
Currently, life insurance companies use supercomputers to lower business risks, cutting costs while analyzing the benefits of different treatment options. Supercomputers are ideal for solving some of the most important computational problems of “Web 3.0,” such as social network mining, customer relationship management, personalized recommendations, spam filtering, and privacy solutions. With all the data piling up, it takes a supercomputer to crunch the numbers.
6. Global Climate Change
Climate models simulate the interactions of important drivers of climate, including the atmosphere, oceans, land surface, and ice. In recent years, supercomputers are now succeeding in delivering high-resolution climate models. But even with all the power of supercomputers, average forecasting skills only extend to about six days.
The “transient climate response” (TCR) describes the rise in global mean temperature (Global Warming) that would manifest at the end of 70 years if CO2 (carbon dioxide) concentration increases 1% every year. The TCR has the most influence in determining our uncertainty about the impacts of climate change. According to Chris Hope of the University of Cambridge, we can see a 50% reduction in TCR’s uncertainty range by 2030, or earlier if international computing centers dedicate themselves to setting up high-resolution global climate models. This is all remarkable considering only five years ago, calculating climate change up to 100 years wouldn’t have been possible.
5. The Human Brain
Hundreds of researchers and partner institutions are looking for ways to simulate the human brain. Leading two ambitious projects is neurologist Henry Markram. “The Blue Brain Project,” founded in 2005, uses one of the fastest computers in the world to reconstruct 3D models of the human brain. Its mission is to learn more about the brain’s architectural and functional principles. A cellular human brain model is possible by 2023.
“The Human Brain Project,” also headed by Henry Markram, aims at providing “informatics” and infrastructure of the human brain by 2023. This could give us deep insight into the way the brain works, providing the scientific community with computational tools, mathematical models, and databases comprehensive enough to share data in a meaningful way. The Human Brain Project could also have a decisive impact on Neurorobotics, hoping to guide ongoing efforts in robotics and artificial intelligence into its next stages of evolution.
4. Long-Term Source of Energy
Harvesting the power of supercomputers, secure and reliable long-term sources of plasma-energy is a possibility for the 21st century. This concept originates in the world of nuclear fusion. Nuclear fusion can also have a major impact on resolving climate change because its plasma-product does not release carbon-dioxide. In theory, less CO2 emissions into the atmosphere will allow greater amounts of infrared radiation to escape into space, reducing humanity’s harmful effects on the Earth’s climate.
To show this, physicists combine advanced computations, experiments, and theories. And it is supercomputers that allow such complex simulations of confined plasmas. Such calculations involve billions of particles and would not be possible without access to some of the world’s most powerful computers. Thus, physicists are gaining an ever deeper understanding of plasma. And we can expect to meet major goals within nuclear fusion energy source projects as early as 2023.
3. Low-Risk Pharmaceuticals With Superpowers
The pharmaceutical industry wants to make drugs that will cure new diseases cost less, and have less adverse side-effects. To do this the industry has opened up a discipline dubbed “Poly-pharmacology.” It depends on supercomputers for its underlying experimentation. The goal is to create a drug that can interact with many proteins. Or re-model an existing drug, designing it to do beneficiary secondary interactions with alternative protein structures. So in effect “Poly-pharmaceuticals” could cure multiple diseases while at the same time avoiding harsh side-effects, making the drugs much more effective.
Designing a poly-pharmaceutical is a great challenge, requiring the most powerful computers in the world and their ability to process mountains of data in seconds. Researchers estimate that the Department of Energy’s Titan Supercomputer can process almost 40 million compounds in a single day.
2. The Next Week Of Weather… With Pinpoint Accuracy
Numerical weather predication (NWP) is a set of mathematical models used in weather predictions. A hundred years ago, meteorologists engaged in NWP by hand. It wasn’t until the 1950s that the use of computers became a bonafide method for calculating weather forecasts.
Even now we can only predict the weather about six days in advance. So a huge focus of research and development within the meteorological community is on supercomputers. The idea is to create mathematical models that we can feed into supercomputers to come up with more accurate weather forecasts and extend the lead time for weather predictions. Especially in cases of hurricanes, tsunamis, earthquakes, and so forth. “The advanced computers are critical to advancing NOAA’s (National Oceanic and Atmospheric Administration) ability to make ever-increasingly accurate weather forecasts and climate outlooks,” according to Louis W. Uccellini, director the NOAA National Weather Service, which now employs three supercomputers in its repertoire.
1. The Universe
Astrophysicists are working on simulations of the entire Universe. The models consist of specified initial conditions. It’s estimated that 75 percent of cosmological theorists make use of these computer simulations. The goal is to model the universe from the Big Bang all the way up to the present-day. Starting from the time when the universe was smaller than one atom. And far ahead in the future, to create computer simulations with resolutions equal to that of every individual particle in the Universe, including quarks, electrons, and photons. Current simulations allow astrophysicists to answer some of the most extreme questions proposed about the Universe, including the current theories of dark matter and dark energy. And how massive black holes, one billion times larger than the sun, formed shortly after the Big Bang. This makes way for dramatic answers about our Universe we have never been able to reach before.