For several decades, the Oil and Gas industry has been continuously challenged to produce at lower cost more hydrocarbons in response to the growing world demand for energy.
Progress in data acquisition, rapid progress in rocks physics labs and more powerful computers have greatly contributed to the development of advanced numerical algorithms integrating more and more complex physics and delivering high value to O&G industry.
As we approach the exascale era, we expect that the order of magnitude increase in computing capability combined with the emergences of HPDA and machine learning will still contribute to the development of more accurate and efficient algorithms.
However, as we transition from post-petascale to pre-exascale HPC technologies, scientists must face challenging problems and questions. There is a notable increase in complexity, both in hardware and software stack. The technology roadmap is still a moving process and sometime difficult to understand from a user perspective. What technology for what application? HPC landscape is becoming more and more heterogeneous. How the emergence of cloud computing will impact HPC and the future of datacenter? With the predicted end of Moore’s law by mid 2020s what will be the HPC evolution? Are there any disruptive technologies to be explored/investigated and what will be the impact on numerical algorithms and software development?
In this presentation, we will review Total’s experience in HPC and demonstrate the value that HPC has delivered to the O&G industry. We will see that our industry can still leverage the rapid evolution in computing capability, highlighted by new “players” taking advantage of advances in HPC technologies. However, as we move toward to exascale we assist to an increase of the HPC eco-system and from a user perspective we will discuss about the different challenges we are facing and the necessity to explore new disruptive technologies.