[show_avatar firstname.lastname@example.org align=left avatar_size=30]It’s hard to believe that just 25 years ago seismic interpretation was a manual process. Rather than workstations, provisioning a geoscientist meant providing a solid drafting table, good light, plenty of colored pencils, and paper weights. The process of developing a prospect could take months or even years. The advent of computer-based systems was a boon to the industry and has drastically reduced workflow times and the risk of drilling dry holes. The infrastructure evolved from a large, expensive turnkey system shared by many geoscientists, to large, isolated, individual workstations. For data backup, disk was expensive and tape ubiquitous.
The client/server model coupled with shared storage decreased costs, reduced the use of tape, and streamlined access to data. Thin client technology, which puts minimal hardware on the desktop and relies on the computing power of the server, was yet another innovation used to reduce costs. One could argue that today we have come back to thick clients with “fat” (large memory, multiple CPUs) Windows and Linux based workstation applications.
A new IT revolution is upon us – the shift towards cloud computing. The private cloud, deployed on a company’s private LAN/WAN, is a dramatically more efficient and effective model for delivering IT as a service, making IT easier, more cost effective to install, use, and manage. Most large companies with lots of sensitive data that needs to be well protected and easily accessible are turning to the private cloud. A private cloud has the ability to accelerate geoscience workflows, increase efficiency, reduce costs and provide a more robust and resilient IT infrastructure that can be rapidly deployed and better protected. Part of the journey to the private cloud is to virtualize applications.
CIOs of oil and gas companies around the globe are recognizing the advantage of the private cloud and are embracing the journey by virtualizing applications. The benefits of private cloud are too great and numerous to ignore. Corporate applications like email, HR, accounts payable, etc. are already being converted to this new model and the benefits are tangible. Server virtualization alone makes IT more efficient by eliminating redundancy, optimizing infrastructure costs and increasing operational efficiency.
The challenge with cloud computing in upstream oil and gas
Applying this virtualized IT model to the upstream computing environment is what we call, “PetroCloud”. The biggest challenge to realizing the PetroCloud is virtualizing Geology and Geophysics applications because they require high-end graphics. There are some promising technologies on the immediate horizon that should address that issue.
Exploration and production rely heavily on interactive 3D applications as part of their daily workflow. These applications generally can only achieve usable performance if the 3D rendering is hardware-accelerated, i.e. graphics accelerators are in the geoscientists’ workstation. The desirable performance for these applications is 20-30 frames per second. Thus, organizations which would like to move toward a more centralized PetroCloud computing model of application deployment have been constrained by their inability to move key 3D applications off the user’s desktop.
These applications are designed to have the data rendered in the graphics hardware on the desktop which then requires the data to be local or supplied via a very large network pipe. The other option is to execute the application on the server and have the application send all of its drawing commands and data, both 2D and 3D, to the client-rendering hardware. Neither of these methods supports real time 3D rendering application performance over the WAN (wide area network).
How does new technology support 3D rendering applications in the cloud?
New virtual graphics technology from companies like VMware provide the architecture for the 3D rendering to occur on the server machine, where there is a fast and direct link between compute, graphics, and storage resources. Then only the resulting 2D images must be sent to the client. Images can be delivered at the same frame rate regardless of the size of the 3D data files that were used to generate them.
As technologies that support server side hardware rendering are becoming available from companies like VMware, the important items to consider are:
- Is server side graphics rendering done in hardware? Software rendering is not fast enough to support interactive applications
- Can the technology leverage one graphics card to render and support multiple users or does it require one server and one graphics card per client?
- Is the technology compatible with industry-leading virtualization technologies?
Clearly, information infrastructure technology is rapidly evolving. The PetroCloud is on the horizon now and moving closer to an upstream environment near you. Companies that are quick to deploy this technology to achieve a more streamlined, better protected and cost-effective infrastructure will surely enjoy a competitive advantage in both flexible computing capabilities and lower overall cost to the operation.