Houston 2009 Annual Meeting

View the Recent Advances and the Road Ahead photo gallery.
Back to 2009 Annual Meeting articles and photo galleries.

Prescient geophysicists discuss ground-breaking research and future expectations

Nina M. Rach, Senior Editor, E&P


Recent AdvancesThis first special session featured eight leading researchers from North America, Europe, and Saudi Arabia. It was chaired by Masoud Nikravesh and Dave Wilkinson.

A. J. Berkhout, Univ. of Delft, presented “Grand challenges for geophysics—a seismic vision of the future.” Berkhout considers multiples as “beauties” and says that we need to delve more deeply into the “science of noise.” We can use all multiples to elicit additional information from our data, even internal multiples in the inverse data space. Interfaces are “doubly illuminated” and can be imaged from the top and the bottom. He sees a bright future for full wavefield migration processing and further development of incoherent shooting. 

Juan Meza, Lawrence Berkeley National Laboratory, presented “Future Directions in High Performance Computing (HPC), 2009–2018,” characterized by the session chairs as the “WOW” talk of the day. Meza said that computing is changing more rapidly than ever before with the turning point in 2004, when 15 years of exponential clock-rate growth ended, and stopped following Moore’s law.

Now, PC and desktop systems are no longer the economic drivers and architecture is again about to change. The number of cores/chips will double every 18–24 months, said Meza, but clock speed will not increase necessarily. Meza thinks systems in 2014 will be based on clusters with many core nodes and many sockets on each node. Memory bandwidth and power consumption will be the limiting factors, requiring low-power designs.

Recent advances audience membersPanos G. Kelamis, Saudi Aramco, presented “A new era in land seismic: The near-surface challenge.” More than 60% of the world’s proven reserves are on land, he said, and the near surface presents problems with signal to noise ratio, statics, imaging, and surface multiples. Future solutions will hinge on integration of complimentary data sets, such as joint inversion combined with micro-gravity. Acquisition will be enhanced with wireless systems, low-frequency sources and receivers, buried sensors, and simultaneous, blended sources. Kelamis stressed that the effects of the near surface should be recognized as an imaging issue, not a statics issue.

Felix J. Herrmann, Univ. of British Columbia, presented “Sub-Nyquist sampling and sparsity: How to get more information from fewer samples.” The impediments to full-waveform analysis are the cost of acquisition and long turn-around time. Herrmann promotes a combined strategy of linear dimension reduction, such as using randomized simultaneous acquisition with source encoding, followed by nonlinear recovery. Compressive sensing (CS) involves randomized subsampling that turns aliased interference into noise, and sparsity promotion. Herrmann says that dimensionality reduction will revolutionize the field and reduce acquisition costs.

Mark Thomson, StatoilHydro, presented “Past and future trends: Two decades of ocean bottom seismic experience in light of Moore’s law.” StatoilHydro has performed 62 surveys with OBS off Norway, beginning with a Gulfaks trial in 1989. The resulting data produces better imaging and feature resolution. Processing each survey required nearly 1 year, until a breakthrough on the 120-sq. km, 36 terabyte Statfjord 3D survey in 2002, when processing was completed in only 123 days. This was improved again in 2008, when data from the Snorre field was processed in only 6 hours using a Linux cluster. “The ability to make informed business decisions is driving data throughput,” Thompson said.

Recent Advances speakersFred Aminzadeh, University of Southern California, presented “Soft Computing (SC) for intelligent oilfield application.” SC involves artificial neural networks, fuzzy logic computing, and generic algorithms. Hard computing, by comparison, involves histograms, variograms, clustering, and kriging.  He compared the relative abilities of hard and soft computing to solve practical problems, and presented five actual problems solved with a combination of techniques. It’s “important to see the intrinsic power and value of each method,” he said.

Hilde Nakstad, Optoplan AS, presented “Toward the optical oil field,” which she defined as a producing offshore field where key downhole and subsea monitoring is conducted by an optical fiber sensing system and all data is transferred to shore through optical networks.

Optical fibers are thinner than 0.1 mm, have an outer polymer protective layer, and transfer data at speeds greater than 1 million Mb/sec. The first optical pressure-temperature gauges were installed in the Gyda field by BP in 1994. Statoil ran a pilot at Snorre field in 2008, and later calculated that the use of time-lapsed reservoir monitoring resulted in 6% drilling cost reduction and 5% additional reserve capture.

Richard Johnston, Schlumberger, in tandem with Dwight Smith, Inner Logix, presented “The effects of generational prejudice on working environments” to round out the session. The Big Crew Change is already here, they said, and today’s experienced professionals are working with “Millennials,” young scientists raised in an information-rich environment. Unstructured data needs to be harnessed, and we need robust auto-translation engines to extend the level of content accessible in nonnative languages, use ICR (intelligent character recognition) systems, and indexed geographic routing, with spatial content routing presented to users in natural language. Solutions are available!

Photographs provided by Barchfeld Photography