The Algorithm in the Oilfield

Industry | 8 min read
Khalid had spent nineteen years in the oil and gas industry before he ever typed a single prompt into an AI model. He'd started as a junior reservoir engineer in the dusty fields outside Dhahran, climbed his way up to senior production engineer at a mid-sized operator, and earned a reputation as someone who could look at a pressure curve and feel, almost instinctively, whether a well was healthy or dying. His colleagues called him "the old-school one"—not because he was old, but because he trusted his calculations done by hand, his laminated reference tables, and his battered copy of Craft & Hawkins more than he trusted any piece of software.
So when his company rolled out an enterprise license for a large language model in early 2024, Khalid was skeptical. He attended the training session, nodded politely, and went back to his spreadsheets. This isn't for engineers, he thought. This is for the marketing people.
The shift began on a Tuesday, during what should have been an ordinary well review.
Khalid had been handed a portfolio of forty-three mature wells in a declining field and asked to identify candidates for workover. The data was a mess—decades of production records in inconsistent formats, some from legacy systems that hadn't been touched since the early 2000s, others exported from SCADA dumps, still others buried in scanned PDFs of handwritten field reports. His manager wanted recommendations in two weeks. Khalid calculated that, working his usual way, it would take him at least six.
That evening, frustrated and staring down the prospect of months of tedious data wrangling, he remembered the training session. He opened the AI tool, feeling slightly foolish, and typed: "I have production data from 43 wells in multiple formats. Can you help me think through how to standardize it for decline curve analysis?"
The response surprised him. It didn't try to do the work for him—it asked him questions. What data fields did he have? Was he working with monthly or daily production? Did he have water cut and GOR data? What decline model was he planning to use—Arps hyperbolic, exponential, or was he considering something more modern like a Duong model for unconventional behavior?
Khalid found himself in a conversation, not an interrogation. The AI was treating him like a peer. He explained his situation, and within an hour, together they had sketched out a workflow: a Python script to parse the various file formats, a standardized schema for the well data, and a methodology for handling the gaps in historical records. The AI wrote the initial script. Khalid, who knew just enough Python to be dangerous, read through it, caught two assumptions that wouldn't work for his specific field (the AI had assumed constant bottomhole pressure; his wells had significant pressure variation), and asked for corrections. The AI revised the approach without ego, without defensiveness.
By Friday, he had clean data for all forty-three wells.
The real transformation came the following week. Khalid had always been strong at the physics of reservoir engineering, but he'd struggled with the statistical side—Bayesian updating, Monte Carlo simulations, the kind of probabilistic reasoning that modern production forecasting demanded. He'd faked it for years, leaning on deterministic estimates and adding conservative safety factors. Now, with the AI as a patient tutor, he started asking the questions he'd been too proud to ask his younger colleagues.
"Explain Bayesian updating to me like I'm a petroleum engineer who slept through statistics class."
The AI did. Then it walked him through applying it to his actual wells, using his actual data. When Khalid pushed back on a concept that didn't match his intuition about reservoir behavior, the AI didn't just capitulate—it explained the underlying math and showed him where his intuition was actually correct and where it was leading him astray. He learned more about uncertainty quantification in three weeks than he had in the previous decade.
His workover recommendations, when he submitted them, were different from anything he'd produced before. Instead of a ranked list with single-value NPV estimates, he delivered probability distributions. For each candidate well, he showed the expected value, the P10 and P90 outcomes, and the sensitivity of the economics to the three most important uncertain parameters. His manager, Fatima, read through the report twice and then called him into her office.
"Khalid," she said, "this is the best technical work I've seen from this department in two years. Where did this come from?"
He told her the truth. He'd been using the AI tool, not as a replacement for his judgment, but as a collaborator. It had helped him do the grinding data work faster, yes, but more importantly, it had helped him become a better engineer. It had filled in gaps in his knowledge without making him feel small for having them. It had caught his arithmetic errors. It had suggested modeling approaches he wouldn't have considered. And crucially, it had pushed back when he was wrong—gently, but firmly.
Over the following months, Khalid built what he came to think of as his "second brain." He used the AI to draft field trip reports, which freed him to spend more time actually thinking at the wellsite instead of scribbling notes. He used it to translate technical documents from the Chinese equipment vendor, catching nuances that the old translation software missed entirely. When a new graduate joined his team and needed mentoring, Khalid used the AI to help him design a training curriculum, drawing on resources and case studies he wouldn't have had time to compile himself.
But he was careful. He'd seen younger colleagues become over-reliant on AI tools, pasting in outputs without understanding them, making recommendations based on reasoning they couldn't defend. Khalid set rules for himself. He never included a calculation in a report unless he could reproduce the logic by hand on a whiteboard. He never cited a reference the AI provided without verifying it existed. And he always, always, applied his own judgment last. The AI was a powerful assistant; it was not an engineer. It didn't know that the pump jack on Well 17 had a vibration issue that the data didn't capture. It didn't know that the operator on the night shift tended to underreport gas production. It didn't know the field the way Khalid knew the field.
Eighteen months after he first typed that hesitant prompt, Khalid was promoted to Technical Lead for unconventional assets. He presented at an industry conference on integrating AI tools into traditional reservoir engineering workflows. In his closing remarks, he said something that stuck with the audience:
"The AI didn't make me obsolete. It made me the engineer I should have been ten years ago. It took the parts of my job that were grinding me down—the data cleanup, the repetitive calculations, the translation of my ideas into polished reports—and it gave me back the time to do what I actually love. Which is sitting with the data, thinking about the reservoir, and figuring out what the earth is trying to tell us."
He paused, then added with a small smile: "It also forced me to learn statistics. Finally."
The old-school engineer had become something else. Not a new-school engineer. Just a better one.