Digital transformation was a hot topic during the first day of the 23rd Annual ARC Industry Forum. Analysts, vendors and customers all had various angles and advice, which led to some thought-provoking discussions and questions. There was also an abundance of parallel tracks — and I am still catching up on some highlights from colleagues. However, I did see a number of trends come into focus from the opening keynote and the track I was in (Analytics and Machine Learning).
So without further ado, here are the "Five Ds" that I noted.
In the opening keynote, Mike Guilfoyle of ARC pointed out an all-too-problematic path that a lot of people go down with digital transformation. There are so many possible choices and directions that, at the outset, they abandon trying to clearly define a direction and jump right into a technology.
In one slide, Mike showed several paths that one can pursue, and then half-jokingly went on to note that he could have listed 10 more slides like it — or even 100 more.
The key point is to focus on which business problem you want to solve, and then look for the technology that can solve it. Nihar Satapathy of thyssenkrupp North America noted that one way his firm fostered this approach was to develop a team of “translators” who were adept at bridging the gaps between the business needs and the technology implementer.
One particularly prevalent aspect of digital transformation in the process industries is technical debt. Many companies have expensive legacy technology in place, and it is not easy to simply pull out and do a big replace.
Larry Megan of Praxair Digital summarized this well, pointing out that digital transformation is going to be unique to each environment. As he noted, for some companies, just being able to get all their data into one dashboard may be a significant step in their digital transformation, even if this is usually considered a step in a modern business information evolution. So progress is a very much a measure of starting position as well as goals.
A point made by Patrick Holcomb of Hexagon PPM on the morning panel was around the need for domain expertise. The panel talked about the multitude of horizontal approaches out there to areas like machine learning and AI (which I also noted in my blog on the FT Energy summit).
Patrick specifically pointed out that one needs the expertise to know the failure mechanisms to actually understand how to address asset performance management (APM). As an example, he talked about the detailed knowledge one has to have about compressor blades, their design, seizing and other factors that can help understand what would cause a compressor to fail in the first place.
The afternoon panels contained several real-life use cases. Of particular interest were some examples from Eyad Buhulaiga of Saudi Aramco, who talked through the use of machine learning to model flares. He also talked about the intense level of security that is needed for plant data, and how that is driving distributed computing to the edge of the plants.
Of course none of this is without tradeoffs. However, the advantages clearly stood out and satisfy the needs of the environment, including limited or no connectivity, bandwidth costs, cybersecurity risks and latency.
(Just) Do It
With so many options, so much jargon and so many legacy applications, where does one start? Ultimately, Paula Hollywood, one of the panelists in AspenTech’s session on Digital Acceleration, summed it up by noting that people can’t get caught up in “analysis paralysis” – instead, pick a project, start, and look for the value. Be discerning, and don’t try to “boil the ocean.”
And that was just a sliver of day 1. I’m excited for the next few days of the digital journey — thanks to ARC for a great start so far!