This is the first part of a two-part series, where I will be looking into how AI and automation are impacting wind turbine blade maintenance. In this post, I will be looking at blade inspections and the asset management aspect of it. The second part, to be published in August, will be concentrating on the development of robotics and automated blade maintenance.
AI has been on the news a lot for the past few years. On one hand, the Microsoft Bing search engine now utilizes the OpenAI GPT-4 model to produce amazingly useful replies (and sometimes funny or questionable ones), even to create art to your heart’s desire. On the other hand, some of the smartest and most influential people on the planet are warning that AI might kill us all. While the jury is still out for AI becoming the civilization ending entity pictured in many movies, like in most things in life, the truth lies somewhere in the middle.
“What has this got to do with wind turbine blade maintenance?”, you might ask.
For many years, we and others working in the industry have been tackling the daunting task of blade data analysis. In many cases, this blade data has meant visual images taken either from a drone, ground-based camera or a technician working up close to the blade. Some other technologies, such as ultrasonic phased array scanning and thermal imaging, exist as well. For this post, I will concentrate on the visual images as those represent most of the data currently being utilized.
It all started from images being taken by hand with a digital camera and then someone manually sorting the images on a laptop, producing a written report, analysis, and a maintenance plan. This approach led to some hilarious incidents. I was once handed a USB memory stick by a customer with him asking me “to take a look and give us a quote.” Back at the office I looked in the memory stick and it contained 8000 unorganized images with no inspection notes at all. Nowadays it might be possible for an advanced AI model to use those images for something; back then, their value was zero.
A lot of developments have happened since then and today there are exceptionally good tools for imaging the blades in an organized manner. Drone inspections have become the preferred method of image capture; there are many companies which can produce high quality data sets, with a high level of operational autonomy built in them. The costs of drone inspections have decreased, to a point where I see no reason for them not to be utilized on a scale and often.
“The more data, the better you are, right?”. Well, yes and no.
Visual images often only tell half of the story. We regularly see situations where the initial analysis and repair plans, based on the visual images, can change by 30-50% once the actual repair work begins on-site. There have been cases where the delta between the original analysis and the actual repair needs have been different by more than 100%. In almost any other industry, this would be utterly unacceptable.
In June, Siemens Energy announced that its wind turbine issues could cost more than 1 billion euros to fix and could affect as many as 15-30% of its fleet. The announcement only mentioned that “while rotor blades and bearings were partly to blame for the turbine problems, it could not be ruled out that design issues also played a role.” Although I could not find any more details about their issues, I would not be surprised if the issues with blades would constitute most of the problem. This is not to say that Siemens’ blades are somehow bad. They most certainly are not bad. It just underlines how wind turbine blades, as a component, are complicated. They pose special challenges in terms of identifying the issues in them and managing their life cycle. All the way from the design table to field maintenance and end-of-life.
The problem is that many of the issues with blades are underlying and can only be detected once the work beings and material is removed. Or the underlying issue becomes severe enough to be visually detectable. This creates a special challenge for applying AI models to visual images. The AI models can only identify the issues they are trained to look for. If the issues are not visible, they cannot be identified.
The issue is made worse by the fact that there is currently no industry-wide data available, so the training of any AI model will require vast datasets to be truly game changing. This problem has been identified elsewhere and the recent announcement by Microsoft and Meta to open source the Llama2 model is not done because of the goodwill of these companies, but because they recognize that the most powerful AI model will be the one with the widest outreach and the largest datasets available to it.
AI is already employed by many companies working with wind turbine blades. I have no doubt that in the future it will significantly improve workflows and further reduce costs. But I am yet to see the “golden nugget” that will solve the blade data analysis problem, in a way where the difference between the AI analysis of visual images and the real-world situation would consistently be under 5%.
If you have such a solution, please contact me.
PS. This post was written the old-fashioned way, without AI help. I am sure that will change soon too.