Analytics

Aeronautics: big data helps accelerate test flights

22 December 2015

Aeronautics: big data helps accelerate test flights

To test fly new aircraft, it is necessary to measure, collect and then process more and more data, in shorter and shorter time frames. The methods themselves are also evolving: we are moving from flight by flight and system by system analysis to multi-flight multi-system analysis, in order to cross-check information.
So how do you read these terabytes of data? The solution: big data, which should allow testing to be accelerated and could eventually reduce the number of test flights and thus the cost. The real challenge now is to utilise this novel method in the industry.

When it comes to test flight data and certifying aircraft, the aeronautics sector is becoming more and more demanding. The method consists of placing sensors on the aircraft — to measure indicators such as pressure, acceleration, temperature and position, there are around 6,000 on an A350 for example! — and comparing the results with the expected values, which are provided by the engineering office or the suppliers. If there are any gaps this indicates an anomaly, which must be corrected in order to ensure the aircraft’s safety. Now the task is becoming more arduous, as the volume of data from test flights is increasing with each new generation of aeroplane, because of the growing sophistication of their systems.

Increasing test productivity

The problem is being able to analyse collected data and then cross-check this with subsequent test flights. The volume of data is colossal (2 terabytes of data per flight) and is divided between hundreds of sensors, each of which is responsible for a certain part of the plane. Operators, located in engineering offices, must be able to make use of this data, emanating from different flights. So how can the productivity of these tests be increased, while making sure they are reliable? More directly, how can the ‘return on investment’ (ROI) of test flights be improved?

Putting delays to one side, test flights go one after the other, and often take place daily. To release an aircraft for tomorrow’s flight, it is preferable to analyse the day’s results, to check for coherence between sensors and on-board equipment, to correct anomalies and better anticipate failures (making predictions, in other words).

The solution comes from the internet, via spheres such as marketing and behavioural analysis. It is big data, which, let’s remind ourselves, consists of processing huge masses of unstructured data. This is done with the help of new technology such as business intelligence, with classic tools such as information management and databases showing their limitations.

Until now, for test flights, there has been no database. Access was granted sequentially and information storage was limited to only three months, while a test flight campaign lasts between seven and eighteen months. Furthermore, data analysis was conducted flight by flight and system by system. Big data is changing things rapidly.

Big data to reduce number of test flights

Big data is already bringing about drastic change. It is already used in a number of different ways, for instance to analyse climate information, social network posts, videos posted on the internet, online transactions, amongst other things.

Its use in industry, in particular in aeronautics, is more pioneering. This is because up until now, in test flights, there has been no database. Access was granted sequentially and information storage was limited to only three months, while a test flight campaign lasts between seven and eighteen months. Furthermore, in the short term, rather than analysing data flight by flight and system by system, analysis will have to be multi-flight and multi-system to improve incident detection, to find the cause of problems more quickly and to reduce the number of test flights.

The objective, however, is not to turn everything on its head. Analysis and the correction of anomalies must still be up to human operators. Therefore, the important thing is to make their work easier, by improving concurrent access to data, and not changing the analysis and monitoring tools that have already been developed internally by aircraft manufacturers. Neither should the working methods of users or test operators be altered, thus allowing them to focus on their work as engineers.

Dual competency needed

A number of precautions must be taken. First of all, to improve concurrent access to data, it is absolutely necessary to check the compatibility of big data technology with the aeronautic standards of aircraft manufacturers.

To this end, feasibility must be demonstrated. Choosing the right solution is important, to produce a model which is suitable for industrial use; this can then perform a data reception test. It will be measured, for example, how the model reacts to the injection of data, firstly for a few users, then for up to 300 simultaneous users. The performance will be observed and then compared to that of the classic platform which is currently in operation.

In order to then configure the system as well as possible, it must be adaptable to different use cases. To guarantee optimum use and satisfy the needs of an application, at least 100 parameters can be adjusted. Altering just one of these can have an impact on all the others! It is crucial to ally the expertise of big data with understanding of how it’s being applied to achieve the correct settings.

Ultimately, technical competence must be combined with knowledge of one’s profession.

A tangible return

The future of this system, which is already in operation at one big aircraft manufacturer, lies in the cogency it can offer testers. It allows them to correlate data from numerous tests, and even to correlate this with non-test data, such as the weather, equipment information or, more generally, metadata. It will then even be possible to improve the predictive maintenance carried out by aircraft companies.

A number of flights have already yielded positive results. While the improvement is limited for an individual tester, it is significant when you take many working simultaneously. We have thus been able to observe that, for a “complex” breakdown, the time taken for diagnosis could go from 50 hours with the classic test down to 2 hours using big data. One thousand parameters are analysed each second. In the future phase of the project, the aim is to — among other things — put in place prediction algorithms, allowing test equipment failures to be anticipated, meaning that a flight does not have to be repeated because of a breakdown. Let’s not forget that one hour of flight costs $10,000. The ambition is also to avoid operation checks of test equipment before a flight, which can take several hours.

In aeronautics, big data has shown itself to be a useful and effective tool for reducing test times, increasing their ROI and even, ultimately, reducing the number of test flights and the time it takes to give an aircraft clearance.

Leave a comment

Your email address will not be published. Required fields are marked *