Arvind Thapliyal
24 Sep 2024
Déjà Vu: Avoiding Repetitive Accidents in The Indian General Aviation Sector
Almost two decades ago, I attended an Accident Investigation Course at the French Institute of Flight Safety (IFSA). During the introductory session, a senior trainer made a point that has stayed with me since. Using several examples, he explained how the causes of air accidents have remained the same since 1903; just the people, the places, and the aircraft have kept changing. A year or so later, I underwent the ICAO “Train the Trainer” Safety Management System (SMS) Course in New Delhi. During a discussion, the lead instructor, a highly- experienced ICAO aviation safety professional, said precisely the same thing, and mentioned how ICAO was deeply concerned about addressing repetitive accidents.
Years of observation and experience in aviation safety have strengthened my belief that both these gentlemen were right: accidents repeat themselves all too often. Here are a few examples, culled from the broad spectrum of civil aviation, both in India and abroad.
1. Fuel Starvation/ Exhaustion
On 09 Jan 1968, a Beechcraft Bonanza crashed shortly after takeoff from Amausi airfield, near Lucknow. The cause was found to be fuel starvation. Almost 50 years later, on 01 Dec 2017, a Piper Seneca crashed near Dhule in Maharashtra during night flying. The cause was found to be fuel exhaustion, as the aircraft had carried less fuel than required for the mission profile.
2. Runway Overrun
On 02 Aug 2005, Air France 358, an Airbus-340, overshot the runway at Toronto-Pearson International Airport after landing. The cause was a combination of bad weather and human error. Almost exactly 15 years later, on 07 Aug 2020, Air India Express Flight AXB 1344, a Boeing-737-8HG, met the same fate while landing at Calicut airfield. Similarities in the factors that contributed to both accidents (thunderstorms nearby, wet runway, long landing, human factors), are starkly evident.
3. Controlled Flight Into Terrain
On 18 Mar 1984, a Pushpak of a Flying Training Organisation (FTO), crashed into the sea off the coast of Chennai (then Madras), while carrying out unauthorised low flying. Over the years, this accident has seen several iterations in FTOs, including one confirmed case 33 years later, of a Diamond DA-42 on 26 Apr 2017 near Gondia, and at least two highly probable cases within the last five years, whose final investigation reports are still awaited.
Why do accidents repeat themselves?
After almost 125 years of heavier-than-air flight aviation, you’d think that the aviation community as a whole might have discerned repetitive accident patterns and instituted remedial action. That, sadly, has not happened, and one of the reasons for it is that some of us simply refuse to learn from the past.
Accidents occur across the spectrum of aviation organisations; some are repetitive. However, the problem is not spread evenly across all civil aviation sectors. As per IATA’s latest safety survey, the “All Accident” rate per million airline sectors dropped from 1.30 in 2022, to 0.80 in 2023. Despite the expansion of air traffic over the years, airline accident rates have not increased over the past decades. This is because the airline industry, by and large, has a structured and robust safety management system in place, which is constantly improving safety practices and tools, a trend seen in India as well. The bottom line is that lessons learnt from accidents are quickly translated into remedial actions.
However, the story is different in our General Aviation (GA) sector. In the Indian context, this sector comprises all FTOs and certain niche service providers, such as skydiving aircraft operators and survey, firefighting, and rescue aircraft operators. Up to some years ago, it was a small sector, largely outside the ambit of institutionalised, professional safety management. Over the last five or so years, this sector has burgeoned, and accidents within it have accordingly increased.
The intent here is to address to the problem of repetitive accidents on General Aviation aircraft in India. Because of sheer numbers, much of what will be stated applies to FTOs, and may not always resonate in other GA organisations. We will examine the following aspects:-
1. How a typical GA accident investigation process unfolds
2. Why companies don’t learn from GA accidents
3. Suggested measures to improve the situation
A Typical Accident Investigation Process
A regulatory body investigates all accidents in India. Every accident investigator in this body will agree that five questions must be answered after an accident. These are: What? Where? When? How? and Why? Due to the omnipresence of social and digital media, the first three are swiftly answered, within minutes in some cases.
The How and the Why take the most time.
Answering the How is hard, grinding work. It involves evidence gathering, sometimes laying out the wreckage, reconstructing the path of the aircraft, interviewing witnesses and correlating their statements with other evidence, analysing radar tracks, radio calls, Flight Data Recorders (FDRs) and Cockpit Voice Recorders (CVRs), and much more. Here, the focus is on establishing the timeline of events, from when the aircraft was boarded with the intent of flight, till the accident occurred. Usually, the How is also answered satisfactorily, barring cases of aircraft lost at sea or in inhospitable reasons, where access to wreckage or evidence is impossible.
The Why is more complex, especially in fatal accidents. In the case of a GA accident, or any accident for that matter, regulatory investigators face several problems, some of which are:-
- As in other fatal accidents, without the crew’s statements, the investigative team relies only on material evidence and witness statements.
- In an accident where the prima facie cause is Human Factors (earlier generically and incorrectly named “Pilot Error”), a key aspect is the state of mind of the deceased crew, which can only be approximated as best as possible, based on radio calls, CVR inputs, interviews with witnesses, colleagues, and other staff, sometimes even near relatives.
- Some witnesses, especially those in the affected company, often distort the narrative. Examples are concealing related events that happened earlier, or giving sweeping, unverifiable statements such as, “I heard he (the deceased pilot) used to take a lot of chances, so this was bound to happen.” The reasons for these distortions are complex, but a common trend is a subdued yet perceptible atmosphere of panic that prevails in the company as long as the investigators are at the site, which, in turn, is caused by fears of getting roped into the investigation, and becoming collateral damage.
- With the mass induction of glass cockpit aircraft, modern GA aviation activities are being conducted in increasingly intricate operating and technical environments. Multi-causation (more than one cause for an accident, such as Technical Failure + Human Factors), is more challenging to detect in a small organisation where inputs are limited.
- Finally, since the mid-1990s, it has been acknowledged that organisational safety culture and behaviour are often causal factors in accidents. Accident investigators, therefore, need to probe into these areas, focusing on some critical facets such as organisational structure, safety infrastructure and training, leadership, management, and organisational audit processes. This is not an easy task.
With so many intangibles in this matrix, regulatory investigation teams have an uphill task, often working on a tight timeline and lacking local knowledge. Consequently, despite being voluminous, detailed documents, complete with photographs and videos, some investigation reports answer everything except the most important question – Why? Why did it happen? Why, for example, did a serviceable aircraft, being flown in good weather, suddenly end up as a flaming wreck? Why did the crew do what they did?
This flawed process all but ensures that the root cause of the accident is not determined in a disturbingly high proportion of GA accidents. The natural consequence is a recurrence of the same or a similar accident. It might take months, years, or decades, and it may occur in another organisation, but it will reoccur. Here are a few reasons why this trend prevails.
Why Companies Don’t Learn From Accidents
The “Business as Usual” Approach
In the highly competitive GA world, one stance taken after a fatal accident is, “It’s unfortunate, and we pray for the departed souls, but we do need to move on.” This spiel, spun out by or on behalf of senior managers, is more common than one might think. Why is everyone so keen to move on? The reason is that accidents often reveal embarrassing flaws and loopholes in organisational processes, some of which may not even emerge in the regulatory investigation. For various reasons, including shielding the higher management, fear of damage to the reputations of individuals and the company, and forestalling uncomfortable questions from insurers, organisations tend to bury the past and look forward to the future.
As a result, practically no internal discussion takes place on the causes and possible remedial measures. It is pertinent to mention here that regulatory investigation reports can take up to two years after the accident to be published. What does an organisation do in the interim? Well, nothing, in most cases. This is not a theory; some GA organisations, particularly those whose top management is bereft of experienced aviators, have been known to avoid any post-accident introspection or review, and actively dismiss or discourage attempts to “look within.” This ostrich-like policy is a recipe for disaster. When no time is spent deliberating on the shortfalls or organisational weaknesses that were revealed by the accident, a repetition is all but guaranteed.
Unsafe Organisational Culture
Organisational culture is a multifaceted entity. In the context of safety, this culture can be defined as the atmosphere created by senior management which shapes workers’ attitudes towards safety practices. Aviation safety professionals agree that data is the lifeline of a safety programme. Thus, a strong indicator of a positive safety culture is the manner in which safety data is created and processed. This data encompasses safety occurrence reports, defects noted in aircraft during operations, audits, surveys, and many other inputs. Of these, safety occurrence reporting is a crucial element. The number, quality, and frequency of safety occurrence reports raised within an organisation is often a good yardstick to measure the health of its safety culture.
There are several benchmarks for measuring safety culture, of which three are considered.
1. Generative — the company values safety reporting
2. Bureaucratic — the company restrains safety reporting
3. Pathological — the company prohibits safety reporting
A generative safety culture is ideal and is marked by sharing responsibilities, encouraging filing of safety reports, welcoming new ideas, scrutinising failures, and other proactive measures. Essentially, the organisation treats a safety occurrence, especially an accident, as the failure of a process and tries to address the lacunae that caused the failure through focused and time-bound measures.
Bureaucratic cultures are marked by high tolerance for minor incidents and attempts to put them into convenient boxes, using red tape and creating financial constraints to prevent corrective actions, or implementing these actions only on paper, and treating individual professional failures leniently. The process is similar to how some bureaucracies work – they appear to be doing a lot but, in reality, do nothing meaningful. A few clear indicators of a bureaucratic culture are – excessive conferences, meetings and discussions (all with no Minutes, decisions, or follow-up), letters written to so-and-so and then forgotten, and acceptance of delayed timelines.
The worst safety culture is a pathological one, whose indicators are – a lack of safety occurrence reports, the reluctance of employees to speak up, cover-ups, shouting down or attacking those who raise safety issues, and an overall disregard for safety. This type of culture is often driven top-down; the boss makes it generally known that he doesn’t want to hear any bad news, and some employees fall in line by hiding safety occurrences. When other employees encounter hostility or belittlement from the top management after raising safety issues, they stop highlighting them, even if they know the likely consequences. This culture often leads to a catastrophic accident, since minor safety occurrences, also called incidents, are early indicators of a systemic problem. It also spawns two more problems- lack of safety data storage, and loss of institutional memory.
Lack of Safety Data
When collated and analysed by professionals, safety inputs or data can present an unbiased view of trends and problem areas. In case non-reporting is the norm, and safety reports are scarce, it becomes challenging for safety operatives to discern trends, educate people about them, or employ mitigation measures. The importance of data is illustrated by the Heinrich Principle, first proposed by Herbert William Heinrich in 1931. It states, "In a workplace, for every accident that causes a major injury, there are 29 accidents that cause minor injuries and 300 accidents that cause no injuries.” This relationship is often shown pictorially as a triangle or pyramid. Although the principle emerged from Heinrich’s studies of accidents in factories, and opinions are divided on whether or how it can be applied in an aviation setting, it is a valuable indicator of how things can go wrong if minor incidents are not analysed to discern trends. A rise in incidents often precedes accidents, and a lack of reporting and analysis inhibits the employment of timely mitigation strategies.
Lack of Institutional Memory and Continuity
This malaise infects all GA organisations, and is probably the most critical cause of repetitive accidents. Organisations just don’t keep good records of past safety occurrences. Thanks to computerisation and digitisation of records, this problem is gradually being resolved- one hopes! However, a further issue is the lack of continuity of key safety staff. During a visit to Russian aviation safety organisations some years ago, it was discovered that the average staff tenure varied between 10 and 15 years. Compare this with many GA organisations, where the safety heads and staff are short-term, contractual employees, often just biding time to join an airline, and leaving as soon as an opportunity presents itself. These factors ensure poor institutional memory and, over time, lead to repetitive accidents.
The “What, Me Worry?” Syndrome
Readers who grew up in the ‘60s and ‘70s will remember Mad Magazine’s flag-bearer, the gap-toothed Alfred E. Neuman, mouthing his famous byline, “What, Me worry?” It epitomised a carefree and uninformed person, who is not interested in exploring matters that don’t affect him personally. In the context of GA operators, it underpins an attitude that “accidents happen only to other people.” No attempt is made to share safety occurrences with other companies flying the same aircraft, or performing the same role. This approach pays no dividends.
Complacence and the Red Queen Effect
When an organisation hasn’t had an accident for a long time, an illusion is often created that no accident will ever occur. The result is a slacking of effort and reduced focus on safety oversight, training, and processes, and an overwhelming complacency at all levels. This is very dangerous. Complacency breeds carelessness, and carelessness breeds accidents. The problem and its solution are illustrated by the Red Queen Effect, a concept promulgated in evolutionary biology. The Red Queen is a character in Lewis Carroll’s “Alice Through the Looking Glass,” who advises, "It takes all the running you can do to stay in the same place. If you want to get somewhere else, you must run twice as fast.” In other words, if an organisation wants to remain ahead of threats that could cause accidents, it has to ensure that its accident prevention tools remain sharp, since the threats themselves are constantly evolving and morphing. Failing that, the organisation will fall into the trap of complacency and become accident-prone.
The Safety Department’s Conundrum
Safety is not achieved by accident, and safety operatives have a daunting task. They often have to operate in an environment that is peppered with overt and covert problems and walk a fine line between doing too much and too little. A tilt towards the former earns them managerial frowns, hostility from peers, and allegations of being perpetual questioners of operational plans, never mind how hare-brained the plans may be. On the other hand, the ‘do less’ approach is sure to invite censure during regulatory audits, and even firing. Motivation, too, is a problem; there are no rewards or even acknowledgements for an extended accident-free period, but severe adverse consequences ensue when even an incident occurs. Safety operatives often function in this highly demotivating environment. Frequently, they take the path of least resistance, which is what the organisation tacitly wants. The fallout is operational overreach at the cost of safety oversight. Such situations are like ticking time bombs, waiting to go off anytime.
Financial Constraints
Money makes the world go round. Without funding, there can be no safety organisation, no safety programme, and no safety oversight. In the GA world, where profit margins are usually slim, a company accountant always looks for ways to cut costs. On the other hand, like the captain of a cricket team, a safety head can deliver only if the organisation provides him with the requisite manpower and tools - both of which require funds. This leads to a clash that takes place across the entire range of GA organisations.
The result of this tussle is quite often an implicit policy of downsizing the safety assets whenever cost-cutting is required. During the COVID pandemic, one company head, looking to cut the salary bill, decided that a particular key safety functionary was too highly paid. This was just a whimsical decision, with no logic behind it. The head of safety dug his heels in, and refused to play along, which led to tensions and harassment. Six months later, the same functionary played a crucial role in averting a major accident in the organisation, leaving the company head red-faced. However, that was a fortunate turn of events; in most cases, whenever an organisation starts loading the “safety vs productivity” scale in favour of the latter, it loses aircraft and lives.
Top managers and finance advisors fail to realise that short-term gains accrued by sacrificing safety assets to cut costs can, and often do, lead to long-term losses. Every accident has direct and indirect costs; the latter can go up to as much as four to six times the former. Indirect costs accrue for various reasons, including changes in procedures, retraining of pilots or engineers, reputational damage, lowering of employee morale, purchase and maintenance of additional flight simulators, and legal costs, to name just a few. Regulators also step in and add their own opinions on what the organisation should implement or buy, which further increases costs.
Regulatory Shortfalls
Composition of Regulatory Bodies: Indian regulatory investigation bodies mainly comprise people with an engineering background, as compared to other advanced countries, where a sizeable component of aviators is present. The advantage of this mixed team is a more holistic investigation process, and more precise answers to the “Why” of the accident.
Delays in Publishing Accident Reports: GA accident reports take far too long to be published online, affecting aviation worldwide. IATA's analysis of accident investigations from 2018 to 2022 showed that only 54% of the accidents had an investigation report published. This leads to delayed implementation of recommendations and rule amendments, wherever necessary. In the interim, another accident with the same root cause can occur.
Suggested Measures to Improve the Situation
What is necessary to prevent repetitive GA accidents? Not much. Here are a few suggestions for implementation at local and national levels.
Company-level Measures
Responsibility of the Top Management
The primary aim of a safety programme is accident prevention. Company heads must allot time, manpower, and resources to achieve this aim. Before that, it is necessary to acknowledge the importance of safety management and shift the organisational policy from the adversarial “safety vs productivity” stance to a more harmonious “safe production” one. Some steps that need to be taken by the leadership are given below.
What, We Worry: This needs to replace the “What, me Worry?” attitude. Leaders must buy into safety and commit to it wholeheartedly. Company heads often restrict their safety responsibilities to mouthing cliches such as "safety is paramount" and "there will be no compromise on flight safety," at regular intervals. These words have to be followed up by action. They need to walk the talk by encouraging all employees to report safety occurrences without fear and ensuring that risk assessments are conducted before any major changes in operating philosophy, manpower, or aircraft fleet are implemented. Audit findings need to be actioned within stipulated timeframes. Spot checks, walkaround inspections, and surprise checks must be carried out diligently, and their findings need to be taken by department heads as professional advice, not acts of war.
No, it Isn’t Business as Usual: After an accident or a serious incident, there is no need to wait for the publication of the regulatory investigation report to take remedial action. Every case needs to be internally analysed by the safety dept, and a comprehensive briefing must be held for all post-holders. The approach should be curative, avoiding finger-pointing and blame. Each department head needs to understand what he or she could have done to avoid the occurrence, and convey the message downwards. Internal measures to prevent recurrence should kick in at the earliest, not two years later.
Handholding the Safety Dept: As already mentioned, safety operatives walk a fine line. When the head of safety shows up on his morning rounds of the ramp or hangar, it can be a very challenging experience. Employee behaviour resembles a herd of deer who have sensed a tiger in the vicinity. People suddenly vanish or become very busy, and walk around with a wary look in the eye, giving monosyllabic answers to casual questions. This process inhibits hazard detection and hobbles safety management. The way out is for the boss to clarify at all levels that he backs the safety department to the hilt and that its staff are just doing their jobs. When reporting drops, department heads need to be held accountable. The consistent message from the boss should be that an accident averted benefits everyone, and that safety personnel, like all others, are professionals trained to pick up signs of impending trouble and forestall it. Working harmoniously with them is not an option, but an imperative.
Just Culture and Follow-Up: An often misunderstood term, it defines a culture that promotes open reporting and encourages sharing mistakes so they can be fixed without adverse consequences to the reporter. It envisages an environment where “psychological safety” exists in the team. As described by one author, team psychological safety is a shared belief held by team members that it is OK to express their ideas and concerns, to speak up with questions, and to admit mistakes — all without fear of negative consequences. Essentially, Just Culture is designed as a protective umbrella for self-reporters, and intended to create a positive reporting culture. It is not, however, a get-out-of-jail-free card; the circumstances of the incident, as well as the experience and training levels of the concerned people, are evaluated, and specific acts (wilful negligence, blatant disregard of rules, illegal acts, and substance abuse) are outside its ambit.
Follow-up through feedback loops goes hand in hand with Just Culture and psychological safety. The leadership must promptly and seriously address safety concerns and issues raised by employees. If nothing happens after problems are raised, then employees and customers will stop raising them, seeing them as an exercise in futility. A leadership that listens but also acts is essential for an influential safety culture. Feedback to individual occurrence reporters is also necessary, because they will get the satisfaction of knowing that their word is valued, thus building up confidence amongst the entire group.
Resource Allocation: A company once set up an FDR analysis programme, aiming to analyse all flights. Unfortunately, financial constraints ensured that inadequate resources were allocated for the task. Loopholes in analysis thus existed, and were known to the operators. The outcome of this ill-advised attempt to cut costs was a fatal accident. It needs to be understood by the higher management that one cannot put too high a premium on safety, and that adequate resources must be allotted to fulfil the department’s mandate.
National-level Measures
Sharing of Best Practices
Up to just a few years ago, the national aviation regulator's efforts seemed to focus mainly on the safety programmes and performance of scheduled airlines. During the tenure of the previous Civil Aviation Minister, the regulator started taking a hard look at GA accidents and the entire GA safety infrastructure. As a result, some good things happened, such as approving safety heads and SMS Manuals and increasing interaction with GA operators. More needs to be done here. Some GA operators have highly qualified and experienced personnel in safety appointments, while others have freshers who need training, but don’t know where to obtain it. A forum for safety heads of GA operators needs to be established under the aegis of the regulator. Seminars and conferences can be organised regularly, where safety heads of all GA organisations share their experiences and best practices transparently. In this context, it is heartening to hear the present Civil Aviation Minister encouraging the regulator to maintain focus on improving FTO safety trends, a move which will undoubtedly bear fruit.
Appointing Qualified Safety Operatives
Safety management is a bona fide profession that requires people with the requisite training and expertise to undertake it. Getting trained manpower is not easy, because approved SMS training organisations, for some reason, have not been set up in the country. The shortfall is evident; in the recent past, almost 90% of FTOs had to make multiple attempts to get their SMS Manuals and their Safety Managers approved by the regulator. Appropriate infrastructure needs to be set up to ensure that every GA safety head is a trained and certified accident investigator who should have undergone an SMS course with particular emphasis on hazard identification and risk assessment.
Commissioning of Study
Concurrently, an independent study covering GA accidents over the last 50 years must be commissioned. The period of 50 years has been chosen to broadly cover all the current and legacy aircraft being used in GA organisations; it can be increased or decreased. The purpose should be to discern accident trends in this period, and focus on repetitive accidents. Clear recommendations to avoid them need to emerge at the end of the process.
Regulatory Staff Enhancement
GA is mushrooming in India. New FTOs and charter companies are being opened almost every month, leading to a corresponding rise in incidents. The same regulatory staff is entrusted with safety oversight over scheduled operators and non-scheduled/ GA organisations. Is it qualitatively and quantitatively sufficient to address the growing number of civil aviation aircraft in Indian skies, which is roughly 1000 to date, with about 1500 more on order? Do we have enough investigators, safety auditors, risk assessment specialists, and other safety specialists in our regulatory oversight bodies? One indicator indicates otherwise: the excessive time taken for accident reports to be published. If there is a shortfall, retired, trained investigators and safety specialists from the military or other govt organisations can be hired on a contractual basis.
Conclusion
Aviation is an expensive proposition. An accident's human and material costs are tremendous, and two of them in one GA organisation can spell its doom. It is not possible to name every possible reason why an accident might occur, because the GA field has so many variables in terms of aircraft types, operating environments, and managerial structures, among others. However, it is a fact that regulators and operators have followed a generally reactive approach to GA accidents for far too long. The outcome of this faulty strategy has been a marked failure to answer the question “Why” after numerous mishaps. It is time to go proactive; the first step would be to try and avoid repetitive accidents. For that to happen, a national effort spearheaded by the Ministry of Civil Aviation must commence as soon as possible. The internal issues that GA organisations need to look into are leadership commitment, hiring expert safety staff and backing them, revamping organisational safety culture, open reporting, data gathering and analysis, and strongly endorsing safety initiatives. On a national level, the regulator needs to push for the sharing of best practices amongst GA operators, imparting of training to freshers by the qualified and experienced safety heads among them, setting up of SMS and accident investigation training organisations for GA safety heads, commissioning a holistic study on GA accidents, and induction of sufficient, experienced staff in the concerned directorates and investigative bodies.
A final word of caution
The origins of the ICAO SMS are often thought to lie in the Air Ontario accident at Dryden, Canada, in 1989, which resulted from competitive pressures caused by commercial deregulation in the country and the burgeoning of regional airlines. These pressures cut into safety standards and resulted in several sloppy practices and questionable procedures. As FTOs and other GA organisations multiply in India, the number of fatal accidents and serious incidents is on the rise. We need to act now, not later. The remedies suggested in this write-up are simple; it is only a question of intent for the stakeholders. Rather than waiting for our own Dryden to hit us, it would be wiser to forestall it with simple, proactive measures that only require commitment. Only then can we produce aviation professionals who will live by the philosophy of “Safety First.”
About the Author
Group Captain Arvind Thapliyal (Retd) is a veteran Indian Air Force fighter pilot, instructor, examiner, and senior aviation safety professional. After serving for over three decades in the IAF, he worked as Safety Manager and Chief Ground Instructor at a reputed civil flying training organisation for seven years. A post-graduate in Defence and Strategic Studies and an ATPL holder, he is also an experienced aviation safety trainer, having undergone the Train-the-Trainer course under ICAO and the Accident Investigation Course at IFSA, Paris.
References
https://www.ehsinsight.com/blog/understanding-the-safety-pyramid
https://en.wikipedia.org/wiki/Red_Queen_hypothesis
https://www.iata.org/en/publications/safety-report/executive-summary/
https://hbr.org/2023/02/what-is-psychological-safety
https://en.wikipedia.org/wiki/Air_Ontario_Flight_1363