Online application

11 October 2013



Marketing authorisation applications are becoming larger and increasingly complicated, and the switch from paper to electronic dossiers in recent years has added to the overall complexity of the process. GlaxoSmithKline’s Alastair Nixon examines how the electronic common technical document’s implementation as a global standard affects verification on a regional level.


From the regulatory authority's perspective, receipt and handling of dossiers has changed significantly, from the logistics of a paper-based process some years ago to today's fully electronic process, where the dossier is delivered by an applicant directly to the regulatory authority via a portal or electronic gateway.

This brings its own challenges: the dossier content needs to be structured in accordance with the International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH)'s electronic common technical document (eCTD), and in line with the regional guidelines for the content in the regional module of the dossier.

While the ICH eCTD is a global standard, all authorities worldwide adopting the eCTD have also issued their own detailed guidance on the processes involved in creating and submitting regulatory dossiers. These complement the ICH guidance, but also add specific regional context and instructions for applicants.

The eCTD is a well-structured entity, and, as such, an eCTD submission can be subjected to various technical tests to determine if it is valid and meets the specification requirements. In the early years of eCTD implementation, ICH issued guidance on typical issues that had been seen at the time when eCTD submissions were received by regulatory authorities. This guidance, written up as a question-and-answer document - Q&A 36 - summarised 23 typical errors made by applicants that would result in an eCTD being difficult to process or read. This Q&A quickly became a set of standard tests against which all applicants would check their eCTD submissions.

Since these tests include the regional part of the dossier, regional authorities have generally taken the basis of Q&A 36 and added their own region-specific tests to this list, issuing a regional list of validation criteria against which all incoming eCTD submissions are tested. These lists of validation criteria have become increasingly sophisticated.

Many agencies, including the US FDA, issue graded criteria where a severity rating is added to each individual test. Tests with a high severity rating must be passed, medium ones tend to involve some level of agency discretion, and tests listed as low severity tend to be advice only, and will not lead to a rejection on technical grounds. There is, however, a downside to this approach. Having a set of validation criteria for each incoming dossier that involves agency discretion and, thus, potential human intervention can be very resource-intensive for the agencies.

Adjusting the criteria

At the beginning of 2011, the European eCTD validation criteria severity ratings were changed from A, B and C (serious, medium and low) to a more simple pass/fail (P/F) and best practice (BP)approach, where all pass/fail criteria had to be passed in order for the dossier to be processed by the agency's internal systems.

This approach allows the agencies to automate the validation process, sending system-generated receipts back to applicants submitting eCTDs that confirm their compliance with the validation criteria. Portals and gateways have since been developed that will simply return an invalid submission back to the applicant for correction and resubmission. Full automation of the validation process for electronic regulatory submissions also brings its own challenges, however.

Neither the pharmaceutical industry nor regulatory authorities are in the business of marketing software. As data exchange standards and validation criteria are developed, the process is dependent on the vendors that develop and maintain software to build, view, validate and review compliant submissions.

"As data exchange standards and validation criteria are developed, the process is dependent on the vendors that develop and maintain software to build, view, validate and review compliant submission."

The standards are continually being changed and, as more regions worldwide adopt the ICH eCTD, the number of changes that the industry has to deal with is increasing significantly.

The eCTD is now accepted in the US, the EU, Japan, Switzerland, Canada, South Africa and Saudi Arabia. At the ICH level, there is global collaboration on development of the eCTD, but at a regional level, maintenance is driven by local factors, such as bug fixes and change requests for the regional part of the specification, or changes in local legislation.

In the EU, the eCTD guidance needs to be maintained on behalf of all member states. As agency names are updated, they need to be amended in the regional eCTD specification and, more importantly, as the EU expands and countries join, they must be added to the eCTD.

A recent example is the accession of Croatia into the EU in July 2013. In order to build eCTDs that include Croatian content, the country 'Croatia' and the Croatian language needed adding to the pick lists in the eCTD's regional specification. This in turn also drove an update to the validation criteria. Since any change to the regional specification is costly and time-consuming for industry, agency and software vendors, European authorities have tried to minimise the number of changes. For this reason, the recent update to include Croatia also incorporated a number of other changes that had been on hold pending a major update, such as the European Medicines Agency's change of acronym, which occurred in 2009.

The European authorities always allow six months between release of a new specification and its implementation to provide time for software vendors to develop solutions, and industry and agencies to implement them. But there is now some recognition that six months may not be long enough for all the activities required to be carried with ample robustness. This is more the case today than in the past because the business requirement to automate the eCTD submission validation process means that a successful transition to a new specification is dependent on a consistent interpretation of the specification, and associated validation criteria by software vendors.

All software that validates eCTD submissions must give the same result. If not, an applicant could find that, while a submission seemed to be valid when tested by their chosen system, it may be deemed invalid once received at the agency if a different software vendor has been used. Any ambiguity in the written specification can result in tools behaving differently, which can in turn result in conflicting validation reports for the same eCTD submission. Each instance requires escalation of the issue to the team responsible for the specification so the intended meaning can be clarified, and issuance of a written Q&A document to resolve the ambiguity. There were several instances of such ambiguity when the European validation rules were updated in February 2012; submissions tested on one tool passed, but failed when tested in another tool.

In the months since the release of EU Module 1 v2.0, it was also apparent that some of the validation criteria were not interpreted as intended.

Validation, validation, validation

In recent years, some agencies saw issues where applicants had used life cycle in an eCTD to modify content already submitted, but had provided the replacement content in a different location in the eCTD table of contents. This poses a problem for eCTD viewing; how does a tool display the relationship between the replaced and replacement document? Are they both displayed in the new location, the old location or both locations?

To get around this, a new validation criterion - 11.10 - was introduced to check for such instances. Rule 11.10 states: "For all leaves with an operation attribute value of replace, delete or append, the modified file must be present in the same CTD section of the dossier". Further guidance on what constitutes a section was also provided: "'Same CTD section' refers to the position in the table of contents. Sections are defined by the CTD and also by attributes in the eCTD. For example, applicants cannot replace content in the application form section with revised content that is being provided in the cover letter section. The eCTD attributes also create applicant-defined sections.

"The efficiency that is being introduced by the drive towards automated validation comes at the cost of increased complexity."

For example, each substance or manufacturer attribute in m3-2-s-drug-substance, or product-name attribute in m3-2-p-drug-product will create a new CTD section, and life cycle between these sections is also not allowed."

Most vendors of eCTD validation tools, however, interpreted the definition of "section" more strictly, and their tools reported validation failures where another attribute, such as language, was inconsistent from the original document location and the replacement. If left unchallenged, this would have meant that applicants creating valid eCTD submissions would have been getting validation failure reports, either when running their own tests or following submission to the regulatory authority. To resolve the problem, the agencies, in collaboration with industry, worked with the key vendors to further explain the intentions behind the rule, and issued a more detailed description of the error in a subsequent Q&A document to remove this ambiguity.

As mentioned earlier, the European Medicines Agency's change of acronym - from EMEA to EMA - was one of the key changes in this version of the eCTD specification. But when vendors of eCTD validation software started issuing new versions that handled the new criteria, it became apparent that under the same rule 11.10, most vendors were treating EMA as a different section to EMEA. This was not intended; the values EMEA or EMA are from a pick list of countries, and the only allowable value was changed from one to the other in the updated specification. Therefore, if the EMEA section were treated as a different section to EMA under validation criterion 11.10, it would be impossible for the applicant to fix the problem because the original section identifier was no longer an allowed value in the pick list. Again, vendors were contacted by the agencies, a Q&A was written and the problem resolved.

Both of these issues, with just one validation criterion, illustrate how difficult it is to write a specification that will be interpreted in the same way by different software vendors.

Discretionary measures

The approach to eCTD validation is similar in other countries, but at the present time, most agencies still incorporate some level of discretion. In the US, the FDA has already issued validation criteria with an intended implementation date of June 2014 and 30 days' advance notice. These criteria also have severity ratings of high, medium and low, where high means "the error is a serious technical error that prevents the processing of the submission and will require resubmission; the submission is considered not received by the FDA". For medium-severity criteria, however, the FDA still indicates some discretion ("the submission might be considered received by FDA").

Heath Canada's eCTD validation rules are based on the severities, error, warning and information. Swissmedic has recently changed its validation criteria to follow the European P/F approach, and the Saudi Food and Drug Authority also use P/F.

Full automation of receipt and validation of eCTD submissions is not yet a reality in Europe, and may not have the highest priority in other regions. In the EU, however, automation is part of the overall vision for the future of eSubmissions, which includes extended use of portals, introduction of smart electronic application forms and electronic signatures, mandatory eCTD and the introduction of eCTD v4.0.

Complexity and communication

As the technical specifications of the eCTD evolve and the trend towards automation of technical validation continues, it will become more critical that the criteria for technical validation of eCTD submissions are written in a way that is totally unambiguous; the aim is that, once issued, all software based on the criteria will behave in the same way.

For this reason, in the EU, the validation criteria themselves are changing from what was once a set of business rules and advice to applicants (such as the original ICH Q&A 36), which could be read and understood relatively easily, to a set of system rules that are more complex to read and grasp.

This trend continues with the latest version of the eCTD - eCTD v4.0 - where the relative complexity of the message has rendered the underlying technical files far more difficult to work with for those without specific software.

The efficiency that is being introduced by the drive towards automated validation comes at the cost of increased complexity, and will certainly result in a much greater reliance on effective collaboration between agencies, industry and software vendors.

The history of eCTD validation criteria in the EU.
Alastair Nixon is director of publishing (UK) at GlaxoSmithKline.


Privacy Policy
We have updated our privacy policy. In the latest update it explains what cookies are and how we use them on our site. To learn more about cookies and their benefits, please view our privacy policy. Please be aware that parts of this site will not function correctly if you disable cookies. By continuing to use this site, you consent to our use of cookies in accordance with our privacy policy unless you have disabled them.