What is Similarity?

The European Medicines Agency (EMA) published the final version of the so-called reflection paper on "Similarity". The paper discusses the appropriate use of statistics during drug development when a comparative assessment of quality attributes (QAs) is needed (e.g. in case of pre- and post-manufacturing changes, biosimilar & generics´ development). 


In 2017, the EMA released a draft Reflection Paper (RP) on statistical methodology for the comparative assessment of QAs. This was followed by a 1-year public consultation phase ending on 31 March 2018. The EMA considered the comments received during the public consultation period and from a workshop conducted in 2018 when finalizing the RP. In total, about 300 general and 1000 specific comments were received by the time the consultation period ended. According to the EMA, examples, where the comparative evaluation of QAs plays a major role in regulatory decision-making, are:

  • Comparison of two drug product versions, "e.g. when it becomes necessary to establish a bridge between clinical trials, where the two versions of the drug product were used exclusively", 
  • Comparison of a biosimilar product to a reference medicinal product, in particular in development programmes with (suggested) abbreviated clinical development parts,
  • Comparison of a generic product to the reference medicinal product, e.g. when a bioequivalence trial is waived, and pivotal evidence to support similarity comes from comparative dissolution experiments.

Consideration of comments received

According to the agency, one of the most relevant comments received during the consultation period was that the draft RP did not answer the question: "What is similarity?". Thus a new concept, the "similarity condition", is introduced in the revised RP. Thereby, it is pointed out that a definition is first required on when the two data distributions (e.g., data taken from two manufacturing processes) to be compared would allow the conclusion of "similarity". In addition, it is important to differentiate the "similarity condition" from the "similarity criterion": The former corresponds to a consensus between stakeholders involved in regulatory decision-making (applicants as well as regulators). The latter is a concrete instruction on how data should be analyzed to check whether the a priori agreed "similarity condition" can be assumed to hold or not.

  • Similarity condition: It is emphasized in the final RP that "any similarity / equivalence claim eventually made after a comparative data analysis always refers to two underlying data distributions, and not to the two sets of samples taken from two processes (e.g. two sets of batches)". Such a claim would require in first place an agreement on a "similarity condition", i.e. "a concise description for when two data distributions allow a conclusion of similarity’". To specify the similarity condition, considerations related to the maximum allowed difference between the two data distributions for the specific QA are of interest. This should be guided by an understanding of what differences in a QA could have an impact on clinical efficacy or safety. Similarity could then be defined, for example, based on the overlap of the test distribution with a reference range based on the distribution for the reference product.

  • Similarity criterion: The agency states that "a clear idea regarding the similarity condition for a specific QA in a given context is essential before a similarity criterion is applied". The revised RP provides an overview of frequently used similarity criteria, like, for example, similarity criteria based on statistical intervals.

  • QA data comparison protocol: The draft RP included an Appendix regarding the planning of tasks related to QAs' data comparison (“check list”). The revised RP contains a more concrete recommendation to plan prospectively for any QA data comparison which might influence regulatory decision-making. The preparation of a "QA data comparison protocol" containing the following topics is suggested:

    - The objective as well as the context of the QA data comparison,
    - Elaborations concerning the similarity condition,
    - Considerations concerning the sampling strategy,
    - The choice of similarity criteria,
    - A description of how the combined comparative evaluation of several QAs would eventually translate into an overall conclusion of similarity at the quality level .

What is a reflection paper?

The following definition is provided for the RP in the Procedure for European Union Guidelines and Related Documents within the Pharmaceutical Legislative Framework:
"A reflection paper may be developed to communicate the current status of discussions or to invite comment on a selected area of medicinal product development or a specific topic. It can provide a framework for discussion or clarification particularly in areas where scientific knowledge is fast evolving or experience is limited. A reflection paper does not provide scientific, technical or regulatory guidance, but may contribute to future development of such guidelines, or related documents."

FDA´s statistical approach

The U.S. Food and Drug Administration, FDA, previously withdrew their draft guidance on “Statistical Approaches to Evaluate Analytical Similarity" issued in September 2017. According to the FDA, the draft guidance was intended to provide advice for sponsors developing biosimilar products regarding the evaluation of analytical similarity between a proposed biosimilar product and a reference product. After considering public comments, the FDA determined to withdraw the draft guidance. The comments addressed a range of issues that could impact the cost and efficiency of biosimilar development, including the number of reference product lots the draft guidance recommended to sample.

For more information please see the final "Reflection paper on statistical methodology for the comparative assessment of quality attributes in drug development" published together with the comments received on the draft RP on the EMA website.

Go back

GMP Conferences by Topics