Nothing evolves within a vacuum. While today we can transmit reports, research, and more at the click of a button, this was science fiction fifty years ago. In parallel, regulatory information management was a vague concept with paper submissions being physically transported to reviewers. Regulations were national and required the review of a physical dossier to determine the quality of research.
Technology and the evolution of regional standards changed the regulatory landscape. Electronic data storage negated the need for printed submissions but created the need for a central record of information to ease regulators access to information. Thus, technology met the gap creating solutions that could easily meet regulatory requirements. Today, regulatory operations and regulatory affairs require regulatory information management (RIM) systems developed to capture and manage submission publishing metadata in support of regional regulatory requirements.
As regulatory requirements expanded, RIM systems evolved to accommodate increasing complexity and to match standard industry workflows and processes. In response to the complexity, workflows and processes became increasingly fragmented. Centralized management and standardization across countries, where possible, emerged as industry consensus. That viewpoint led to a push for end-to-end technologies with workflows and triggers to connect regulatory strategies, submission planning and tracking, content management, publishing, registration management, and health authority correspondence management. Data however, remained siloed, and processes separate, to stay within the bounds of regulatory requirements.
The transition from physical submissions to innovative technology facilitates a conversation that could permanently change the face of regulatory within life sciences by creating a comprehensive solution to facilitate regulatory needs and ensure safe and effective drugs and therapies within every corner of the globe.
Life sciences industry participants and regulatory agencies pursue a shared objective – bringing safe and effective therapies to market. These stakeholders diverge only in scope. A regulator’s focus is regional and singular. They are bound by regional law and while regulatory agencies must also operate within a budget, financial viability, operational efficiency, and product planning are not among their primary considerations.
Industry is responsible for global compliance — all regulations in all countries where they maintain marketing authorization for products or conduct clinical trials must be met. Information must also be formatted in accordance with each country’s technical specifications, which often vary. The duplication of effort and lack of visibility to what’s ultimately submitted in each region are catalysts to harmonize data and processes across regions. It’s no surprise that effective global lifecycle management with a focus on data governance continues to emerge as a foundational strategy for industry, even if regulators do not require it.
Evolution of Regulators and Requirements
Historically, each country developed independent national standards to manage drug safety with which the pharmaceutical industry had to comply, and each corresponded to a different governing organization. While some organizations like the United States Food and Drug Administration (U.S. FDA) date back to the 1930s, a substantial increase in the laws and guidelines of drugs and therapeutics in the 1960s and 1970s about their safety and efficacy necessitated the creation of additional regulatory oversight organizations for each region. Multiple agencies led to multiple standards diversifying the data and format needed for submission by pharmaceutical organizations.
The advent of multiple agencies increased the pressure on life sciences organizations to align to different requirements in different regions, decreasing the speed of therapies to market and increasing industry costs. To manage this divergent trajectory, the International Council of Harmonization (ICH) was formed to create a Common Technical Document that standardized what each organization would accept with room to align to national differences.
Recognizing the need to not only harmonize document structure, but also to harmonize data across countries, the International Organization of Standards (ISO) began developing a way to globalize the Identification of Medicinal Products (IDMP). By gaining agreement across all 167 members of ISO and developing a set of standards to facilitate consistent data exchange across regulators and industry, IDMP helps improve patient safety and gives life sciences companies an opportunity to interlink product registrations and pharmacovigilance data.
With the original standards published in 2012, both regulators and Industry had access to the new standards to move toward a global data harmonization. This acknowledged the need for harmonized data sets and ways of operating to improve drug safety, manage drug shortages, and increase transparencies between countries. Around the same time, the Medical Device industry and regulators recognized the same need and agreed on guidelines and guidance to standardize and harmonize Unique Device Identification systems at the global level.
In other words, regulations first emerged to harmonize and manage static content and is now evolving to harmonize global data exchange.
Technology Unlocks the Power of Data
At the same time global standards were emerging, technology advanced by leaps and bounds. Significant advancements in the ability to store and share massive amounts of data enable instantaneous global connectivity. In the past organizations favored on-premise software and technology, whereas software as a services (SaaS) software applications available via the cloud that provide seemingly unlimited scalability, while reducing overhead costs, are now preferred. In the span of less than a century, technology has evolved from disparate to interconnected systems that mirror a more global economy.
Going a level deeper, software technology itself continues to evolve. Alexa and Netflix cater to and anticipate user’s preferences, making content available on-demand in regionally specific languages and in compliance with local and national regulations. By connecting vast amounts of data, utilizing automation and predictive analytics, these technologies can instantly meet needs and learn to fit them better. The amount of data available is increasing exponentially but thankfully, technology is advancing to a place where advanced capabilities like artificial intelligence can harness it to inform smarter decisions.
The Challenge in an Evolving Landscape
Technology is not the limiting factor in remaking the regulatory landscape. Designing processes to match new technical capabilities is the key that unlocks progress and will require courage as stakeholders shift away from the old way of doing things. Technological limitations and geography each played a part in the orientation of legacy processes and information sharing around siloed data. Budgets, teams, and systems fell into that pattern and were also built within silos. However, there’s an extensive list of examples that prove processes, corporate structure, budgets, and technology can advance to meet modern needs. In the case of Defense Finance and Accounting Services (DFAS), a footprint of over 300 physical office sites translated into a sprawling information technology infrastructure with over 330 different systems. By consolidating 300 office sites to 10 regional locations, DFAS was able to reduce the number of IT systems to 70. The pharmaceutical industry, as well as regulators, similarly should prioritize process reinvention to take advantage of technological advancements.
Instead of having multiple independent systems unable to communicate with each other and requiring data to be copied and re-entered between systems, pharma needs to move toward end-to-end connectivity, establishing the source of truth for each dataset and then reusing that dataset across the entire eco-system.
The remit of regulatory within life sciences is no longer to simply maintain compliant submissions, it’s expanded to include maintenance of compliant data that can be used to meet a variety of needs, including authoring documentation, submitting data, making strategic decisions, tracking manufacturing, monitoring drug safety, and more.
Reimagining the Regulator-Industry Dynamic
Global regulatory agencies accustomed to managing static content are now evolving to facilitate the exchange of harmonized data. Not only do global data standards ease the operational burden of life sciences organizations, but they also open new pathways for the future.
While life sciences organizations seek to meet global regulatory requirements for drug safety, international standards such as IDMP and UDI are used to create common terminologies and IDs and provide a framework of global operations. Global standards create transparency and offer a view of all the data in each country and begin the process of harmonizing the data to the standard to improve drug safety and decrease time to market. Unlike regulators, who are focused on aligning their national data to IDMP, industry can align all their data across all regions and all countries to IDMP. This vantage point puts industry in a unique position to offer regulators recommendations on how best to implement IDMP. Industry can do what regions cannot do as easily – provide a global perspective. Industry can also drive adoption of technologies best equipped to harmonize the approach.
What’s the Future of Regulatory and Technology?
Regulatory is on the verge of foundational change. On one side, we see an ecosystem driven by regulators and their requirements, but we’re transitioning to a new paradigm where industry takes the lead, driving toward purpose-built solutions that embrace transformative innovation and create 360-degree visibility across the regulatory lifecycle. Through the harmonization of data, and therefore regulations, industry can push for faster overall results; this achieves the joint goal of regulators and industry to bring safer and more effective drugs to market with greater speed and urgency.
IDMP standardizes referentials from drug discovery to post-market authorization. The current moment offers the life sciences industry a unique opportunity to revisit their internal standards and practices to ensure they are compliant and optimized from drug discovery onward to leverage the new standard. While the five standards within IDMP will shape forthcoming national standards, this global standard allows organizations to streamline beyond the data required for regulatory compliance and pivot toward open architecture that supports further growth, embraces new and emerging technologies, and enables improved strategic decision making.
IDMP also standardizes beyond the established national and regional standards – creating a springboard for industry stakeholders to reconsider how data is accepted by regulatory agencies and how they formulate future initiatives to provider better transparency. With industry leading the way and leveraging global standards, does that mean that regulators lose authority? Not at all. Instead, the use of global standards streamlines this joint industry-health authority endeavor while benefiting patients who will be able to receive needed care more quickly and have improved transparency into the safety and effectiveness of their drugs and devices.
The sheer prevalence of advanced technologies affords industry and regulators the ability to choose the solutions best suited to consolidate an approach around emerging standards like IDMP. Artificial intelligence (AI), natural language processing (NLP), and machine learning (ML) allow vast amounts of data to be quickly processed, assessed, and drive new insights. Advancements in master data management and micro-services allow for the alignment and dissemination of data. Improvements in cross-functional operational automation allow systems to easily talk and exchange data. If industry and regulators work together to standardize and ease this exchange within an open architecture framework, all parties will benefit by reducing duplication and reentry of data, thereby improving confidence in authoritative data sources.