Next in our series on EMA’s Guideline on Computerized Systems and Electronic Data in Clinical Trials, we look at the sections on data transfer and data migration.
The guideline does not define data transfer, but defines migration “as opposed to…transfer” as a process of “permanently moving existing data (including metadata) from one system into another system.” This definition could potentially encompass transfers, which are sometimes “permanent” and may involve transfer of metadata. We offer this alternative definition, which we believe is the intention of the guideline: data migration is the permanent movement of data and metadata from one system to another with the intention of managing the data henceforth in the new system; data transfer would be movement of a subset of data from one system to another for any other purpose.
For example, if an electronic copy of a protocol generated in a RIMS system is automatically forwarded with its metadata to the eTMF system, that’s a transfer. If all the documents in an eTMF system are downloaded with their metadata and uploaded to a new eTMF system so they can be managed there for the duration of the study, that’s a migration.
Regulatory authorities have increasingly focused on data transfer over the past few years, requesting data flow documentation during inspections and questioning sponsors closely about the controls at every step of the workflow. Per the guideline, data transfer tools and processes must be validated, including “appropriate challenging test sets.” Although the guideline does not give any examples, most sponsors use validated systems and access controls. Where systems are not validated, they may employ hashes or manual verification techniques to verify that the size of the data file sent by the sender is the same size as the file received. Adoption of these techniques is not uniform across the industry; we see many sponsors and CROs with validated secure File Transfer Protocol (sFTP) systems, and about an equal number of sponsors and CROs that use poorly-controlled, unvalidated Box or Sharepoint accounts or email for this purpose, although the balance is starting to tip in favor of greater control.
As sponsors trend toward standing up their own eTMF systems, we are seeing more TMF migrations than we used to. “Rescue” situations also result in migrations of safety data, Interactive Response Technology (IRT) and Electronic Data Capture (EDC) data. Per the guideline, migration processes and tools need to be validated in addition to validation of the systems involved. Validation starts with a risk analysis, which leads to a risk-appropriate migration plan, including mapping from one system to another. The process and tools are tested using mock data in a test instance of the target system, after which key data are verified to show that they were transferred without loss or corruption. Sponsors should retain the plan, validation documentation, and documentation of the actual migration, including verification.
When sponsor migrate documents from one system to another, they frequently find that the target system cannot accommodate the source system’s audit trail. Per the guideline, in these situations, “adequate mitigating actions should be taken to establish a robust method to join the audit trail and data for continuous access by all stakeholders.” For example, the audit trail at the time of migration might be extracted from the source system in .pdf format and placed in a repository where users of the target system could access it. If this is not done, the guideline sternly cautions us, “a detailed explanation is expected.”