Skip to main content

Table 5 Summary of recommendations

From: Changing platforms without stopping the train: experiences of data management and data management systems when adapting platform protocols by adding and closing comparisons

Process Risk to patient safety or data integrity? Recommendations
CRF development and maintenance Increased number and/or complexity of CRFs may lead to erroneous completion, impacting data quality, possibly relating to safety data.
Additional time required to incorporate new data requirements may impact other data management tasks.
Consider which CRFs are going to be generic or comparison-specific for initial and additional comparisons.
Consider possible future impact of visit schedule and data collection changes.
Randomisation system Ineligibility checks incorrectly implemented or unable to be implemented at all, leading to patients incorrectly being randomised to a specific comparison or at all.
Allocation and stratification weighting to new and ongoing comparisons incorrectly implemented or unable to be implemented at all.
Ensure choice of randomisation systema can incorporate changes to questions, eligibility, multi-randomisation sub-groups, tables, weightings and stratification factors.
Check for imbalances before resetting tables; adjust if appropriate.
Thorough test scripts and testing of updates before go-live.
Database: design Increasing complexity increases time to incorporate new data requirement.
Complexity, legacy issues and redundancy may impact database performance, taking time away from other tasks.
Complexity, legacy issues and redundancy may increase risk of incorrect specification, programming and testing, which risk incorrect data capture and validation in the live environment.
Database change control or requirement for entirely new database design may impact timelines to activate a comparison or other data management tasks.
The chosen CDMSa must flexibly incorporate multiple number of increasingly conditional changes.
Consider single or multiple modular database designs to incorporate multiple comparisons. The former may be efficient if limited, non-complex changes are associated with new comparisons The latter may increase the number of databases to maintain but protect complexity and limit lifespan. Try to find logical separations of the database designs if using multiple.
The chosen CDMSa should have a proven record in functioning with large amounts of data in terms of overall number of trial participants, questions and validations.
Minimise long eCRFs and on-screen validation number and complexity.
Thorough test scripts and testing of updates before go-live.
Re-use shared elements to save development time.
Database: table structure The growing number of data points in a CDMS may impact database performance, which risks existing data if any errors in saving occur.
This may add time taken to enter data, taking resources away from other trial tasks.
Ensure the chosen CDMS’sa capacity for data storage is scalable for the forecasted patients and additional comparisons.
Use multiple databases or set up CDMS to partition data in multiple tables. Consider logical separation of expected data across multiple comparisons.
Processes should be in place to manage data if partitioned (e.g., reports, data mergers)
Database: support Existing trial data may be at risk if new bugs occur in the database which can no longer be fixed. Additional work may be required to transfer data to new CDMS or CDMS version.
Updating an already complex database may require existing in-depth knowledge. If staff change, then the database may not be updated or regression tested appropriately, risking data capture and validation.
Investigate whether predicted support for chosen CDMSa lines up with predicted timelines of maximum number of comparisons to be added when starting the trial.
Training and documentation Additionally complex guidance for multiple comparisons risks lack of understanding and misreporting of data at site or mishandling of data at sponsor. Prepare to regularly update an increasingly large set of documentation and train sites on generic and comparison-specific data management processes.
Competing, concurrent tasks: data queries and CRF chases Any data for comparisons that may not be queried before any analyses during this time, risking data quality during this time period.
Insufficient data cleaning before analyses is a risk to data integrity for any trial. This may be greater in shared control arm platform protocols because there may be imbalance in reporting on the control arm if this has been chased more frequently.
Plan for possibility of priority analyses occurring close together, without neglecting other comparisons.
Send queries for all comparisons where possible. If volume is too great, queries may have to be split by patient, site or CRF.
Ensure both control and research comparisons are sufficiently chased and cleaned before analysis.
Check for imbalances in reporting between missing expected forms and triggered event forms in control arm and research comparison before any analysis.
Competing, concurrent tasks: opening new comparisons while managing existing comparisons Data management in existing comparisons may be neglected if staff spend time setting up new comparison.
New comparisons may not adequately incorporate new data requirements if staff are working on existing comparisons.
Adequately resource for both ongoing data management and the work required for the addition of a new comparison
Consider competing trial priorities when planning to activate a new comparison
  1. CRF Case report form, eCRF Electronic case report form
  2. aThe choice of in-house or third-party clinical data management system (CDMS) is likely made at a unit level, and there may not be scope for choosing a different approach or for switching between third-party systems. Recommendations in table relating to how to set up any given CDMS should be considered to reduce size and complexity where possible