Individual Participant Data Meta-Analysis. Группа авторов

Чтение книги онлайн.

Читать онлайн книгу Individual Participant Data Meta-Analysis - Группа авторов страница 25

Individual Participant Data Meta-Analysis - Группа авторов

Скачать книгу

to specify appropriate methods and oversee and guide the day‐to‐day work of a more hands‐on but potentially less experienced statistician.

      It is usually helpful to establish an advisory group during the planning stage, as having a strong advisory group that endorses the IPD meta‐analysis project may add to credibility, be helpful in persuading trial investigators to participate, and strengthen funding applications.

      As participant‐level data are being used, commissioners may sometimes suggest that an IPD meta‐analysis project should establish governance structures similar to those for a clinical trial. However, with the exception of those that are prospective (Section 3.12), IPD meta‐analysis projects use participant‐level data that have already been collected in existing trials and no new participants are being recruited; therefore, there should usually be no requirement for a steering or data monitoring and ethics committee (as would be needed for a clinical trial).

      Unlike a conventional systematic review, the pace and progress in an IPD meta‐analysis project is not entirely under the control of the research team. The time required for some activities can be reasonably estimated, such as producing a protocol, running searches, eligibility screening, and preparation of a data dictionary (Section 4.2). However, the time taken to assemble, code and check the quality and applicability of received IPD is highly dependent on the actions of the data providers (Sections 4.4 to 4.6), and so is often difficult to estimate in advance.

      There is inevitably a lag between inviting trial investigators to participate and receiving IPD. After having been persuaded personally of the value of the proposed project (which may itself take some time), some trial investigators will need to obtain institutional or other approval to release data. This can take several months, particularly if granted at institutional review board or committee meetings, which may be scheduled infrequently. Data and documentation may then have to be located in stores and archives, and prepared for release.

      After data have been received, the central research team then need to check the trial IPD and send queries to trial investigators and resolve them. Again, this can take considerable time, particularly if the lead trial investigator needs to liaise with others such as the trial data manager or statistician, to determine exactly how data have been coded or which variables were used to define outcomes in the original trial analyses. It is important to remember that whilst the IPD meta‐analysis may be a top priority for the research team leading it, the same may not be true for trial investigators, who are likely to have many competing demands for their time.

      Unless there are very specific assurances that IPD will be released rapidly and that any issues encountered during data checking will be resolved promptly by trial investigators, as a rule of thumb, upwards of a year should be planned for collecting and checking the IPD from the full set of eligible trials. IPD collection, cleaning and harmonisation for large projects involving many trials may take much longer than a year, and typically 18 to 24 months is needed prior to the IPD meta‐analysis itself. Much of the elapsed time is taken up by communications and by trial investigators gaining approvals. As older trials may be difficult and time‐consuming to trace, and agreement more difficult to reach for controversial topics, these sorts of issues should also be factored into planning project timelines.

      Likewise, there will usually be a lag between requesting IPD from a data repository and subsequent provision of that IPD. There will usually be a process for approving project proposals prior to the release of a trial’s IPD that may take several months to complete. As noted earlier, experience to date has been mixed; whilst some teams have found that communication with the data providers has become more streamlined and that pre‐coded datasets can reduce the time taken to prepare data for analysis,74 others have found that obtaining permissions when multiple data owners are involved has been difficult.75 Based on the limited experience so far, it is sensible to still factor in at least a year to obtain the necessary approvals and gain access to IPD from data repositories and data‐sharing platforms.

      As trial data usually arrive sporadically over a period of time, data checking is usually done concurrently with data collection and coding. Data are checked as soon as possible after receipt, and any issues discussed and resolved with trial investigators, usually with a series of iterations (Section 4.5.4). This often includes analysing each trial individually and comparing with any published analyses, both to understand any differences that may arise as a result of, for example, using different outcome definitions, and to ensure that the central research team has understood the data correctly. The time and resource needed for checking will depend on how clean the data are on arrival and the extent of checking required. Allowing about three or four days per trial for carrying out data checking is a good starting point. It is useful to allow extra time beyond when the last dataset might be received, in order to complete the checking processes.

Скачать книгу