Normalization methods in time series of platelet function assays A SQUIRE compliant study

Sven Van Poucke*, Zhongheng Zhang, Mark Roest, Milan Vukicevic, Maud Beran, Bart Lauwereins, Ming-Hua Zheng, Yvonne Henskens, Markus Lance, Abraham Marcus

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

1 Citation (Web of Science)

Abstract

Platelet function can be quantitatively assessed by specific assays such as light-transmission aggregometry, multiple-electrode aggregometry measuring the response to adenosine diphosphate (ADP), arachidonic acid, collagen, and thrombin-receptor activating peptide and viscoelastic tests such as rotational thromboelastometry (ROTEM). The task of extracting meaningful statistical and clinical information from high-dimensional data spaces in temporal multivariate clinical data represented in multivariate time series is complex. Building insightful visualizations for multivariate time series demands adequate usage of normalization techniques. In this article, various methods for data normalization (z-transformation, range transformation, proportion transformation, and interquartile range) are presented and visualized discussing the most suited approach for platelet function data series. Normalization was calculated per assay (test) for all time points and per time point for all tests. Interquartile range, range transformation, and z-transformation demonstrated the correlation as calculated by the Spearman correlation test, when normalized per assay (test) for all time points. When normalizing per time point for all tests, no correlation could be abstracted from the charts as was the case when using all data as 1 dataset for normalization.
Original languageEnglish
JournalMedicine
Volume95
Issue number28
DOIs
Publication statusPublished - Jul 2016

Keywords

  • aggregometry
  • data space
  • high-dimensional
  • multivariate
  • normalization
  • platelets
  • thromboelastometry

Cite this