Oral

Survival Analysis 2

Presenter: Mailis Amico

When: Tuesday, July 12, 2016      Time: 11:00 AM - 12:30 PM

Room: Salon B Carson Hall (Level 2)

Session Synopsis:

The single-index / Cox proportional hazard mixture cure model: a new modelling approach for the mixture cure model.

In biometry (as in other fields), the classical survival analysis assumption that all observations will experience the event of interest (if followed long enough) is often not realistic (e.g. time to relapse from a cancer, time to experience a disease…). Survival data can contain a “cure” or a “immune” fraction that can’t be handled by classical survival models. Cure models extend classical survival analysis models to propose an answer to this situation. The mixture cure model is one of the two broad classes of cure models that have been proposed in the literature. Considering that the population of interest is a mixture of cured and uncured individuals, the model is composed of two elements, the incidence part referring to the probability of being uncured, and the latency part corresponding to the survival function of the uncured observations. Most often, the probability of being uncured is modelled parametrically assuming a logistic regression model. In this research, we propose to consider a semi-parametric modelling through a single index structure, which offers more flexibility than a parametric approach but avoids the curse of dimensionality phenomenon encountered in nonparametric modelling. We use a kernel estimator for the unknown link function in the single-index and develop an estimation method based on the EM algorithm. We propose a data-driven (cross-validation) method to determine the bandwidth inside the framework of the EM algorithm. The identifiability of the model is guaranteed by the application of the zero-tail constraint for the mixture cure model, and by identifiability conditions specific to the single-index structure. Based on simulations, we demonstrate the good performance of our proposed method for both continuous and categorical covariates both when the true model for the incidence part is or is not of a logistic form. We also present an application of our methodology on a real dataset and contrast our result with those obtained assuming a logistic regression model for the incidence part.

Survival Analysis 2

Presenter: Denekew Belay

When: Tuesday, July 12, 2016      Time: 11:00 AM - 12:30 PM

Room: Salon B Carson Hall (Level 2)

Session Synopsis:

A Bayesian joint model for time to malaria and mosquito abundance data from Ethiopia

Introduction: Currently, more than two billion people live at risk of contracting malaria, and the estimated global annual incidence of clinical malaria is greater than 300 million cases. More than one million people die every year from the direct causes of malaria , with children less than five years of age living in sub-Saharan Africa at highest risk. Data: We used data from a project which originally aimed to assess the effect of the construction of a mega hydropower dam in southwest Ethiopia on malaria incidence. Sixteen villages were selected, at various distances from the dam shore. In each village on average 130 households were sampled, and in each household one child of age below ten was randomly chosen and followed from study start. Each child of the total cohort of 2040 children was visited weekly during the 23 months of the study, and when actual a blood test was performed to establish if the child had contracted malaria. Methodology: We used a Bayesian joint model for time to malaria and mosquito abundance data. For the abundance we used a linear mixed effect model with monthly rainfall, seasonal variation and distance to water of each child as covariates. For time to event we use a proportional hazard model with the latent mosquito abundance as a main effect and interaction term of this abundance with a measure of local incidence of malaria in the previous three weeks. MCMC is used for inference using the approach and R-package JMBayes in [1]. Results: Based on the joint model there is a significant association between latent mosquito abundance and time to malaria. Most importantly the interaction effect of the newly confirmed malaria cases around each child and the latent mosquito abundance has a strong significant association with time to malaria. This indicates that preventive intervention could advantageously target the pool of malaria parasites available for infection, in addition to mosquito control. The observed mosquito abundance is significantly associated to distance to the dam shore. This is a joint work with Arnoldo Frigessi, Ayele Taye, Yehenew Getachew, Luc Duchateau and Jon Michael Gran. References [1] Rizopoulos, D., 2014. The R Package JMbayes for Fitting Joint Models for Longitudinal and Time-to-Event Data using MCMC. arXiv preprint arXiv:1404.7625.

Survival Analysis 2

Presenter: Coraline Danieli

When: Tuesday, July 12, 2016      Time: 11:00 AM - 12:30 PM

Room: Salon B Carson Hall (Level 2)

Session Synopsis:

Competing risk modelling of time-varying drug exposures

An accurate assessment of the safety or effectiveness of drugs in pharmaco-epidemiological studies requires defining an etiologically correct time-varying exposure model, which specifies how previous drug use affects the hazard of the event of interest. An additional challenge is to study the possible multitude of adverse events that may be associated with the use of a given drug, each possibly involving a different biological mechanism. To simultaneously address both challenges, we develop, and validate in simulations, a new approach that combines (1) flexible modelling of the cumulative effects of time-varying exposures, with (2) competing risks methodology. In particular, we use the Lunn-McNeil data augmentation approach [1] to separate the effects of the same drug exposure on different outcomes (e.g. adverse events A vs B), within a competing risks extension of the Cox proportional hazards model. Then, to account for the dosage, duration and timing of past exposures, we rely on a flexible weighed cumulative exposure (WCE) model [2]: WCE(?|x(t), t

Survival Analysis 2

Presenter: Ruth Keogh

When: Tuesday, July 12, 2016      Time: 11:00 AM - 12:30 PM

Room: Salon B Carson Hall (Level 2)

Session Synopsis:

Dynamic prediction of survival using landmarking in large longitudinal observational patient databases: challenges and solutions

In ‘dynamic’ prediction of survival we make updated predictions of individuals’ survival over time as new information becomes available about their health status, e.g. given survival to time t we may estimate the probability of survival to time t+5 years based on data up to time t. Landmarking is an attractive and flexible method for dynamic prediction modelling. The times at which predictions are made are the ‘landmarks’ and Cox models are used to estimate survival from each landmark to some time horizon using data up to the landmark. Large observational patient databases provide longitudinal data on clinical measurements and present opportunities to develop ‘personalised’ dynamic predictions of survival. However, these data also present challenges for landmarking, including that the frequency of measurements varies between individuals and that a single measurement may not be the best representative of current status due to random short-term fluctuations. Applications of landmarking have so far used the 'last observation carried forward' approach to obtain measurements at each landmark time. However this may result in biased estimates due to the challenges noted above. More recently, mixed models been suggested as a way to predict the measurements at each landmark, which can incorporate error due to random fluctuations. However, this has not yet incorporated different frequencies of measurements or allowed for uncertainty in the predictions. We show how frequency of measurements can be incorporated into mixed prediction models in this context and how uncertainty in predictions can be incorporated using a multiple imputation technique for mixed models. This work is motivated by the aim of developing dynamic prediction models for survival in people with cystic fibrosis (CF), one of the most common inherited life-shortening diseases, using data from the US CF Patient Registry which contains longitudinal clinical data on over 45,000 people. Measures are obtained at visits to the care team which occur at considerably different frequencies between individuals. Here, the landmarks are ages. The data used for dynamic prediction include current lung function and its recent trajectory, airway infections, and number of exacerbations. Our methods are compared using simulations studies and illustrated using the CF data.

Survival Analysis 2

Presenter: Rodrigo Pescim

When: Tuesday, July 12, 2016      Time: 11:00 AM - 12:30 PM

Room: Salon B Carson Hall (Level 2)

Session Synopsis:

The destructive Poisson-OLLGHN model for survival data with cure rate

In this paper, we propose a new flexible cure rate survival model by assuming the initial number of competing causes of the event of interest follows a Poisson distribution and the time to event has the odd log-logistic generalized half-normal (OLLGHN) (Cordeiro et al., 2015) distribution. This new survival model describes a realistic interpretation for the biological mechanism of the occurrence of the event of interest in studies related to carcinogenesis (initiation of a tumor, promotion and progression of the tumor to a detectable cancer) in presence of the competing latent causes. This mechanism includes a process of destruction of tumor cells after an initial treatment or the capacity of an individual exposed to irradiation to repair initiated cells that result in cancer being induced (Borges et al., 2012). We estimate the model parameters using maximum likelihood. We derive the appropriate matrices for assessing local in influence diagnostics on the parameter estimates under different perturbation schemes. In addition, we define the martingale and modified deviance residuals to detect outliers and evaluate the model assumptions. Also, we demonstrate that the extended cure rate regression model can be very useful in the analysis of real survival data and provide more realistic fits than other survival regression models with cure rate commonly used in the literature. The potentiality of the new cure rate survival model is illustrated by means of a real data set.

Survival Analysis 2

Presenter: Jane Hutton

When: Tuesday, July 12, 2016      Time: 11:00 AM - 12:30 PM

Room: Salon B Carson Hall (Level 2)

Session Synopsis:

Assessing and Modelling informative missing data in survival analysis

One approach to assess the impact of missing covariate data on parametric regression models for lifetime data is to jointly model the survival times (T) and the missing data mechanism. An accelerated life model has log(T) a linear comination of covariate plus an error term, which might follow a normal or logistic distribution. A linear model, with normal error, for a continuous variable, M, with only the sign of M observed, is used for the missingness mechanism. When the error terms in the lifetime and missing data models are correlated, missingness is informative. Simulation studies have shown that such models provide reliable and precise estimates of survival. Sensitivity to the untestable assumptions for the missingness mechanism can also be investigated. Chain event graphs (CEGs) have been shown to be useful in exploring different patterns of missing data with discrete variables (Barclay et al, 2014). A CEG is similar to a Bayesian network model, as it expressed the distribution as a series of conditional independence statements. However, the conditioning is on the values of each variable, which yields a probability tree. These probability trees are then simplified to a more compact graph using observed symmetries. This provides more nuanced descriptions of data, which allow subsets defined by a particular value of one discrete variable to have sub-graphs with uninformative missing data. This framework is extend to allow for the final, outcome, variable in the graph to be a life-time, possibly censored. Data from people with cerebral palsy who were under the care of an English paediatrician during 1951 to 1964 provide an opportunity to assess mortality in older people with cerebral palsy (Hemming et al, 2006). Impairments are known to have substantial effects on survival, but there is substantial missing impairment data. The joint modelling of missingness and lifetime, and chain event graph approaches will be illustrated using these data. Results will be compared with imputation based methods for semi-parametric proportional hazard models. LM Barclay, JL Hutton and JQ Smith, (2014) "Chain event graphs for informed missingness", Bayesian Analysis, Vol. 9, 53-76. K Hemming, JL Hutton and POD Pharoah, (2006), "Long term survival for a cohort of adults with cerebral palsy", Dev. Med. Child. Neuro., Vol.48, 90-95.