Salta al contenuto principale
Passa alla visualizzazione normale.

ANGELO MINEO

Modelling the background correction in microarray data analysis

Abstract

Microarray technology has been adopted in many areas of biomedical research for quantitative and highly parallel measurements of gene expressions. In this field, the high density oligonucleotide microarray technology is the most used platform; in this platform oligonucleotides of 25 base pairs are used as probe genes. Two types of probes are considered: perfect match (PM) and mismatch (MM) probes. In theory, MM probes are used to quantify and remove two types of error: optical noise and non specific binding. The correction of these two types of error is known as background correction. Preprocessing is an essential step of the analysis in which the intensity, read from each probe, is manipulated in order to obtain an expression measure for each gene. In this paper, we introduce a new method for the background correction by using a calibration approach based on a generalized linear mixed model (GLMM).