The doctoral dissertations of the former Helsinki University of Technology (TKK) and Aalto University Schools of Technology (CHEM, ELEC, ENG, SCI) published in electronic format are available in the electronic publications archive of Aalto University - Aaltodoc.
|
|
|
Dissertation for the degree of Doctor of Science in Technology to be presented with due permission of the Department of Computer Science and Engineering for public examination and debate in Auditorium T2 at Helsinki University of Technology (Espoo, Finland) on the 29th of October, 2004, at 12 o'clock noon.
Overview in PDF format (ISBN 951-22-7343-8) [7381 KB]
Dissertation is also available in print (ISBN 951-22-7342-X)
Contemporary science produces vast amounts of data. The analysis of this data is in a central role for all empirical sciences as well as humanities and arts using quantitative methods. One central role of an information scientist is to provide this research with sophisticated, computationally tractable data analysis tools.
When the information scientist confronts a new target field of research producing data for her to analyse, she has two options: She may make some specific hypotheses, or guesses, on the contents of the data, and test these using statistical analysis. On the other hand, she may use general purpose statistical models to get a better insight into the data before making detailed hypotheses.
Latent variable models present a case of such general models. In particular, such latent variable models are discussed where the measured data is generated by some hidden sources through some mapping. The task of source separation is to recover the sources. Additionally, one may be interested in the details of the generation process itself.
We argue that when little is known of the target field, independent component analysis (ICA) serves as a valuable tool to solve a problem called blind source separation (BSS). BSS means solving a source separation problem with no, or at least very little, prior information. In case more is known of the target field, it is natural to incorporate the knowledge in the separation process. Hence, we also introduce methods for this incorporation. Finally, we suggest a general framework of denoising source separation (DSS) that can serve as a basis for algorithms ranging from almost blind approach to highly specialised and problem-tuned source separation algoritms. We show that certain ICA methods can be constructed in the DSS framework. This leads to new, more robust algorithms.
It is natural to use the accumulated knowledge from applying BSS in a target field to devise more detailed source separation algorithms. We call this process exploratory source separation (ESS). We show that DSS serves as a practical and flexible framework to perform ESS, too.
Biomedical systems, the nervous system, heart, etc., constitute arguably the most complex systems that human beings have ever studied. Furthermore, the contemporary physics and technology have made it possible to study these systems while they operate in near-natural conditions. The usage of these sophisticated instruments has resulted in a massive explosion of available data. In this thesis, we apply the developed source separation algorithms in the analysis of the human brain, using mainly magnetoencephalograms (MEG). The methods are directly usable for electroencephalograms (EEG) and with small adjustments for other imaging modalities, such as (functional) magnetic resonance imaging (fMRI), too.
This thesis consists of an overview and of the following 6 publications:
Keywords: exploratory source separation, independent component analysis, blind source separation, denoising, denoising source separation, biomedical systems, biomedical data, magnetoencephalograms, electroencephalograms
This publication is copyrighted. You may download, display and print it for Your own personal use. Commercial use is prohibited.
© 2004 Helsinki University of Technology