Bayesian Source Estimation

Kevin H. Knuth
kevin.h.knuth@nasa.gov

http://www.huginn.com/knuth

This site is an updated version of my older BSE site, which had derivations that can be found in the publications.
Feel free to visit it, but be aware that this site is more up-to-date.

Last updated: 6 November 2004


SOURCE SEPARATION
The problem of source separation is a ubiquitous problem in the sciences where there are multiple signal sources s(t), which are recorded by multiple detectors. In this scenario, each detector records a mixture x(t) of the original signals. The goal is to recover estimates of the original signals y(t).

SOURCE SEPARATION AS INFERENCE
The problem of source separation is by its very nature an inductive inference problem. There is not enough information to deduce the solution, so one must use any available information to infer the most probable solution. This information comes in two forms: the signal model and the probability assignments. By adopting a signal model appropriate for your problem, you can develop an algorithm that is specially-tailored to suit your needs. Many people like the idea of a general blind source separation algorithm that can be applied anywhere. However, since the quality of the results depends on the information you put into the algorithm, you will do better with an algorithm that incorporates more specific knowledge.

THE BAYESIAN APPROACH
What I appreciate about the Bayesian approach is that it requires you to make your assumptions explicit. This is not case the case with ad hoc source separation algorithms, which are almost impossible to modify intelligently if they do not quite work for a particular application. With a Bayesian solution, one needs only to trace the problem back to the model, the probability assignments, or a simplifying assumption and modify it appropriately. While this is often easier said than done, it is still better than the situation that one is in when dealing with an ad hoc algorithm where the model and assumptions are implicit and often unknown.

 

EXAMPLES
Here are four examples of Bayesian source separation algorithms that I have worked on...

Back to top

 


Bayesian derivation of ICA
In this paper, I demonstrated how one can derive the Bell & Sejnowski Infomax ICA algorithm in the Bayesian framework. The advantage here is that the prior probabilities and the signal model can be varied at will to produce a host of source separation algorithms specifically tailored to suit the needs of a particular problem. My subsequent work demonstrated this approach on varies applications.

Knuth K.H. 1999. A Bayesian approach to source separation. In: J.-F. Cardoso, C. Jutten and P. Loubaton (eds.), Proceedings of the First International Workshop on Independent Component Analysis and Signal Separation: ICA'99, Aussios, France, Jan. 1999, pp. 283-288. [arXiv link] [pdf (218 kb)]

Back to example list

 


ICA with source position priors
In this picture, the sensors around the bridge recorded sounds from each of the characters during one of the crew's weekly catastrophic events. Since we know that the Starship Enterprise officers won't wander far from their posts, we can use their approximate locations to help separate their recorded speech signals from the mayhem.
On the Bridge of the Enterprise

In the SPIE98 paper below, I considered an example where the source positions are known with some accuracy. This combined with the propagation laws of the signal (inverse square) leads to a prior probability on the values of the mixing matrix, which, in general, improves the separation. The results aren't perfect however, because I use a prior on the source amplitude histograms that is inappropriate for some of the other recorded signals, such as the photon torpedo blast. These difficulties are discussed in the MaxEnt97 paper below, although in a different context. More detailed information can be found at my old BSE site, and in the papers below.
I won't tell you who survived, but its a sure bet that Ensign Jones is toast.

Knuth K.H. 1998. Bayesian source separation and localization. In: A. Mohammad-Djafari (ed.), SPIE'98 Proceedings: Bayesian Inference for Inverse Problems, San Diego, July 1998, pp. 147-158. [arXiv link] [pdf (363 kb)]

Knuth K.H. 1998. Difficulties applying recent blind source separation techniques to EEG and MEG. In: G.J. Erickson, J.T. Rychert and C.R. Smith (eds.), Maximum Entropy and Bayesian Methods, Boise 1997, Kluwer, Dordrecht, pp. 209-222. [pdf (421 kb)]

Back to example list


EEG/MEG Source Separation and Localization
Introducing information about the signal propagation and modeling the source locations as well as their signals, leads naturally to joint source localization and separation. We investigated this briefly in this paper where we demonstrated that the selected model of the physical system is all that separates source separation problems from source localization problems. The basic problem considered in this paper was estimation of neural sources. (Yes, that is my brain in the picture)

Knuth K.H., Vaughan H.G., Jr. 1999. Convergent Bayesian formulations of blind source separation and electromagnetic source estimation. In: W. von der Linden, V. Dose, R. Fischer and R. Preuss (eds.), Maximum Entropy and Bayesian Methods, Munich 1998, Dordrecht. Kluwer, pp. 217-226. [pdf (204 kb)]

Back to example list


Differentially Variable Component Analysis (dVCA)
This is a highly specialized algorithm that takes into account the fact that EEG/MEG experiments record data in a finite number of experimental trials. Since the activity produced by neural ensembles varies from trial to trial, our signal model accounts for this by allowing the source waveshape to vary in both amplitude and latency. These effects are estimated for each trial along with the stereotypic source waveshape. The relevant publications are below:

Knuth K.H., Shah A.S., Truccolo W.A., Bressler S.L., Ding M., Schroeder C.E. 2005. Differentially variable component analysis (dVCA): Identifying multiple evoked components using trial-to-trial variability. Submitted. [on request]

Truccolo W.A., Knuth K.H., Shah A.S., Bressler S.L., Schroeder C.E., and Ding M. 2003. Estimation of single-trial multi-component ERPs: Differentially variable component analysis. Biol. Cybern. 89(6): 426-38. [PubMed link] [pdf (717kb)]

Shah A.S., Knuth K.H., Lakatos P., Schroeder C.E. 2003. Lessons from applying differentially variable component analysis (dVCA) to electroencephalographic activity. In: G.J. Erickson, Y. Zhai (eds.), Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Jackson Hole WY 2003, AIP Conference Proceedings 707, American Institute of Physics, Melville NY, pp. 167-181. [pdf (445 kb)]

Shah A.S., Knuth K.H., Truccolo W.A., Ding M., Bressler S.L., Schroeder C.E. 2002. A Bayesian approach to estimating coupling between neural components: evaluation of the multiple component event related potential (mcERP) algorithm. In: C. Williams (ed.), Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Moscow ID 2002, AIP Conference Proceedings 659, American Institute of Physics, Melville NY, pp. 23-38. [pdf (1.05 mb)]

Truccolo, W.A, Ding M., Knuth K.H., Nakamura, R. and Bressler S.L. 2002. Variability of cortical evoked responses: implications for the analysis of functional connectivity. Clinical Neurophysiol. 113(2):206-26. [PubMed link ] [pdf (433 kb)]

Truccolo, W.A., Knuth K.H., Ding M., Bressler S.L. 2001. Bayesian estimation of amplitude, latency and waveform of single trial cortical evoked components. In: R.L. Fry and M. Bierbaum (eds.), Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Baltimore 2001, AIP Conference Proceedings 617, American Institute of Physics, Melville NY, pp. 64-73. [pdf (100 kb)]

Knuth K.H., Truccolo, W.A., Bressler S.L., Ding M. 2001. Separation of multiple evoked responses using differential amplitude and latency variability. Proceedings of the Third International Workshop on Independent Component Analysis and Blind Signal Separation (ICA 2001), San Diego CA. [arXiv link] [pdf (248 kb)]

Back to example list