|Date: Friday, September 15, 2017
Location: 1084 East Hall (4:10 PM to 5:00 PM)
Title: Generalized CP Tensor Decompositions
Abstract: Given a tensor (i.e., a multi-dimensional array), a CP decomposition is formed by finding a low-rank tensor approximation. The CP decomposition is one of several extensions of decompositions for matrices (i.e., two-dimensional arrays) to tensors and has many exciting applications in fields ranging from neuroscience to network science and signal processing. In this talk, we will see two example applications: a) finding latent structure (i.e., political parties) in senate voting data and b) distinguishing gasses in chemo-sensing data.
The CP decomposition, in particular, forms a low-rank tensor approximation that has good fit as measured by the total entry-wise square difference. However, different fit/loss functions may be more appropriate in some cases and can provide new ways to look at the data. For example, the logistic loss is a natural choice for binary tensors. We propose a new generalized tensor decomposition method that allows users to select a generic loss function. To solve the resulting optimization problem, we use a stochastic gradient algorithm from machine learning to exploit the fact that approximate gradients can be computed efficiently from small samples of the entries.
Speaker: David Hong
Institution: University of Michigan
Event Organizer: Audra McMillan email@example.com