SIPL Annual Event 2016

Photos icon Photos from the Event


Gathering & Poster Session

13:30-14:00


Video icon Welcome

14:00-14:10

Prof. David Malah, Head of SIPL


PDF Logo Video icon Nonparametric Canonical Correlation Analysis

14:10-14:35

Prof. Tomer Michaeli

Abstract:

Canonical correlation analysis (CCA) is a classical representation learning technique for finding correlated variables in multi-view data. This tool has found widespread use in various fields, including recent application to natural language processing, speech recognition, genomics, and cross-modal retrieval. One of the shortcomings of CCA is its restriction to linear mappings, since many real-world multi-view datasets exhibit highly nonlinear relationships. In recent years, several nonlinear extensions of the original linear CCA have been proposed, including kernel and deep neural network methods. These approaches significantly improve upon linear CCA in many practical applications, but have two limitations. First, they are still restricted to families of projection functions which the user must specify (by choosing a kernel or neural network structure). Second, they are computationally demanding.

In this work, we derive a closed form solution to the nonlinear CCA problem without any functional restrictions. We show that the optimal projections can be obtained from the singular value decomposition of a certain operator associated with the joint density of the views. Thus, by estimating the population density from training data, we obtain a practical nonparametric CCA (NCCA) algorithm, which reduces to solving an eigenvalue system. Superficially, this is similar to kernel CCA, but importantly, NCCA does not require the inversion of any kernel matrix. We also derive a partially linear CCA (PLCCA) variant in which one of the views undergoes a linear projection while the other is nonparametric. Finally, we show how our algorithms can be constrained to output non-redundant projections, a feature not possessed by any other nonlinear CCA algorithm.

As we demonstrate on several test cases, our NCCA and PLCCA algorithms are memory-efficient, often run much faster, and perform better than kernel CCA and comparable to deep CCA.

This is joint work with Weiran Wang, Karen Livescu and Yochai Blau.


Wilk Family Award Ceremony

14:35-14:45


PDF Logo Video icon Advanced Learning for Deep Reinforcement Learning

14:45-15:05

2nd prize winner in the Kasher undergraduate project contest in the EE faculty

Shai Reozenberg, Nadav Bhonker

Supervisor: Itai Hubara


Break & Poster Session

15:05-15:35


PDF Logo Video icon Review of Teaching Activity in SIPL

15:35-15:50

Nimrod Peleg


PDF Logo Audio QR Over Streaming Media

15:50-16:10

Wilk family award winner

Gal Binyamin, Itai Dagan

Supervisor: Alon Eilam

In cooperation with:   Prontoly logo


PDF Logo Usage of Surface EMG on the Forearm to Classify Palm Movement

16:10-16:30

Wilk family award winner

Aviv Peleg, Or Dicker

Supervisor: Tal Shnitzer

In cooperation with: Dr. Oscar Lichtenstein, Faculty of BioMedical Engineering, Technion


PDF Logo Manifold Learning for Anomaly Detection in High-Dimensional Data

16:30-17:00

Gal Mishne, Ph.D. student

Advisor: Prof. Israel Cohen

Abstract:

In this talk, I will present manifold learning-based methods for anomaly and target detection in supervised and unsupervised settings. Our approach achieves impressive results on various remote sensing image datasets and is shown to be independent of the imaging sensor and noise model. I will also present Diffusion Nets, a geometric auto-encoder that incorporates a manifold embedding of the data in the deep learning framework.