Algorithms
Practical solutions to challenging problems.
Researchers in the Algorithms group at MERL develop solution methods for optimization problems involving very large numbers of variables. Typically these arise in inference problems involving images, video, or audio; network transport problems; coding and compression problems; or design problems. Usually these problems are characterized by very complicated probability distributions in extremely high dimensional spaces. Because classical approaches to these problems are infeasible, our results can open new business opportunities where there are no competitive technologies. Another main research theme involves adaptivelysampled distance fields, providing superior font and graphical rendering for digital displays.
Most of the group's work revolves around graphbased optimizations and inference, where the graph is a representation of the problem constraints and a probability distribution over possible solutions. Through formal analysis we identify tractable estimation or approximation schemes. This meshes with MERL's expertise in fields and technologies such as belief propagation, machine learning, computer vision, dynamic programming, convex optimization, coding and communications theory, and signal processing.
Quick Links

Research Team

Awards

AWARD 2017 Graph Challenge Student Innovation Award Date: August 4, 2017
Awarded to:
MERL Contact: Andrew Knyazev
Research Areas: Data Analytics, Algorithms, Machine LearningBrief David Zhuzhunashvili, an undergraduate student at UC Boulder, Colorado, and Andrew Knyazev, Distinguished Research Scientist at MERL, received the 2017 Graph Challenge Student Innovation Award. Their poster "Preconditioned Spectral Clustering for Stochastic Block Partition Streaming Graph Challenge" was accepted to the 2017 IEEE High Performance Extreme Computing Conference (HPEC '17), taking place 1214 September 2017 (http://www.ieeehpec.org/), and the paper was accepted to the IEEE Xplore HPEC proceedings.
HPEC is the premier conference in the world on the convergence of High Performance and Embedded Computing. DARPA/Amazon/IEEE Graph Challenge is a special HPEC event. Graph Challenge encourages community approaches to developing new solutions for analyzing graphs derived from social media, sensor feeds, and scientific data to enable relationships between events to be discovered as they unfold in the field. The 2017 Streaming Graph Challenge is Stochastic Block Partition. This challenge seeks to identify optimal blocks (or clusters) in a larger graph with known groundtruth clusters, while performance is evaluated compared to baseline Python and C codes, provided by the Graph Challenge organizers.
The proposed approach is spectral clustering that performs block partition of graphs using eigenvectors of a matrix representing the graph. Locally Optimal Block Preconditioned Conjugate Gradient (LOBPCG) method iteratively approximates a few leading eigenvectors of the symmetric graph Laplacian for multiway graph partitioning. Preliminary tests for all static cases for the Graph Challenge demonstrate 100% correctness of partition using any of the IEEE HPEC Graph Challenge metrics, while at the same time also being approximately 5001000 times faster compared to the provided baseline code, e.g., 2M static graph is 100% correctly partitioned in ~2,100 sec. Warmstarts of LOBPCG further cut the execution time 23x for the streaming graphs.
 David Zhuzhunashvili, an undergraduate student at UC Boulder, Colorado, and Andrew Knyazev, Distinguished Research Scientist at MERL, received the 2017 Graph Challenge Student Innovation Award. Their poster "Preconditioned Spectral Clustering for Stochastic Block Partition Streaming Graph Challenge" was accepted to the 2017 IEEE High Performance Extreme Computing Conference (HPEC '17), taking place 1214 September 2017 (http://www.ieeehpec.org/), and the paper was accepted to the IEEE Xplore HPEC proceedings.

AWARD Fellow of the Society for Industrial and Applied Mathematics (SIAM) Date: March 31, 2016
Awarded to:
MERL Contact: Andrew Knyazev
Research Areas: Algorithms, Advanced Control Systems, Decision Optimization, Dynamical Systems, Machine Learning, Predictive Modeling, Wireless Communications & Signal ProcessingBrief Andrew Knyazev selected as a Fellow of the Society for Industrial and Applied Mathematics (SIAM) for contributions to computational mathematics and development of numerical methods for eigenvalue problems.
Fellowship honors SIAM members who have made outstanding contributions to the fields served by the SIAM. Andrew Knyazev was among a distinguished group of members nominated by peers and selected for the 2016 Class of Fellows.
 Andrew Knyazev selected as a Fellow of the Society for Industrial and Applied Mathematics (SIAM) for contributions to computational mathematics and development of numerical methods for eigenvalue problems.

AWARD Professor Emeritus University of Colorado Denver Date: January 6, 2016
Awarded to:
MERL Contact: Andrew Knyazev
Research Area: AlgorithmsBrief Andrew Knyazev is awarded the title of Professor Emeritus at the University of Colorado Denver effective 1/31/2016. The award letter from the Chancellor of the University of Colorado Denver provides examples of the record of excellence over 20 years of contributions to the university such as 2008 CU Denver Excellence in Research Award, 2000 Teaching Excellence Award for the college, supervision of Ph.D. students, and two decades of uninterrupted external research funding from the US National Science Foundation and Department of Energy.
See All Awards for MERL 

News & Events

NEWS Andrew Knyazev (MERL) presents at the SchlumbergerTufts U. Computational and Applied Math Seminar Date: April 10, 2018
MERL Contact: Andrew Knyazev
Research Areas: Data Analytics, Algorithms, Machine LearningBrief Andrew Knyazev, Distinguished Research Scientist of MERL, has accepted an invitation to speak about his work on Big Data and spectral graph partitioning at the SchlumbergerTufts U. Computational and Applied Math Seminar. A primary focus of this seminar series is on mathematical and computational aspects of remote sensing. A partial list of the topics of interest includes: numerical solution of large scale PDEs (a.k.a. forward problems); theory and numerical methods of inverse and illposed problems; imaging; related problems in numerical linear algebra, approximation theory, optimization and model reduction. The seminar meets on average once a month, the location alternates between Schlumberger's office in Cambridge, MA and the Tufts Medford Campus.
Abstract: Data clustering via spectral graph partitioning requires constructing the graph Laplacian and solving the corresponding eigenvalue problem. We consider and motivate using negative edge weights in the graph Laplacian. Preconditioned iterative solvers for the Laplacian eigenvalue problem are discussed and preliminary numerical results are presented.
 Andrew Knyazev, Distinguished Research Scientist of MERL, has accepted an invitation to speak about his work on Big Data and spectral graph partitioning at the SchlumbergerTufts U. Computational and Applied Math Seminar. A primary focus of this seminar series is on mathematical and computational aspects of remote sensing. A partial list of the topics of interest includes: numerical solution of large scale PDEs (a.k.a. forward problems); theory and numerical methods of inverse and illposed problems; imaging; related problems in numerical linear algebra, approximation theory, optimization and model reduction. The seminar meets on average once a month, the location alternates between Schlumberger's office in Cambridge, MA and the Tufts Medford Campus.

TALK Theory and Applications of Sparse ModelBased Recurrent Neural Networks Date & Time: Tuesday, March 6, 2018; 12:00 PM
Speaker: Scott Wisdom, Affectiva
MERL Host: Jonathan Le Roux
Research Areas: Multimedia, Speech & AudioBrief Recurrent neural networks (RNNs) are effective, datadriven models for sequential data, such as audio and speech signals. However, like many deep networks, RNNs are essentially black boxes; though they are effective, their weights and architecture are not directly interpretable by practitioners. A major component of my dissertation research is explaining the success of RNNs and constructing new RNN architectures through the process of "deep unfolding," which can construct and explain deep network architectures using an equivalence to inference in statistical models. Deep unfolding yields principled initializations for training deep networks, provides insight into their effectiveness, and assists with interpretation of what these networks learn.
In particular, I will show how RNNs with rectified linear units and residual connections are a particular deep unfolding of a sequential version of the iterative shrinkagethresholding algorithm (ISTA), a simple and classic algorithm for solving L1regularized leastsquares. This equivalence allows interpretation of stateoftheart unitary RNNs (uRNNs) as an unfolded sparse coding algorithm. I will also describe a new type of RNN architecture called deep recurrent nonnegative matrix factorization (DRNMF). DRNMF is an unfolding of a sparse NMF model of nonnegative spectrograms for audio source separation. Both of these networks outperform conventional LSTM networks while also providing interpretability for practitioners.
 Recurrent neural networks (RNNs) are effective, datadriven models for sequential data, such as audio and speech signals. However, like many deep networks, RNNs are essentially black boxes; though they are effective, their weights and architecture are not directly interpretable by practitioners. A major component of my dissertation research is explaining the success of RNNs and constructing new RNN architectures through the process of "deep unfolding," which can construct and explain deep network architectures using an equivalence to inference in statistical models. Deep unfolding yields principled initializations for training deep networks, provides insight into their effectiveness, and assists with interpretation of what these networks learn.
See All News & Events for Algorithms 

Recent Publications
 "Preconditioned Spectral Clustering for Stochastic Block Partition Streaming Graph Challenge", IEEE HPEC Graph Challenge, DOI: 10.1109/HPEC.2017.8091045, September 2017, pp. 16. ,
 "A Brief Theory of Guided Signal Reconstruction", International Conference on Sampling Theory and Applications (SampTA), July 2017. ,
 "Signal reconstruction via operator guiding", International Conference on Sampling Theory and Applications (SampTA), July 2017. ,
 "Recent implementations, applications, and extensions of the Locally Optimal Block Preconditioned Conjugate Gradient method (LOBPCG)", Householder Symposium on Numerical Linear Algebra, June 2017. ,
 "Signed Laplacian for Spectral Clustering Revisited", arXiv, January 2017.BibTeX Download PDFRead TR2017001
 @techreport{MERL_TR2017001,
 author = {Knyazev and A.},
 title = {Signed Laplacian for Spectral Clustering Revisited},
 institution = {MERL  Mitsubishi Electric Research Laboratories},
 address = {Cambridge, MA 02139},
 number = {TR2017001},
 month = jan,
 year = 2017,
 url = {http://www.merl.com/publications/TR2017001/}
 }
,  "RayleighRitz Majorization Error Bounds of the Mixed Type", SIAM Journal on Matrix Analysis and Applications, DOI: 10.1137/16M1058121, Vol. 38, No. 1, pp. 3049, January 2017. ,

Videos

Free Downloads