How difficult is the variable selection problem? Specifically,suppose one is doing variable selection for linear regression. That is, he or she is given a matrix B and a target vector y, and wants to find an x such that Bx is approximately y. For this problem, ordinary least squares is the answer. However, in addition x should have (approximately) as few nonzeroes as possible.
It would desirable that if there were an x* with k nonzeroes such that Bx* = y, then the algorithm should find x.Unfortunately, finding x is NP-Complete, so it is basically hopeless to find an algorithm, running in polynomial time, which will find x.
However, what if we let the algorithm "cheat"? More precisely, suppose we seek an algorithm that finds an x with Bx "approximately equal" to y, with x having "only slightly" more nonzeroes than x. Maybe sucha "relaxed" algorithm could run in polynomial time?
Unfortunately, no. In the talk I will show that (for suitable definitions of"approximately equal" and "only slightly"), no such algorithm can exist.
If you know nothing about NP-Completeness, don't panic. I will define all the terms and use nothing but basic linear algebra.
Speaker - Howard Karloff
After receiving a PhD from Berkeley, Howard Karloff taught at theUniversity of Chicago and Georgia Tech before leaving Georgia Tech as a full professor to join AT&T Labs--Research in 1999. He left ATT Labs in2013 to join Yahoo Labs in New York, where he stayed till February, 2015. A Fellow of the Association for Computing Machinery (ACM),he has served on the program committees of numerous conferences, chaired the 1998 Symposium of Discrete Algorithms(SODA) program committee, and was general chair of the 2012 and 2014 Symposia on the Theory of Computing. He is the author of numerous journal and conference articles and the Birkhauser book "Linear Programming." His interests include algorithms, optimization, machine learning and data science.
Claim the event and start manage its content.I am the organizer