Dan's research interests lie in the intersection of machine learning and continuous optimization. His main focus is on the development of efficient algorithms with novel and provable performance guarantees for machine learning, large-scale data analysis, sequential decision making and optimization problems.

Dan completed his bachelor degree in electrical engineering, his master's degree in computer science and his PhD degree in operations research, all at the Technion - Israel Institute of Technology. After completing his PhD, Dan spent one year as a Research Assistant Professor at Toyota Technological Institute at Chicago. In 2017 Dan joined the Department of Industrial Engineering and Management at the Technion as a senior lecturer.

Selected Publications

  • Dan Garber and Elad Hazan. Universal Adaptive Linear Filtering. IEEE Transactions on Signal Processing, 61(7): 1595-1604, 2013 

  •  Dan Garber and Elad Hazan. Playing Non-linear Games with Linear Oracles. Foundations of Computer Science (FOCS), 420-428, 2013, Berkeley, CA
  •  Christos Boutsidis, Dan Garber, Zohar Karnin and Edo Liberty. Online Principal Component Analysis. Symposium on Discrete Algorithms (SODA), 887-901, 2015, San Diego, CA 

  •  Dan Garber and Elad Hazan. Faster Rates for The Frank-Wolfe Method over Strongly-Convex Sets. International Conference on Machine Learning (ICML), 541-549, 2015, Lille, France
  •  Dan Garber, Elad Hazan and Tengyu Ma. Online Learning of Eigenvectors. International Conference on Machine Learning (ICML), 560-568, 2015, Lille, France
  • Dan Garber and Elad Hazan. Sublinear Time Algorithms for Approximate Semidefinite Programming. Mathematical Programming Series A, Volume 158, Issue 1, pages 329-361, 2016
  •  Dan Garber and Elad Hazan. A Linearly Convergent Conditional Gradient Algorithm with Applications to Online and Stochastic Optimization. SIAM Journal on Optimization, Volume 26, Issue 3, pages 1493-1528, 2016
  •  Dan Garber, Elad Hazan, Chi Jin, Sham M. Kakade, Cameron Musco, Praneeth Netrapalli and Aaron Sidford. Faster Eigenvector Computation via Shift-and-Invert Preconditioning. International Conference on Machine Learning (ICML), 2626-2634, 2016, New York, NY
  • Jialei Wang, Weiran Wang, Dan Garber and Nathan Srebro. Efficient Globally Convergent Stochastic Optimization for Canonical Correlation Analysis. Neural Information Processing Systems (NIPS), 766-774, 2016, Barcelona, Spain

  • Dan Garber. Faster Projection-free Convex Optimization over the Spectrahedron. Neural Information Processing Systems (NIPS), 874-882, 2016, Barcelona, Spain 

  • Dan Garber and Ofer Meshi. Linear-memory and Decomposition-invariant Linearly Convergent Conditional Gradient Variant for Convex Optimization over Structured Polytopes. Neural Information Processing Systems (NIPS), 1001-1009, 2016, Barcelona, Spain 


Machine Learning, Continuos optimization, large-Scale data analysis, Design & Analysis of Algorithms.

Contact Info

Room 406 Bloomfield Building