BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//wp-events-plugin.com//5.981//EN
TZID:Asia/Jerusalem
X-WR-TIMEZONE:Asia/Jerusalem
BEGIN:VEVENT
UID:163@web.iem.technion.ac.il
DTSTART;TZID=Asia/Jerusalem:20220117T113000
DTEND;TZID=Asia/Jerusalem:20220117T123000
DTSTAMP:20220110T064342Z
URL:https://web.iem.technion.ac.il/site/iemevents/a-first-order-method-for
-solving-non-differentiable-and-non-strongly-convex-bilevel-optimization/
SUMMARY:A First Order Method for Solving Non-Differentiable and Non-strongl
y convex Bilevel Optimization [ \n Graduate Student Seminar\n Semi
nars\n \n ]
DESCRIPTION:By: M.Sc. Roey Merchav\n Advisors: Prof. Shoam Sabach\n Where:
ZOOM From:\nTechnion\nAbstrasct:\n\nBi-level optimization problems seek t
o find a minimizer of an outer objective function constrained to the minim
izers set of an inner optimization problem. These problems\, both in the c
onvex and non-convex settings\, have diverse applications in the fields of
machine learning\, signal processing and many more. Since these problems
inherently don't satisfy Slater's condition\, exact solutions are mostly d
ifficult to find. Consequently\, existing algorithms only focus on approxi
mating them. Most algorithms for bi-level optimization problems are based
on regularization techniques\, which regularize the outer objective functi
on with the inner objective function in a certain way. Moreover\, in the c
onvex setting\, all the proposed algorithms provide (if any) a convergence
rate result only in terms of the inner objective function\, but only unde
r restrictive assumptions such as the outer objective function is smooth a
nd/or strongly convex (except a very recent work that propose a very compl
icated algorithm).\n\nIn this thesis\, we propose the Bi-Sub-Gradient (Bi-
SG) method\, which is a generalization of the classical sub-gradient metho
d to the setting of bi-level optimization problems. This is a first-order
method that is very easy to implement in the sense that it requires only a
computation of the associated proximal mapping or a sub-gradient of the o
uter objective function\, in addition to a proximal gradient step on the i
nner optimization problem (a step that is shared by many algorithms). We s
how\, under very mild assumptions\, that Bi-SG tackles bi-level optimizati
on problems and achieves several theoretical guarantees. First\, Bi-SG enj
oys sub-linear rates both in terms of the inner and outer objective functi
ons. Moreover\, if the outer objective function is additionally strongly c
onvex (could be still non-smooth)\, the outer rate can be improved to a li
near rate. Last\, we prove that the distance of the generated sequence to
the set of optimal solutions of the bi-level problem converges to zero.\n\
nIt should be noted that our mild assumptions do not include any different
iablity of the outer objective function nor its strong convexity. For inst
ance\, any smooth or Lipschitz continuous outer objective function satisfi
es the needed assumptions. Finally\, we demonstrate Bi-SG's performances i
n an extensive numerical comparison.\n\n \;\n\nZoom Link\n\nhttps://te
chnion.zoom.us/j/3800541616
CATEGORIES:Graduate Student Seminar,Seminars
END:VEVENT
BEGIN:VTIMEZONE
TZID:Asia/Jerusalem
X-LIC-LOCATION:Asia/Jerusalem
BEGIN:STANDARD
DTSTART:20211031T010000
TZOFFSETFROM:+0300
TZOFFSETTO:+0200
TZNAME:IST
END:STANDARD
END:VTIMEZONE
END:VCALENDAR