Simple bilevel problems are optimization problems in which we want to find an optimal solution to an inner problem that minimizes an outer objective function. Such problems appear in many machine learning and signal processing applications as a way to eliminate undesirable solutions. However, since these problems do not satisfy regularity conditions, they are often hard to solve exactly and are usually solved via regularization (e.g LASSO and ridge regression). In the past few years, several algorithms were proposed to solve these bilevel problems directly and provide a rate for obtaining feasibility, assuming that the outer function is strongly convex. In our work, we suggest a new approach that is designed for bilevel problems with simple outer functions, such as the l1 norm, which are not required to be either smooth or strongly convex. In our new Iterative Approximation and Level-set Expansion (ITALEX) approach, we alternate between expanding the level-set of the outer function and approximately optimizing the inner function over this level- set. ITALEX guarantees that at each iteration of the algorithm the outer objective function is super-optimal, a property that is not known for other bilevel algorithms. We show that optimizing the inner function through first-order schemes such as proximal gradient and generalized conditional gradient results in a feasibility convergence rate of O(1/k), which is a rate only shown to be obtained for a smooth strongly convex outer functions. Assuming more restrictive settings which still holds for important cases (like Basis Pursuit), it results a convergence rate of O(1/k^2).