Advertising Console

    Optimal convex optimization under Tsybakov noise through reduction to active learning (Aarti SINGH-Carnegie Mellon University)

    Repost
    325 vues
    Optimal convex optimization under Tsybakov noise through reduction to active learning
    We demonstrate that the complexity of convex minimization is only determined by the rate of growth of the function around its minimizer, as quantified by a Tsybakov-like noise condition (TNC) with exponent k. Specifically, we demonstrate optimal first-order stochastic optimization rates that depend precisely on the TNC exponent k which include as special cases the classical rates for convex (k tending to infinity), strongly convex (k=2) and uniformly convex optimization (k > 2). Even faster rates (nearly exponential) can be attained if the exponent 1