You are here

Back to top

Convex Optimization: Algorithms and Complexity (Foundations and Trends(r) in Machine Learning #26) (Paperback)

Convex Optimization: Algorithms and Complexity (Foundations and Trends(r) in Machine Learning #26) Cover Image
$95.00
Usually Ships in 1-5 Days

Description


This monograph presents the main complexity theorems in convex optimization and their corresponding algorithms. It begins with the fundamental theory of black-box optimization and proceeds to guide the reader through recent advances in structural optimization and stochastic optimization. The presentation of black-box optimization, strongly influenced by the seminal book by Nesterov, includes the analysis of cutting plane methods, as well as (accelerated) gradient descent schemes. Special attention is also given to non-Euclidean settings (relevant algorithms include Frank-Wolfe, mirror descent, and dual averaging), and discussing their relevance in machine learning. The text provides a gentle introduction to structural optimization with FISTA (to optimize a sum of a smooth and a simple non-smooth term), saddle-point mirror prox (Nemirovski's alternative to Nesterov's smoothing), and a concise description of interior point methods. In stochastic optimization it discusses stochastic gradient descent, mini-batches, random coordinate descent, and sublinear algorithms. It also briefly touches upon convex relaxation of combinatorial problems and the use of randomness to round solutions, as well as random walks based methods.

Product Details
ISBN: 9781601988607
ISBN-10: 1601988605
Publisher: Now Publishers
Publication Date: October 28th, 2015
Pages: 142
Language: English
Series: Foundations and Trends(r) in Machine Learning