Oops! Looks like we're having trouble connecting to our server.
Refresh your browser window to try again.
About this product
Product Identifiers
PublisherCambridge University Press
ISBN-101108494315
ISBN-139781108494311
eBay Product ID (ePID)22038694812
Product Key Features
Number of Pages704 Pages
LanguageEnglish
Publication NameBeyond Worst-Case Analysis
SubjectGeneral
Publication Year2021
TypeTextbook
AuthorTim Roughgarden
Subject AreaMathematics, Computers
FormatHardcover
Dimensions
Item Height1.6 in
Item Weight49.4 Oz
Item Length10.2 in
Item Width7.4 in
Additional Product Features
Intended AudienceScholarly & Professional
LCCN2020-049362
Reviews'Many important algorithmic problems are considered intractable according to the conventional worst-case metrics of computational complexity theory. This important book demonstrates that, for many such problems, it is possible to craft algorithms that perform well under plausible assumptions about the structure of the inputs that are likely to be presented. It may well mark a turning point in the field of algorithm design and analysis.' Richard M. Karp, University of California at Berkeley, 'The book is a must have for any aspiring algorithm researcher ... Essential.' D. Papamichail, Choice Magazine
Dewey Edition23
IllustratedYes
Dewey Decimal518.1
Table Of ContentForward; Preface; 1. Introduction Tim Roughgarden; Part I. Refinements of Worst-Case Analysis: 2. Parameterized algorithms Fedor Fomin, Daniel Lokshtanov, Saket Saurabh, and Meirav Zehavi; 3. From adaptive analysis to instance optimality Jérémy Barbay; 4. Resource augmentation Tim Roughgarden; Part II. Deterministic Models of Data: 5. Perturbation resilience Konstantin Makarychev and Yury Makarychev; 6. Approximation stability and proxy objectives Avrim Blum; 7. Sparse recovery Eric Price; Part III. Semi-Random Models: 8. Distributional analysis Tim Roughgarden; 9. Introduction to semi-random models Uriel Feige; 10. Semi-random stochastic block models Ankur Moitra; 11. Random-order models Anupam Gupta and Sahil Singla; 12. Self-improving algorithms C. Seshadhri; Part IV. Smoothed Analysis: 13. Smoothed analysis of local search Bodo Manthey; 14. Smoothed analysis of the simplex method Daniel Dadush and Sophie Huiberts; 15. Smoothed analysis of Pareto curves in multiobjective optimization Heiko Röglin; Part V. Applications in Machine Learning and Statistics: 16. Noise in classification Maria-Florina Balcan and Nika Haghtalab; 17. Robust high-dimensional statistics Ilias Diakonikolas and Daniel Kane; 18. Nearest-neighbor classification and search Sanjoy Dasgupta and Samory Kpotufe; 19. Efficient tensor decomposition Aravindan Vijayaraghavan; 20. Topic models and nonnegative matrix factorization Rong Ge and Ankur Moitra; 21. Why do local methods solve nonconvex problems? Tengyu Ma; 22. Generalization in overparameterized models Moritz Hardt; 23. Instance-optimal distribution testing and learning Gregory Valiant and Paul Valiant; Part VI. Further Applications: 24. Beyond competitive analysis Anna R. Karlin and Elias Koutsoupias; 25. On the unreasonable effectiveness of satisfiability solvers Vijay Ganesh and Moshe Vardi; 26. When simple hash functions suffice Kai-Min Chung, Michael Mitzenmacher and Salil Vadhan; 27. Prior-independent auctions Inbal Talgam-Cohen; 28. Distribution-free models of social networks Tim Roughgarden and C. Seshadhri; 29. Data-driven algorithm design Maria-Florina Balcan; 30. Algorithms with predictions Michael Mitzenmacher and Sergei Vassilvitskii.
SynopsisThere are no silver bullets in algorithm design, and no single algorithmic idea is powerful and flexible enough to solve every computational problem. Nor are there silver bullets in algorithm analysis, as the most enlightening method for analyzing an algorithm often depends on the problem and the application. However, typical algorithms courses rely almost entirely on a single analysis framework, that of worst-case analysis, wherein an algorithm is assessed by its worst performance on any input of a given size. The purpose of this book is to popularize several alternatives to worst-case analysis and their most notable algorithmic applications, from clustering to linear programming to neural network training. Forty leading researchers have contributed introductions to different facets of this field, emphasizing the most important models and results, many of which can be taught in lectures to beginning graduate students in theoretical computer science and machine learning., Understanding when and why algorithms work is a fundamental challenge. For problems ranging from clustering to linear programming to neural networks there are significant gaps between empirical performance and prediction based on traditional worst-case analysis. The book introduces exciting new methods for assessing algorithm performance.