Statistics help guide us to optimal decisions under uncertainty. A large variety of statistical problems are essentially solutions to optimization problems. The mathematical techniques of optimization are fundamentalto statistical theory and practice. In this book, Jagdish Rustagi provides full-spectrum coverage of these methods, ranging from classical optimization and Lagrange multipliers, to numerical techniques using gradients or direct search, to linear, nonlinear, and dynamic programming using the Kuhn-Tucker conditions or the Pontryagin maximal principle. Variational methods and optimization in function spaces are also discussed, as are stochastic optimization in simulation, including annealing methods. The text features numerous applications, including: Finding maximum likelihood estimates, Markov decision processes, Programming methods used to optimize monitoring of patients in hospitals, Derivation of the Neyman-Pearson lemma, The search for optimal designs, Simulation of a steel mill. Suitable as both a reference and a text, this book will be of interest to advanced undergraduate or beginning graduate students in statistics, operations research, management and engineering sciences, and related fields. Most of the material can be covered in one semester by students with a basic background in probability and statistics. Covers optimization from traditional methods to recent developments such as Karmarkars algorithm and simulated annealingDevelops a wide range of statistical techniques in the unified context of optimizationDiscusses applications such as optimizing monitoring of patients and simulating steel mill operationsTreats numerical methods and applicationsIncludes exercises and references for each chapterCovers topics such as linear, nonlinear, and dynamic programming, variational methods, and stochastic optimization