Kernelization : theory of parameterized preprocessing

By: Fomin, Fedor VContributor(s): Lokshtanov, Daniel | Saurabh, Saket | Zehavi, MeiravMaterial type: TextTextPublication details: Cambridge Cambridge 2019Description: 515 pISBN: 9781107057760 (hardback)Subject(s): Electronic data processing | Data reduction | Kernel functions | Parameter estimationDDC classification: 005.72 Summary: "Preprocessing, or data reduction, is a standard technique for simplifying and speeding up computation. Written by a team of experts in the field, this book introduces a rapidly developing area of preprocessing analysis known as kernelization. The authors provide an overview of basic methods and important results, with accessible explanations of the most recent advances in the area, such as meta-kernelization, representative sets, polynomial lower bounds, and lossy kernelization. The text is divided into four parts, which cover the different theoretical aspects of the area: upper bounds, meta-theorems, lower bounds, and beyond kernelization. The methods are demonstrated through extensive examples using a single data set. Written to be self-contained, the book only requires a basic background in algorithmics and will be of use to professionals, researchers and graduate students in theoretical computer science, optimization, combinatorics, and related fields"--
Tags from this library: No tags from this library for this title. Log in to add tags.
    Average rating: 0.0 (0 votes)
Item type Current library Call number Status Date due Barcode
BK BK
005.72 KER (Browse shelf (Opens below)) Available 54423

"Preprocessing, or data reduction, is a standard technique for simplifying and speeding up computation. Written by a team of experts in the field, this book introduces a rapidly developing area of preprocessing analysis known as kernelization. The authors provide an overview of basic methods and important results, with accessible explanations of the most recent advances in the area, such as meta-kernelization, representative sets, polynomial lower bounds, and lossy kernelization. The text is divided into four parts, which cover the different theoretical aspects of the area: upper bounds, meta-theorems, lower bounds, and beyond kernelization. The methods are demonstrated through extensive examples using a single data set. Written to be self-contained, the book only requires a basic background in algorithmics and will be of use to professionals, researchers and graduate students in theoretical computer science, optimization, combinatorics, and related fields"--

There are no comments on this title.

to post a comment.

Powered by Koha