Paper by Erik D. Demaine

Reference:
Ilya Baran, Erik D. Demaine, and Dmitriy A. Katz, “Optimally Adaptive Integration of Univariate {Lipschitz”, Algorithmica, volume 50, number 2, February 2008, pages 255–278. Special issue of selected papers from the 7th Latin American Symposium on Theoretical Informatics, 2006.

Abstract:
We consider the problem of approximately integrating a Lipschitz function f (with a known Lipschitz constant) over an interval. The goal is to achieve an error of at most ε using as few samples of f as possible. We use the adaptive framework: on all problem instances an adaptive algorithm should perform almost as well as the best possible algorithm tuned for the particular problem instance. We distinguish between DOPT and ROPT, the performances of the best possible deterministic and randomized algorithms, respectively. We give a deterministic algorithm that uses O(DOPT(f, ε) ⋅ log (ε−1 / DOPT(f, ε))) samples and show that an asymptotically better algorithm is impossible. However, any deterministic algorithm requires Ω(ROPT(f, ε)2) samples on some problem instance. By combining a deterministic adaptive algorithm and Monte Carlo sampling with variance reduction, we give an algorithm that uses at most O(ROPT(f, ε)4/3 + ROPT(f, ε) ⋅ log (1/ε)) samples. We also show that any algorithm requires Ω(ROPT(f, ε)4/3 + ROPT(f, ε) ⋅ log (1/ε)) samples in expectation on some problem instance (f, ε), which proves that our algorithm is optimal.

Comments:
This paper is also available from SpringerLink.

Availability:
The paper is available in PostScript (594k), gzipped PostScript (257k), and PDF (289k).
See information on file formats.
[Google Scholar search]

Related papers:
Integration_LATIN2006 (Optimally Adaptive Integration of Univariate {Lipschitz)


See also other papers by Erik Demaine.
These pages are generated automagically from a BibTeX file.
Last updated November 12, 2024 by Erik Demaine.