Near-optimal labeling schemes for nearest common ancestors

Stephen Alstrup, Esben Bistrup Halvorsen, Kasper Green Larsen

16 Citations (Scopus)

Abstract

We consider NCA labeling schemes: given a rooted tree T, label the nodes of T with binary strings such that, given the labels of any two nodes, one can determine, by looking only at the labels, the label of their nearest common ancestor. For trees with n nodes we present upper and lower bounds establishing that labels of size (2 ± ε)logn, ε < 1 are both sufficient and necessary.Alstrup, Bille, and Rauhe (SIDMA'05) showed that ancestor and NCA labeling schemes have labels of size logn + Ω(loglogn ). Our lower bound increases this to logn + Ω(logn) for NCA labeling schemes. Since Fraigniaud and Korman (STOC'10) established that labels in ancestor labeling schemes have size logn+Θ(log logn), our new lower bound separates ancestor and NCA labeling schemes. Our upper bound improves the 10 log n upper bound by Alstrup, Gavoille, Kaplan and Rauhe (TOCS'04), and our theoretical result even outperforms some recent experimental studies by Fischer (ESA'09) where variants of the same NCA labeling scheme are shown to all have labels of size approximately 8 log n.

Original languageEnglish
Title of host publicationProceedings of the Twenty-Fifth Annual ACM-SIAM Symposium on Discrete Algorithms
EditorsChandra Chekuri
Number of pages11
PublisherSociety for Industrial and Applied Mathematics
Publication date2014
Pages972-982
ISBN (Print)978-1-61197-338-9
ISBN (Electronic)978-1-61197-340-2
DOIs
Publication statusPublished - 2014
EventTwenty-Fifth Annual ACM-SIAM Symposium on Discrete Algorithms - Portland, United States
Duration: 5 Jan 20147 Jan 2014
Conference number: 25

Conference

ConferenceTwenty-Fifth Annual ACM-SIAM Symposium on Discrete Algorithms
Number25
Country/TerritoryUnited States
CityPortland
Period05/01/201407/01/2014

Fingerprint

Dive into the research topics of 'Near-optimal labeling schemes for nearest common ancestors'. Together they form a unique fingerprint.

Cite this