Training big random forests with little resources

    2 Citations (Scopus)

    Abstract

    Without access to large compute clusters, building random forests on large datasets is still a challenging problem. This is, in particular, the case if fully-grown trees are desired. We propose a simple yet effective framework that allows to efficiently construct ensembles of huge trees for hundreds of millions or even billions of training instances using a cheap desktop computer with commodity hardware. The basic idea is to consider a multi-level construction scheme, which builds top trees for small random subsets of the available data and which subsequently distributes all training instances to the top trees' leaves for further processing. While being conceptually simple, the overall efficiency crucially depends on the particular implementation of the different phases. The practical merits of our approach are demonstrated using dense datasets with hundreds of millions of training instances.

    Original languageEnglish
    Title of host publicationKDD 2018 - Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
    PublisherACM Association for Computing Machinery
    Publication date2018
    Pages1445-1454
    ISBN (Print)9781450355520
    DOIs
    Publication statusPublished - 2018
    Event24th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD 2018 - London, United Kingdom
    Duration: 19 Aug 201823 Aug 2018

    Conference

    Conference24th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD 2018
    Country/TerritoryUnited Kingdom
    CityLondon
    Period19/08/201823/08/2018
    SponsorACM SIGKDD, ACM SIGMOD

    Keywords

    • Classification
    • Ensemble methods
    • Large-scale data analytics
    • Machine learning
    • Random forests
    • Regression trees

    Fingerprint

    Dive into the research topics of 'Training big random forests with little resources'. Together they form a unique fingerprint.

    Cite this