Correlation Engine 2.0
Clear Search sequence regions


  • help (1)
  • jackal (9)
  • tent (4)
  • Sizes of these terms reflect their relevance to your search.

    Feature selection is a crucial process in machine learning and data mining that identifies the most pertinent and valuable features in a dataset. It enhances the efficacy and precision of predictive models by efficiently reducing the number of features. This reduction improves classification accuracy, lessens the computational burden, and enhances overall performance. This study proposes the improved binary golden jackal optimization (IBGJO) algorithm, an extension of the conventional golden jackal optimization (GJO) algorithm. IBGJO serves as a search strategy for wrapper-based feature selection. It comprises three key factors: a population initialization process with a chaotic tent map (CTM) mechanism that enhances exploitation abilities and guarantees population diversity, an adaptive position update mechanism using cosine similarity to prevent premature convergence, and a binary mechanism well-suited for binary feature selection problems. We evaluated IBGJO on 28 classical datasets from the UC Irvine Machine Learning Repository. The results show that the CTM mechanism and the position update strategy based on cosine similarity proposed in IBGJO can significantly improve the Rate of convergence of the conventional GJO algorithm, and the accuracy is also significantly better than other algorithms. Additionally, we evaluate the effectiveness and performance of the enhanced factors. Our empirical results show that the proposed CTM mechanism and the position update strategy based on cosine similarity can help the conventional GJO algorithm converge faster.

    Citation

    Kunpeng Zhang, Yanheng Liu, Fang Mei, Geng Sun, Jingyi Jin. IBGJO: Improved Binary Golden Jackal Optimization with Chaotic Tent Map and Cosine Similarity for Feature Selection. Entropy (Basel, Switzerland). 2023 Jul 27;25(8)


    PMID: 37628158

    View Full Text