Home // COMPUTATION TOOLS 2024, The Fifteenth International Conference on Computational Logics, Algebras, Programming, Tools, and Benchmarking // View article
Contextual Categorization Enhancement through LLMs Latent-Space
Authors:
Zineddine Bettouche
Anas Safi
Andreas Fischer
Keywords: Natural Language Processing, Contextual Categorization, Large Language Models, BERT, Convex Hull, Hierarchical Navigable Small Worlds, High-dimensional Latent Space, Dimensionality Reduction.
Abstract:
Managing the semantic quality of the categorization in large textual datasets, such as Wikipedia, presents significant challenges in terms of complexity and cost. In this paper, we propose leveraging transformer models to distill semantic information from texts in the Wikipedia dataset and its associated categories into a latent space. We then explore different approaches based on these encodings to assess and enhance the semantic identity of the categories. Our graphical approach is powered by Convex Hull, while we utilize Hierarchical Navigable Small Worlds (HNSWs) for the hierarchical approach. As a solution to the information loss caused by the dimensionality reduction, we modulate the following mathematical solution: an exponential decay function driven by the Euclidean distances between the high-dimensional encodings of the textual categories. This function represents a filter built around a contextual category and retrieves items with a certain Reconsideration Probability (RP). Retrieving high-RP items serves as a tool for database administrators to improve data groupings by providing recommendations and identifying outliers within a contextual framework.
Pages: 6 to 11
Copyright: Copyright (c) IARIA, 2024
Publication date: April 14, 2024
Published in: conference
ISSN: 2308-4170
ISBN: 978-1-68558-158-9
Location: Venice, Italy
Dates: from April 14, 2024 to April 18, 2024