Skip to main navigation Skip to search Skip to main content

Capacity Matters: a Proof-of-Concept for Transformer Memorization on Real-World Data

  • Anton Changalidis*
  • , Aki Harma
  • , R Jia
  • , E Wallace
  • , Y Huang
  • , T Pimentel
  • , P Maini
  • , V Dankers
  • , J Wei
  • , P Lesci
  • *Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingAcademicpeer-review

Abstract

This paper studies how the model architecture and data configurations influence the empirical memorization capacity of generative transformers. The models are trained using synthetic text datasets derived from the Systematized Nomenclature of Medicine (SNOMED) knowledge graph: triplets, representing static connections, and sequences, simulating complex relation patterns. The results show that embedding size is the primary determinant of learning speed and capacity, while additional layers provide limited benefits and may hinder performance on simpler datasets. Activation functions play a crucial role, and Softmax demonstrates greater stability and capacity. Furthermore, increasing the complexity of the data set seems to improve the final memorization. These insights improve our understanding of transformer memory mechanisms and provide a framework for optimizing model design with structured real-world data.
Original languageEnglish
Title of host publicationPROCEEDINGS OF THE FIRST WORKSHOP ON LARGE LANGUAGE MODEL MEMORIZATION, L2M2
PublisherAssociation for Computational Linguistics (ACL)
Pages227-238
Number of pages12
ISBN (Print)9798891762787
Publication statusPublished - 2025
Event1st Workshop on Large Language Model Memorization-L2M2 - Vienna International Centre, Vienna, Austria
Duration: 1 Aug 20251 Aug 2025
https://sites.google.com/view/memorization-workshop/

Conference

Conference1st Workshop on Large Language Model Memorization-L2M2
Country/TerritoryAustria
CityVienna
Period1/08/251/08/25
Internet address

Fingerprint

Dive into the research topics of 'Capacity Matters: a Proof-of-Concept for Transformer Memorization on Real-World Data'. Together they form a unique fingerprint.

Cite this