Transformer Based Sequential Recommender System / (Record no. 594917)

000 -LEADER
fixed length control field 02074nam a22001697a 4500
003 - CONTROL NUMBER IDENTIFIER
control field NUST
082 ## - DEWEY DECIMAL CLASSIFICATION NUMBER
Classification number 005.1,FAR
100 ## - MAIN ENTRY--PERSONAL NAME
Personal name Farooq, Nadia
245 ## - TITLE STATEMENT
Title Transformer Based Sequential Recommender System /
Statement of responsibility, etc. Major Nadia Farooq
264 ## - PRODUCTION, PUBLICATION, DISTRIBUTION, MANUFACTURE, AND COPYRIGHT NOTICE
Place of production, publication, distribution, manufacture Rawalpindi
Name of producer, publisher, distributor, manufacturer MCS, NUST
Date of production, publication, distribution, manufacture, or copyright notice 2023
300 ## - PHYSICAL DESCRIPTION
Extent x, 27
505 ## - FORMATTED CONTENTS NOTE
Formatted contents note Recommender systems (RS) aids in helping endusers by providing suggestions and predicting items of their interest in e-commerce and social media platforms. Sequence of user’s historical preferences are used by Sequential Recommendation system (SRS) to predict next user-item interaction. In recent literature, various deep learning methods like CNN and RNN have shown significant improvements in finding recommendations, however, anticipating future item pertaining to user’s past record history is still challenging. With the introduction of transformer architecture, SRS have gained major performance boost in generating precise<br/>recommendations. Recently proposed models based on transformer architecture predict next user-item by exploiting item identifiers only. Regardless of the efficacy of these models, we believe that performance of recommendation models can be improved by adding some additional descriptive item features along with the item identifiers. This paper proposes a transformer based SRS that models user behavior sequences, by incorporating auxiliary information along with item identifiers for producing more accurate recommendations. The proposed model extends the BERT4Rec model to incorporate auxiliary information by exploiting<br/>the ”Sentence Transformer model” to produce the sentence representations from the textual features of items. This dense vector representation is then merged with the item representations of user. Comprehensive experiments upon various benchmark datasets shows remarkable improvements when corelating with other similar state-of-the-art models.
650 ## - SUBJECT ADDED ENTRY--TOPICAL TERM
Topical term or geographic name entry element MSCSE / MSSE-27
690 ## - LOCAL SUBJECT ADDED ENTRY--TOPICAL TERM (OCLC, RLIN)
Topical term or geographic name as entry element MSCSE / MSSE
700 ## - ADDED ENTRY--PERSONAL NAME
Personal name Supervisor Dr. Naima Iltaf
942 ## - ADDED ENTRY ELEMENTS (KOHA)
Source of classification or shelving scheme
Koha item type Thesis
Holdings
Withdrawn status Permanent Location Current Location Date acquired Full call number Barcode Koha item type
  Military College of Signals (MCS) Military College of Signals (MCS) 06/06/2023 005.1,FAR MCSTCS-547 Thesis
© 2023 Central Library, National University of Sciences and Technology. All Rights Reserved.