1
|
Benlahbib A, Boumhidi A, Nfaoui EH. MINING ONLINE REVIEWS TO SUPPORT CUSTOMERS’ DECISION-MAKING PROCESS IN E-COMMERCE PLATFORMS: A NARRATIVE LITERATURE REVIEW. JOURNAL OF ORGANIZATIONAL COMPUTING AND ELECTRONIC COMMERCE 2022. [DOI: 10.1080/10919392.2022.2053454] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Affiliation(s)
- Abdessamad Benlahbib
- Lisac Laboratory, Faculty of Sciences Dhar El Mehraz (F.S.D.M), Sidi Mohamed Ben Abdellah University, Fes-Atlas, Morocco
| | - Achraf Boumhidi
- Lisac Laboratory, Faculty of Sciences Dhar El Mehraz (F.S.D.M), Sidi Mohamed Ben Abdellah University, Fes-Atlas, Morocco
| | - El Habib Nfaoui
- Lisac Laboratory, Faculty of Sciences Dhar El Mehraz (F.S.D.M), Sidi Mohamed Ben Abdellah University, Fes-Atlas, Morocco
| |
Collapse
|
2
|
López M, Martínez-Cámara E, Luzón MV, Herrera F. ADOPS: Aspect Discovery OPinion Summarisation Methodology based on deep learning and subgroup discovery for generating explainable opinion summaries. Knowl Based Syst 2021. [DOI: 10.1016/j.knosys.2021.107455] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|
3
|
Lamsiyah S, Mahdaouy AE, Ouatik SEA, Espinasse B. Unsupervised extractive multi-document summarization method based on transfer learning from BERT multi-task fine-tuning. J Inf Sci 2021. [DOI: 10.1177/0165551521990616] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Text representation is a fundamental cornerstone that impacts the effectiveness of several text summarization methods. Transfer learning using pre-trained word embedding models has shown promising results. However, most of these representations do not consider the order and the semantic relationships between words in a sentence, and thus they do not carry the meaning of a full sentence. To overcome this issue, the current study proposes an unsupervised method for extractive multi-document summarization based on transfer learning from BERT sentence embedding model. Moreover, to improve sentence representation learning, we fine-tune BERT model on supervised intermediate tasks from GLUE benchmark datasets using single-task and multi-task fine-tuning methods. Experiments are performed on the standard DUC’2002–2004 datasets. The obtained results show that our method has significantly outperformed several baseline methods and achieves a comparable and sometimes better performance than the recent state-of-the-art deep learning–based methods. Furthermore, the results show that fine-tuning BERT using multi-task learning has considerably improved the performance.
Collapse
Affiliation(s)
- Salima Lamsiyah
- Laboratory of Informatics, Signals, Automatic, and Cognitivism, FSDM, Sidi Mohamed Ben Abdellah University, Morocco; Laboratory of Engineering Sciences, National School of Applied Sciences, Ibn Tofail University, Morocco
| | - Abdelkader El Mahdaouy
- School of Computer Science (UM6P-CS), Mohammed VI Polytechnic University (UM6P), Morocco
| | - Saïd El Alaoui Ouatik
- Laboratory of Informatics, Signals, Automatic, and Cognitivism, FSDM, Sidi Mohamed Ben Abdellah University, Morocco; Laboratory of Engineering Sciences, National School of Applied Sciences, Ibn Tofail University, Morocco
| | - Bernard Espinasse
- LIS UMR CNRS 7020, Aix-Marseille Université/Université de Toulon, France
| |
Collapse
|