Coding Small Group Communication with AI: RNNs and Transformers with Context
Andrew Pilny, Joseph Bonito & Aaron Schecter
View all authors and affiliations
Abstract
This study compares the performance of recurrent neural networks (RNNs) and transformer-based models (DistilBERT) in classifying utterances as dialogue acts. The results show that transformers consistently outperform RNNs, highlighting their usefulness in coding small group interaction. Furthermore, the study explores the impact of incorporating context, in the form of preceding and following utterances. The findings reveal that adding context leads to modest improvements in model performance. Moreover, in some cases, adding context can lead to a slight decrease in performance. The study discusses the implications of these findings for small group researchers employing AI models for text classification tasks.
Click on the link for full article.
You will have to use your library to gain access to the article…