What is it about?

The paper discusses deep transfer learning using transformer language models, as well as the implications for these techniques in information systems research. It includes first a survey of the new techniques, and second a survey of information systems research utilizing text analytics that could be improved by the new techniques.

Featured Image

Why is it important?

New advanced NLP techniques have applications in a wide variety of social science disciplines. This survey offers an introduction for those researchers, and focuses specifically on applications for information systems researchers. There is no single published or online source that offers a comprehensive survey at the level of detail as this, and such a level of detail is necessary to optimally understand and use these technologies in practice. What is great about this article is that it introduces these concepts in a way that is not esoteric, but that is easy to grasp for NLP novices and experts alike.


I hope that this article can help researchers from IS and other social science disciplines to quickly get up to speed with the powerful techniques that are now at the forefront of natural language processing research. It could also serve as a good guide for where new and novel opportunities may exist for IS researchers. I really like the online appendix which goes into detail about how these techniques could be applied to existing IS research, discussing when they can and shouldn't be used as well as what methods would be appropriate for different research topics. The methods discussed are very powerful, and I hope that you find the article easy to understand so that you can begin applying them in your own research quickly.

Dr. Ross Gruetzemacher
Wichita State University

Read the Original

This page is a summary of: Deep Transfer Learning & Beyond: Transformer Language Models in Information Systems Research, ACM Computing Surveys, January 2022, ACM (Association for Computing Machinery),
DOI: 10.1145/3505245.
You can read the full text:




The following have contributed to this page