A review of digital formative assessment tools: Features and future directions.


Çekiç A., Bakla A.

INTERNATIONAL ONLINE JOURNAL OF EDUCATION AND TEACHING, cilt.8, sa.3, ss.1459-1485, 2021 (Hakemli Dergi)

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 8 Sayı: 3
  • Basım Tarihi: 2021
  • Dergi Adı: INTERNATIONAL ONLINE JOURNAL OF EDUCATION AND TEACHING
  • Derginin Tarandığı İndeksler: EBSCO Education Source, Educational research abstracts (ERA), ERIC (Education Resources Information Center), MLA - Modern Language Association Database, Directory of Open Access Journals, TR DİZİN (ULAKBİM)
  • Sayfa Sayıları: ss.1459-1485
  • Sivas Cumhuriyet Üniversitesi Adresli: Evet

Özet

The Internet and the software stores for mobile devices come with a huge number of digital tools for any task, and those intended for digital formative assessment (DFA) have burgeoned exponentially in the last decade. These tools vary in terms of their functionality, pedagogical quality, cost, operating systems and so forth. Teachers and learners need guidance on how to choose the most effective digital formative software to make the most out of it. This study provides an in-depth critical review of the features of most popular formative assessment tools available on the Internet. It aims to unearth what current DFA tools are capable of doing and what further developments are needed for more effective use. The tools for analysis were sampled using frequency of mentions in educational technology websites and blogs and two scholarly databases (Web of Science and Scopus). After identifying the most frequently recommended reviewed and researched formative assessment tools, the researchers inspected 14 tools in terms of various issues, ranging from platforms and devices used, item-types offered by the software, features related with monitoring student performance and providing feedback (through student/instructor dashboards), grading, scoring of open-response items and collaborative responses. The results indicated that all closed-ended items were common to all the tools examined and they were automatically scored, while only a few of them offered underdeveloped methods of grading open-ended items. All the tools provided learner analytics with diverse forms of data and different mechanisms for feedback, yet the most common form of data were immediate answers and numerical scores. It was also clear that popularity did not necessarily mean offering more functionalities and better tools. Based on the current status of the tools, avenues for further research are discussed.