Blog

CREATE Shares Their Rubric for Technical Assistance Evaluation!

The digital technology landscape is constantly evolving. To sustain provision of innovative and quality instruction that prepares adult learners to use digital technologies in schooling and work, adult education program leaders and teachers must continuously evolve in the way they make use of digital technologies in their programming. This can be a dizzying challenge as practitioners and program leaders may feel they have insufficient experience and confidence to make choices about what technology to use, when to use it, and for what purpose. Quality technical assistance (TA) resources can help fill this gap by providing guidance on the selection and implementation of new digital technology resources and tools. 

Developed by World Education's CREATE team, the Create Adult Skills Network Edtech Technical Assistance Library (the TA Library) is a compilation of technical assistance resources that help adult education leaders and practitioners plan and implement instruction utilizing digital technologies. The resources in the TA Library are organized by categories that cover the tasks for which teachers might use digital technologies

  • Planning technology use 
  • Communicating with learners 
  • Managing content/instruction 
  • Determining instructional content 
  • Providing instruction through different modes 
  • Assessment 

The resources in the TA Library are all free, were crowdsourced by adult educators, and evaluated before they were added to the TA Library. 

The utility of the TA Library rests on the quality of its resources. Deciding which resources to keep or rule out was a challenge. A strong evaluation rubric was the key. We constructed our rubric by looking at research that evaluated the quality of technical assistance resources and open educational resources. The research we found most useful came from the fields of organizational leadership, medical education and training, education evaluation, open education, and online education. 

We built the rubric through a multistep process. We located findings from publications in these fields, identified common characteristics of quality in technical assistance materials within those findings, and then gauged the relevance of those characteristics for adult education. A key consideration for the final structure of the rubric was usability, which provided an incentive for keeping it simple and relatively short. Additionally, the literature provided guidance on the structure and content on the rubric itself (Yuan & Recker, 2015). The rubric we created was the result of team discussions, iterative development, trial use, and revision -- done until we felt we had a tool that could inform our decisions. 

The initial rubric had six criteria: clarity and comprehensibility, content and technical accuracy, accessibility, quality of content, ease of use, and provenance. Working through the iterative process described above, we were able to condense these down to three main categories with clear definitions. The evaluation criteria, their definitions, and the most salient literature that supported our understanding of them is reflected below.

Content Quality 

  • The content in the resource is comprehensible to adult education practitioners and written in plain language. 
  • The content is accurate, current, and based on some expert knowledge or evidence. 
  • The content is well constructed without grammatical or typographical errors; no broken links or obsolete formats. 
  • The content is logically arranged and easy to follow. 
  • The content is relevant and conveys the information it purports to cover. 

References that informed this category. 

Haughey, M., & Muirhead, B. (2005). Evaluating learning objects for schools. E-Journal of Instructional Science and Technology8(1), n1. https://eric.ed.gov/?id=EJ850358

Pérez-Mateo, M., Maina, M. F., Guitert, M., & Romero, M. (2011). Learner generated content: Quality criteria in online collaborative learning. European Journal of Open, Distance and E-Learning14(2). https://old.eurodl.org/?p=special&sp=articles&article=461&article=455&article=459

Plainlanguage.gov | Clear writing and plain language. (n.d.). https://www.plainlanguage.gov/about/definitions/clear-writing-and-plain-language/

Research guides: Open Educational Resources: Evaluating OER. (n.d.). https://libguides.lehman.edu/c.php?g=612694&p=4255719

Ease of Use and Accessibility 

  • The content is not behind a paywall or log in. 
  • The content presents information in a format that supports usability (e.g., layout and navigation are easy to follow) and accessibility (e.g., may include alt text, closed-captioning). 

References that informed this category. 

Achieve. (2011). Rubrics for evaluating Open Education Resource (OER) objects. Washington, DC. http://www.achieve.org/publications/achieve-oer-rubrics

Haughey, M., & Muirhead, B. (2005). Evaluating learning objects for schools. E-Journal of Instructional Science and Technology8(1), n1. 

Kurilovas, E., Bireniene, V., & Serikoviene, S. (2011). Methodology for evaluating quality and reusability of learning objects. Electronic Journal of e-Learning9(1), pp39-51. https://academic-publishing.org/index.php/ejel/article/view/1604

Nesbit, J., Belfer, K., & Leacock, T. (2007). Learning object review instrument (LORI), Version 1.5. E-Learning Research and Assessment (eLera) and the Portal for Online Objects in Learning (POOL)https://edutechwiki.unige.ch/en/Learning_Object_Review_Instrument

Pérez-Mateo, M., Maina, M. F., Guitert, M., & Romero, M. (2011). Learner generated content: Quality criteria in online collaborative learning. European Journal of Open, Distance and E-Learning14(2).

Research guides: Open Educational Resources: Evaluating OER. (n.d.). 

Provenance 

  • The author or authoring organization is known and has made previous contributions to the field. 

References that informed this category. 

Custard, M., & Sumner, T. (2005). Using machine learning to support quality judgments. D-Lib Magazine11(10), 1082-9873. https://www.dlib.org/dlib/october05/custard/10custard.html

 

We have created a public version of the rubric, available to help educators discern the quality of TA resources they might encounter. We hope the rubric, the research literature introduced here, and our TA Library all inform quality technology-rich instruction in adult education. 

Try out the TA Evaluation Rubric 

Explore the TA Library

References 

Yuan, M., & Recker, M. (2015). Not all rubrics are equal: A review of rubrics for evaluating the quality of open educational resources. International Review of Research in Open and Distributed Learning16(5), 16-38.