Displaying results 31 - 38 of 38
Resource
The Rapid Response, Innovation, and Challenges of Sustainability in the Time of COVID-19: Reports from the Field
Posted on
Description
A new report that explores how adult education programs have adapted over time to the constantly changing conditions in programs caused by ongoing but inconsistent and sporadic COVID-19 infections and local policies governing public health. The report also looks at lessons learned after more than a year of remote teaching.
Resource
Digital Resilience in the American Workforce: Technical Assistance Pilot Program
Posted on
Description
The Technical Assistant (TA) Pilot Program is a training program for instructors in adult digital literacy programs whose work is funded under the Adult Education and Family Literacy Act (AEFLA), as well as state professional development leaders. The pilot offers participants an opportunity to engage in peer learning and develop plans for using materials created by the DRAW team to enhance digital literacy instruction and make supplementary resources available to adult education learners. This program will include a webinar, state and local courses, and coaching.
Resource
Researcher Guide on Interpreting Impacts
Posted on
Description
IES released a guide to help researchers avoid common misinterpretations of statistical significance and report study impacts that are more actionable for end users. Improving the quality and relevance of education studies is IES Director Mark Schneider’s central goal for the Standards for Excellence in Education (SEER).
The guide introduces BASIE (Bayesian Interpretation of Estimates), an alternative framework to null hypothesis significance testing, and walks researchers through the key steps of applying BASIE:
Select prior evidence based on the distribution of intervention effects from existing impact studies (e.g., IES’ What Works Clearinghouse database).
Report traditional (based only on study data) and shrunken (based on both study data and prior evidence) impact estimates.
Interpret impact estimates using Bayesian posterior probabilities (or credible intervals).
Examine the sensitivity of shrunken impact estimates and posterior probabilities to what prior evidence is used.
The guide includes “express stops” and a simple Excel tool so that researchers can quickly start using BASIE. Detailed “local stops,” technical appendices, and programming code are also provided for evaluation methodologists.
View the guide by clicking here.
This guide is one of a series that helps researchers implement SEER. Guides on generalizability and sharing study data were recently released, and a guide on implementation research is in development and will be announced on https://www.ies.ed.gov.
The Institute of Education Sciences, a part of the U.S. Department of Education, is the nation's leading source for rigorous, independent education research, evaluation, statistics, and assessment.
Resource
Guidelines for Technology-Based Assessment
Posted on
Description
The Association of Test Publishers and the International Test Commission have collaborated to develop Guidelines for Technology-Based Assessment to promote best practices and ensure fair and valid assessment in a digital environment. These Guidelines are now in draft form and are available for public comment through May 15, 2022. The draft Guidelines are the product of a multiyear effort that involved dozens of invited authors, ad hoc technical reviewers, and extensive review by ten advisory groups representing practice areas and regions of the world.
Resource
CREATE Adult Skills Network COABE Flyer
Posted on
Description
This flyer provides background on the CREATE Adult Skills Network and a list of session Network partners and research project teams are presenting at COABE 2022 (April 10th - 13th).
Resource
Enhancing the Generalizability of Impact Studies in Education
Posted on
Description
This guide will help researchers design and implement impact studies in education so that the findings are more generalizable to the study's target population. Guidance is provided on key steps that researchers can take, including defining the target population, selecting a sample of schools—and replacement schools, when needed—managing school recruitment, assessing, and adjusting for differences between the sample and target population, and reporting information on the generalizability of the study findings.
Resource
Digital Skills Frameworks and Assessments: A Foundation for Understanding Adult Learners’ Strengths and Learning Needs
Posted on
Description
The CREATE Adult Skills Network (the Network) research teams are developing technology-supported learning and assessment tools and implementing curricula to help adult learners build digital skills. Throughout this work, each team has noted the importance of gaining a better understanding of the digital skills learners need to fully participate in the research projects. To that end, this Network Brief will introduce several widely used and relevant digital literacy frameworks and assessment strategies used in adult education.
The brief provides high-level descriptions of the following frameworks:
Northstar Digital Literacy standards
The ISTE SkillRise Profile of a Lifelong Learner
Seattle Digital Equity Initiative’s (SDEI) Digital Skills Framework
The Maryland Department of Labor/Adult Education’s Digital Literacy Framework for Adult Learners
Resource
Researcher Guide on Sharing Study Data
Posted on
Description
IES has released a guide to help researchers who, to support open science, are making decisions about how to safely and appropriately share study data. This includes deciding which study data to share, how to organize the data, what documentation to include, and where to share their final dataset. Making data open is one of the Standards for Excellence in Education (SEER) that IES Director Mark Schneider identifies as essential to making research transformational.
The guide offers researchers tips to address common challenges in sharing study data, such as how to balance privacy with transparency, along with concrete steps to take throughout the research process. Key principles include:
The goal of data sharing is to produce something of value for science and ultimately, for the improvement of education.
Focusing on sharing a well-organized and well-documented dataset can improve the organization and efficiency of the original study team.
Researchers should commit to sharing some data or code to facilitate additional analysis.
There is no single approach to sharing study data, and tradeoffs may be necessary.
The guide also provides links to other resources, a checklist, templates, and sample materials.
View the guide by clicking here.
This guide is one of a series that helps researchers implement SEER to improve the quality and relevance of their education studies. A guide on generalizability was recently released, and another guide on how to report more interpretable impact findings is forthcoming and will be announced on https://www.ies.ed.gov.
The Institute of Education Sciences, a part of the U.S. Department of Education, is the nation's leading source for rigorous, independent education research, evaluation, statistics, and assessment.