NERA Webinar Series 2019 Summarize Your Research in Three (3) Minutes
On Wednesday, May 1st at 12:30pm (EDT), the Northeastern Educational Research Association (NERA) hosted a webinar on the 3-Minute Thesis Competition and a comparable competition for Professionals, which will be held during NERA’s 50th Annual Meeting at the Trumbull Marriott in Trumbull, CT (October 16-18, 2019). The competitions will challenge participants to present their research in just 180 seconds, in an engaging form that can be understood by an intelligent audience with no background in the research area. As we consider this year’s conference theme, “Transformation by Design,” this will be another way that we can improve how we share our research with a broader audience.
Presented by: Bridget Thomas, 2019 NERA Conference Co-chair
NERA Webinar Series 2017
Cultivating Sociocultural Consciousness in Students: Using Town Hall Meetings in the Urban Classroom
Monday, May 22nd from 6:00 to 7:00 PM EST
Presented by Melissa Soto, John F. Kennedy High School
Moderated by Darlene Russell, Teacher-as-Researcher Committee Chair
Bio: Ms. Soto is a William Patterson University alumna with a degree in Secondary Education and History. She has taught for three years at the JFK Campus High School as a Global Studies teacher. Currently, Melissa is the Social Studies Department Chaire, and has led professional development workshops in culturally responsive teaching.
Instructions: Please register for the event using the following link: https://www.viethconsulting.com/members/evr/regmenu.php?orgcode=NERA
. Further instructions will be provided upon registration.
NERA Webinar Series 2015
Past NERA Webinars
Measuring Student Learning Outcomes in Higher Education: Current State, Research Considerations, and an Example of Next Generation Assessment
Tuesday, February 24, 2015 from 3PM-4PM EDT
Presented by Ou Lydia Liu, Educational Testing Service and Katrina Roohr, Educational Testing Service & NERA conference co-chair
Influences and pressures from statewide governing boards, state mandates, regional and program accreditors, and a greater drive for accountability have resulted in an increase in the measurement of student learning outcomes (SLOs), or competencies, across United States colleges and universities (Kuh, Jankowski, Ikenberry, & Kinzie, 2014; Toiv, 2013; Richman & Ariovich, 2013). Institutions use a variety of tools to assess SLOs such as national and locally developed surveys, rubrics, performance assessment, e-portfolios, and standardized measures. Despite the increased use of these measures across higher education institutions, there are still a number of challenges that exist in SLO assessment such as insufficient predictive evidence, design and methodological issues with value-added research, and concerns with the effect of student motivation on test performance (Liu, 2011). The purpose of this presentation is to provide an introduction to SLO assessment, discussing the current state of assessment, and challenges in both implementation and use. We focus our discussion on research that has been conducted on student motivation. Additionally, we also discuss the importance of developing next-generation SLO assessments, and provide a working example of a next-generation assessment in quantitative literacy.
A 2013-2014 Presidential Initiative by NERA President John Young
Coordinated and Hosted by NERA Presidential Advisor on Special Projects Steven Holtzman
Modeling Item Response Profiles Using Factor Models, Latent Class Models, and Latent Variable Hybrids
Wednesday, August 13, 2014 from 4PM-5PM EST
Presented by Dena Pastor, James Madison University
There are a wide variety of latent variable models, both old and new, appropriate for modeling the relationships among observed variables. More traditional models include those using only latent continuous variables (e.g., factor models, item response theory models) or latent categorical variables (e.g., latent class analysis). More recently introduced models, known as latent variable hybrid models (Muthén, 2008), incorporate both latent continuous and latent categorical variables (e.g., factor mixture models or mixture factor models). To enhance the understanding of the hybrid models, Muthén (2008) provided a framework which highlights how the hybrid models are both similar and different to one another and to their traditional counterparts. This presentation will use Muthén’s framework to illustrate the model-implied item response profiles (IRPs) that correspond to these latent variable models used with dichotomous item response data. The IRPs associated with the more traditional latent class and factor models are presented along with the IRPs of the newer hybrid models. A graphical representation of item response profiles is first described, followed by a discussion of the ways in which IRPs can vary across examinees and the latent variable models best utilized to capture certain forms of variation. To illustrate the use of these models in practice, an exploratory model selection approach is used to determine which of the various model best represents college students’ responses to items assessing their knowledge of ethics and generalizability in psychosocial research.The Promise of Learning Progressions for Identifying Pathways for Student Success
Wednesday, June 18, 2014 from 4PM-5PM EST
Presented by NERA Secretary Jennifer Kobrin, Pearson
In recent years, learning progressions have captured the interest of educators and policy makers. As defined by the National Research Council (NRC, 2001), learning progressions are “descriptions of the successively more sophisticated ways of thinking about a topic that can follow one another as children learn about and investigate a topic over a broad span of time” (pp. 165-166). In this webinar, I will define and provide examples of learning progressions, and discuss the process for collecting validity evidence to support them. Then, I will describe how learning progressions can be used in formative assessment and to provide feedback to teachers. I will end by talking about current challenges and future research directions to realize the promise of learning progressions for identifying pathways for student success.
National Research Council [NRC] (2001). Knowing what students know: The science and design of educational assessment
. (Pelligrino, J., Chudowsky, N., and Glaser, R., Eds). Committee on the Foundations of Assessment, Board on Testing and Assessment, Center for Education. Division of Behavioral and Social Sciences and Education. Washington, DC: National Academy Press.
To learn more about Pearson's learning progression research, please visit Jennifer's blog
Taking the Mystery Out of Qualitative Data Analysis
Wednesday, April 23, 2014 from 4PM-5PM EST
Presented by Felice Billups, Johnson and Wales University
Have you decided to conduct a qualitative study involving interviews, focus groups, observations, artifact/document reviews - or some combination of these data collection strategies? Or have you already collected your qualitative data and are unsure about how to analyze pages and pages of transcriptions or notes - or where to begin? Just as there are numerous statistical tests to run for quantitative data, there are a variety of options for qualitative data analysis. This overview is designed to provide beginning qualitative researchers with the tools to apply the appropriate data analysis strategies to match one of the various qualitative research designs.
Expanding Education’s Predictors and Criteria: The Research and Assessment of 21st Century and Noncognitive Skills
Friday, February 28, 2014 from 3PM-4PM EST
Presented by NERA 2014 Conference Co-chair Ross Markle, Educational Testing Service
Over the past two decades, a great deal of educational research and practice has expanded our notions of what students should learn and our understanding of the factors that
influence learning. This expanded set of skills, behaviors, and attitudes goes by many names (e.g., psychosocial skills, noncognitive factors, 21st century skills) and contains many constructs (e.g., motivation, time management, interpersonal skills, self-regulation). This webinar will provide an overview of several key issues in the research and assessment of noncognitive skills. First, we will discuss research into the role of these factors, focusing on their relevance to other important student outcomes. Second, we will examine several innovations in the assessment of these skills that can improve or move beyond self-report methods. Finally, we will look at various ways that noncognitive factors have impacted educational practice, including effective student intervention and models of student learning outcomes.
Applying for 2014 Summer Internships
Wednesday, December 4, 2013 from 5PM-6PM EST
Presdented by NERA President John Young, Educational Testing Service
The first-ever NERA webinar took place on Wednesday, December 4th from 5:00 to 6:00 PM EST and focused on applying for 2014 summer internships. It was led by NERA’s President, Dr. John Young of Educational Testing Service (ETS). He has been involved in the intern selection process at ETS for the past several years. Information was provided on the summer internship programs for ETS and other organizations and there was time for Q&A.