Here is a list of our invited speakers:
1. Preston Green
, Professor of Education, University of Connecticut
2. Ronald Hambleton
, Professor, University of Massachusetts Amherst
3. Joanna Gorin
, Vice President of Research, Educational Testing Service
Wednesday, October 21, 2015, 3:15pm – 4:45pm
Preston Green, John and Carla Klein Professor of Urban Education, University of Connecticut, Neag School of Education
Public Funding, Private Rules: How Charter Schools have Taken Advantage of Their Hybrid Characteristics
Since 1991, forty-two states and the District of Columbia have enacted legislation for charter schools. While charter schools are generally characterized as “public schools,” they are in reality hybrid institutions that exhibit both public and private characteristics. This presentation discusses how charter schools have used their public characteristics to qualify for public funding under state constitutional law, while highlighting their private characteristics to exempt themselves from other laws that apply to public schools. This presentation concludes by discussing how legislatures can design their charter school laws to provide greater protection to the public and to students.
Preston Green is the John and Carla Klein Professor of Urban Education at the University of Connecticut’s Neag School of Education. He is also a professor of educational leadership and law at the University of Connecticut. Before coming to the University of Connecticut, he was the Harry Lawrence Batschelet II Chair Professor of Educational Administration at Penn State, where he was also a professor of education and law and the program coordinator of Penn State’s educational leadership program. In addition, Dr. Green was the creator of Penn State’s joint degree program in law and education. Further, he ran the Law and Education Institute at Penn State, a professional development program that teaches teachers, administrators, and attorneys about educational law. At the University of Massachusetts, Dr. Green was an associate professor of education. He also served as the program coordinator of educational administration and Assistant Dean of Pre-Major Advising Services. Dr. Green has written four books and numerous articles and book chapters pertaining to educational law. He primarily focuses on the legal and policy issues pertaining to educational access and school choice.
Thursday, October 22, 2015, 4:00pm – 5:30pm
Ronald K. Hambleton, Professor, University of Massachusetts Amherst
Invited Interview (Interviewer: Daniel Jurich, National Board on Medical Examiners)
There are few topics in psychometrics that occupy the space between policy and operational testing in such an interesting, challenging, and consequential way as equating and score comparability. Technical and methodological choices can have an enormous impact on results, and sometimes the context for equating brings in considerations that are well outside the textbook procedures for ensuring test and form equivalence. Professor Ronald K. Hambleton has over twenty years’ worth of hands-on experience in this area within the context of the Massachusetts K-12 assessment system and other state testing programs such as those in Alaska, Delaware, New Jersey, Pennsylvania, and Virginia, and for several years he has had a front-row seat for this topic at the national level in his role as a member of the PARCC Technical Advisory Committee. In this featured interview, Professor Hambleton will share his accumulated understanding and insights on large-scale equating in the context of both state testing and PARCC, with a special focus on technical problems, policy implications, and research opportunities.
Friday, October 23, 2015, 11:30am – 1:00pm
Joanna Gorin, Vice President of Research, Educational Testing Service
Next Generation Performance Assessment for Improved Assessment and Learning
As education standards have increasingly emphasized higher-order cognitive processes, assessment developers have responded by including more short and extended constructed-response items, most notably essays, on their high-stakes assessments. Both in terms of face validity and construct validity, the use of performance tasks is appealing because they can be more aligned with the targeted reasoning and higher-order cognitive skills. However, while performance assessments overcome several of the limitations of traditional item types, it is not without costs. Scoring cost and reliability, performance generalizability and person-by-task interactions, and logistical and financial constraints are challenges that have faced the educational testing community as large-scale performance assessments have been become more frequently included on student and teacher assessments.
In this talk I will discuss the opportunities and challenges facing next-generation performance assessments, emphasizing in particular what is new now that may have been insurmountable in past performance assessment movements. I will focus on assessment capabilities that leverage emerging technologies to afford a wider range of examinee interactions, assessment contexts, and real-time process data to support inferences and decisions about the complex skills stakeholders want to measure. I will also review the rigorous research that is needed to address a set of common questions including fairness and bias, generalizability, scoring and scaling, construct representation, utility and validity, data structure and processing, psychometric and statistical modeling, and reporting and validation. Finally, I will provide specific examples of simulation and games-based assessment in a discussion of strategies for building next generation performance assessments and associated validity arguments.
As Vice President of Research at ETS, Joanna Gorin is responsible for a comprehensive research agenda to support current and future educational assessments for K–12, higher education, global, and workforce settings. She manages a portfolio of internally-funded research such as agendas on next generation assessment, cognitive and learning science, and validity and fairness for learners and teachers globally. Joanna's own research has focused on the integration of cognitive theory and psychometric theory as applied to principled assessment design and analysis. Her work has included studies of item difficulty, cognitive complexity, and automated item generation for assessments of quantitative reasoning, spatial reasoning, and verbal reasoning. With a publication record that includes articles in top-tier educational assessment journals and chapters in numerous edited volumes, her recent publications have focused on the role of cognitive and psychometric models, methods, and tools to support improved measurement of complex competencies, including literacy and the Next Generation Science Standards. Gorin received her Ph.D. in Quantitative Psychology (minor: cognitive psychology) from the University of Kansas 2002; her M. A. in Educational Psychology (major: quantitative methods, minor: learning) from the University of Texas at Austin in 1999; and her B.A. in Psychology from the University of California at Los Angeles in 1995. She is the 2007 recipient of NCME's Jason Millman Promising Measurement Scholar award, and founding president of AERA's Cognition and Assessment Special Interest Group.