Sep 8, 2021
9:00 AM - 12:00 PM
|
Pre-Conference Workshops
LINGUISTIC DISCRIMINATION AND THE ORAL ABILITY - IS YOUR RUBRIC PRODUCING INEQUITY?
Poppy Slocum, Ph.D - LaGuardia Community College & Carlos de Cuba, Ph.D - Kingsborough Community College Zoom Room 1
Oral communication skills are a component of many colleges’ and universities’ assessment plans for good reason - they are listed in the top 10 attributes sought by employers in a 2019 report by the National Association of Colleges and Employers. Rubrics for assessing these skills, however, often allow for linguistic discrimination and prejudice to affect the scoring. They often ask the reader to evaluate the speaker’s pronunciation, articulation, and/or grammar, with an unspoken assumption that there is only one language that is “appropriate” or “effective” in speaking contexts, the “standard” language. Mirroring the general public, higher education is steeped in Standard Language Ideology, which is the belief that there is a “standard” variety of a language that is superior to other varieties of that language (Lippi-Green 2012). This belief has been widely debunked by linguists and educators. Linguistic discrimination arises from a system of myths about language and grammar and a lack of awareness of the biases many of us have against speakers of so-called “nonstandard” varieties of language. These biases can have adverse effects for speakers of these varieties of English, including lower grades, lower expectations from teachers and disengagement from school. At the end of this workshop, participants will be more aware of dialectal variation and understand that all languages and dialects are equally logically valid. They will gain an understanding of the damage linguistic discrimination has on students, and how assessment rubrics can perpetuate this damage. Participants will examine how these rubrics can reinforce linguistic discrimination and fail to fairly or accurately assess student speech, and gain practice writing oral rubrics that assess effective communication without unfairly penalizing students for their linguistic variety.
Attendees of this workshop will be able to:
Understand the illogical, racist, and classist nature of Standard Language Ideology.
Gain the tools to remove Standard Language Ideology from their Oral Communication assessment rubrics.
THE UNTOLD STORY OF ASSESSMENT: CREATING A CULTURE THAT WORKS FOR YOU
Chadia Abras, Ph.D & Janet Schreck Ph.D - Johns Hopkins University Zoom Room 2
This workshop is relevant, especially in today’s environment, as assessment is becoming a key to the success of an institution in delivering effective learning, demonstrating student engagement and growth throughout the learning journey, and establishing the value proposition for higher education. Accrediting bodies are looking for direct and multi-measure evidence of how formative and summative assessments can improve attainment of students learning outcomes in their journey at the institution and beyond. This workshop is designed to guide participants in creating a culture of assessment at the university level. Presenters will discuss approaches that target various levels of the university ecosystem, including university administration, deans, faculty, student affairs staff, students and others. The workshop will also highlight how the university implemented and onboarded a new assessment management system in a holistic manner across 9 schools and divisions that are decentralized and operate as semi-independent entities. Lastly the audience will be invited to share experiences with the presenters and will benefit from feedback given by the presenters and the audience on how to maximize and improve their processes. Attendees will acquire knowledge and skills in creating structures and processes designed to rally stakeholders around the concept of assessment as a tool for improvement and not just for accreditation. The audience will acquire a tool kit of how to identify and engage appropriate stakeholders and what is needed to maintain their engagement. The audience will learn how to gain buy-in from stakeholders, create effective partnerships, and promote ownership of assessment practices. Attendees will also learn how to utilize technology as a tool to promote culture change.
Attendees of this workshop will be able to:
Involve stakeholders across multiple schools to create an effective culture of assessment.
Engage stakeholders from various levels of the university ecosystem, including university administration, deans, faculty, student affairs staff, students and others, to gain their buy-in into assessment and eventually create a culture of ownership of assessment practices.
Implement an assessment management system across multiple divisions and schools in a decentralized environment.
ASSESSMENT 101: A PRACTICAL GUIDE FOR NON-ACADEMIC UNIT ASSESSMENT
Nasrin Fatima, Ph.D - Binghampton University Zoom Room 3
Although student learning is directly affected by instruction in the classroom, it is indirectly affected by the processes, services, and resources of the operational/administrative support units of an institution. Because these units have great impact on the environment and tools of the classrooms, the goals/objectives of these units must be assessed on a continuous basis. In addition, MSCHE Standards of Accreditation requires an institution to continuously assess and improve its programs and services (Standard VI). However, many institutions struggle to create a culture of administrative unit assessment that is systematic, organized, and sustainable. Misconceptions about what assessment is and how administrative unit assessment is relevant to the overarching effectiveness institutional mission, goals, and objectives are the principal barriers to this. This workshop will provide step-by-step guidelines to create a framework for developing and successfully implementing an organized, systematic and sustainable non-academic unit assessment geared to improve institutional effectiveness. his workshop will provide an overview of the administrative/operational unit assessment process, why do these units need to conduct assessment, what are the characteristics of the effective administrative unit level assessment. The presenter will also describe the ten steps assessment framework that was developed and used to implement administrative unit level assessment at Binghamton. The presenter will also provide tips and techniques to write outcomes based administrative/operational unit level assessment plans as well as how to define and link strategic outcomes and operational outcomes. Attendees will learn how to convert performance measures to outcomes to create a sustainable administrative/operational assessment culture that leads to continuous improvement. Additionally, the presenter will describe how institutional context, culture, and the roles of the senior administrative senior leaders impact the effectiveness of the assessment implementation process. Finally, this presentation will focus on the challenges confronted and lessons learned. The presenter will provide overview of the challenges encountered during the implementation process and the strategies used to overcome those challenges. In addition, the workshop has two exercises embedded in the materials that attendees will participate. The concluding Q&A section offers and encourages attendees to engage in active discussion.
Attendees of this workshop will be able to:
Increase their knowledge about outcomes assessment process.
Distinguish between learning outcomes and operational outcomes.
Write measurable and meaningful learning/operational outcomes and implement them at non-academic unit level assessment.
|
Sep 8, 2021
10:00 AM - 12:00 PM
|
AALHE Assessment Institute Session 1
AALHE Assessment Institute Session 1 of 3
Jane Marie Souza, PhD - University of Rochester & Catherine M. Wehlburg PhD - Marymount University Zoom Room
Dr. Jane Marie Souza and Dr. Catherine Wehlburg, both past presidents of the Association for the Assessment of Learning in Higher Education (AALHE), will lead this workshop-style institute. These facilitators will bring a mix of theory and practice along with an engaging and participatory mix of information, practice, feedback, and skill-building. Participants will leave this institute with a solid foundation in the assessment of student learning, multiple resources, and a network of colleagues from across the country. Using their experiences at the course, program, institution, and national levels, the facilitators will foster lively conversations about what has worked, what hasn’t worked, and how higher education can best focus on improving and enhancing the quality of student learning at our institutions. This 6 hour workshop is intended for anyone who would benefit from a comprehensive review of assessment concepts, beginning with the basics. This is a wonderful opportunity to address knowledge gaps in your assessment education. The concepts will be introduced and immediately followed by learning activities and discussion. Topics include, but are not limited to: defining assessment and evaluation, direct and indirect measures, formative and summative assessments, qualitative and quantitative measures, rubric development and use, reliability and validity, goals and objectives, test development, assessing reflection papers, and making good use of assessment data.
By creating a network, participants will have access to each other, the facilitators, and many other resources long after the end of the program. Recognizing that each institution has a different mission and culture, this Institute will provide a framework for ways to better understand how to use information and data to inform decision making. The facilitators will work to use examples from many different types of institutions and will encourage dialogue among all participants in order to model good practices for determining how, when, and why to use assessment. Participants will leave with handouts of all slides, case studies, and templates. In addition, references, lists and other resources will be shared during the session and in communications following the institute. Institutions are encouraged to send more than one person to of this Institute, but all participants will benefit from making new connections for future communications and shared resources.
|
Sep 8, 2021
11:00 AM - 12:00 PM
|
Networking Session
AEFIS Speed Networking
AEFIS Academy Link to the event: https://www.aefisacademy.org/community-event/aefis-speed-networking-drexel2021/
Speed Networking is an opportunity to gather and learn about AEFIS Solutions and best practices in assessment from VIPs (very important partners) as expert users and AEFIS Team Members. Join us for an exciting opportunity to experience AEFIS through the lens of academic partners and learn how to employ best practices to support initiatives at your institution.
|
Sep 8, 2021
12:45 PM - 2:00 PM
|
Welcome and Opening Plenary
Opening Remarks
Joseph M Hawk - Drexel Assessment Conference Co-Chair
Opening Plenary
COL(R) Gerald C. Kobylski, Ph.D., M.B.A., P.E. - US Military Academy
|
Sep 8, 2021
2:00 PM - 4:00 PM
|
AALHE Assessment Institute Session 2
AALHE Assessment Institute Session 2 of 3
Jane Marie Souza, PhD - University of Rochester & Catherine M. Wehlburg PhD - Marymount University Zoom
Dr. Jane Marie Souza and Dr. Catherine Wehlburg, both past presidents of the Association for the Assessment of Learning in Higher Education (AALHE), will lead this workshop-style institute. These facilitators will bring a mix of theory and practice along with an engaging and participatory mix of information, practice, feedback, and skill-building. Participants will leave this institute with a solid foundation in the assessment of student learning, multiple resources, and a network of colleagues from across the country. Using their experiences at the course, program, institution, and national levels, the facilitators will foster lively conversations about what has worked, what hasn’t worked, and how higher education can best focus on improving and enhancing the quality of student learning at our institutions. This 6 hour workshop is intended for anyone who would benefit from a comprehensive review of assessment concepts, beginning with the basics. This is a wonderful opportunity to address knowledge gaps in your assessment education. The concepts will be introduced and immediately followed by learning activities and discussion. Topics include, but are not limited to: defining assessment and evaluation, direct and indirect measures, formative and summative assessments, qualitative and quantitative measures, rubric development and use, reliability and validity, goals and objectives, test development, assessing reflection papers, and making good use of assessment data.
By creating a network, participants will have access to each other, the facilitators, and many other resources long after the end of the program. Recognizing that each institution has a different mission and culture, this Institute will provide a framework for ways to better understand how to use information and data to inform decision making. The facilitators will work to use examples from many different types of institutions and will encourage dialogue among all participants in order to model good practices for determining how, when, and why to use assessment. Participants will leave with handouts of all slides, case studies, and templates. In addition, references, lists and other resources will be shared during the session and in communications following the institute. Institutions are encouraged to send more than one person to of this Institute, but all participants will benefit from making new connections for future communications and shared resources.
|
Sep 8, 2021
2:15 PM - 3:15 PM
|
Concurrent Session 1
Using Universal, Multi-Disciplinary Competency Rubrics to Efficiently Collect Data and Satisfy Multiple Accreditors’ Requirements
Ricki Kaplan, Ph.D & Dr. Karen Ann Tarnoff - East Tennessee State University Zoom Room 1
Many schools address the requirements of their multiple accreditors by designing a separate assessment process for each. This session will share a simple approach to create universal competency rubrics that allow data collection from multiple disciplines and satisfy the requirements of multiple accreditors. Universal competency rubrics, created by multidisciplinary faculty groups, eliminate the need for inefficient, redundant data collection while providing high quality assessment data to drive student-oriented improvement efforts. Universal rubrics allow for the aggregation or disaggregation data at multiple levels to enhance the impact of implemented improvements. Utilizing universal rubrics, while increasing the efficiency and ease of data collection, also benefit faculty by providing a standardized set of performance expectations to students. Further, they help properly focus faculty on the improvement process by simplifying compliance with the requirements of multiple accreditors.
Learning Outcomes:
Participants will gain an understanding of how to develop multidisciplinary learning objectives/outcomes and utilize universal rubrics to collect data and drive impactful student-oriented improvements.
Participants will gain an understanding of how to structure a single assessment process to easily meet the requirements of multiple accreditors while efficiently using faculty time and effort
Equity in Assessment, Assessment for Equity: Dismantling the Master's House
Dr. Joel Bloom - Hunter College, CUNY Zoom Room 2
Audre Lorde famously said "The master's tools will never dismantle the master's house." In this session, I will host an interactive discussion of ways in which the very tools we use to assess student learning are themselves racist, and discuss more equitable options. Next, we will focus on ways we can use assessment analysis and reporting to further more equitable outcomes. We know American higher education is built on a racist foundation; we also know many of the assessment tools currently in use frequently have racist outcomes, even where there was no racist intent. By discussing/examining/brainstorming alternatives to these flawed tools, as well as ways in which we can conduct analyses and reporting in a more equitable manner, we can hopefully steer our campus discussions in an anti-racist direction. Many assessment professionals have been trained to believe that the methods we use are neutral when it comes to racial equity. Unfortunately, decades of research have shown that many tools do indeed discriminate -- from "objective" multiple choice exams, to the ways we write learning outcomes, and design rubrics, assessment professionals would benefit from considering issues of equity in every aspect of our work.
Learning Outcomes:
Participants will understand ways in which certain facially neutral assessment tools still discriminate on the basis of race & ethnicity; and think about how to design more equitable tools.
Participants will learn ways in which we can use assessment results -- our analysis and reporting -- to expand equity on our campuses.
Learning, Limited - How Bad Faith Assessment Lowered Academic Achievement and Increased Inequality in America
Michael Seelig - Medgar Evers College CUNY Zoom Room 3
In a time when America is struggling with its role and identity as a democracy, it is imperative that education leaders understand the neoliberal influences and effortful disenfranchisement of select Americans embedded in the DNA of American education and how to build strategies to increase equality and democracy. The modern education system was not designed to serve democracy – but rather preserve 20th century power dynamics through selective disenfranchisement. Bad faith assessment has been at the core of this effort. This presentation presents an overview of key policy changes that have enabled this behavior and how to fix them. Assessment individuals are faced with the challenge of setting and helping institutions to meet goals and outcomes – but those short-term outcomes often come at the expense of broader philosophical goals for society. Understanding the broader neoliberal philosophy will inform our understanding of how to best guide out institutions forward.
Learning Outcomes:
Participants will be able to Identify instructional design principles and assessment technologies that can be adapted to professional learning for faculty and staff at your institution to empower their contribution to continuous improvement and change.
Participants will be able to examine principles of adult-learning theory combined with faculty feedback to design a process for continuous improvement institution-wide.
Addressing Missing Links in Interprofessional Education and Assessment
Dr. Melissa Vitek - Salus University Zoom Room 4
Salus University offers an Interprofessional Evidence Based Practice (IPEBP) required course for first-year graduate students that includes the Interprofessional Education Collaborative (IPEC) core competencies. According to the IPEC learning continuum, this is an “exposure” activity. The identified need for curricular continuity led to the development of an “immersion” interprofessional activity. Developing and implementing learning activities that are effective and sustainable require strong assessment practices. Assessment tools beyond attitudes and perceptions will be introduced, appraised and applied to sample activities. Attendees will have the opportunity to discuss applications to their specific student populations along the full IPEC learning continuum. The IPEC-defined core competencies and learning continuum provide a roadmap for educators to develop and implement effective educational activities that align with their learners’ curricular and professional development. The use of effective and timely assessment tools is essential in informing pedagogical approaches, measuring benchmarks and making strategic, evidence-based curricular modifications.
Learning Outcomes:
Participants will be able to define the interprofessional education collaborative (IPEC) core competencies and the interprofessional learning continuum.
Participants will be able to design an immersion interprofessional activity and discuss strategies to effectively assess the outcomes of that activity.
Developing Effective Evaluation Tools to Promote Feedback and Discussion Beyond the Classroom
Dr. Dana Scott - Thomas Jefferson University, East Falls Campus Zoom Room 5
This interactive workshop will explore a variety of evaluation tools covering Instructor Feedback, Peer-to-Peer Feedback, and Self-Reflection, that can be used across platforms and beyond the classroom. Examples of how to bring quantifying elements to these methods will be given to allow for both quality improvement and data collection. Rubric styles and their appropriate uses will be examined for both instructor and peer-to-peer evaluation. The integration of self-reflection into these tools will be discussed, as well as separate methods to bring elements of metacognition to the evaluation process. Participants will engage in discussions on a variety of methods for feedback, be provided with sample instruments and invited to interact with digital platforms to create a framework for effective feedback beyond a face-to-face environment.
Learning Outcomes:
Participants will be able to expand understanding of the use of rubrics and examine a variety of instruments and digital platforms as effective tools for feedback and evaluation.
Participants will be able to identify methods to quantify the evaluation process for both quality improvement and data collection.
|
Sep 8, 2021
3:30 PM - 4:30 PM
|
Concurrent Session 2
Building And Institutionalizing Sustainable Self-Study Practices During A Pandemic
Dr. Kate Wolfe & Dr. Nelson Nunez Rodriguez & Sarah Brennan - Hostos Community College: City University of New York Zoom Room 1
The two self-study co-chairs and one working group co-chair (Standard I) will discuss how a sustainable, structured and engaging process of reaffirming a college accreditation navigated the unexpected pandemic disruption and capitalized on new landscape to uncover new processes and institutionalized lessons learned while keeping initial self-study timeline. Planning processes created for this new scenario that emerged during the pandemic and processes that were reimagined to maintain the self-study process are discussed. Building understanding of ways to maneuver unexpected challenges and barriers to this effort will be discussed and brainstormed by attendees. We will provide strategies and tools for institutions navigating or preparing to navigate self-study process such as creating a timeline for planning and meetings, a SharePoint site as an evidence repository, regular steering committee and working group meetings, having one faculty and one staff member as working group co-chairs, etc.
Learning Outcomes:
Participants will be able to create evaluation and operational processes that remain beyond accreditation process
Participants will be able to develop strategies to address Self-Study findings during the accreditation period.
Amplifying diverse voices in assessment: strategies, pitfalls, and benefits
Dr. Susan Donat - Messiah University Zoom Room 2
This session explores the mutual benefits of partnerships which amplify campus voices. Participants share strategies for cultivating productive partnerships. In addition to exploring the difference between echoing and amplifying voices, we will discuss strategies for amplifying diverse voices on campus to promote student learning. Partnering is an essential aspect of assessment work, yet earning faculty trust can be a challenge for assessment professionals. Awareness of power differentials help us to identify the voices and ideas that would benefit the campus community but may not heard within their departments. A critical element in improving student learning is discussing learning data and developing improvement plans. Research suggests that amplifying colleagues’ voices benefits the status of the original speaker, the person who amplifies their voice, and the organizational goal (Bain, et. al, 2021).
Learning Outcomes:
Participants will be able to discuss and connect current research and theory on voice and caring to assessment practices.
Participants will be able to brainstorm strategies to empower diverse campus voices in efforts to improve student learning.
No One’s Buy-in’: Don’t Sell Assessment; Build Trust Instead
Dr. Patricia Coughlan & Jeff Bonfield - Rowan University Zoom Room 3
Faculty are more likely to conduct meaningful assessment if they trust the people responsible for assessment. Assessment professionals should earn faculty’s trust by demonstrably trusting faculty first. Trust is shown in word and deed, of course, but it can also be shown in the systems and practices of assessment. If program assessment consists of faculty completing reports that are all form with no substance, it will accomplish nothing. Adopting a multi-framed perspective identifying peers who promote a growth mindset expands faculty assessment investment and improves the usefulness of that assessment. By utilizing the strategies explored in this session, participants can create a university assessment culture that transitions from compliance to trust, improves timely and thoughtful participation, and strengthens connections between institutional expectations and faculty practice. This leads to substantive data collection for decision making, program improvement, and ultimately improved student learning.
Learning Outcomes:
Participants will be able to analyze their current systematic assessment tools and faculty participation to identify trust barriers that negatively impact substantive assessment and assessment culture.
Participants will examine how a multi-framed perspective results in new insights and strategies that can be implemented to develop trust, increase substantive data collection, and grow a positive assessment culture.
Backstage Tips for Front of House Speaking, Conveying Textual Meaning with Expressiveness
Dr. Nancy Bandiera - LaGuardia Community College & Cheyenne Seymour - Bronx Community College & Patricia Sokolski - LaGuardia Community College Zoom Room 4
Instructors of all disciplines can learn strategies to help students gain confidence as speakers. Easy to replicate exercises demystify the teaching of oral communication. Assessing oral communication without falling into the reductive teaching of “General/Standard American English” is delicate. The presenters will offer techniques and a rubric participants will take home. Public speaking can be a source of anxiety, but students succeed academically when they confidently deliver oral presentations. Effective speaking skills also help students professionally (Murphy, 2014; Meertins et al., 2021). Employers value candidates with these soft skills, especially as telecommunication and videoconferencing become more widespread in the workplace. At the end of the session, attendees will have participated in hands-on exercises that they will be able to replicate and assess in their classrooms to improve students’ confidence and delivery skills.
Learning Outcomes:
Participants will be able to help their students improve confidence and delivery skills
Participants will be able to assess their students' oral communication
Embedding Project-Based Instruction, UDL, and 5E to Course Design as a Means of Assessment
Dr. Carlos Guevara & Jacqueline M. DiSanto & Meg Ray - Hostos Community College - CUNY, The City University of New York Zoom Room 5
The relationship between learning and assessment drove the redesign of a course in educational technology. Project-based assessment, Universal Design for Learning (UDL), and the 5E Model for Instruction serve as both pedagogy and scaffolded measurement. Assessment is continuous through each student's autonomous application of theory with an emphasis student-centered learning. What do UDL and 5E look like as assessment? Differentiated instruction through student-designed projects that demonstrate practical application of course content is intended to support academic success by serving as both content and assessment. This session provides an opportunity for participants to reconsider assessment design to better include students in its design and implementation. It also shares a real-life example of modeling theory within a course so that students both gain knowledge and practical experience.
Learning Outcomes:
Participants will be able to consider scaffolded student-designed projects as a continuous assessment tool.
Participants will be able to discuss using universal design for learning as pedagogy and how the 5E Model of Instruction promotes student-centered learning.
|
Sep 9, 2021
9:00 AM - 11:00 AM
|
AALHE Assessment Institute Session 3
AALHE Assessment Institute Session 3 of 3
Jane Marie Souza, PhD - University of Rochester & Catherine M. Wehlburg PhD - Marymount University Zoom Room
Dr. Jane Marie Souza and Dr. Catherine Wehlburg, both past presidents of the Association for the Assessment of Learning in Higher Education (AALHE), will lead this workshop-style institute. These facilitators will bring a mix of theory and practice along with an engaging and participatory mix of information, practice, feedback, and skill-building. Participants will leave this institute with a solid foundation in the assessment of student learning, multiple resources, and a network of colleagues from across the country. Using their experiences at the course, program, institution, and national levels, the facilitators will foster lively conversations about what has worked, what hasn’t worked, and how higher education can best focus on improving and enhancing the quality of student learning at our institutions. This 6 hour workshop is intended for anyone who would benefit from a comprehensive review of assessment concepts, beginning with the basics. This is a wonderful opportunity to address knowledge gaps in your assessment education. The concepts will be introduced and immediately followed by learning activities and discussion. Topics include, but are not limited to: defining assessment and evaluation, direct and indirect measures, formative and summative assessments, qualitative and quantitative measures, rubric development and use, reliability and validity, goals and objectives, test development, assessing reflection papers, and making good use of assessment data.
By creating a network, participants will have access to each other, the facilitators, and many other resources long after the end of the program. Recognizing that each institution has a different mission and culture, this Institute will provide a framework for ways to better understand how to use information and data to inform decision making. The facilitators will work to use examples from many different types of institutions and will encourage dialogue among all participants in order to model good practices for determining how, when, and why to use assessment. Participants will leave with handouts of all slides, case studies, and templates. In addition, references, lists and other resources will be shared during the session and in communications following the institute. Institutions are encouraged to send more than one person to of this Institute, but all participants will benefit from making new connections for future communications and shared resources.
|
Sep 9, 2021
9:30 AM - 10:30 AM
|
Concurrent Session 3
Unusual Artifacts to Uncover to Assess Institutional Student Learning Goals
Dr. Sr. Janet Thiel - Georgian Court University Zoom Room 1
The opportunity to include direct and indirect assessment artifacts to align with institutional learning goals at both the undergraduate and graduate levels offers an overview of learning success that expands the usual boundaries of graduation and retention rates. This session will explore the areas often overlooked as providing solid evidence of achievement of student learning goals and outcomes. Participants will be given examples of artifacts and data to explore to validate student learning within stated institutional goals and outcomes. The attendees will be able use this opportunity to dig through their own institution’s data troves and treasures to apply some uncovered gems to their assessment reports.
Learning Outcomes:
Participants will view available assessment data with an expanded viewpoint upon return to campus.
Participants will be aware of aligning various data sources to institutional goals and outcomes.
Participants will be able to train department chairs and program directors toward various artifact retrieval that may expand assessment data not currently captured by direct and indirect evidence.
Let’s talk about it: Creating and measuring the work of a systemic justice taskforce
Dr. Jasmine Tenpa-Lama, Shoshana Sicks, Niekan Ukanna, Brooke Salzman, Amber King, Pamela Gassman, Quadira McPherson, Richard Hass - Thomas Jefferson University Zoom Room 2
In spring 2020, a University Center formed a Racial and Social Justice Taskforce (RSJT) – comprised of staff and faculty members - committed to: expanding conversation around systemic justice issues and implementing change in individual/team practices. This session will focus on the center’s experiences, challenges, lessons learned and future plans. Systemic change is difficult to make and measure, and systemic injustice conversations can be difficult to start and navigate. However, developing thoughtful processes and intentional change requires them. Participants will receive resources, lessons learned and strategies on how to successfully build and measure more informed, inclusive and socially aware teams.
Learning Outcomes:
Participants will be able to increase their knowledge and awareness of social and racial justice issues
Participants will be able to gain the skills to start conversations/taskforces around systemic injustice
Mirror, Mirror on the Wall, What Course Assessment Tells Us All
Dr. Karen LaPlant & Zala Fashant - Metropolitan State University Zoom Room 3
This presentation will provide a Cycle of Course Assessments that develop effective teaching and creates significant student learning. Participants will analyze ways they reflect on their current practice, improve courses to provide a deeper measurement of student learning and increase their teaching ability to meet student needs. With greater levels of accountability for learning during multiple deliveries of course, participants need closely examine if they are delivering what they state they are. During the pandemic, course changes took place quickly, perhaps chaotically. By analyzing course assessment, needs to meet outcomes with the proper alignment can be re-achieved. This session offers reflective assessment strategies to provide a deeper understanding of the ways designed assessments serve insights to student learning, outcome mastery, program assessment, and teaching and learning improvement. Participants will take away a cycle of course assessments to examine their current analysis of how students are progressing.
Learning Outcomes:
Participants will be able to analyze their current course evaluation practices
Participants will be able to evaluate the model to gain a broader perspective of course assignments.
Rubric Design Challenges of Assessing Writing across the Disciplines
Dr. Jane Detweiler - University of Nevada, Reno Zoom Room 4
Academic literacy instruction is difficult to integrate into—and assess in—degree programs. This session focuses on designing authentic, efficient rubrics for first-pass programmatic inquiry. Participants will examine sample communication-centered SLOs, operationalizing for assessment of disciplinary literacy practices, and building a rubric item and scoring scale for a first-pass evaluation. Articulating and operationalizing disciplinary literacy practices for assessment is a thorny challenge faced in assessing communication. This discussion engages the thorniest questions: How should communication be evaluated? Where are critical assignments, in which courses? How can first-pass rubric items identify problem areas for closer examination in a more in-depth assessment? This presentation draws on recent writing assessment research and on practical experience of developing a program assessment “two-step”: a rubric for first-pass, overarching study of communication across the disciplines, which degree program faculty can use to efficiently home in on problem areas with a finer-tuned, discipline-specific rubric.
Learning Outcomes:
Participants will be able to build or revise a communication-related rubric item to use for "first pass" assessment.
Participants will be able to build or revise a communication-related rubric item to use for in-depth assessment.
Artificial Assessment: An Exploration of Tools to Help Your Assessment Process
Mark Green - Holy Family University Zoom Room 5
The transition to online and blended learning this last year has resulted in everyone becoming experts in teaching through web conferencing tools like Zoom, Big Blue Botton, and WebEx. In this session, participants will learn about a tool called Otter.AI to see how it can help with both assessment and students with accommodation requests. This review is focused on improving learning through web conferencing. Participants will be able to learn how they leverage this tool to enhance data from student repsonses. Participants are asked to share some other tools and software they found useful during the transition to remote learning. This session will provide an easy way to use the tool Otter.AI that can transcribe zoom or other recorded sessions, and help analyze key data patterns.
Learning Outcomes:
Participants will be able to learn about Otter AI and how to use it in their course
Participants will be able to understand the basic principles of loading and analyzing responses
|
Sep 9, 2021
10:45 AM - 11:45 AM
|
Concurrent Session 4
Showcasing a Sustainable Student Learning Assessment Reporting System Commended by Middle States for Academic Programs and General Education Results and Actions to Close the Loop
Dr. Brett Everhart & Dr. Ed Bowman - Lock Haven University Zoom Room 1
This session will provide a deep look at the details and timeline that make up a sustainable assessment plan for annual reporting of student learning outcomes (SLO) and actions to improve for both general education and academic program SLOs. Universities are regularly faced with challenges for developing and maintaining assessment reporting plans in a way that gains and keeps faculty buy-in within a faculty-driven process for assessing student learning on campus. This session will provide participants with ideas and tools that they can use for their own assessment reporting needs. This session will showcase annual timelines for all involved in the assessment data collection and reporting process. It will also show how rubrics, assessment website, reporting tool (Nuventive IMPROVE) and faculty discussions all play a role in the improvement of student learning and the reporting processes.
Learning Outcomes:
Participants will review and analyze specific assessment collection and reporting details and tools for possible implementation
Participants will compile a list of strategies for redesign, modification or development of student learning assessment data collection and annual reporting
Honestly, Is This The Best Policy? An Equity Framework for Policy Review and Revision
Jeff Bonfield - Rowan University Zoom Room 2
College and university policies are written and enforced with the best intentions, such as promoting student learning and safety. Yet, many policies affect different populations differently, often with disproportionately negative outcomes for the most institutionally vulnerable populations. The presentation will present a framework for evaluating policies for equity. At a time when most colleges and universities are placing greater emphasis on diversity, equity and inclusion, our institutions’ polices are upholding status quo power structures. Policies that on the surface appear perfectly neutral are reinforcing systemic racism, sexism, classism, ableism, etc. Attendees will practice using proven tools for interrogating college and university policies. Attendees will be able to apply what they learn to their own institution’s policies. This anti-racist work is an essential part of rooting out entrenched systemic racism and other evils that harm students, faculty and staff.
Learning Outcomes:
Participants will be able to persuasively explain the importance of conducting equity-based reviews of their institutions’ policies.
Participants will be able to analyze their institutions’ policies and suggest revisions that promote greater equity.
Stealing Strategies: How to borrow practices for work-integrated learning assessment
Kristen Gallo - Temple University & Dr. Liza Herzog - Drexel University Zoom Room 3
This session will take a deep look at how assessment of a long-standing work-integrated learning program at a large, private university influenced assessment of a small, newer on-campus internship program at a large, public university. This example of scaling best practices from one institution to another, with differing programs, will allow the audience to learn how to integrate assessment practices into their own programs, large or small, public or private. As experiential and work-based learning programs continue proliferating in higher education, it is essential to find meaningful ways to assess and show the value of these program to stakeholders.
Learning Outcomes:
Participants will be able to learn portability practices to successfully activate meaningful assessment tools across institutional and programmatic types.
Participants will be able to gain exposure to two differing models of work-based learning programs and assessment.
When does learning occur? Assessing learning from a sport coach perspective
Dr. Cameron Kiosoglous - Drexel University Zoom Room 4
In sport, evaluation and assessment are used interchangeably and confusion remains about their meaning. Here we highlight ways feedback can help coaches improve and evaluate their performances against a standard. Four conditions of assessment are highlighted to help advance coaching quality and improve athlete experiences (Hay, Dickens & Cruddington, 2012). Assessment has been overlooked within the research on coaching and coach learning (McCarthy, Allanson & Stoszkowski, 2021). While much attention has been focused on how coaches learn, the conversation of assessment in coaching is complex and challenging. This session focuses on establishing a common language to use. Coaching is a complex activity and when assessing a complex activity, like in many forms of assessment, there is a high risk of making mistakes. Lessons from the field will help to highlight how assessment is used to help improve coaching practices and help optimize the athlete’s experiences.
Learning Outcomes:
Participants will be able to apply practical examples from the field to help improve coach and athlete learning.
Participants will be able to clarify the use of key terms in assessment and evaluation in the sport context.
Improving Adult Learning: Empowering Diverse Faculty as Assessment Leaders at Your Institution
Dr. Colin Suchland - Lincoln Land Community College Zoom Room 5
This session shares outcomes of implementing peer-to-peer learning structures and adult learning theory to support faculty who are at the center of continuous improvement and change. Using an intentional approach, Lincoln Land Community College’s faculty-driven model provides a professional learning blueprint that can be adapted and scaled at your institution. On any campus, efficiency and collaboration are essential processes in establishing a culture for assessment. At the center are faculty, who are expected to implement instructional strategies that support student growth and success as well as make meaningful decisions about how students are learning using data. This session highlights the implementation of an online course to design and deliver professional learning using a peer-to-peer learning model. Outcomes of this design support the diversity, equity and inclusion of faculty who are expected to contribute to assessment for academic improvement but need efficient ways to do so. This session also shares outcomes of implementing peer-to-peer learning structures and adult learning theory to support faculty who are at the center of continuous improvement and change. Using an intentional approach, Lincoln Land Community College’s faculty-driven model provides a professional learning blueprint that can be adapted and scaled at your institution.
Learning Outcomes:
Identify instructional design principles and assessment technologies that can be adapted to professional learning for faculty and staff at your institution to empower their contribution to continuous improvement and change.
Examine principles of adult-learning theory combined with faculty feedback to design a process for continuous improvement institution-wide.
|
Sep 9, 2021
12:15 PM - 1:15 PM
|
Luncheon Round Table
Round Table with Assessment Leaders
Various Colin Suchland, PhD - Lincoln Land University & Chadia Abras, PhD - Johns Hopkins University & Nasrin Fatima, PhD - Binghampton University
Participate as assessment and accreditation leaders share their thoughts on how to make assessment manageable and sustainable. A Q&A period will be inlcuded.
|
Sep 9, 2021
1:30 PM - 2:00 PM
|
Vendor Session
Vendor Sessions
Various
|
Sep 9, 2021
2:15 PM - 3:15 PM
|
Graduate Student Poster Session
|
Sep 9, 2021
3:15 PM - 3:45 PM
|
Yoga
AEFIS Virtual Yoga Level 1: 30-Minute Easy Stretching & Meditation While Sitting at Your Desk: 30-minute easy stretching and meditation while sitting at your desk
AEFIS Academy Link to the event: https://www.aefisacademy.org/community-event/aefis-virtual-yoga-level-1-drexel2021/
This activity is intended for everyone and no experience necessary. All you need to do is show up! You will not need a yoga mat, special shoes, or any specific clothing. All you will need to do is sit up in your chair and ideally be close to your camera showing your shoulders and head. During our session, we will be focusing on gentle neck and shoulder stretches, breathing techniques, and a small meditation for each class where we will set an intention for our day.
|
Sep 9, 2021
3:45 PM - 4:45 PM
|
Snapshot Session
A Sustainable Assessment System for Aligning Assessment Reporting Processes with Middle States Standards
Dr. Ed Bowman & Dr. Brett Everhart - Lock Haven University Zoom Room 1
Learning Outcomes:
Participant will gain ideas and strategies to fill gaps in their own assessment plans
Participants will gain an understanding of the alignment of Middle States assessment-related standards with a campus-wide reporting system
Virtual Reality (VR) as a tool to promote growth-fostering relationships and formative assessment in the classroom
Dr. Abby Dougherty - Drexel University Zoom Room 1
This presentation will demonstrate how technology like VR can be used as a formative assessment tool while also building authentic, connective relationships in the classroom. Connective relationships are critical to creating safe learning spaces for all students. VR may create safer spaces for all students in the classroom. We will show how virtual reality (VR) technology can be used simultaneously to develop deep connective and inclusive relationships with adult students, while also being used as a formative assessment tool in the classroom. This session will provide an example to educators about how they might use VR as an assessment tool, and as a way to build deeper and more connective relationships with their students.
Learning Outcomes:
Participants will be able to identify how VR can be used as a formative assessment tool.
Participants will be able to articulate how VR can be used to build growth-fostering relationships in the classroom.
Improvements to chemistry curriculum based on annual assessment program assessment reports--closing the loop
Dr. Kristen Grant & Dr Gabriela Smeureanu - Hunter College CUNY Zoom Room 1
Institutional learning outcome assessment may seem like a chore, but it can be useful in getting valuable information regarding programs offered by the department. Aligning institutional LOs to the program and course LOs were used to close the loop in student learning in chemistry lab courses for chemistry majors. Two separate lab courses were chosen to complete assessments of institutional learning outcomes. Conclusions from these assessments were used to redesign the lab manuals to address gaps in student learning in the chemistry program. Reassessment of curriculum updates will be discussed.
This session will show how we used institutional LOs and rubrics for assessment to streamline our annual programmatic assessment reports and use the data to help improve our 100-level general chemistry lab course curriculum. programmatic assessment reports and use the data to help improve our 100-level general chemistry lab course curriculum.
Learning Outcomes:
Participants will learn how to align institutional learning outcomes with program and course learning outcomes
Participants will be able to reflect on how to "close the loop" in the assessment cycle.
Placement Process Appraisal
Dr. Deborah Greenberg - Howard Community College Zoom Room 1
English placement has been identified as a key factor in success in college. Incorrectly placing students into their first semester English course can adversely affect the student’s likelihood of completion especial among at-risk and minority populations. This presentation examines the methodology used to evaluate the English placements, using Guided Self-Placement, in a community college setting. The need to establish a process to ensure adequate inter-rater reliability among faculty raters as well as how to differentiate among types of students who may be best served by different placement methods will be examined. Finally, the presenter will talk about the development of an automated process to collect data accurately and continuously.
Learning Outcomes:
Attendees will gain insight into assessment strategies for English placement tests.
Attendees will identify approaches for assessing placement within their institutions.
Inequities in College and Career Readiness for Students with Disabilities and ELLs
Dr. Victoria Shriver - Drexel University Zoom Room 1
Inclusion and equity are crucial values that should drive our school systems to prepare all students for college and or career readiness. So then, the gap in academic opportunities needs to be spotlighted in order to advocate and support the success/improving graduation rates and post secondary admission for students with disabilities and ELLs. Disparities in graduation rates and post secondary attendance for both students with disabilities and English Language Learners (ELLs). This achievement gap is associated to the inequities in academic tracking inhibiting both populations of students from accessing college and career readiness courses. The data presented will inform current and future educators of the needs of both populations of students and the possible changes to implement to counter this achievement gap. Additionally, prompting educator involvement in addressing these inequities to evoke change!
Learning Outcomes:
|
Sep 9, 2021
3:45 PM - 4:45 PM
|
AEFIS Lightning Talks
AEFIS Lightning Talk Series—Assessment for Learning: Engaging Faculty and Students in Reflection and Improvement
AEFIS Academy and Conference Attendees Link to the event: https://www.aefisacademy.org/community-event/lightning-talks-series-assessment-for-learning-engaging-faculty-and-students-in-reflection-and-improvement/
Lightning Talks are short, targeted presentations that provide insight into best practices, challenges, and successes in assessment. Listen in as some of the greatest minds and thought leaders share authentic assessment processes, continuous improvement stories, and innovative approaches toward lifelong learning and student success. This Lightning Talk Series consists of two presentations both centered on engaging faculty and students in reflection and improvement through active learning and research. Hear from leaders who are giving terrific opportunities for learners to provide feedback on teaching and learning through a robust syllabi review at Brigham Young University-Hawaii and graduate student research at Drexel University.
|
Sep 9, 2021
5:00 PM - 6:00 PM
|
Professional Associations Networking Event
Professional Association Meet & Greet
Stephen P. Hundley Ph.D - IUPUI, Andre Foisy - Excelsior College, Josephine Welsh PhD - Southern California University of Health Sciences, Gianina Baker PhD - NILOA, Suzanne Carbonaro - AEFIS Academy, Dr. Joseph D. Levy - National Louis University Zoom Room 1
Network with your peers as you meet representatives from some of the major professional associations that are devoted to assessment and accreditation. Here what they have to offer and then learn more in a break-out room.
Participant associations will include: AALHE, AAC&U, the IUPUI Assessment Institute, NILOA and AEFIS Academy.
|
Sep 10, 2021
9:30 AM - 10:30 AM
|
Panel Discussion
Panel Discussion
Various
|
Sep 10, 2021
10:30 AM - 11:00 AM
|
Networking Event
|
Sep 10, 2021
11:00 AM - 12:00 PM
|
Concurrent Session 5
To Infinity and Beyond! Designing a Sustainable Self-Study Process
Dr. Karol Batey, Dr. David Allen - Texas A&M International & Suzanne Carbonaro - AEFIS Zoom Room 1
When completing a self-study at any point in an institution’s term, the focus should be on celebrating resilience and the return on investment because of this process, and not the anxiety for how it will get done. Preparing your institution for accreditation can be stressful for all stakeholders, especially if they are going through their first self-study together. This session demonstrates how Texas A&M International University developed, adjusted, and learned from their initial approach to create a more efficient process for their SACSCOC Fifth-Year Interim Report. This session demonstrates an innovative approach to the completion of the Fifth-Year Report, lessons learned, and opportunities for stronger collaboration among stakeholders, accountability of tasks and a fresh perspective toward continuous improvement. Newly appointed assessment professionals and other coordinators will benefit from lessons learned. This session highlights lessons learned, and opportunities for stronger collaboration among stakeholders.
Learning Outcomes:
Identify your goals for completing your self-study keeping in mind your culture and make-up of your institution.
Determine what specific timelines, processes and return on investment of tasks your institution will gain from this process and backward design appropriately.
Creating and Assessing the Strategic Plan: A sustainable model for outcomes assessment and continuous improvement
Dr. Janine Fredericks-Younger - Rutgers School of Dental Medicine, Rutgers University Zoom Room 2
Members of the Rutgers School of Dental Medicine will share how it uses a sustainable model for assessing its Strategic Plan through an annual cycle of assessment and institutional review of outcomes measures. This model represents the culmination of dynamic engagement and contributions from various community constituencies. All institutions of higher education must demonstrate the achievement of their mission, vision, and goals. This presentation will demonstrate use of a successful, closed loop model for documentation of compelling evidence comprised of tangible indicators of institutional effectiveness and the systematic use of data for continuous improvement. RSDM’s assessment process begins with a set of measurable outcomes with benchmarks. Each year, the Planning Committee reviews these outcomes data, assessing whether RSDM is meeting its benchmarks, and recommends action plans as appropriate. The presentation will share our best practices in a manner translatable to other programs.
Learning Outcomes:
Participants will be able to engage their campus communities in the development, assessment, and ongoing review of their institutional strategic plan.
Participants will be able to document the assessment of their strategic plan to respective stakeholders and constituencies to clearly demonstrate the achievement of their institutional mission, vision, and goals.
Analysis of Leadership Education Training at U.S. Allopathic and Osteopathic Medical Schools
Dr. Michelle Schmude, Venard Koerwer and Sooyoung VanDeMark - Geisigner Commonwealth School of Medicine -
Healthcare is an increasingly complex industry. Do institutions that do not provide a foundation in the basics of leadership theory and practice do a disservice to their students? Medical students not expressly trained for the role(s) of leadership risk entering the field unprepared, and possibly, experiencing earlier career burnout. During their career, many physicians assume leadership responsibilities. Although medical schools educate future physicians clinically, do they prepare them to assume leadership roles as a physician leader? This session will discuss our research findings regarding medical school curricula and how medical schools are educating the next generation of physician leaders. As healthcare becomes increasingly complex to deliver, our research suggests that medical schools consider explicit leadership training for aspiring physicians. If attendees have not recently assessed their own institution’s mission, vision, and value statements, and more importantly, the deliverables related to them, this presentation will encourage them to do so.
Learning Outcomes:
Participants will be able to identify if your institution delivering on what it proposes in its mission, vision, value statements regarding leadership.
Participants will be able to develop a plan to analyze if the institution is delivering a curriculum that aligns with the mission, vision and values statement.
A comparison of examination modality: In person versus Remote NBME Subject exams
Dr. Carolyn Giordano & Dr. Michael Howley - Drexel University Zoom Room 4
NBME Subject Exams are high stakes examinations given to medical students to ensure comparable clinical learning. This is one type of examination that moved remote but many institutions had to work nimbly to adapt to online examinations. This session shares lessons learned, technological resources, proctoring guides, and institutional resources. Prior to the 2020, COVID-19 pandemic high stakes exams such as the NBME Subject Exams were administered in person. When the pandemic hit, institutions had to quickly shift exam administration modality to remote proctoring. This session describes lessons learned and shares some comparison data from prior years. We needed to ensure comparability from in person to remote testing, and data was compared year to year. We will share some aggregate data showing that the examination experience wasn't compromised.
Learning Outcomes:
Participants will be able to demonstrate how to administer a high stakes examination remotely.
Participants will be able to identify the challengers of administering a high stakes examination remotely.
Transforming Assessment in a Virtual Environment
Dr. Gracie Williams & Nicole Harris - Tarrant County College District Zoom Room 5
Assessment of student learning involves intentional engagement and is a continuous improvement process. Institutions must find innovative ways to reframe assessment; focus on what is actionable and bring consistency to the assessment of student learning for greater utilization of existing resources. College enrollment has become increasingly diverse in terms of students’ race, ethnicity, gender identity, socioeconomic status, sexual orientation, age, ability, etc. Conducting assessment in a manner that takes into consideration the various needs of different student populations is a responsibility of higher education. In the era of COVID-19, virtual planning and implementation of instructional assessment is crucial and vital to the success of any institution. Participants will learn how to adapt, maintain, and achieve continuous improvement of student learning outcomes by reorienting instructional assessment with a flexible and manageable approach.
Learning Outcomes:
Participants will be able to describe what reframing assessment virtually would look like at your institution.
Participants will be able to identify missing elements of current assessment strategies and develop an action plan.
|
Sep 10, 2021
12:30 PM - 1:00 PM
|
Closing Remarks
Closing Remarks
Drexel Conference Committee Zoom Room 1
Join the conference committee and your peers for some final thoughts as we close the conference for 2021.
|