For a better experience, click the Compatibility Mode icon above to turn off Compatibility Mode, which is only for viewing older websites.

2023 Preliminary Program

Date/Time Session

Sep 13, 2023
8:00 AM - 8:45 AM

Coffee Break Room

Virtual Coffee Room Conference Committee Zoom Room 1

Come and “hang out” with the conference event planners before the beginning of the conference
Sep 13, 2023
9:00 AM - 10:00 AM

Opening Remarks

Opening of the Conference Paul Jensen, PhD. - Executive Vice President and Nina Henderson Provost, Drexel University Zoom Room 1

The Provost of Drexel University will help kick off the conference.

Opening Remarks Joseph Hawk - Executive Director of Assessment and Accreditation & Accreditation Liaison Officer - Drexel University Zoom Room 1

Opening Remarks form the conference co-chair
Sep 13, 2023
10:15 AM - 11:15 AM

Concurrent Session 1

SHARED SESSION 1: Mobile Application-Based Assessment of Teamwork Skills with Jefferson Teamwork Observation Guide® (JTOG) App Maria Brucato & Julianne Liskov, Thomas Jefferson Univeristy Zoom Room 1

App-based assessment and reporting allows students to receive feedback on their teamwork skills and discuss opportunities for growth in real time as a group, rather than in isolation after a delay as traditional methods require. Learning to assess and improve upon teamwork skills can be beneficial to students from various disciplines. We will present the use of a mobile application, the Jefferson Teamwork Observation Guide® (JTOG) developed and implemented with thousands of health professions students, clinicians, faculty, patients and caregivers at Thomas Jefferson University to assess teamwork skills during collaborative experiences. The app can be used for quick assessments and basic reporting of results to enable critical reflection and opportunities for improvement. Attendees will learn about the benefits of implementing app-based assessments of teamwork skills, including time savings for data collection, reporting, and distributing results. Specifically, the JTOG app is an existing tool that can be used for assessment of teamwork skills, which is a desirable learning outcome for students and professionals of many disciplines.

Audience Level: Intermediate

Learning Outcomes

  • Participants will understand the benefits and use of the Jefferson Team Observation Guide app for learners, educators, and researchers.
  • Participants will apply the Jefferson Teamwork Observation Guide® to a case scenario to assess teamwork skills.
  • SHARED SESSION 2: Future Learning: Using AI Tools for Good! Karen LaPlant, Metropolitan State University & Zala Fashant, Minnesota State Colleges & Universities Zoom Room 1

    There are many sources to obtaining quality learning for students at an institution or employees in the workplace. Our institutions need to design for a future to strategically meet and assess the needs of all learners. If we want lifelong learning which is truly student-centered we need to make it easy for students to learn what they need when they need it. AI tools have changed the world of learning assessment! How will faculty, designers, and institutions prepare to adapt to the changes that are centered more on learners? Join us for the ways to use artificial intelligence (AI) to demonstrate learning and assess the achievement of outcomes. How can AI be used as an opportunity to increase student performance? Participants can share responses through the Chat tool and the Q&A portion of the presentation to gain greater strategies for redesigning courses to maximize student-centered, lifelong learning. Over the past few years we have discovered that the way we have traditionally taught may not have been the best for all students. Courses will need to be redesigned to consider the changes in pedagogy and access to delivery that the future of learning requires.

    Audience Level: Beginner,Intermediate

    Learning Outcomes

  • Participants will gain greater strategies for redesigning courses to maximize student-centered lifelong learning.
  • Participants will discuss assessment practices presented as it applies to their own teaching.
  • Harnessing Canvas Data to Identify Student Needs and Improve Learning Outcomes Shannon Bertha-Angulo & Shannon Osborn-Jones, Middlesex College Zoom Room 2

    The use of Canvas Assignments for assessments can provide valuable data and insights into student performance and learning outcomes. Using the dashboards, instructors can identify areas where students are excelling and where they may need support, which can help inform instructional strategies and improve overall learning outcomes. Middlesex College is using rubrics attached to course learning outcomes in Canvas to help with assessment collection. We will discuss how this approach provides valuable insights into student learning outcomes, allowing instructors to identify areas where students might need additional support. This session will highlight the benefits of this approach for both instructors and department chairs. Attendees will learn how to effectively use Canvas Assignments for assessment and program evaluation, ultimately improving their ability to support students in their learning and contribute to the success of their programs. This session will provide attendees with practical tools and insights that they can apply in their colleges.

    Audience Level: Intermediate,Advanced

    Learning Outcomes

  • Participants will learn how to set up Canvas Assignments and sub-accounts for the Canvas Outcomes Assessment.
  • Participants will learn how to analyze student proficiencies connected to student data displayed on a Tableau Dashboard. Strategies for using the dashboard will also be explored.
  • Post-Pandemic Interoperability: Using Agnostic Course Design to Cross Modalities Jacqueline DiSanto, Nelson Nunez Rodriguez and Antonios Varelas, Hostos Community College Zoom Room 3

    In this post-quarantine world, can faculty fluidly use skills, resources, and techniques developed for face-to-face courses prior to the pandemic in tandem with those acquired to teach online under emergency circumstances? Agnostic course design may be particularly helpful to students who have spent the majority of the last three years online rather than on site. By incorporating what works online with what is effective in person, faculty can develop an interoperable course to engage students where they are learning today. Focus groups discussed how agnostic course design could engage students and maintain rigor and teaching identity regardless of modality. We will pose the same questions used in our focus groups so attendees' can consider whether agnostic course design could work on their campus, in their classes, or simply to identify what skills and strategies they have acquired and where they are most effective.

    Audience Level: Intermediate

    Learning Outcomes

  • Attendees will be able to identify in-person and online teaching skills, resources, and strategies they use interoperably across modalities.
  • Attendees will be able to determine if employing agnostic course design could be applied to their courses.
  • Unleashing the Potential: Leveraging Technology and AI to Propel Administrative Goals and Objectives to New Heights James Marshall & Dr Alexa Beshara-Blauth, Ocean County College Zoom Room 4

    One of the most frequently cited Middle States standards for non-compliance/follow-up is Standard VI. Attendees will learn about a successful planning and budgeting process that clearly links new money requests to institutional goals. We demonstrate how AI can be used to craft measurable and achievable goals and objectives. The OCC Planning and Budgeting Council is critical in meeting Middle States accreditation standards. It connects planning and budgeting, ensuring that new budget requests are linked to specific goals or objectives. This session explores the process and documentation for the PBC, issues encountered, and how we’ve solved them leveraging AI. Attendees will be able to bring an inclusive, innovative approach that links planning, budgeting, and institutional improvement back to their campuses. Artificial Intelligence is the future of technology, and higher education will need to embrace and leverage it in the pursuit institutional effectiveness.

    Audience Level: Intermediate

    Learning Outcomes

  • Participants will learn how the Planning and Budgeting Council (PBC) functions and review the unit planning documents.
  • Participants will explore how AI is being leveraged to craft more meaningful goals and objectives.
  • Assessment and Continuous Improvement through Multi-Modal Student Engagement Christina Sax, Maryland University of Integrative Health Zoom Room 5

    This session describes MUIH’s systematic and multi-modal approach to engage students in the assessment of their experiences with courses, programs, and services. It provides an overview of the strategies that have been used to cultivate relationships with students and the non-traditional and traditional means of assessment that have been used. This holistic and inclusive methodology provides a framework adaptable by other institutions seeking to deepen their assessment of the student experience by more effectively and broadly engaging with students. This approach creates a picture that helps institutions fully understand and support the student experience. This occurs through formal and informal means, assessment of the overall experience as well as focused and timely topics, assessment through standardized means and direct student voice, and with retrospective and forward-looking lenses. Specific examples will help individuals and institutions build capacity and expertise and ensure sustainable assessment and continuous improvement within ongoing practices.

    Audience Level: Intermediate

    Learning Outcomes

    Sep 13, 2023
    11:30 AM - 12:30 PM

    Concurrent Session 2

    Embrace the bot: Using AI Technology to Develop Objective Assessment Items Marylee Demeter & Alicia J. Morrill-Foerster, Western Governor's University Zoom Room 1

    Harnessing the power of AI enables faculty to save time when developing fair assessments. Rather than struggling to write generate many items, faculty only need to review and revise AI-generated items for content and fairness. The best items can then be selected for inclusion in formative and summative assessments to measure student progress and achievement. Grading adjustments “based on the curve” result when an assessment fails to measure what it was intended to measure. Item writing is a skill few faculty acquire in preparation for an academic career. Understanding best practices used to generate fair items that align with course outcomes will help faculty create more reliable, valid assessment items. This session will host a live demonstration of an artificial intelligence (AI) technology used to generate objective assessment items based on course learning objectives/outcomes/ competency statements. Participants will learn best practices in developing input statements and conducting reviews of AI-generated items to ensure test fairness and alignment with course objectives.

    Audience Level: Beginner

    Learning Outcomes

  • Participants will be able to draft input statements based on course objectives/competencies/learning outcomes and psychometric guidelines to develop AI-generated objective assessment items.
  • Participants will be able to review and revise AI-generated objective items for fairness and course alignment based on subject content as related to course objectives/competencies/learning outcomes and psychometric guidelines.
  • Grade inflation and improving student learning – are our students getting better? Cameron Kiosoglous, Drexel University Zoom Room 2

    Measuring program impact and assessing student learning in the field is the focus here. In this presentation, the focus is on lessons from different programs that focus on sport coaching education and development. Using student course evaluations can provide valuable information about curriculum effectiveness and student learning. Being systematic about program improvement is an important consideration. The focus here is to reflect on what questions are asked in the course evaluation process and how those student responses can inform ways to make strategic decisions about program improvement and enhancing student learning.

    Audience Level: Intermediate

    Learning Outcomes

  • Participants will explore ways to measure program impact and assessing student learning
  • Participants will examinie ways course evaluations can lead to program improvement
  • Institutional Use of Data and Performance Indicators: Where We Want to Be, Need to Be, and By When Robert McLoughlin & Kellie Delmonico, United States Military Academy at West Point Zoom Room 3

    Institutions face many unique challenges with the abundance of available. However, many institutions want to improve how they effectively use data but often have minimal dedicated staff. The discussion will follow a scalable template for organizing and using data that will be handed out to the audience. The presenters will discuss the evolution of USMA's Performance Indicators, challenges encountered, processes developed, and lessons learned with their implementation. The presenters will also describe how they incorporate MSCHE’s expectations on data submissions into their work and how their institution is beginning to disaggregate data using on-hand technology It is essential to understand as a starting point the different definitions and perspectives on and types of data, what data is available or easily available for the institution, what data is required by accreditors, in particular, MSCHE, and how and when data will be used for institutional improvement.

    Audience Level: Intermediate

    Learning Outcomes

  • Participants will describe definitions of data, performance indicators, and evidence
  • Participants will consider an application of the template at an attendee’s institution
  • Engaging Faculty: The LIU Assessment Fellows Program Brian N. Sweeney & Maureen Tuthill, Long Island University Zoom Room 4

    The Faculty Assessment Fellows model lies at the heart of outcomes assessment at Long Island University. Working in conjunction with the Office of Assessment, faculty peers encourage pedagogical collaborations and build positive working relationships between faculty and administrators. Faculty Assessment Fellows help faculty to articulate learning outcomes, identify measurement tools, interpret data, and report results. The faculty fellows model has proven successful at creating a culture of assessment and empowering faculty to drive the process of improving learning in their programs through assessment.

    Audience Level: Beginner to Intermediate

    Learning Objectives:

  • Attendees will understand the benefits of faculty-administrative collaboration in outcomes assessment.
  • Attendees will Identify potential innovations arising from effective outcomes assessment processes.
  • INCLUSIVITY: LEARNING DESIGNED FOR ALL Karen LaPlant, Metropolitan State University & Zala Fashant, Minnesota State Colleges & Universities Zoom Room 5

    Often, extroverts look at introverts with a not-yet-ready-for-prime-time bias. Many tend to think being an extrovert should be everyone’s goal and that it is normal for everyone to reach an extravert level, causing some introverts to suffer from self-esteem issues and think they are less successful. This inclusivity session discusses designing successful learning activities and assessments for all learners where both introverts and extraverts learn best. A variety of learning activities and assessments will provide chances for all students to learn in their more comfortable mode, and offer ways to expand their learning and performance experiences in preparing for careers. Participants will discuss activities to make inclusivity intentional in the course design. Introverted faculty tend to design courses with a more reflective, internalized approach. Extroverted faculty design courses which capitalize on student interaction. Unless they have designed ways to include learning experiences to champion both introverts and extraverts, the class appeals to one type more than the other.

    Audience Level: Beginner, Intermediate

    Learning Outcomes

  • Participants will reflect on their level of introversion and extraversion.
  • Participants will discuss learning activities and assessments to make inclusivity intentional in the course design.
  • Sep 13, 2023
    12:30 PM - 1:30 PM

    Lunch and Vendors

    Lunch Break/Vendor Presentations Various Vendors Various

    Our vendors will be presenting on their various products while you have a lunch break.
    Sep 13, 2023
    1:45 PM - 2:45 PM

    Snapshot Session

    SS1: More Feedback is Better for Everyone: Ending the End-of-Semester-Only Evaluation Model Tess Cherlin, University of Pennsylvania Zoom Room 1

    My session is about implementing post-exam surveys to engage students in meta-cognition and sharing actionable feedback with the goal of improving student learning. I think this session will help attendees build deeper connections with their students and have a more accurate understanding of their students' learning experience if they are not already implementing continuous course evaluation assessments. This content is important because timely constructive feedback is important for everyone! By implementing post-exam surveys, instructors are able to assess how the students are doing and feeling as they progress through the course.

    Audience Level: Beginner

    Learning Outcomes

  • Attendees will appreciate the benefit of implementing continuous course evaluations throughout a semester.
  • Attendees will be able to implement their own post-exam surveys in their next courses if they choose.
  • SS2: A Comparative Analysis Process to Maximize Insights and Leverage Actions Across Programs Dina Eagle & Stacy Sculthorp, PhD. , Strayer University Zoom Room 1

    It’s important to continuously refine our processes so that our students benefit from the best practices in assessment, and that students have a fair evaluation of their learning. Comparative analysis across Strayer University has enabled academic leaders to share insights/actions that can be leveraged between programs. The goal of this process is to improve student learning by implementing ideas and actions that have shown success, and to engage academic leaders in assessment conversations leading to improved student performance. We adjusted our process from the Peregrine assessment to a comparative analysis because the objective assessment of Peregrine did not align with how our students learned, particularly because our students are accustomed to learning through project based/rubric based assessments. Developing a new process ensured a better comparison of student learning.

    Audience Level: Intermediate

    Learning Outcomes>

  • Participants will identify the essential components of a comparative analysis process.
  • Participants will generate ideas to improve an existing comparative analysis process.
  • Participants will discuss how the refinements of a comparative analysis process can better serve students.
  • SS3: Having a Generational Diversity Mindset will Fuel Human and Organizational Performance Michelle Keusch, Indiana Wesleyan University Zoom Room 1

    Organizations and institutions have communication blind spots and generational advantage gaps that are unrecognized; therefore, performance suffers. There are commonalities and differences within the generations that will blend well when understood and accepted. Valuing and understanding the differences within generations will bring advantages. Baby Boomers, Generation X, Millennials, and Generation Z all contribute to society and to the workforce in different capacities. Each generation is shaped by history, and each has unique experiences and expectations as a part of their human design. Leaning to lead (and follow) others within different generations will bring individual success and collective greatness. Building bridges, connecting, and having meaningful conversation with the different generations within a group will improve scholarship, leadership, followership, and workmanship.

    Audience Level: Beginner

    Learning Outcomes>

  • Participants will understand enhanced team formulation and collaboration through acceptance and understanding of generational differences.
  • Participants will recognize and maximize the strengths of individuals within a team.
  • SS4: Muddiest Point Clarified in Three Steps to Engage Students with Content Joanne Mathiasen, Drexel University Zoom Room 1

    Utilizing a classroom assessment technique commonly referred to as "Muddiest Point", we have modified this for use in our courses to help students identify confusing concepts, engage with content, and encourage earlier review of materials prior to studying for exams. Often students wait until close to an exam to review or study material. This technique encourages earlier engagement with the material using a seek and find approach that leads to better retention. This session will demonstrate how we are using this technique in simple steps leading to effective student content engagement. This is an easy to use technique with minimal instructor effort that is effective and well received by students. Additionally, it is a way to provide opportunity for students to earn extra points towards their overall score while enhancing their learning.

    Audience Level: Beginner

    Learning Outcomes

  • Participants will develop a classroom technique that engages students with the course content.
  • Participants will understand where concept confusion is occurring to allow resolution prior to exams.
  • SS5: Initial COVID Curricular Response Review: Perspectives as the Dust Settles Greg Null, University of Pittsburgh Zoom Room 1

    This session will present findings from the School of Medicine’s review of its initial curricular response to COVID completed in 2020. We will also examine how the template created allowed for future monitoring within the five areas: Student and Staff Support, Online Learning, Curriculum Governance, Technical Resilience, and Crisis Communication.

    Audience Level: Intermediate

    Learning Outcomes

  • Attendees will learn to use After Action Reviews in their home institutions.
  • Attendees will initiate AAR planning prior to the start of projects.
  • SS6: Challenges and Opportunities in Assessing Programs from a Perspective of a Neophyte Liberal Arts Faculty Member Gerard Dumancas Dr. Satyajit Ghosh, & Dr. Vanessa A. Jensen, Scranton University Zoom Room 1

    While various literature reports several grand challenges for assessment in higher education, none has so far reported a specific case study on specific challenges as well as opportunities of assessing programs in a private liberal arts university setting, especially from the perspective of a new faculty member. This session will discuss various challenges and opportunities of assessing programs at a private liberal arts university from a perspective of a neophyte faculty member. This session will provide the audience the feel of the structure, processes, challenges, as well as opportunities for the assessment of educational programs from the perspective of a new faculty member in a private liberal arts university setting.

    Audience Level: Beginner

    Learning Outcomes

  • Attendees will explore the challenges and opportunities of assessing educational programs in a private liberal arts university setting from the perspective of a new faculty member.
  • Presenters will develop ideas on how to further support new faculty members in their assessment needs.
  • Sep 13, 2023
    3:00 PM - 4:00 PM

    Concurrent Session 3

    Are Your Students Learning Online? A Glimpse at Best Practices in Online Assessment Denise Cummings-Clay, Hostos Community College Zoom Room 1

    The session will identify and describe specific practices that are effective in assessing student understanding of course content in the online modality (i.e., synchronous and asynchronous). Examples of best practices in online assessment will be shared. Using effective online assessment practices will ensure that learning outcomes for students will increase in the modality. This is crucial, especially since higher amounts of people participate in and request courses online due to work restrictions, childcare responsibilities, and other reasons impeding them from face-to-face college attendance. Attendees will choose three best practices to focus on and nurture. Attendees will be able to adopt at least three best practices that can be used immediately. It is hoped that the best practices adopted by attendees will strengthen their capacity to assess student assignments in the online modality.

    Audience Level: Beginner

    Learning Outcomes

  • Attendees will be able to describe at least three best practices that can be used to assess learning in the online modality.
  • Attendees will be able to design one online learning assessment by the end of the session
  • Data-Driven Decisions Unleashed: The Power of Decentralization in Assessment Irina Koroleva & Michelle Zhu, Montclair State University Zoom Room 2

    Join our presentation, "Data-Driven Decisions Unleashed: The Power of Decentralization in Assessment," revolutionizing university-level assessment and empowering faculty. Discover the potential of decentralization, unlocking insights from various departments and reshaping your institution's future through data-driven decision making. This session introduces a groundbreaking approach to data-driven decision making in assessments. By decentralizing the process and empowering faculty, valuable insights can be gained from multiple departments, leading to positive results and potential for reshaping institutions through informed decisionsThis session will empower attendees to transform their day-to-day work by revolutionizing data-driven decision making and solving assessment-related challenges. It will reshape their institution's future and keep them current with innovative practices in assessment and evaluation.

    Audience Level: Beginner,Intermediate

    Learning Outcomes

  • Participants will understand the game-changing approach to transform assessment processes at the University level.
  • Participants will develop strategies to decentralize assessment and empower faculty for insightful data-driven decision making.
  • Determining Agency to Improve Student Learning Will Miller, Embry-Riddle Aeronautical University Zoom Room 3

    Assessment professionals as well as teaching and learning center directors devote significant energy to helping faculty advance better learning when they are the units of change (better pedagogical practices, curricular revisions, better outcome alignment on syllabi, etc.), but can not always do as much if departmental, college, or institutional barriers are preventing necessary action. In this session, we will discuss what the end stages of continuous improvement can look like for assessment, areas of the assessment process with high faculty agency, and strategies for helping faculty and staff navigate areas where their agency is limited, including political and accreditation efforts underway across the country today. Changes at these levels require more buy-in and more capital (of all types) but have a significant impact on student learning—and consequently, assessment. Some may not even be able to be solved or moved—but without being fully aware of them, we can’t ensure that not all of our assessment asks fall on faculty.

    Audience Level: Beginner,Intermediate,Advanced

    Learning Outcomes

  • Attendees will identify areas of assessment of high and low faculty agency.
  • Attendees will describe two strategies for addressing assessment barriers that have low faculty agency.
  • To Infinity and Beyond- Activating Data Beyond Accreditation Janet Thiel, Georgian Court University & Doug Masterson, University of Southern Mississippi Zoom Room 4

    Year after year and conference after conference, accreditation representatives across the country share that higher education leaders overcomplicate their self study and process. This session unpacks that and provides a blueprint for making data stories meaningful in day to day decision-making. As higher education leaders, we all have a storybook of our experiences. Sharing these stories helps us remember the struggles and triumphs and why we do what we do to serve our students and community. And accreditation should not be the key motivator that drives us to collect our stories, but rather our desire to improve and grow. Data can help tell stories, but if the data are not meaningful or digestible, it will just be left in reports for someone to dig through or check a box. This session provides key ways that data can be used to tell stories from infinity and beyond. Data analytics is more than a trend; it is a way of life. But if data are collected but not placed in the hands of assessment leaders and program coordinators across an institution, then it becomes a compliant exercise that becomes lifeless and resented. This session provides a blueprint for how to collect, use and make an impact using data to tell stories about your successes, your needs and future planning.

    Audience Level: Intermediate

    Learning Outcomes

  • Participants will identify the components of a truly collaborative planning and improvement process that is inclusive of stakeholders accessing data to drive their decision-making and tell compelling stories.
  • Participants will describe the ideal data analytics initiative that support accreditation and beyond.
  • Sep 13, 2023
    4:15 PM - 5:15 PM

    Speaker/End of Conference

    Panel: Thoughts on Current and Future Usage of AI in Higher Education Panelist Information Below Zoom Room 1

    The panel will discuss how AI is currently being used within higher education and their role with supporting it. They will also be asked about effective ways to implement an AI policy, plus what the landscape could look like in the future. There will also be time for questions from the audience.

    Panelists:

  • Zala Fashant, Professor, Dean & Faculty Developer - Minnesota State University (retired)
  • Marylee Demeter, Senior Assessment Developer – Western Governors University
  • Anne Converse Willkomm, Associate Dean, Graduate College & Associate Teaching Professor, Dept. of Communications – Drexel University
  • Colin Gordon, Associate Professor, College of Computing and Informatics – Drexel University
  • Closing of Day 1 Joseph Hawk, Executive Director of Assessment and Accreditation - Drexel University Zoom Room 1

    Sep 14, 2023
    7:30 AM - 8:30 PM

    Breakfast

    Breakfast in PISB PISB Atrium

    Join us for breakfast in the PISB atrium where the conference registration will be.
    Sep 14, 2023
    8:30 AM - 9:30 AM

    Plenary/Discussion

    How Data-Informed Decision Making Can Impact Institutional Effectiveness Dr. Doug Masterson, Senior Associate Provost for Institutional Effectiveness at the University of Southern Mississippi & Dr. Janet Thiel, Associate Vice President, University Assessment, Georgian Court University PISB 120

    Once you commit the human and financial resources to support the onboarding of an assessment management system, the clock starts ticking to see the return on your investment. As assessment leaders and their institutions experience change, is the data still being collected? Is the data meaningful? Is it being used? How has the presence of a platform to support continuous improvement impacted your institution and its data literacy? Assessment leaders from a variety of different sizes, regions and scopes discuss their launch, use and impact of assessment management software on their culture for improvement and their stakeholders who are using reports and visualizations to support their self studies, decision-making and future planning

    Learning Outcomes:

  • 1. Identify key uses of technology in practice for continuous improvement and development
  • 2. Discuss the change and impact software has had on the growth of an institution and its data literacy
  • 3. Highlight innovative ways that leaders have designed and implemented assessment and planning processes using technology to enhance teaching, learning and improvement.
  • Sep 14, 2023
    9:45 AM - 10:45 AM

    Concurrent Session 4

    SHARED SESSION 1: A Self-Study Tool to Evaluate Diversity, Equity, and Inclusion in Higher Education Curricula Maria Brucato & Jeannette Kates, Thomas Jefferson University PISB 104

    The Jefferson Center for Interprofessional Practice & Education’s (JCIPE) Racial & Social Justice Taskforce (RSJT) developed and piloted a self-study tool to evaluate educational programming with a racial and social justice lens. The tool allows faculty and staff to reflect on program strengths and areas to better address race and social justice. Creating inclusive working and learning environments is essential in higher education. Our self-study tool can be implemented at attendees’ own institutions, with the potential of fostering diverse perspectives and improving inclusivity in attendees’ curricula. Using the self-study tool allows individuals to brainstorm ways to improve upon content and track incremental changes to curricula. Examples will be shared of faculty and staff’s reflections using the tool, as well as changes JCIPE has implemented as a result of the self-study process.

    Audience Level: Intermediate

    Learning Outcomes

  • Participants will describe the components of a self-study tool to evaluate educational programming with a racial and social justice lens.
  • Participants will explore how a racial and social justice self-study tool could be applied to curriculum.
  • SHARED SESSION 2: Embracing Culturally Responsive Assessments At An HBCU Business School Nicole Buzzetto-Hollywood & Leesa Thomas-Banks, University of Maryland Eastern Shore PISB 104

    Assessments are often plagued by biases that impact students based on personal characteristics diminishing the quality of educational delivery, serving as an obstacle to student success, leading to incorrect inferences about student abilities, and contributing to the systematic oppression of students from historically marginalized groups A business department at a Mid-Atlantic HBCU has embraced culturally responsive assessments that reflect the assets of students and which are consonant with their mission to provide a holistic learning environment that fosters multicultural diversity, academic success, and intellectual and social growth of students from traditionally underserved populations. Institutions committed to DEI must evaluate their assessments for equity, understanding that thoughtful education provides mirrors and windows, has cultural validity, affords multiple mechanisms for success, and is centered around student assets. With CRA, one considers qualities students bring to create equitable learner-centered opportunities for the demonstration of mastery.

    Audience Level: Beginner, Intermediate, Advanced

    Learning Outcomes

  • Attendees will understand the attributes of culturally responsive assessments.
  • Attendees will be provided with a set of questions to consider when looking at their assessments for cultural responsiveness.
  • These Boots Were Made for Assessment! Lessons Learned When hosting a Summer Faculty Development Assessment Boot Camp Jared Brown Tracy Kaiser-Goebel & Susan Masciantonio, Montgomery County Community College PISB 106

    This session will offer structured narrative and small group collaboration sessions. The presenters will facilitate sample sections from the boot camp training to provide participants with context and hands-on experience simulation with re-engaging faculty post-pandemic is the goal. To accomplish this, we focused on an in-person faculty engagement opportunity regarding assessment. When discussing assessment planning it is much easier to accomplish in an in-person collaborative space. The facilitators built the training with the purpose of engagement, utilizing peer to peer, and small groups only in-person with no virtual sessions. Participants will experience a version of the training to create a similar session for faculty at their institution. Presenters will share organization and planning tips, content, and some “lessons learned”. Participants will discover strategies for high impact and scalable training for faculty, focusing on assessment strategies.

    Audience Level: Intermediate, Advanced

    Learning Outcomes

  • Participants will identify strategies that inspire faculty to engage in meaningful assessment of student learning while acquiring interactive and collaborative faculty program tactics.
  • Participants will come away with an organized plan to replicate the boot camp training at their institution.
  • It’s Not Them – It’s Us!: Assessing Institutional Barriers to Student Engagement in Experiential Learning Michele Deegan & Regina Lau, Muhlenberg College PISB 108

    Unintended Institutional barriers may limit student participation in learning opportunities such as internships, research and study away. Exploring ways in which assessment can be used to identify these barriers is essential to student success. These barriers include communication, financial/administrative support, and lack of an inclusive culture for certain students. This session describes the use of focus groups to identify institutional barriers to student engagement in experiential learning opportunities. It highlights the benefits of experiential learning, opportunities to engage student research assistants in the practice of institutional research, and assessment of institutional barriers to student participation in these experiences. Participants will have the opportunity to identify possible institutional barriers and hear about solutions to assess and address these barriers from panelists and attendees. They will be able to return to their campuses with ideas for exploring this issue with other campus administrators responsible for the delivery of these experiences.

    Audience Level: Beginner,Intermediate

    Learning Outcomes

  • Participants will be able assess the extent to which program access and program quality is consistent across all experiences and equitable for all students.
  • Audience members will be provided with examples of how focus groups and other research instruments can be used to assess program goals.
  • Cohort Modeling: Strategies for Supporting Success Across Diverse Student Groups Delarious Stewart, Jeffery Fleming & April Massey, University of the District of Columbia GHALL 108

    Cohort models are widely employed in graduate and professional education where program structure (admissions, clock, pedagogies, and curriculum) may drive assumptions about student preparation, availability, and time to completion. Cohort models support community building and belonging in academic programs. These experiences are foundational to improving student success, and they can impact student engagement, satisfaction, and completion. This session explores the 1) fundamental expectations of cohort modeling; 2) implications of traditional cohort designs for student persistence; and 3) the opportunities and challenges of translating traditional cohort expectations to diverse and non traditional student groups. This session examines opportunities for successful cohort modeling in undergraduate and graduate student populations that may present at the margins of these assumptions.

    Audience Level: Intermediate

    Learning Outcomes

  • Participants will describe the impacts of program variables on determining cohort model efficacy.
  • Participants will describe opportunities to create cohort strategies
  • Sep 14, 2023
    11:00 AM - 12:00 PM

    Concurrent Session 5

    Facilitating the Facilitators in High-Fidelity Simulation Exercises for Physiology and Pharmacology Joanne Mathiasen, Drexel University PISB 104

    High-fidelity patient simulation (HFS) has shown a beneficial effect on student learning in medical related curriculum (Cortigani, 2015; Meyers, 2020). Drexel University College of Medicine (DUCOM) uses simulation as a low-stakes, pseudo-clinical environment representing practical clinical scenarios within the physiology and pharmacology threads in the medical curriculum. The facilitators are briefed on the case ahead of time which includes the best intervention(s) for resolution of the case. After evaluating video recordings of facilitators who use these two approaches, we plan to advise facilitators towards the “spirit guide” method, as this better develops student-driven problem solution objectives of the activity. Each session in the simulation rooms has a faculty facilitator to guide students through the case. The role of the facilitator can be seen either as a “spirit guide” which would provide minimal content expertise or, alternatively, as an opportunity for teaching the material. While the advantage to student learning has been demonstrated with simulation-based education (Heitz, 2009; Harris, 2012), the role of the facilitator has not been explored.

    Audience Level: Intermediate

    Learning Outcomes

  • Participants will gain team based student driven problem solving during a simulated urgent situation is beneficial to learning retention
  • That's the Sound of Inevitability: Using ChatGPT and other LLMs to Create Cross-Curricular Writing Assignments Christina Markle & Bradley Markle, Penn State University PISB 106

    ChatGPT and other LLMs are inevitable. OpenAI and Microsoft will integrate the technology with Microsoft Word in the near future. Any educator who has anxiety about the technology or is interested in how to use it as a tool to help, rather than hinder, their educational goals should attend this session. This panel seeks to counter the fear of ChatGPT in the classroom through practical instruction. We will understand the capabilities and limitations of LLMs, and offer concrete examples to leverage the technology in the classroom by learning to create meaningful assignments that emphasize reading, analysis, and modifying lines of inquiry. Teachers are expected to integrate technology into their classrooms, but often lack concrete training. In the education community, new technologies are often met with apprehension. This session will provide teachers with real-world examples to instruct students to use LLMs to improve their learning experience via composition rather than hinder it.

    Audience Level: Beginner, Intermediate

    Learning Outcomes

  • Participants will understand the abilities and limitations of ChatGPT through direct instruction.
  • Participants will create an oringinal writing assignment for their subject area using ChatGPT as a resource.
  • Retrospective assessment of an online asynchronous STEM Gen-ED course for non-STEM students at an HBCU Lucia Santacruz, Bowie State University PISB 108

    Assessment of student learning and teaching is extends beyond programs subject to external professional accreditation (i.e nursing or education for example). This session will provide the audience with examples of how assessment data can be applied to improve student learning outcomes. Data obtained from assessment of student learning and teaching data are a key element in ensuring that course and curricular content are providing today's students with the knowledge, skills and abilities to be successful beyond the classroom. This session is centered around assessment data used to drive course redesign and provide recommendations for curricular design and advisement for student success.

    Audience Level: Beginner, Intermediate, Advanced

    Learning Outcomes

  • Participants will apply information gleamed form assessment data to drive course/curriculum redesign
  • Participants will use assessment data to develop student academic advisement policies to support student success
  • CANCELLED: Show ‘em What We’ve Got: Program Showcase, A Strategy for Closing the Loop and Breaking down Silos Corbyn Wild & Kristie Camacho, College of the Desert PISB 108

    At one community college, a Program Showcase designed to encourage collaborative interaction with success measures and frank conversations about assessing and improving our students’ experiences shifted siloed interactions with data. Discover a practical plan to encourage faculty to embrace the power of data across success metrics, use it appropriately, and find an authentic path to celebrate their students and programs. By implementing the Program Showcase model and leveraging data in conjunction with narrative, participants can learn how to celebrate student success, engage the larger campus community, and improve student outcomes. This topic is relevant because continuous improvement and assessment of student learning outcomes are crucial in higher education. However, these processes are often completed in isolation, leading to a lack of collaboration and a missed opportunity for improving student outcomes. This session offers a practical plan for embracing the power of data, celebrating student success, and fostering a culture of empowerment and meaning around continuous improvement. Participants will learn how to adapt the Program Showcase principles to their own campus and develop an action plan for implementing similar strategies.

    Audience Level: Beginner,Intermediate

    Learning Outcomes

  • Participants will develop techniques for leveraging of data in conjunction with narrative to assess and improve student experiences in programs.
  • Participants will explore innovative approaches, such as a program showcase, to celebrate student success and engage the larger campus community in continuous improvement efforts.
  • Sep 14, 2023
    12:00 PM - 1:00 PM

    Lunch/Networking

    Lunch and Networking PISB Atrium

    Join us for boxed lunch in the PISB atrium and network with your peers.
    Sep 14, 2023
    1:15 PM - 2:15 PM

    Snapshot Session

    SS1: "Don't Wait - Disaggregate!" Collect and Examine Assessment Data to Ensure All Student Groups are Thriving J Bret Bennington & S. Stavros Valenti, Hofstra University PISB 106

    DEI is emphasized in many sections of the recently revised MSCHE accreditation standards. Sustainable methods for collecting disaggregated student leaning data are necessary to support the continuous improvement of learning by all student demographic groups. Our session will share methods for the sustainable collection of disaggregated student data – including the Python computer code we wrote to support Qualtrics-based assessment surveys – and present the results of our most recent learning assessments across gender, race, and ethnicity. We will demonstrate a method of collecting and analyzing disaggregated learning outcomes data that uses common online survey tools. A simple Python computer program will be made available to attendees for creating customized instructor surveys that embed course roster information.

    Audience Level: Intermediate

    Learning Outcomes

  • Attendees will learn how to collect disaggregated assessment data with common online survey tools.
  • Attendees will learn methods of analysis and communication of disaggregated assessment data to support the continuous improvement of learning by all student demographic groups.
  • SS2: A Survey of Alumni, Students and Applicants to Gauge Preference of In-person or Online Learning Modalities Including Synchronous, Asynchronous, or Both Mary Carty, Drexel University PISB 106

    We decide program modalities for many reasons, but most times do not directly consult the students regarding their feelings or preferences. Gathering this data, along with assessing the longitudinal trends, will allow us to make more informed decisions on program delivery. Our Department of Microbiology & Immunology offers some programs both online and in-person (MS in Infectious Disease, MS in Immunology & MS in Molecular Medicine). An analysis was developed to see why students preferred one over the other, if they still would have chosen the program if it was just on-line, if we should add more synchronous sessions to the asynchronous classes, and admissions trends for these modalities over the years. It would provide a framework for making informed decisions on the best ways to meet student demand for online and in-person learning. By knowing what you audience prefers and why they prefer it, programs are more marketable and successful.

    Audience Level: Beginner, Intermediate, Advanced

    Learning Outcomes

  • Participants will build an assessment of program modality to determine the most efficacious approach.
  • Participants will explore trends for current and potential Drexel graduate students on learning preferences.
  • SS3: Diversifying a Rubric with Weights Diane DePew, Drexel University PISB 106

    The importance of assignment components varies based on the goal of the assessment. This session will describe how to diversify rubrics by weighting each criterion. The session will show how to develop a weighted rubric that aligns with the focus of learning outcomes.

    Audience Level: Intermediate

    Learning Outcomes

  • Participants will be able determine the importance of rubric criteria.
  • Participants will be able design a rubric to reflect the learning outcomes.
  • SS4: Student Success Impacts of SI PASS Online and In Person Delana Gregg, University of Maryland Baltimore County PISB 106

    What can we learn from COVID-induced online academic support? How do those lessons translate during the return to in-person learning? We collected and analyzed data on the effects of the intervention on student participation, feedback and success in Calculus 1, including the impact of SI session modality and course modality. Does online Supplemental Instruction help students, compared to in-person SI? Come learn from our statistical analysis estimating the effects of placing an SI Leader in each section of Calculus 1 from spring 2021-fall 2022 on SI student participation, success (D/F/W rates, retention) and survey feedback during online and in-person learning. This statistical study adds to the body of research indicating positive effects from Supplemental Instruction. COVID-induced remote teaching and academic support provided a unique research opportunity to compare previous data on the effects of in-person support to online support and the hybrid model we offered after the return to in-person instruction.

    Audience Level: Intermediate,Advanced

    Learning Outcomes

  • Participants will understand Supplemental Instruction’s effects on student success (D/F/W grades, retention) in Calculus 1, both during in-person and online learning.
  • Participants will understand how to assess academic support intervention effects on student success using propensity score matching and student survey feedback.
  • SS5: An Exploration of Students' Perceptions of Bias and Racism in their Learning Environment Simran Shamith & Dr. Carolyn Giordano, Drexel University PISB 106

    While studies have been conducted to understand racial allyship, the literature lacks an understanding of how the learning environment impacts allyship in future physicians. Literature shows racism and bias exist in medical education. Experiences of bias in medical education permeate into future interactions with diverse patient populations. Developing assessment tools that can identify sources of racism and bias in the learning environment in medical education is important to promote allyship. This session will discuss the creation of a tool to understand the factors that influence students’ perceptions of bias and racism in medical education. Racism in medical education impacts everyone. The creation of an anti-racism assessment tool will identify the sources of racism and bias in medical institutions. Increasing awareness and education of bias, racism, and racial allyship will not only improve medical education, but will improve the lives of everyone.

    Audience Level: Intermediate

    Learning Outcomes

  • Participants will understand the importance of evaluating student learning environments, especially as it relates to bias and racism.
  • Participants will be able to identify methods to evaluate and assess the effectiveness of an anti-racism survey tool in medical education.
  • Sep 14, 2023
    2:30 PM - 3:15 PM

    Vendor/Coffee Break

    Coffee Break and Vendor Presentations Various Various Rooms in PISB

    Grab some coffee while the vendors provide presentations.
    Sep 14, 2023
    3:30 PM - 4:30 PM

    Concurrent Session 6

    SHARED SESSION 1: Communications Internship Handbook: What HBCU Students Need to Know Rochelle R. Daniel & Karima A. Haynes, Bowie State University PISB 104

    At a time when the United States is engaged in a national conversation on diversity, equity, and inclusion. It is important for faculty, internship coordinators, and career development staff to address and assess how students of color perceive themselves in the workplace and how they are perceived by internship supervisors. Internships, especially in the communications field, are essential for students to achieve their career goals. This session will provide insights and practical advice to faculty, academic advisers, internship coordinators, internship supervisors, and career development staff regarding the unique challenges students at historically Black colleges and universities face during their internships. The presenters will share with attendees a brief history of historically Black colleges and universities; how students of color balance internships with classes, work, finances, and family obligations; how students present themselves in virtual and in-person settings; workplace sexism, racism, and ageism; and advice for interns from communications professionals.

    Audience Level: Beginner

    Learning Outcomes

  • Attendees will understand the challenges HBCU communications students face in acquiring internships and performing well during their internship experience.
  • Attendees will understand how to direct HBCUs students toward internship where they have the best opportunity to learn under a communications professional.
  • SHARED SESSION 2: Academic Affairs Assessment Survey results and template design to Support Regional and Specialized Accreditation Efforts Becky Verzinski, Bowie State University PISB 104

    Assessing an institution's programmatic assessment practices can be a daunting task - but a necessary one, especially as it relates to the review and evaluation process required for self-studies.This session will share with participants the process, resources, and survey instrument used to collect data on assessment practices in the academic affairs division. Session participants will have the opportunity to learn how Bowie State University developed an internal assessment survey, disseminated the instrument to achieve a 51% return rate, and collected and analyzed programmatic survey data for continuous improvement. The goal of the session is to provide participants with resources and recommendations for developing and implementing their own assessment survey. A brief overview of the the assessment survey results will be provided and further examined as evidence for supporting regional accreditation as well as specialized programmatic accreditation. Understanding academic assessment practices through survey data helps continuous improvement processes.

    Audience Level: Beginner,Intermediate

    Learning Outcomes

  • Participants will be able to develop an internal assessment survey to collect data on the dynamics of programmatic assessment at their institution.
  • Participants will be able to implement various best practices with survey administration
  • Assessment, Higher Education, and the Essential Connectedness of Everything Joel Bloom, Hunter College PISB 106

    Assessment offices vary a tremendous amount with regard to how connected they are to other offices that relate to institutional effectiveness and student success. This matters because institutions cannot be effective without effective communication and collaboration among units. Dr. Bloom will discuss a variety of ways in which all aspects of higher education are interconnected, with a particular focus on assessment, IR, and accreditation. Unless we recognize this essential inter-connectedness and start building bridges across divisions rather than reinforcing (and enforcing!) silos, we will fail to carry out our institutional mission and our values. and unless we recognize and develop the necessary connections among them in an intentional, comprehensive, and collaborative manner, we will not be effective as an institution. The Middle States Comission has just adopted new Standards of Accreditation that specifically require that assessment of the student experience (Standard IV) and educational outcomes (Standard V) be done in ways that allow disaggregation by demographic groups to ensure equitable outcomes across groups. Some schools have already been doing this, but for most schools in the Middle States region, this new requirement will necessitate the kinds of communication and collaboration I will be discussing. In fact, strategic assessment professionals can use this requirement to advocate for greater communication & collaboration on their campus.

    Audience Level: Beginner, Intermediate, Advanced

    Learning Outcomes

  • Participants will learn about the new Middle States standards requirement data disaggregation.
  • Participants will learn the importance of greater communication and collaboration between assessment and other offices across units and divisions, to facilitate accreditation and achieve institutional effectiveness.
  • From Zero to Ninety-Nine: Reflections on Restructuring an Existing Assessment Program for Improved Student Learning, Data Collection, and Accreditation at a Small Midwestern College Yasmin Rioux, PISB 108

    This session features relevant and current best practices and approaches to assessment programs and responses to HLC’s Core Component 4 B. The presenter will help audience members to examine their own assessment efforts and offer tips on how to effectively create a sustainable program for unique learning environments in higher education. The prestner will discuss how the HLC’s request to examine our assessment processes led to the complete overhaul and restructuring of existing assessment efforts to create a systematic and sustainable assessment program. Attendees will learn address important assessment program subjects like assessment culture and infrastructure, software selection, faculty engagement, college-wide interest and motivation, informal/formal communication, and sustainability. Also the presetner will share offer practical insights into the complete revamping of an existing assessment program at my current institution. Further, I explain how our efforts are constantly relying of qualitative and quantitative data and input to ensure our program is effective and sustainable and that the relevant stakeholders and participants are engaged and integrated.

    Audience Level: Beginner,Intermediate

    Learning Outcomes

  • Participants will be able to more effectively analyze their own assessment efforts and programs to address possible HLC concerns.
  • Participants will be able to use our information to build more effective assessment infrastructure, communication strategies, improve faculty engagement, and assess the sustainability of their existing assessment programs.
  • Data Driven Shift Toward a Curricular Approach within Residence Life Ashley Sardik, Muhlenberg College GHALL 108

    During this session attendees will learn about a mixed method approach to assessing the Case Western Reserve University's residential environment that led to Residence Life adopting a curricular approach to its programs and services. This will allow student affairs practitioners to learn how assessment can inform learning outcomes based strategies, programs and services while displaying what that academic affairs can learn about the student experience through student affairs assessment activities. For student affairs practitioners it will give them methods & templates for assessing their programs and services and how to leverage quantitative and qualitatives data to advocate for divisional change and funding to support their work.

    Audience Level: Intermediate

    Learning Outcomes

  • Attendees will be able to assess learning in the residential environments at their institutions.
  • Attendees will be able to leverage resources to build curricular approaches to student learning in the co-curricular environment.
  • Sep 14, 2023
    4:45 PM - 5:30 PM

    Speaker/End of Conference

    Closing Remarks Conference Chair PISB 120

    We finish off the 10th anniversary of the Drexel Assessment Conference together.