For a better experience, click the Compatibility Mode icon above to turn off Compatibility Mode, which is only for viewing older websites.

Final Program

Date/Time Session
Sep 7, 2022
9:00 AM - 9:45 AM

Informal Meet and Greet Conference Committee

Meet the Conference Committee and your colleagues

Sep 7, 2022
10:00 AM - 11:00 AM

Opening Plenary

Opening Panel: Learning is the New Pension Moderator: Suzanne Carbonaro - AEFIS

Sep 7, 2022
11:15 AM - 12:15 PM

Concurrent Session 1

Visualization of Student Cohort Data: Creating an Interactive Dashboard to Visualize Student Flow and Success Nasrin Fatima - Binghamton University

Scaling for Large Enrollment Courses Melissa Kaufman - Drexel University

An Effective and Scalable Framework for Integrating Institutional Strategic Planning, Resourcing, and Assessment Gerald Koblyski - United States Military Academy at West Point

Starting from the Very Beginning: Reflections on the Experiences of Totally Revamping an Assessment Program in the Setting of a Small Midwestern College Yasmin Rioux - Divine Word College

In this session, we discuss how the HLC’s request to examine our assessment processes led to the complete overhaul and restructuring of existing assessment efforts to create a systematic and sustainable assessment program. We address important assessment program subjects like assessment culture and infrastructure, software selection, faculty engagement, college-wide interest and motivation, informal/formal communication, and sustainability. This topic matters because it emphasizes relevant and current best practices and approaches to assessment programs and responses to HLC’s Core Component 4 B. We help audience members to examine their own assessment efforts and offer tips on how to effectively create a sustainable program for unique learning environments in higher education. We offer practical insights into the complete revamping of an existing assessment program at our institution. Further, we explain how our efforts are constantly relying of qualitative and quantitative data and input to ensure our program is effective and sustainable and that the relevant stakeholders/participants are engaged and integrated.

Session Level: Beginner

Learning Outcomes:

  • Participants will be able to more effectively analyze their own assessment efforts and programs to address possible HLC concerns.
  • Participants will be able to use our information to build more effective assessment infrastructure, communication strategies, improve faculty engagement, and assess the sustainability of their existing assessment programs.
  • Technology and Assessment: Using Multiple Platforms for Cohesive Programming Janet Thiel - Georgian Court University & Suzanne Carbonaro - Heliocampus

    S1: Preparing STEM Trainees for Assessment: Win-Win for Students and Administrators & S2: Teaching from the Middle: Assessing Past Experiences & Future Ambitions to Sustain Learning and Engagement S1: Natalie Chernets - Drexel University & S2: Dhymsy Vixamar-Owens - University of the District of Columbia

    S1: The quality of HigherEd depends on effective and continuous assessment and evaluation to align student-learning outcomes with the program goals and prepare trainees for future employment. Program and course assessments require a working knowledge of assessment tools; however, trainees in biomedical sciences do not build competencies to lead assessment efforts. We developed a 1-year program to introduce graduate students and postdoctoral fellows in biomedical sciences to best practices in assessment and evaluation. Since assessment and evaluation are continuous endeavors, providing trainees with tools to perform these activities will develop skills for those pursuing biomedical-science faculty careers. We will share the program motivation, design, timeline, and outcomes to inspire participants to replicate the program at their institution. We will share lessons learned, including participants' impressions of the program and its value for the administrators. Lastly, we will share plans to sustain the program and engage participants to brainstorm project ideas to benefit their institutions.

    Session Level: Beginner

    Learning Outcomes:

  • Participants will be able to discover how to train graduate students and postdoctoral fellows in STEM to conduct assessments and evaluations through professional development opportunity.
  • Participants will be able to brainstorm ideas for projects at their institutions that can serve as experiential learning component if they choose to replicate the program at their institution.
  • S2: Approximately 32.9% of undergraduates do not complete their degree program (Hanson, 2022). As students reach courses that demand increased conceptualization of abstract ideas, application, and overall higher-level learning, repeated failing and disinterest increases. Said courses can benefit greatly if taught from the middle. Teaching from the Middle (TFM) means honoring what students bring to the learning environment, keeping previously learned information alive, and establishing connections to the courses and practical experiences that follow. This session will explore the value of assessing where students are at the start and end of a course to TFM. TFM implores educators to take inventory of who they are teaching, and intentionally connect to other courses and lived experiences. Through TFM students capitalize on their strengths while developing and improving new skills. During this session we discuss the value of assessment to TFM and explore effective strategies.

    Session Level: Intermediate

    Learning Outcomes:

  • Participants will be able to transform concept/theory to application.
  • Participants will be equipped with teaching/learning/assessment strategies with lasting impact that align with TFM.
  • Sep 7, 2022
    12:30 PM - 1:30 PM

    Luncheon Discussion Panel

    Accreditation Plenary Panel Moderator: Joseph Hawk - Drexel University

    Sep 7, 2022
    1:45 PM - 2:45 PM

    Graduate Student Poster Session

    Graduate Student Poster Session Various Presenters

    Sep 7, 2022
    3:00 PM - 4:00 PM

    Concurrent Session 2

    Developing Learning Taxonomies to Foster Common Language Across Your Curricular and Co-Curricular Experiences Suzanne Carbonaro - Heliocampus

    One institution's lessons learned about institutional readiness for adoption of an assessment management system. Cynthia Burns Martin - New England College

    Faculty-Facilitated Assessment and Continuous Improvement of DEI Course Design Principles James Snow - Maryland University of Integrative Health

    Cross Purposes? Aligning Institutional Assessment with DEI Russell Stone - Boston University

    The pandemic exacerbated inequities in student success that many institutions discussed pre-2020. Diverse students’ needs became more evident even while assessment practices remained relatively static. Institutions must move to a more nuanced, mosaic-like approach to assessment that reflects their campus populations and can accomplish this by inviting stakeholders to participate. Assessment can whitewash; even projects designed to respond to DEI can fail. The key is to engage many, divergent perspectives in the development, practice, and interpretation of learning assessment, in response to the needs/interests of diverse and changing student populations, rather than continue to reify past or more homogenous ones. This session will discuss real-world challenges to DEI in assessment, offering a framework for responsive assessment, opportunities to identify potential stakeholders and their possible roles, and the means by which to activate a more inclusive, accessible, and productive assessment ecology by engaging students, faculty, and administrators’ active and direct participation.

    Audience Level: Intermediate

    Learning Outcomes:

  • Participants will leave the session with clear ideas about how to reconcile the homogenizing forces of institutional assessment with the more atomistic requirements of meaningful DEI work, in and through assessment.
  • Participants will leave the session with clear ideas and options for how to bring stakeholders into the process, how to engage them in a negotiated and shared purpose for both global and local emphases, and who to look to on their campuses as allies in this complex and satisfying work.
  • The Ongoing Debate: Can Virtual Labs Effectively Replace In-Person Labs in Science Lab Courses? Laura Bianco - Delaware Technical Community College

    S1: What Do Students Want? Assessing Student Engagement in the New Normal & S2; A post-pandemic re-examination of Public Speaking assessment: Mixing feedback types for effective student engagement towards more creative, virtual speaking skills outcomes. S1: Julia Campana - Temple University & S2: Lucy Gichaga - Bowie State University

    S1: Students' wants and needs have changed in the wake of a continued pandemic and great resignation. We have been left trying to understand the best service delivery models - in-person, hybrid, or online. The Temple University Career Center began a proactive assessment strategy to answer these questions. Understanding which services and modalities students utilize allows administrative support offices at universities to remain relevant. This is important as the purpose and structure of higher education are reevaluated. Formative assessment of behaviors allows departments to create innovative methods of support and programming and respond to changing student needs. Attendees will walk away with strategies for this to be implemented in a variety of settings.

    Session Level: Intermediate

    Learning Outcomes:

  • Participants will learn how a central university department has approached changes to student appointment schedule behavior.
  • Participants will learn ways to take assessment about appointments to their own offices.
  • S2: Grading assessments using the same rubric for multiple speeches in the Public Speaking courses proved less engaging, so there arose a need to rethink student engagement strategies in the virtual-focused post –pandemic context. The speeches yet to be done after mid-terms had become of “less quality” once students noticed that they could still get a good grade if they barely touched the rubric expectations. The strategies employed to help improve student engagement, boost creative skills were informed by the NCA’s performance based evaluation Public Speaking Competence Rubric (PSCR), and the use of class created rubrics to leverage on pedagogical partnerships. Use of mixed assessments help engage students in the post pandemic virtual focused context. We included the previously used clear behavioral objectives and contract grading strategies coupled with peer reviews designed along the five canons of rhetoric. Speech evaluations were student-to-student comprehensive peer reviews in both verbal and discussion formats. The final project was a website with all speech videos done during the class, images, audio and other items to illustrate their ideas. Approach helped students to observe significant improvement in their speech-making skills by the end of the class.

    Audience Level: Intermediate

    Learning Outcomes:

  • Participants will observe that a great rubric should “show’ not “tell” the skill that needs to be internalized by learners especially for an important creative skill like Public Speaking on the virtual space (video speeches).
  • Participants will see rubrics for other types of feedback type including peer review rubrics and discussion board to help build individual, internalized skills.
  • S1: A PBL course centered on advocacy and multi-modal writing that integrates authentic assessment and ethnography & S2: Building Connections between Institutional, Academic, and Student Affairs/Co-Curricular Assessment S1: Jaime Groockett - Rowan University & S2: Bri Lauka - Johns Hopkins University

    S1: Writing Out Loud! Writing for Social Justice is an overview of a Project-Based Learning course designed for a First-Year Writing program. The course integrates multi-model writing, equitable awareness of audience, and the hallmarks of PBL to create a real-world writing experience that empowers students to advocate for the nonprofit of their choice.

    Session Level: Beginner

    Learning Outcomes:

  • Participants will understand the role ethnography plays in developing a real-life, equitable, antiracist awareness of audience.
  • Participants will understand the design principles of PBL and be able to apply them to their own course design.
  • S2: University assessment goals are multi-dimensional. Institutional and academic assessment units typically directly measure student learning for attainment of learning outcomes and quality improvement; whereas student affairs/co-curricular assessment units frequently prioritize indirect and operational outcomes and measures. Despite their disparate processes, all efforts seek to advance institutional priorities, especially student learning and development. This session will discuss benefits, barriers, and strategies for collaboration across divisions and linking institutional, academic, and student affairs/co-curricular learning assessment; provide examples of such collaboration within an institutional context; and propose collaborative assessment practices that center around student learning and promote innovation. Despite barriers to implementation, connecting assessment goals and activities improves the utility of results, builds opportunity for collaboration and innovation across the institution, and helps students, staff, and faculty understand and communicate the value of the holistic learning experience, thus further enhancing student engagement and outcomes.

    Session Level: Advanced

    Learning Outcomes:

  • Participants will be able to articulate similarities and differences in the goals and methods of academic, institutional, and student affairs/co-curricular assessment units.
  • Participants will be able to describe the benefits, barriers, and strategies for collaboration between institutional, academic, and student affairs/co-curricular assessment administrators and practitioners.
  • Sep 7, 2022
    4:15 PM - 5:15 PM

    Snapshot Session

    SS1: Describing the strategy and process for improving the assessment program at a STEM graduate school. Andrea Baker PhD - Air Force Institute of Technology

    SS2: Establishment of a new IDEAS Hub for Capstone Senior Design David Brookstein & Jkeya Sadeghipour - Temple University

    SS3: Assessment and Accreditation: Quality Assurance and/or Quality Improvement Christopher Davis - University of Maryland Global Campus

    SS4: Data-Informed Decision-Making in Higher Education: Evidence-based Evaluation of Digital Learning Sima Caspari-Sadeghi & Juta Magdefrau - University of Passau

    SS5: Academic course transformation to technologically redesign introductory accounting to improve retention and pass rates Symon Manyara - Bowie State University

    SS6: Getting Academic Program Reviews On Track: How our Academic Assessment Team Made it Happen Elizabeth Mosser - Harford Community College

    SS7: Assessment Systems: How We Use Them and How Long We Keep Them George Nickels & Dale Carpenter - Western Carolina University

    SS8: Closing the Loop: Using end-of-program survey results to improve a Doctor of Physical Therapy curriculum Sheryl Sanders, Tzurei Chen, Michael Bridges & Rebecca Reisch - Pacific University

    Sep 7, 2022
    5:30 PM - 6:00 PM

    Closing

    Closing Remarks Drexel Assessment Conference Planning Committee