For a better experience, click the Compatibility Mode icon above to turn off Compatibility Mode, which is only for viewing older websites.

Date/Time Session
Sep 11, 2019
9:00 AM - 12:00 PM

Pre-Conference Workshops

Cognitive Load & Student Success: Examining the Critical Balance Between Content, Instruction & Assessment Kristen Betts, Dana Kemery & Joanne Serembus - Drexel University. Karyn Holt - University of Nevada, Las Vegas Pearlstein 101

Advancements in neuroscience are transforming what is known about the brain, mind, and learning. For educators, these advancements provide key insights that can be applied to course development, instruction, and assessment to support student success. This interactive session is designed for individuals who teach, plan to teach, design courses, or involved in a course review process. The session discusses evidence-based practices related to student learning, Carnegie Unit, credit-hour policy, and the critical balance between course content, instruction, and assessment. Attendees are invited to bring a laptop and syllabus to engage in individual and group activities. This session is facilitated by a transdisciplinary team of faculty from three colleges/schools at Drexel University.

The session begins with a short interactive presentation on the human learning process and cognitive load. Attendees then complete a self-reflective exercise. Following a course calculations demonstration, attendees have access to a cloud-based application to review their own course using interactive dashboards to examine three types of interaction: student-instructor, student-content, and student-student. A paper-based version will be available. Attendees will revisit the self-reflective exercise and then work in small groups to share preliminary findings. The session concludes with a collective discussion on policies/regulations, cognitive load, and shared best practices.

Attendees who attend this workshop will be able to:

  • Define cognitive load
  • Describe the effects of cognitive load on learning
  • Discuss the Carnegie Unit and U.S. credit hour policy
  • Assess three types of course interaction: student-instructor, student-content, student-student
  • Apply strategies that support learning in alignment with research on cognitive load/ overload
  • Integrate evidence-based practices into courses to support student success
  • Learning Outcomes:

  • 1. Describe the effects of cognitive load on learning
  • 2. Assess three types of course interaction: student-instructor, student-content, student-student
  • 3. Integrate evidence-based practices into courses to support student success
  • Dazzling Others with your Effective Teaching Through Rubrics, Analytic Tools, and Cool Charts Phyllis Blumberg & Kymber Taylor - University of the Sciences Pearlstein 102

    Higher education institutions need to demonstrate that their teaching is effective for varied stakeholders including accreditors, legislators, students, employers and the public at large. However, faculty and administrators may be at a loss on how to define good teaching and how to assess teaching practices. Learning-centered teaching is an evidence-based best educational practice that defines aspects of effective teaching (Blumberg, 2009).Teaching effectiveness can be measured using learning-centered teaching rubrics. This workshop will introduce a revision of the components of learning-centered teaching along with revised evidence-based rubrics. The rubrics suggest actions that increase learning-centered teaching. Using the assessment cycle, participants will learn how to collect, analyze, summarize, plan for change, and effectively communicate teaching effectiveness. All of these processes and methods identify strengths, areas of improvement, and determine if benchmarks were achieved. They foster teaching improvement actions both for individual faculty members and faculty collectively within departments or colleges. We will discuss and practice using a currently recommended analysis tool, SOAR, and a new way to represent data, dumbbell charts, that can be used with a wide variety of types of data.

    Learning Outcomes:

  • 1. Participants will be able to: use the evidence-based model of learning-centered teaching and its assessment rubrics is to make improvements in their own teaching or to suggest ways for others to improve their teaching effectiveness.
  • 2. Participants will be able to use data to: a) plan for concrete actions to improve teaching using SOAR analyses and b) create easy- to-interpret graphs, including dumbbell charts, to concisely communicate individual and programmatic assessment results to suggest concrete actions for quality improvement.
  • Going Beyond Needs Assessments: Using Learning Metrics to Support High-Impact Faculty Development Carol A Hurney - Colby College Pearlstein 302

    Developing high-impact educational development experiences for faculty often begins by determining their needs. To do this, many Centers for Teaching and Learning and other faculty development entities deploy surveys to their faculty and instructors. Unfortunately, needs assessment surveys, from my experience, tell educational developers what they already know—faculty need everything and have no time to address these needs. Additionally, most needs assessment instruments related to teaching only include items about pedagogical innovations or other instructional strategies and rarely ask faculty about needs related to growing a fulfilling teaching career in academia. Thus, program development emanating from needs assessment tends to target the “low hanging fruit” that faculty think they need, rather than a more holistic approach to teaching and their teaching careers. This session explores how the Faculty Learning Outcome (FLO) framework, developed as an assessment tool, can also support robust faculty development program development and implementation.

    Learning Outcomes.

  • 1. Participants will reflect on ways the Faculty Learning Outcome framework supports faculty development strategic planning processes,
  • 2. Participants will utilize assessment data from the FLO framework as a case study for creating high-impact educational development experiences, and
  • 3. Participants will apply the Faculty Learning Outcome framework to their institutional context
  • Increasing Equity Using Evidence Based Assessment Karen Singer-Freeman & Christine Robinson - University of North Carolina at Charlotte Pearlstein 303

    To achieve equity, we must examine the extent to which assessment choices contribute to the perpetuation of achievement gaps. Montenegro and Jankowski (2017) assert the importance of employing culturally responsive assessment as a means of increasing equity in higher education. In this workshop we will teach participants about a model of culturally relevant assessment (Singer-Freeman, Hobbs, & Robinson, in press) and evidence-based methods of assessment that measure learning equally in all groups of students. We will share innovative uses of data from our campus in which we have disaggregated assignment grades to reveal areas in which assessments appear to evoke false achievement gaps. Participants will leave the workshop with, an understanding of the role of assessment choices in the perpetuation of achievement gaps, methods of identifying achievement gaps using campus data, and a set of best practices to support increasing equity in assessment.

    Learning Outcomes:

  • 1. Participants will be prepared to identify questions about inclusive assessment and student learning that can be addressed using existing sources of campus data.
  • 2. Participants will be prepared to use data triangulation to evaluate student learning outcomes attainment.
  • 3. Participants will be prepared to communicate the results of data triangulation analyses to different audiences.
  • Integration of General Education and the Major Sandra Bailey - Oregon Institute of Technology. David Marshall - University of California, San Bernadino & NILOA Pearlstein 307

    Our institutions tend not to be organized or function towards intentional alignment of student learning experiences. Students learn everywhere, but the institutional organization tends to require students to take the jumble of experiences and organize them for themselves. How do we help students make sense of it all?

    This workshop will use the Learning Systems Paradigm, a framework to help participants reflect on the organization of their institution, how work might be accomplished within that organization, and whom they might involve in that work. The framework encourages:

  • Working collaboratively across typical divisions
  • Intentionally aligning learning experiences
  • Addressing needs of the institution”s particular students
  • Building transparency for all participants and stakeholders
  • The workshop facilitators will share their experience at two different institutions with participants on meaningful mapping of curriculum, integration of general education, and re-envisioning of assessment. Participants will leave with action plans for how to further work on their campus. They will learn about various resources and publications available to assist in their efforts to better align and integrate general education and the major; explore various approaches to curriculum mapping; and learn from national efforts to enhance the effectiveness of general education.

    Workshop participants will develop individual action plans based on an institutional alignment issue defined in the workshop. Implementation of the plan will be within the realm of the participant's current role at their institution.

    Learning Outcomes:

  • 1. Participants will test their action plans with a colleague soliciting feedback.
  • 2. Participants will implement the plan utilizing resources within the realm of control of their current role at the institution.
  • 3. Participants will assess the effectiveness of the change implemented based on their action plan and make adjustments as necessary using the Design Thinking process modeled in the workshop.
  • Trends in Assessment: Enduring Principles, Emerging Opportunities Stephen P. Hundley & Susan Kahn - IUPUI Pearlstein 308

    What should students know and be able to do? What credible evidence is used to determine progress toward learning goals? How can assessment practices and results support meaningful improvements in student learning and institutional effectiveness? These questions have been at the heart of assessment’s efforts in higher education for the last quarter century. But, recently, we have also been asking some new questions: Can assessment help us understand how and why, as well as what students learn? How can assessment help us to educate and develop the whole student?

    This interactive workshop outlines enduring principles that have influenced the development of assessment and improvement practices and emerging opportunities for assessment, including implications for higher education’s future. Content for this workshop is informed by Assessment Update, a bimonthly publication from Wiley/Jossey-Bass with a national readership; the Assessment Institute in Indianapolis, now the nation’s oldest and largest event of its type; and Trends in Assessment: Ideas, Opportunities, and Issues for Higher Education, a forthcoming book co-edited by the workshop facilitators (Stylus; release date October 2019).

    Workshop Learning Outcomes

  • 1. Participants will be able to describe enduring principles that have influenced assessment and improvement practices;
  • 2. Participants will be able to explain emerging trends in assessment and improvement, informed by national perspectives;
  • 3. Participants will be able to discuss the implications of enduring principles and emerging trends for higher education’s future;
  • 4. Participants will be able to share additional resources to enhance our understanding of assessment; and
  • 5. Participants will be able to develop action plans and priorities for incorporating assessment principles and trends in a given context.
  • Sep 11, 2019
    9:00 AM - 4:30 PM

    AALHE Assessment Institute (ALL Day)

    AALHE Assessment Institute (All Day) Jane Marie Souza - University of Rochester. Catherine Wehlburg - Marymount University Skyview Room

    This AALHE Assessment Institute is a full day workshop with lunch provided. This full-day learning opportunity will be offered to a cohort of 40 participants maximum. Dr. Jane Marie Souza and Dr. Catherine Wehlburg, both members of the Board of AALHE, will lead this workshop-style institute. These facilitators will bring a mix of theory and practice along with an engaging and participatory mix of information, practice, feedback, and skill-building. Participants will leave this institute with a solid foundation in the assessment of student learning, multiple resources, and a network of colleagues from across the country. Using their experiences at the course, program, institution, and national levels, the facilitators will foster lively conversations about what has worked, what hasn’t worked, and how higher education can best focus on improving and enhancing the quality of student learning at our institutions.

    The AALHE Assessment Institute is intended for…

  • Anyone who would benefit from a comprehensive review of assessment concepts, beginning with the basics
  • Anyone who would like to address knowledge gaps in their assessment education
  • Anyone who would like hands-on practice applying fundamental assessment strategies
  • Topics will include, but are not limited to…

  • Defining assessment and assessment-related terms
  • Identifying ways to practice formative and summative assessment
  • Describing both qualitative and quantitative methods for gathering meaningful data
  • Illustrating good psychometric practices that can be used by anyone
  • Practicing good rubric development and use
  • Including many opportunities for discussion and active learning
  • Note: The concepts will be introduced and immediately followed by learning activities and discussion. An important aspect of this Institute is the cohort-based approach. Participants will spend the day learning together, lunching together, and attending a plenary session together. By creating a network, participants will have access to each other, the facilitators, and many other resources long after the end of the program. Recognizing that each institution has a different mission and culture. This Institute will provide a framework for ways to better understand how to use information and data to inform decision making. The facilitators will work to use examples from many different types of institutions and will encourage dialogue among all participants in order to model good practices for determining how, when, and why to use assessment.

    Participants will leave with…

  • Handouts of all slides, case studies, and templates
  • Reference lists and other resources will be shared during the session and in communications following the institute.
  • Contact information for cohort members.
  • Sep 11, 2019
    9:00 AM - 4:30 PM

    Work Integrated Learning Assessment Workshop

    Work Integrated Learning Assessment Workshop Kristen Gallo-Zdunowski & Karen Nulton - Drexel University. Nancy Johnston - Simon Frasier University GHALL 220

    Educators from any school interested in creating or augmenting its work-integrated learning offerings (internship, cooperative education, or flexible-work arrangement) are encouraged to attend the September 11 workshop. This hands-on experience will allow participants to link theory with practice and to grapple with real-world implementation choices. Internationally recognized facilitators will use sample data and scenarios from the Steinbright Career Development Center to engage participants in real-world discussions as they move into the rich and growing potential of work-integrated integrated learning. We recommend that schools send teams of at least three to facilitate post-conference planning and implementation. Attendees are strongly encouraged to register for the full day experience, though half-day sessions are available.

    The morning sessions will explore how to create, run, and assess a WIL program. Sessions will include:

  • Cultivating and maintaining professional work partnerships
  • Managing student applications for work opportunities and matching students with employers
  • Awarding credit for work-integrated learning and incorporating it with school curriculum
  • Collecting and analyzing assessment material from students and employers
  • Analyzing qualitative data to augment information from quantitative data
  • The afternoon sessions will focus on how to make clear links between academics, work-integrated learning, and research. Sessions will include:

  • Using data from work-integrated learning to inform curricular changes
  • Using WIL data to facilitate faculty research programs
  • Analyzing WIL qualitative and quantitative data
  • Linking curricular efforts to WIL efforts
  • Full day participants will be able to

  • Create and augment work-integrated learning experiences
  • Partner with faculty on work-integrated learning research
  • Use data from work-integrated learning experiences to enrich curriculum
  • Contact a network of colleagues both nationally and internationally
  • Sep 11, 2019
    12:45 PM - 2:00 PM

    Welcome and Opening Plenary

    Welcome Remarks M. Brian Blake, Senior Executive Vice President and Nina Henderson Provost Mandell Theater

    Opening Plenary: Measuring Faculty Learning about Teaching: Evidencing the Impact of Educational Development Carol A. Hurney, PhD - Colby College Mandell Theater

    Biography

    Carol A. Hurney earned her Ph.D. in biology at the University of Virginia. Currently she is the founding director of the Center for Teaching & Learning at Colby College. In this role, she works with students and faculty to enhance the Colby academic culture through programs that encourage fresh perspectives on the teaching and learning endeavors informed by the scholarly literature. During Carol’s 20+ years of college teaching, she has taught introductory courses to Biology majors and non-majors infused with active learning, inquiry-based labs and authentic writing experiences. Her scholarly interests include learner-centered teaching, active learning, and measuring the impact of educational development on faculty.

    Sep 11, 2019
    2:15 PM - 3:15 PM

    Concurrent Session 1

    Assessing Teaching in an Online-Learning Environment Sandy Figueroa & Carlos Guevara - Hostos Community College (CUNY) PISB 104

    The teaching ability of faculty has a critical impact on tenure and promotion. Interpreting the guidelines for peer observations already established in face-to-face courses to address course development criteria for the online environment can enhance both professional growth of faculty and student achievement. This session examines assessing faculty teaching in an online-learning environment through course design and peer observations. The topics will be 1) establishing online course design criteria to support student success, 2) applying classroom observation standards to online teaching, and 3) using course design and peer observation as faculty development opportunities. During the workshop, participants will be able to correlate their campus’s design requirements for online courses with their peer-observation guidelines. Attendees will also be prepared to set a collegial atmosphere conducive to professional growth.

    Learning Outcomes:

    1. Participants will be able to correlate campus design requirements for online courses with peer observation guidelines.
    2. Participants will be able to apply peer observation guidelines to an online learning environment.

    Audience: Intermediate

    Civic Learning and Intercultural Competency: Key Tools and Strategies for Assessment Javarro Russell - Educational Testing Service (ETS) PISB 106

    Many courses, programs, and institutions are acknowledging the importance of constructs such as civic learning and intercultural competence. However, traditional assessment tactics, such as learning outcomes, alignment, and data collection, can face certain challenges when assessing these constructs. Developing learning outcomes can be impeded by an array of construct definitions. Aligning institutional efforts requires the consideration of curricular and co-curricular interventions. Thus, innovative strategies are needed when assessing these constructs. This session will review innovative strategies and tools to address the assessment of civic learning and intercultural skills. These include frameworks of knowledge, skills, and dispositions that articulate complex constructs, frameworks for aligning various institutional efforts, and the integration of multiple sources of data.

    Learning Outcomes:

    1. Participants will identify the knowledge, skill, and attitudinal components of civic learning and intercultural competence.
    2. Participants will compare and contrast multiple data sources in the assessment of civic learning and intercultural competence.

    Audience: Intermediate

    Making Hard Decisions: Using Data in Program Prioritization Process Barbara Chessler, Thomson Ling & Ellina Chernobilsky - Caldwell University PISB 108

    In the current landscape, Institutions of Higher Education are often faced with budget crises. This workshop will provide guidance on how assessment data can be utilized to help faculty, Institutional Research, and administration make difficult prioritization decisions. Only about half of university presidents feel confident about the financial viability of their institution over the next decade (Jaschik & Lederman, 2018). One way institutions have chosen to address financial challenges is through program prioritization. However faculty should be involved and provide input in program prioritization decisions (AAUP, 1990). Participants will examine data at three levels: Course, Program, Institution, and will learn best practices for making data-driven decisions to determine which programs to keep, which to cut, and which to revise.

    Learning Outcomes:

    1. Participants will conceptualize the interaction and alignment between course-level, program-level, and institution-level assessment data.
    2. Participants will collaborate in examining the data and making informed decisions about program prioritization.

    Audience: Intermediate

    It’s not just about Academic Freedom: Building Bridges with Faculty at Large Colleges Teresa Frizell -Community College of Philadelphia. Tracy Kaiser-Goebel - Montgomery County Community College. Dorothy Schramm - Northhampton Community College. Elizabeth Gordon - Community College of Philadelphia Pearlstein 101

    This panel discussion will highlight how staff at large community colleges serve our students by negotiating the sometimes conflicting demands to offer assistance in assessment practice, to report on assessment practice, to collect assessment data, and to honor academic freedom. At large institutions transparent, actionable data from student assessment is often collected, analyzed, and reported by staff. Questions over the legitimacy of assessment compliance requirements and academic freedom abound within the academy. Relationships between assessment staff and faculty, administration, and accrediting bodies are absolutely vital to these processes. At the same time, conflicting uses of data, repetitive reporting requirements, and secrecy can strain these relationships. Staff members will share strategies and practices we have used to help create and maintain workable relationships with faculty. This discussion will be example-heavy so that participants can contemplate and discuss the relevance of the topic to their day-to-day work lives and use their experience to help solve problems at their home institutions.

    Learning Outcomes:

    1. Participants will be able to articulate some of the theoretical and practical tensions inherent in assessment in the context of large institutions.
    2. Participants will be able to apply takeaways from the discussion to their daily work

    Audience: Intermediate

    Connecting Passion with Pedagogy: Authentic Community Engagement in Higher Education Mindy Smith & Shardé Hardy - Messiah College Pearlstein 102

    Within a traditional classroom setting, addressing tangible societal issues may be challenging. However, engaging students in service and partnership encourages realistic practice and reflection on experiences (de Groot, Alexander, Culp, & Keith, 2015). Creatively pursuing community engagement enhances student experience and has a positive impact on the world in which we live. In this session, participants are invited to consider diverse and meaningful opportunities for community engaged learning in higher education. This workshop-style approach will engage participants in personal reflection on connecting their passions with disciplinary objectives and community partners. There will be opportunity for discussion and practical examples from the presenters. A pedagogical emphasis on service and community relationships involves learners in sustainable practices to enhance wellness and address social justice through stewardship (Culp, 2016). Personal reflection and listening to the stories of others equip session participants to pursue action steps toward community engagement in their own professional work.

    Learning Outcomes:

    1. Participants will actively reflect on and discuss their own passions, relevant issues within their disciplines, potential community partners, and the support needed to integrate community engaged learning.
    2. Participants will create a concept map that shows connections between their interests and their program and course objectives, highlighting tangible action steps for creatively pursuing community engaged learning.

    Audience: Intermediate

    Assessment Awards: Sharing Effective Practices Faculty Assessment Fellows - Drexel University GHALL 108

    Each year, Drexel University awards an individual, group and/ or program with the Drexel University Assessment and Pedagogy Award. The award recognizes individuals and teams that have utilized assessment to improve teaching and learning initiatives and, as a result, have significantly impacted curriculum design and the overall quality of teaching and learning at Drexel. This year’s awardees, along with first runners up, will participate in a panel discussion describing their efforts, successes, problems and constraints when implementing assessment approaches at Drexel University. The panel will show how various strategies can be implemented at the same institution and that assessment is not a ‘one size fits all’ proposition.

    Using Focus Groups for Holistic Assessment on Campus Will Miller - Jacksonville University GHALL 109

    Complementing survey research--and other assessment methodologies--with more in-depth focus groups can help to triangulate student-based data on campus. In this session, we discuss how focus groups can be used as a key piece of the larger assessment pie, including an overview of design and analysis strategies. Ultimately, data today helps campuses explain what is happening. But, it does not always help us understand why. When we take the time to sit down and talk with students about their experiences, we are able to discover insights and information not available from a cursory look. The overarching goal of this workshop session is to send attendees back to campus armed with a new way to gather insights from their students. They will leave armed with actionable templates and design ideas related to maximizing student information on campus.

    Learning Outcomes:

    1. Participants will explain the value of focus groups for holistic assessment on campus.
    2. Participants will describe the necessary steps for successfully utilizing focus groups as part of a holistic assessment culture.

    Audience: Beginner

    Vendor Session: AEFIS Active Learning Lab (ALL): How to Build Comprehensive Learner Records AEFIS PISB 105

    AEFIS Active Learning Lab Series (ALL) provides all-inclusive sessions on selected topics where attendees can learn and build solutions specific to their institution or program via a backward-design approach. This particular lab unpacks the real-time assessment process of the Comprehensive Learner Record (CLR), a cloud-based outcomes transcript and evidence portfolio of student growth across the learning lifecycle. CLR is a dynamic student outcomes transcript representing the shared responsibility of all stakeholders for student learning. Attendees create an end to end plan of how CLR can be implemented at their institution. Takeaways include a schema of how they will provide evidence of learning, programmatic improvement, and achievements of curricular, co-curricular and experiential education all exportable and shareable to employers at any time by students.

    Sep 11, 2019
    3:15 PM - 3:30 PM

    Break 1

    Sep 11, 2019
    3:30 PM - 4:30 PM

    Concurrent Session 2

    Visualizing Data: How can we Effectively use Data to Tell our Story? Mark Green & Matthew Kegerise - Drexel University PISB 104

    Over 90% of the world’s data was created in the last two years and each day 2.5 quintillion bytes of data are created every signed day. DOMO (https://www.domo.com) conducted a report in 2018 that estimates by 2020 1.7 megabytes of data will be created every second for every person on the earth. For perspective, a quintillion is a billion, billions. To equal this much money, we would need to find one billion Bill Gates and put them in a room together. That’s a lot of data. And with it, comes the demand for understanding what the data tells us and how do we use the data. In this session participants will learn tools and strategies on how to effectively display and visualize data. The session will discuss using visualizations to explore a dataset to identify trends and ask deeper questions. We will work with a sample set and go through techniques on visualizing data. The goal is to see how exploring data with visuals can be powerful too when learning from a dataset.

    Learning Outcomes:

    1. Participants will be able to utilize exploratory data analysis visualization techniques to analyze a datsaet
    2. Participants will be able to identify the appropriate graph to use given the type of data they are displaying

    Audience: Advanced

    Snapshot Sessions: A Collection of Mini Presentations Various Presenters PISB 106

    SS1: The ‘It’s My Birthday!’ mini-challenge: Looking at Problem-based Learning to Assess Mathematics
    Sandy Vorensky, Metuchen School District

    SS2: Making Room for Your Own Office’s Assessment Plans
    John Andelfinger, Holy Family University

    SS3: Assessment on a Budget: Using Everyday Technology to Fulfill your Assessment Needs
    Stephany Giovinazzo, Adelphi University

    SS4: Three Standards of ESL Assessment: Where Do We Go From Here?
    Greg Jewell, Drexel University

    SS5: Answering the Call of the Adult Student Learner: Compressed and Distance Learning Course Options
    Thomas Licata, The University of the District of Columbia

    SS6: What's on the Menu: What can we Meaningfully Assess?
    Adrian Zappala, Peirce College

    SS7:Appreciate the Value of Student Evaluation of Teaching
    Jie Zhang, Stevens Institute of Technology

    Let Me Tell You a Story: Creating an Assessment Narrative Anne Converse Willkomm - Drexel University PISB 108

    In assessment, often the numbers and data alone don’t tell the whole story. When the assessment professional unpacks the numbers, student comments, faculty responses, etc., they are better able to evaluate and communicate the benefits, challenges, achievements, etc. of a course, program, professor, etc. Numbers predict numbers and provide little insight into the meaning of those numbers, whereas an assessment narrative speaks to student/faculty attitudes, behavior, and even effort, as well as programmatic vision and progress. For example, a professor may numerically rank low, but the narrative details a new innovative teaching approach. This session will give attendees hands-on experience in using numbers and other data to create the narrative. While bar graphs tell one story, the narrative might suggest something completely different. In addition, the narrative not only informs programmatic changes and enhancements, but it also provides a meaningful vehicle for communication.

    Learning Outcomes:

    1. Participants will be able to identify at least three ways they can develop a narrative from standard assessment data
    2. 2. Participants will be able to speak with colleagues about the value of using a narrative to convey assessment data, specifically identifying at least two reasons why using a narrative is beneficial

    Audience: Intermediate

    Data-Driven Development: Faculty Development Programming to Promote Greater Faculty Involvement in the Assessment Process Kathleen Landy & Ian Beckford - Queensborough Community College (CUNY) Pearlstein 101

    This session is intended to guide attendees through a process for designing faculty development programming based on local sources of institutional/assessment data. For those who attend, this workshop will support the design of professional development opportunities that are maximally relevant to the faculty for whom they are intended. This session highlights the necessity of an interdepartmental/interdivisional approach to identifying faculty development needs to promote student success. This session is relevant to both the higher education assessment and faculty development communities because it highlights the necessity of making data-informed decisions and design choices to ensure that faculty development programming responds to actual student, faculty, and/or institutional needs.

    Learning Outcomes:

    1. Participants will be able to identify local sources of assessment/institutional data to help inform the planning of meaningful faculty development programming.
    2. Participants will be able to articulate what “data-informed professional development” might look like at their own campuses.

    Audience: Beginner

    CANCELLED - Ideas to Frame and Capture those HIP Experiences on Campus - CANCELLED Tanya Williams - Hood College. Gigi Devanney - CampusLabs Pearlstein 102

    This presentation examines the variety of programs and areas where an institution can add High Impact Practice (HIP) outcome overlays for students and use an ePortfolioto demonstrate connections to active learning. Making active learning connections in cross and co-curricular ways can be challenging to capture. Using an ePorfolio and shifting the focus to the student interaction with the outcomes is a way for an institution to capture the HIP. HIP demonstrates the connection between the class learning and the experience. The application of experiential learning and student engagement in the content is important for institutions to be able to demonstrate effectiveness.

    Learning Outcomes:

    1. Participants will define what a High Impact Practice is as defined by AAC&U and what it could look like on their campus including ways to actively engage faculty and develop buy-in for framing and capturing these HIP activities.
    2. Participants will learn more about how to design an ePortfolio process to capture the identified HIPs through a close examination of three specific areas where HIP overlays can be created and applied.

    Audience: Intermediate

    Diving into Assessment and Bringing Students into the Assessment Loop Sherese Mitchell, Denise Cummings-Clay & Sarah Church- Hostos Community College GHALL 108

    Having an understanding of an assignment and the expectations of a professor is key for student success. Providing students with specific strategies for understanding and applying rubrics to their work is essential. Hostos Community College Early Childhood Education Unit’s collective development and revision of a common assignment rubric will be explored. These are key elements that students should be aware of as well. As professors purposefully pared down program-learning outcomes, decisions needed to be made as to removing or keeping them and maintaining the integrity and rigor of the assignment. The idea of less is more was reviewed. Making an impact with one or two program-learning outcomes seemed to be more effective than having many as a checklist. This session will provide attendees with a lens through which to view their current rubrics. The discussion will be a place to share best practices from other campuses. Gaining insight into the work of others and meeting on a common ground can be a casual and supportive environment to revisit rubrics currently used in participant classrooms.

    Learning Outcomes:

    1. Participants will be able to re-evaluate/develop rubrics in their classrooms
    2. Participants will be able to assist their students in self-evaluation via rubrics.

    Audience: Beginner

    Something Old, Something New: The Importance and Feasibility of Product and Process in Writing Assessment William McCauley, E. Jann Harris & Andreas Mechsner - University of Nevada, Reno GHALL 109

    Many disciplines require students to 'show their work' for evaluation and assessment, but this is particularly essential in writing where students often draw independent meaning and/or reach unique conclusions. Thus, we argue that meaningful writing assessment must include not only written products but some careful consideration of writing processes. This content matters because so many writing assessments, especially WAC/WID assessments, reveal shortcomings but very little insight into what circumstances or practices have produced those outcomes. In short, only looking at products does not reveal production flaws. Including processes in assessments of writing provides greater insight and options for intervention. This session uses current writing pedagogy exemplifying our larger themes and activating the benefits of including writing processes in writing assessment: teaching for writing transfer, teaching for writer agency/self-efficacy, and cloud-based word processing. Participants will leave with concrete applications applicable on their campuses and tangible illustrations of our larger arguments.

    Learning Outcomes:

    1. Participants will be able to enhance existing writing assessments with questions/assessments of writing processes/development.
    2. Participants will be able to apply an understanding of writing as development/process to new assessments of written products.

    Audience: Intermediate

    Beginner Networking Event The Drexel Conference Planning Team Pearlstein 302

    New to the conference or just looking to network with other conference attendees? Come and join the conference committee in some fun and games to start off your Drexel Conference experience.

    Vendor Session: AEFIS Speed Networking Event AEFIS PISB 105

    Network with expert users and AEFIS Team members to learn more about best practices and winning assessment strategies. Solution-specific "stations" we will set up for collaboration in a speed-networking format. Come share and learn while you network with colleagues and friends from across the country and the AEFIS Team.
    Sep 11, 2019
    4:45 PM - 5:30 PM

    Ice Cream Social

    Ice Cream Social - Sponsored by AEFIS PISB Lobby/Registration Area

    Enjoy some ice cream and mingle with your colleagues. Rumor is that the Drexel mascot, Mario the Magnificent, may also make an appearance.
    Sep 11, 2019
    5:30 PM - 6:00 PM

    Return to Sonesta Hotel

    Sep 11, 2019
    5:30 PM - 7:30 PM

    CASTLE Pedagogical Happy Hour

    Hosting Happy Hours as a Strategy to Improve Teaching and Learning Center for the Advancement of STEM Teaching and Learning (CASTLE) - Drexel University CASTLE Idea Lab: 3401 Market St. Philadelphia, PA 19104

    Sep 12, 2019
    8:45 AM - 9:45 AM

    Morning Plenary

    Morning Plenary: Work-Integrated Learning Panel Moderator: Nancy Johnston - Simon Frasier University Mandell Theater

    Come meet our panel of experts who will discuss Work-Integrated Learning (WIL) and how it assists students to connect their academic knowledge in a professional work environment. What is it? Why is it? What is the challenge in Assessing It? Is WIL the best example of the connection to our theme, “Moving from Concept to Practice”? This panel will be moderated by Dr. Nancy Johnston, the President of the World Association for Co-op and Work Integrated Education (WACE).

    Panelists

  • Suzanne Carbonaro: Director of Assessment - University of the Sciences
  • Christian Jordal: Director of the Master of Family Therapy Program, Associate Clinical Professor, Coordinator of Student Experiential Learning - Drexel University
  • Andrew Wolf: Director of Educational Effectiveness & Assistant Professor of Clinical Nursing - University of Rochester
  • Sep 12, 2019
    10:00 AM - 11:00 AM

    Concurrent Session 3

    Don’t Buy Wholesale: A Better VALUE in Learning Outcomes Assessment Jeffrey Bonfield - Rowan University PISB 104

    Technology has made assessment work easier, but for the most part it has been built to execute existing processes rather than to enable better ones. The presenters will share a methodology that makes novel use of existing technology to improve the scope and validity of learning outcomes assessment while providing data that departments can use to improve curricula, pedagogy, and classroom assessment. Typical general education and university outcomes assessment models involve collecting samples of student work that faculty evaluate using a common rubric, often AAC&U’s VALUE Rubrics. Doing so can provide reliable evidence of student learning, but it is time-consuming and often does little to directly inform pedagogy or classroom assessment. The assessment strategy detailed in this presentation has the advantage of measuring student attainment of learning outcomes across departments and throughout students’ undergraduate education, from a baseline in general education courses through upper-level courses. Employing this strategy can benefit institutions equally, regardless of whether they utilize the VALUE Rubrics.

    Learning Outcomes:

    1. Participants will be able to identify flaws in typical outcomes assessment methodologies.
    2. Participants will be able to Implement an assessment methodology that has advantages over AAC&U’s suggested application of the VALUE Rubrics, both in the reliability of the data and the pedagogical and curricular relevance to faculty.

    Audience: Advanced

    Including Different Voices in Assessment System (Re)design Royce Robertson - Le Moyne College PISB 106

    The purpose of the presentation is to provide an overview of a redesigned annual program assessment process, particularly the inclusion of different voices in selection and mapping of outcomes and program assessments. The presentation includes the description of the revised annual assessment process, design considerations of the system and plan, as well as the results of inquiries regarding the alignment of courses, assessments, and outcomes. A just (fair, equitable) assessment system is one that establishes compromise between the content, context, faculty, students, and professional community without simply being a unilateral determination. This session will help attendees apply assessment practices on their respective campuses: - Including underserved populations in the assessment system design and implementation ensures a diverse, broad, and encompassing approach. This presentation will provide a handful of actions aimed at increasing individual participation in the assessment system process. This presentation contains at least 15 individual actions to improve system maturity.

    Learning Outcomes:

    1. Participants will be able to define assessment system maturity recognize qualities at their home institution
    2. Participants will be able to identify specific ways and means of including multiple stakeholders in assessment system (re)design

    Audience: Intermediate

    Re-Thinking Service Learning for Enhancing Student Engagement and Strengthening Community Partnerships Melissa Krieger - Bergen Community College PISB 108

    The value that service has on students' engagement with their studies may be clear, yet assessing learning outcomes within this context is challenging. Service Learning participation can have on student engagement and retention, all while promoting mutually beneficial collaborations between an academic institution and local community partners. The presentation will include a review of a sustainable Service Learning project, one that incudes practical strategies for development and assessment management. Participants will examine that civic engagement and competency can be easily fostered through a thoughtful and practical Service Learning project. Participants will explore Service Learning participation as a means to ignite a passion in students to make a difference in their community by focusing on their studies. There are specific criteria to consider when developing an effective project and these aspects will be presented and examined.

    Learning Outcomes:

    1. Participants will be able to explore effective practices for planning and assessing Service Learning experiences Preview the organization of a sustainable Service Learning project and accompanying assessment tools
    2. Participants will be able to preview the organization of a sustainable Service Learning project and accompanying assessment tools

    Audience: Beginner

    Utilizing Educational Technology to Advance Career Development of Students and Faculty Members Michelle Schmude, Scott Koerwer & Erin Sutzko - Geisinger Commonwealth School of Medicine Pearlstein 308 (Formerly Pearlstein 101)

    This session will focus on how digital innovation is leveraged to support an advising, coaching, and mentoring model housed within a course to advance students’ professional identity formation and career development and achieve programmatic goals. In addition, the model has allowed faculty to advance their professional identity and career trajectory. The digital innovative career development model improves student academic success and retention and enhances placement into professional schools and careers within the healthcare arena. The model also provides faculty with much needed professional development activities that supports their career growth and professional identity. This session can assist attendees in understanding how using technology to innovate academic & co-curricular resources can positively impact student success. In addition, as faculty roles evolve, adequate training needs to be provided so they can develop and grow their advising, mentoring, and coaching skills, thereby, advancing their professional identity.

    Learning Outcomes:

    1. Participants will be able to leverage educational technology as an engagement tool to promote learning outcomes related to coaching, advising and mentoring
    2. Participants will be able to construct and implement an all-inclusive faculty engagement plan focused on coaching, advising, and mentoring

    Audience: Intermediate

    Providing Catalytic Data Resources: Deciding what to Keep, Cut, and Tweak for Program Review Kate Colello - Saint Leo University GHALL 209 (Formerly Pearlstein 102)

    In assessment, we guide others through a critical look at their programs, but how often do we turn the magnifying glass onto our “program”? This session shows how an Academic Program Review process used quantitative research techniques to improve our services, saving time and improving effectiveness of data use. A lot of time and energy is expended to generate relevant data for programs’ use in program improvement. We have found that we often provide data goes unused. How can you decide which sources to continue providing, cut, or tweak how its presented? Participants will make a plan for evaluating the data they provide. We’ll address questions like: Which data act as the biggest catalysts for programmatic change? Attendees will discover which data should be inciting action, but generally doesn't inspire much change. This process of inquiry results in more efficient and purposeful support for programs.

    Learning Outcomes:

    1. Participants will be able to design and implement a plan for improving the effectiveness their program review process.
    2. Participants will be able to improve the relevancy of the data sources they provide to programs to inspire effective change.

    Audience: Intermediate

    Curriculum Mapping: An Effective Assessment Tool Phyllis Blumberg GHALL 108

    Participants will learn how to conduct meaningful curriculum maps that relate to the mission of the institution, show data about student learning outcomes. They will receive a Google spreadsheet template and practice entering data on it. Once curriculum maps are completed, they offer useful assessment information. We will discuss how curriculum maps provide data to complete the assessment loop including: how maps are useful for curriculum review and revision, accreditation reporting, and as an aid for decision making for improvement. These maps can be shared with the students to increase transparency and to orient new faculty about the educational program.

    Learning Outcomes:

    1. Participants will be able to record meaningful, and easy to understand data on curriculum maps about their educational program.
    2. Participants will be able to use the data on curriculum maps to inform decisions and close the assessment loop.

    Audience: Beginner

    Mindset As a Roadmap to Student Success Nicole Buzzetto-Hollywood & Bryant C. Mitchell - University of Maryland Eastern Shore. Austin Hill - Hartford County Public Schools GHALL 109

    Can a mindset intervention built into a freshmen development course and developed after years of longitudinal research have a positive impact on the outlook, achievement, and persistence of first generation and under-prepared students attending a minority serving institution? For many new college students, freshman year can be an exciting and daunting experience. Freshmen development courses are designed to help excite, prepare, and orient students into the college experience predicated on enhancing student success. The concept of “grit’ as a set of traits that lead to the persistence and perseverance to complete long-term goals in the face of obstacles has been a focus in academia for the past 12 years. It’s often coupled with such concepts as Growth Mindset. Clarity of Purpose, and Self-Efficacy.

    Learning Outcomes:

    1. Participants will learn about mindset interventions and their efficacy with students
    2. Participants will be exposed to a set of custom designed models, student-centered reflective exercises, and assessments used as part of a mindset intervention lesson with freshmen students

    Audience: Intermediate

    Vendor Session: Respondus PISB 105

    Sep 12, 2019
    11:00 AM - 11:15 AM

    Break 2

    Sep 12, 2019
    11:15 AM - 12:15 PM

    Concurrent Session 4

    Making Your Data Count: A Taxonomy, Process, and Rubric to Achieve Broader institutional Impact Jennifer Harrison & Sherri N. Braxton - University of Maryland, Baltimore County PISB 104

    Assessment technologies can help contextualize learning analytics with student learning outcome evidence, but how can institutions integrate these data? Institutions need tools that integrate multiple measures of student success—especially direct evidence—to deepen insights about student learning. To bridge student success and outcomes data, we need software that enables institutions to aggregate outcomes data by rolling up direct evidence to the institutional level. Our session explores technologies that enable institutions to systematize outcomes data, so direct learning evidence can add depth and nuance to learning and predictive analytics and deepen our understanding of student learning. Our goal is to help faculty, staff, and other campus leaders create a culture of data-informed decision making by interacting with three tools we created to help institutional leaders begin to systematize learning assessment data: a taxonomy, a process, and a rubric.

    Learning Outcomes:

    1. Participants will be able to classify technology tools and their assessment uses
    2. Participants will be able to customize a planning process to their institutional culture and develop criteria to evaluate technologies for specific uses

    Audience: Intermediate

    Assessing Transformative Experience: Preemptively Addressing "When will I Use this in the 'Real World'?" Nick Dix - University of Northern Colorado PISB 106

    If the statement “when will I ever use this content again” frustrates you, this session is for you. When designing curriculum and course schedules, educators believe the content in question logically fits and is delivered effectively. Some students may not agree. We, as educators, must ask why and adjust. Transformative experience focuses on meaning and relevance of course content and course scheduling to scaffold on initial situational interest. Faculty and staff use transformative experience to improve student learning outcomes and retention. Central to the discussion is understanding how to assess curriculum and scheduling to promote transformative expereince. Assessing transformative experience improves the work lives of faculty and advisors through; examining content, four-year planning, and student interaction. The session will use effective teaching demonstrations, curriculum mapping, and proactive advising techniques to illustrate transformative experience. Attendees of this interactive session can share the content takeaways beyond the conference.

    Learning Outcomes:

    1. Attendees will be able to assess and navigate conversations regarding student interest development.
    2. Attendees will be encouraged to reflect on current content and four-year degree planning to encourage desired student outcomes.

    Audience: Intermediate

    Tame the Beast: How a Simplified System of Annual Assessment Reporting Can Conquer the Complex Task of Academic Program Review. Ellen Boylan - Saint Leo University PISB 108

    Producing an Academic Program Review can be as daunting as writing a dissertation. The burden can be eased, this presenter proposes, by breaking up the major components of an academic program review into individual blocks of annual reporting that come together after so many years in a comprehensive, dynamic report that conveys currency, foresight, and agility at the core. Rapid response to conditions uncovered by yearly dives into assessment outcomes and administrative operations improves prospects for student learning and continuous program improvement, especially in the eyes of accreditors. Preserving faculty time and interest matters, too, since their commitment enhances efforts to meet the challenge of comprehensive Academic Program Review. This system of academic assessment is simple and uses technology readily available to any institution and assessment office. The process of creating the components will be described, including ways to involve faculty and executive administration in crafting the reporting vehicles, piloting them, solving problems, and promoting engagement in their use.

    Learning Outcomes:

    1. Participants will be able to discriminate the relative impact of their own assessment processes against and the annual/comprehensive model presented in "Taming the Beast."
    2. Participants will be able to assemble a working group of key players who can collaborate, design, test, and operationalize a new, institution-wide assessment reporting initiative.

    Audience: Intermediate

    Using Assessment to Develop Leaders in Higher Education Assessment Terri Shapiro & Comila Shahani-Denning - Hofstra University Pearlstein 308 (Formerly Pearlstein 101)

    Assessment leaders must lead faculty, a group often thought of as “independent professionals” toward shared, sustainable, data-based, assessment strategies. Higher education institutions need to provide these leaders with the tools and skills to help their institutions to meet the increasingly difficult demands of accreditation and ongoing, meaningful, assessment effectively. While US organizations may spend up to $14 billion per year on leadership development (O’Leonard, K., & Loew, L., 2012) leadership development in higher education is practically non-existent. We detail two leadership development initiatives, the second building on the first, in which we employed validated leadership assessment tools based on personality and values, as the basis for a leadership program to help faculty and administrators taking on new roles, including assessment and accreditation. We will provide a framework for discussion and implementation of practical leadership initiatives that may work in higher education settings.

    Learning Outcomes:

    1. Participants will understand the importance of developing future academic/assessment leaders and how to begin.
    2. Participants will learn how to employ assessment tools to facilitate leadership development.

    Audience: Intermediate

    Experiential Learning and the Journey of Institution-wide Learning Outcomes Assessment Ingrid Kirschning - Universidad de las Américas Puebla (UDLAP) GHALL 209 (Formerly Pearlstein 102)

    We share the process for institution-wide changes to assess experience-based learning, lessons learned and results obtained up until now. UDLAP committed to this project as part of the requirements for the reaffirmation of the SACS-COC accreditation (2015-2025). The 5-year journey has shown the importance of the commitment at all levels. All academic accreditation agencies focus on AoL. Generic rubrics can be adapted for others to use, but there is no recipe regarding the modifications needed in the institution’s organization, processes and corresponding budget. It takes a great amount of effort, time and resources that have to be considered and approved. Our own testimonials together with activities should make the attendees reflect on their situations, regardless of the stage they are in their AoL process, and explore the requirements for an institution-wide change, from their own institution’s view. This should assist the creation of arguments to begin or close the loop.

    Learning Outcomes:

    1. Participants will be able to identify the requirements to make a successful institution-wide change for LO Assessment
    2. Participants will be able to explore from a theoretical standpoint the feasibility of such change and if it is sustainable over time

    Audience: Advanced

    PLOs and the Future of Work: Using the Student Co-op Experience to Inform Curriculum Design Kristen Gallo-Zdunowski & Liza Herzog - Drexel University GHALL 108

    Millennials change jobs, on average, every two years, and GenZ stands to experience as much or more churn. To best position our students for the future of work, IHEs must understand evolving employer needs and priorities. Self-sufficiency, resilience, opportunity recognition, effective communication… these skills double as Program-Level Outcomes and as competencies central to employer needs. With students reporting their desire to practice these skills to better understand their application, it's critical for IHEs to design discrete ways for students to do so. This session will focus on the development of a career capstone course. Fast-moving information, finite resources, and consummate gig economy mean that today's employers must continuously adapt their ways of doing business. Session participants will come away with insights into bridging PLOs and CLOs to professional workplace priorities, with ways to activate those outcomes to meet students' professional outlook, and with several design ideas around building their own curriculum supports to prepare students for today’s workforce.

    Learning Outcomes:

    1. Participants will explore students’ perception of their own learning in the workplace, including reflections on their ability to activate and leverage key skills.
    2. Participants will appreciate student understanding of the interconnectedness of academic curriculum and workplace experience.

    Audience: Intermediate

    Data Collection in Support of Institutional Effectiveness: An Accreditation Strategy Jane Marie Souza - University of Rochester GHALL 109

    What if there was a way to routinely collect information that the institution really cares about – and address standards? How might the Self-Study experience change? Participants will be provided a strategy for meaningful data collection, and using information to inform a painless, focused way to consider and represent institutional effectiveness. Regional accreditors, specialized accreditors, and CHEA are interested in quality and improvement in every aspect of an institution’s operations (aka institutional effectiveness). Therefore ongoing assessment of teaching/ learning as well as of all operations is incorporated into their standards. Many institutions neglect to collect data on a regular basis to support their claim of institutional effectiveness. Rather, what often occurs is every eight years, the institution creates teams to collect information to address accreditation standards. Hundreds of person-hours are spent scouring files for evidence to check boxes.

    Learning Outcomes:

    1. Participants will be able to create a plan for routinely collecting information important to the institution that is aligned with accreditation standards (Template will be provided)
    2. Participants will be able to articulate to technology support personnel what is required to support the data collection

    Audience: Intermediate

    Guaranteed High Response Rate for Online Exit Survey: New Technology Solutions and Strategies Yilian Zhang - University of South Carolina Aiken Pearlstein 302

    One critical issue in online Student Evaluations to Teaching (SET) is the response rate. Various strategies have been suggested to improve online SET rates: daily email reminders, short assessment questions, setting aside class time and so on. However, low response rates on lengthy online program exit survey is still a challenge. We developed a QR based evaluation system that provided a guaranteed student response rate for such an online exit survey. Unlike the traditional email survey link approach, the system enhanced student anonymity through QR entry/exit codes. No student’s information was associated with the student’s response at every level. Yet, the student’s completion of the survey was able to be tracked. These tools and strategies can be applied to online SET and other assessments where high response rate is critical. The system also provides an efficient solution to midterm course evaluation due to high degree of confidentiality.

    Learning Outcomes:

    1. Participant will learn assessment tools that improve the response rate of online assessment
    2. Participant will understand how technology ensures confidentiality and anonymity of student responses

    Audience: Beginner

    Vendor Session: Respondus Respondus PISB 105

    Sep 12, 2019
    12:30 PM - 2:00 PM

    Luncheon & Plenary

    Leadership for Assessment and Improvement: Contexts, Imperatives, and Competencies Stephen P. Hundley, IUPUI Behrakis Grand Hall

    Presentation Description

    This session discusses the contexts, imperatives, and competencies necessary to fully realize leadership for assessment and improvement. Contexts refer to the places in which learning and improvement occur. Imperatives are the strategic considerations that leaders—at all levels and in various contexts—need to embrace in order to cultivate leadership for assessment and improvement. Competencies are the knowledge, skills, and dispositions required to effectively lead assessment and improvement practices and priorities. Participants will leave this interactive session with an action plan to inform their own contexts, imperatives, and competencies.

    Learning Outcomes;
    Upon completion of this presentation, participants should be able to:

    1. Explain the significance of providing leadership for assessment and improvement;
    2. Describe the contexts in which leadership occurs;
    3. Identify the imperatives necessary to cultivate leadership for assessment and improvement;
    4. Outline the competencies required to lead successfully; and
    5. Develop an action plans to enhance leadership for assessment and improvement in a given context.

    Biography

    Stephen P. Hundley, Ph.D. is Senior Advisor to the Chancellor for Planning and Institutional Improvement and Professor of Organizational Leadership at IUPUI. He chairs the Assessment Institute in Indianapolis and serves as Editor of Assessment Update. With Susan Kahn, he is co-editor of the forthcoming book entitled Trends in Assessment: Ideas, Opportunities, and Issues for Higher Education. Stephen has addressed audiences throughout the United States and in over 30 foreign countries. His prior administrative leadership roles include program director, department chair, associate dean for academic affairs, associate vice chancellor for strategic initiatives, and interim dean and associate vice chancellor for undergraduate education. He holds a bachelor’s and master’s degree from Virginia Commonwealth University and a doctorate from American University in Washington, D.C.

    Sep 12, 2019
    2:00 PM - 2:15 PM

    Break 3

    Sep 12, 2019
    2:15 PM - 3:15 PM

    Concurrent Session 5

    Improving Student and Faculty Success with Comprehensive Learner Records (CLRs) Suzanne Carbonaro - University of the Sciences. Caitlin Meehan - AEFIS PISB104

    This session unpacks assessment power of Comprehensive Learner Records (CLR), a digital skills portfolio that helps students better understand their learning and share verifiable record of their knowledge and accomplishments with various stakeholders including employers. CLR represents an intentional approach to creating agency, transparency, and shared-responsibility of all stakeholders for student learning. Visible assessment design adapted from research of John Hattie provides students and faculty with real-time feedback, with a novel approach to formative assessment built within its distribution and display. CLR is a dynamic student outcomes-transcript, a transition from grades, to transferable learning experiences, highlighting curricular, co-curricular and experiential education evidence, all shareable to employers by students. Attendees will begin to develop a plan of how CLR can be implemented at their institution.

    Learning Outcomes:

    1. Participants will develop a plan for how a Comprehensive Learner Record approach can be implemented at their institution
    2. Participants will design a structure for evidence collection and feedback to impact student achievement and programmatic success

    Audience: Intermediate

    Snapshot Sessions: A Collection of Mini Presentations Various Presenters PISB 106

    SS1: Not All Who Travel are Lost: Becoming an Assessment-guide to Short-term Study-away Faculty
    Sharon Livingston & Don Livingston - LaGrange College

    SS2: Preparing Millennial African Learners for the 21st Century Challenges and Opportunities: Teaching, Learning and Assessing What Matters
    Timothy Chiwiye - Zimbabwe School Examinations Council

    SS3: Assessing Faculty Development: Exceeds, Meets or Does not yet Meet
    Karyn Holt - University of Nevada, Las Vegas

    SS4: Using Technology to Enhance Learning Outside of the Classroom
    Angelita Howard - Morehouse School of Medicine

    SS5: Invigorating Your Practice of Continuous Improvement
    Will Miller - Jacksonville University

    SS6: Making Sense of the Institutional Learning Outcomes Revision Process
    Beth Ross - Emmanuel College

    SS7: Evaluation Model for the Effectiveness of Entrepreneurship Education Based on the Triangulation Theory
    Zhichao Wang & Haibin Liu - Northeast Normal University

    Academic Program and Administrative Unit Review: Strategies for Developing an Effective, Comprehensive Assessment Process Beth Wuest - Texas State University PISB 108

    Holistic academic and administrative reviews quantitatively and qualitatively track change, guide planning, and promote improvement in higher education. This happens through thoughtful discussion among faculty, administrators, and staff as they examine the alignment between academics; administration; personnel; fiscal, physical, and technological resources; strategic objectives and institutional priorities; and student needs. This presentation demonstrates and encourages the pragmatic development of a robust, integrative, comprehensive, and formalized academic and administrative review process which includes strategies for guiding the preparation of a self-study, evaluation by external reviewers, and development of an informed plan of action. Because no single review process works for all institutions, this presentation illustrates commonalities and variations in academic and administrative review processes and assists administrators and assessment personnel in establishing and revising criteria, strategies, and techniques to devise and lead an effective process for specific institutional needs.

    Learning Outcomes:

    1. Participants will be able to examine criteria for developing policies and procedures to guide an effective, holistic review process to meet both internal needs and external standards
    2. Participants will be able to establish processes and techniques for efficiently reporting self-study observations, peer review findings, and plans for improvement

    Audience: Intermediate

    Taking a LEAP [Learn, Engage, Apply, Perform] into Leadership Exploration - Drexel University s Provost Fellows Rajashi Ghosh, Juan Poggio, Jennifer Quinlan, & Richard Frankel - Drexel University Pearlstein 308 (Formerly Pearlstein 101)

    The Drexel University Provost Fellows’ program provides exceptional faculty with a more expansive range of leadership at the University. As part of the program, Fellows become familiar with university-wide academic initiatives, strategic planning, budget issues, and overall challenges in higher education, while working closely with members of the senior administration to spearhead a project in the Provost’s portfolio. Fellows also receive mentorship and are able to conceptualize, develop, implement, and assess institutional initiatives.

    This session will highlight four projects from current and past Drexel University Provost Fellows. Participants will learn about the envisioning and implementing projects that range in scope from Faculty Affairs, Academic Programs and Strategic Initiatives, and Assessment, Accreditation, and Effectiveness. These will include:

  • Mid-career faculty engagement and professional development for associate professors;
  • Strategic planning and higher education leadership;
  • Standardizing languages and processes around establishing institutional agreements, developing guidance processes and disseminating institution wide;
  • Envisioning, developing, and launching the NEW Teaching and Learning Center at Drexel University.
  • A Way Forward: Mitigating Gaps in Performance Assessments Marylee Demeter, Brianna Bellanti, Bob Brown, Heather Hayes, Racheal Killian, John Morris, Rob Neilsen & Goran Trajkovski - Western Governors University GHALL 209 (Formerly Pearlstein 102)

    This session will focus on understanding the differences between manifest (observed, objective) and latent (subjective, underlying meaning) content presented within performance assessments (PAs), as well as the gaps often observed between both types of content. When choosing language in a PA, we want to ensure the observed content represents the underlying task and its associated expectations. When language is misaligned with intent, it is difficult for students to accurately determine what is expected of them, and more importantly, demonstrate their competence and success. This session will increase faculty awareness of language, and identify best practices to ensure latent content is aligned and accurately represented in task descriptions and their associated measures. An aligned approach leads to increased student understanding of task requirements, resulting in increased demonstration of student success and competency.

    Learning Outcomes:

    1. Participants will develop an understanding of the differences between manifest and latent content in performance assessments
    2. Participants will identify best practices and cultivate a framework that aligns manifest and latent content in the development of PAs

    Audience: Beginner

    Overcoming the Barriers to Subject-Specific and College-Wide Assessment: Case Studies from an Urban Four- year Public College Hollie Jones & Augustine Okereke - Medgar Evers College GHALL 108

    The presentation includes a review of the strategies a public college used to address barriers to the use of assessment among stakeholders. We include assessment case studies from an adult learner initiative, an accredited nursing program assessment, and broader institutional assessment. Specific strategies and tools will be shared with attendees. This content will be useful for staff balancing assessment needs and requirements for various programs, initiatives, and stakeholders. Attendees will learn about the following assessment strategies to improve the use of assessment among stakeholders: (a) balancing the use of external assessment and teacher-created assessment in the assessment of learning, (b) aligning assessment with various educational standards, student learning objectives, and curriculum and (c) assessment for discipline-specific accreditation.

    Learning Outcomes:

    1. Attendees will be able to identify the barriers to assessment among stakeholders
    2. Attendees will be able to describe potential strategies for addressing assessment barriers in their own institution

    Audience: Beginner

    Institutional Assessment Practices that Align with MSCHE Revised Standards Janet Thiel - Georgian Court University GHALL 109

    Each of the revised MSCHE Standards of Accreditation require evidence of compliance with periodic assessment of the standard. These standards differ in content and focus, so there is no "one size fits all" approach to the gathering of this evidence. The presentation will help the audience develop a toolbox of such methodologies.The presentation and exchange of ideas will focus on periodic assessment of each of the MSCHE Standards of Accreditation. Audience would include those responsible for or contributing to a MSCHE or other accreditation self-study.

    Learning Outcomes:

    1. Participants will identify simple and practical practices aligned with the periodic assessment each ot the MSCHE Standards for Accreditation.
    2. The presentation will provide a forum for exchange of ideas and practices related to the periodic assessment of MSCHE's Standards of Accreditation.

    Audience: Intermediate

    Vendor Session: SPOL SPOL PISB 105

    Sep 12, 2019
    3:15 PM - 3:45 PM

    Break and Network with Vendors

    Sep 12, 2019
    3:45 PM - 5:15 PM

    Concurrent Session 6 (Workshops)

    “You said Please, so I Thought Assessment was Optional?!” Developing Assessment Culture in Community Colleges Kalina White & Caroline Evans - Community College of Allegheny County PISB 104

    Community colleges face unique challenges in contextualizing assessment information often designed for universities. Furthermore, perpetual budget challenges and resistance to change can paralyze action. Although community colleges have traditionally responded nimbly to stakeholder needs, the consequences for failure to develop assessment solutions can be devastating. So how do we build a culture of assessment at a community college? This session will examine this question through guided peer-review discussion of the actions taken by one community college where initial accreditation concerns flowered into an exciting time of broad college-wide culture change. Through very short presentation and guided discussion, participants will learn of one path for changing workplace culture, reflect on how the methods could be applied at their own college, share community college assessment related wins and challenges, and learn strategies from other community colleges facing similar problems.

    Learning Outcomes:

    1. Participants will begin to develop a strategy for culture change at their community college
    2. Participants will network to share actionable items applicable to their home institution

    Audience: Intermediate

    Train the Trainer: Implementing Administrative Assessment Creation and Support Jan Schumacher, Deborah Tamte-Horan & Nicole Hammel - Muhlenberg College PISB 106

    Muhlenberg College views assessment as part of how we do our work and has put a process in place to assist with collecting information for internal and external (accreditation) use. The college believes in providing offices and departments with structures that enable each to determine priorities. This approach has helped to generate positive experiences within the campus community. Through the process of sharing Muhlenberg College’s administrative assessment evolution, participants will be encouraged to consider how to apply shared ideas and techniques to their own offices and campuses. The presentation will utilize a workshop within a workshop format to demonstrate how to engage offices and departments in the assessment process. Participants will have a clearer understanding of how assessment helps to improve their offices’ work. Additionally, they will hear ideas from the presenters and other participants on how to present assessment techniques to others. Learning Outcomes:

    1. Participants will be able to conduct a workshop at own institution
    2. Participants will understand the structure of administrative assessment planning for a division/office or campus-wide

    Audience: Intermediate

    DIY – Building Excel Dashboards with Institutional Data Mark Green & Lora Furman - Drexel University PISB 108

    Data is everywhere! Each day more data is created than we can effectively use. This workshop will focus on identifying, analyzing, and presenting meaningful data in an effective manner. Participants will review example dashboards and learn how to build them in Excel. By leveraging promising-practices regarding data, we are able to help our Colleges and programs enhance their effectiveness through evidence-informed decision-making. This session will help connect practitioners to common tools and how to utilize them to answer complex questions. This beginner-friendly workshop will give practical examples of dashboards that may be used to present data to stakeholders. The session will review the aspects of effective dashboards and then turn to a practical application. Participants will be given access to a practice dataset in Excel and learn steps on how to recreate one of the example dashboards during the presentation. Participants who bring their laptops will walk away from the presentation with a dashboard they may connect to their own institution’s data.

    Learning Outcomes:

    1. Participants will understand the process of building an interactive dashboard in Excel
    2. Participants will be able to apply the foundations of effective dashboards to their own institution’s dataset

    Audience: Beginner

    Assessing Writing In and Of a General Education Program: Evaluating the Impact of a Required Competence on Student Learning and Success Russell Stone & Jane Detweiler - University of Nevada, Reno Pearlstein 308 (formerly Pearlstein 101)

    Assessing student learning within fundamental competencies is a challenge – and often a requirement – across all types of institutions. Participants will hear how the presenters have responded by designing and implementing an assessment of written communication project to provide data and direction for three campus audiences: students, faculty, and administrators. This workshop addresses the multiple perspectives that should inform an institution-wide assessment of written communication. The presenters will discuss how we implemented an assessment framework that accounts for the various backgrounds of our students and the expectations for student learning in our general education and major curricula. This workshop is designed to assist those with an intermediate-level of experience in assessment resolve two significant challenges in implementing meaningful evaluations of student learning: designing assessments that account for the diversity of experiences and backgrounds of an institution’s students and making institution-wide use of assessment results.

    Learning Outcomes:

    1. Participants will be able to apply knowledge of writing assessment fundamentals to designing and implementing effective, valid assessment of any student performance or competence that requires collection of performance data or material in written form
    2. Participants will be able to make use of data from an assessment of written communications skills to inform curricular design for general education and major programming

    Audience: Intermediate

    Attaining a University-Wide System of Assessment and Data Collection Through the Use of Rubrics Dana Scott - Jefferson University GHALL 209 (Formerly Pearlstein 102)

    This hands-on workshop will present how a university-wide system of assessment can be attained through the use of rubric alignment. Participants will review a broad range of rubric styles, work collaboratively on rubric development, and be presented with tools and insights to enhance and create their own rubrics. The session will examine assessment at both the macro and micro level, covering a comprehensive, university-wide system down to the classroom level. A collaborative exchange of learning, using broad range of rubrics, make this session appropriate for beginner to intermediate levels of knowledge. This workshop will introduce tools and examples enabling participants to advance assessment across classroom rubrics to university level outcomes.

    Learning Outcomes:

    1. Participants will be able to align learning outcomes from individual projects across to university level outcomes
    2. Participants will be able to recognize a process for creating quantitative performance task assessment for a variety of assignments

    Audience: Intermediate

    Writing, Reviewing, and Revising Learning Outcomes: An exercise in wordplay Amy Simolo & Dr. Mary Jane DiMattio - University of Scranton GHALL 108

    Colleges and academic departments are under increasingly higher pressure to provide evidence of assessment of learning outcomes. Properly worded outcomes allow for clear mapping between assessment artifacts and the goals of the course/program/institution. This session will explore the components of properly written learning outcomes (LOs), including their assessability and level of assessment per Bloom’s taxonomy. Through the analysis of sample LOs, participants will be able to identify common LO writing mistakes, and make recommendations for revision. Faculty members and administrators in attendance will be able to use what they learned to evaluate and revise their own course or program LOs, ensuring that assessment of the LOs is accurate and appropriate. Attendees will be able to make recommendations for LO revision to ensure clarity and assessability.

    Learning Outcomes:

    1. Participants will be able to identify the components of a properly written learning outcome, as well as the common mistakes made when writing LOs
    2. Participants will be able to write and/or revise learning outcomes for their own courses or programs, and/or provide guidance to colleagues on LO revision

    Audience: Intermediate

    The Path Towards Meaningful Assessment: Student-Faculty Partnership Nicholas Curtis - Marquette University. Robin Anderson - James Madison University GHALL 109

    Institutions often invest in assessment with little return in relation to the learning improvement. The purpose of this workshop is to assist participants in engaging students in the assessment process in a way that brings new insights into students’ educational experience and elevates their voice in the learning improvement process. While improvement of learning has become a greater focus of assessment, key stakeholders such as students do not have a seat at the table. Without the perspective of our students, all inferences and actions resulting from the assessment process are based on unrealistic assumptions of the student experience. Participants will develop strategies for partnering with students to make better inferences from assessment data that are more likely to result in action. Participants will develop strategies for partnering with students to reduce resistance to assessment among other key stakeholders (e.g. faculty and administrators).

    Learning Outcomes:

    1. Participants will discuss current/existing examples of how students are being engaged at various institutions with colleagues and supervisors
    2. Participants will be able to identify the potential for student engagement at each of the steps along the assessment cycle and create opportunities for student-faculty assessment partnerships at their own institution

    Audience: Intermediate

    AEFIS User's Meeting AEFIS Pearlstein 302

    The AEFIS Team is excited to host our next AEFIS Users Meeting as part of the Drexel Assessment Conference. As the highlight of our event, we will have Lightning Presentations by some of the greatest minds and leaders in assessment, continuous improvement and accreditation. We will also host the AEFIS Partner Awards Ceremony to show our gratitude and recognize some of our great partners.
    Sep 12, 2019
    5:15 PM - 6:00 PM

    Transportation to Hotel & Reception

    Sep 12, 2019
    6:00 PM - 8:00 PM

    Reception at the Museum of the American Revolution

    Network and enjoy drinks and appetizers at the newest Philadelphia museum. Make sure to explore the exhibits and don't miss the show. Shuttle service will be provided.
    Sep 13, 2019
    7:30 AM - 8:30 AM

    Continental Breakfast 2

    Sep 13, 2019
    8:45 AM - 9:45 AM

    Concurrent Session 7

    Moving from LOTS to HOTS: Integrating Ed-Tech Tools for Assessment Jayanthi Rajan & Soma Ghosh - Albright College PISB 104

    Formative assessments work well in undergraduate education when they incorporate interactive student engagement. In this session, we will share how some select Ed-Tech tools such as Flipgrid, Insert Learning, and E-link can be used to develop formative assessments that help students develop cognitive skills aligned with Bloom’s Taxonomy. This hands-on session will share the methodology of using the tools, effective teaching tips, formative assessment methods and lessons learned. Participants will have the opportunity to first interact with the tools and brainstorm on how to use them in their own classrooms. The presentation is relevant for instructors working with the challenges of low attention span amongst digital natives. The attendees will gain insights into how formative assessments can be incorporated into their courses using Ed-Tech tools through our first-hand experience and learning.

    Learning Outcomes:

    1. Participants will be able to choose tools that are aligned with their course objectives and learning outcomes
    2. Participants will be able to create short assignments for formative assessments

    Audience: Beginner

    Improving Your Assessment Process while Demonstrating Continuous Improvement Bliss Adkison & Janyce Fadden - University of North Alabama PISB 106

    Processes and policies found within higher education that are designed to reach goals within the institution are outdated and siloed, and so inhibit the institution from advancing and accomplishing its goals. This plague of stagnancy in the modern university consists of factors that will prevent it from continuing its legacy, unless necessary revolutions are embraced by higher education quickly and efficiently. This session will present Lean Thinking methodology, which provides an institution with an adaptable and evidence based practice for demonstrating continuous improvement. This methodology creates and standardizes processes that will ultimately save time and money; thus, producing a quality student experience, increasing the opportunity for research among faculty, and creating valued added experiences for staff in institutions of higher education. Lean Thinking methodology provides the institution with a way to demonstrate continuous improvement in their programs and assessment processes. With Lean Thinking methodology, academic programs are able to create an agile assessment cycle that produces concrete evidence of continuous improvement.

    Learning Outcomes:

    1. Participants will be able to understand how Lean using Rapid Improvement Events can be useful to Institutional Effectiveness
    2. Participants will be able to understand how Lean practices can resolve in more effective student learning outcomes and programs and accelerate a formative assessment of a program

    Audience: Intermediate

    The Role of Feedback and Holistic Scoring in Building a Growth Mindset Kimberly Chappell - Fort Hays State University PISB 108

    For success in the workplace, we need to assist students in developing a growth mindset as well as knowledge and skills. Developing a growth mindset and increasing self-efficacy can be achieved with targeted feedback and holistic assessment strategies. This session highlights the principles learned through program improvement efforts regarding the role of feedback and holistic scoring strategies in building a growth mindset culture. Soft skills, information literacy, and self-efficacy were also developed significantly in the process. These principles and practical strategies are presented for use across any discipline. Students’ approaches to learning via a growth mindset increases achievement (Dweck, 2006). Participants will leave the session with a set of principles for developing a growth mindset culture in their setting. Practical feedback strategies and holistic scoring methods will also be presented. Principles and strategies can be implemented immediately. Tips for building these across courses will be shared for a broader application.

    Learning Outcomes:

    1. Participants will identify feedback principles and holistic scoring strategies to develop a growth mindset in students
    2. Participants will identify assessment opportunities to apply strategies for developing mindset in their discipline

    Audience: Intermediate

    Rethinking Assessment to Embrace the Faculty's Unique Approach and Encourage Participation Jacqueline M. DiSanto, Sarah Brennan, Kate Wolfe & Antonios Verelas - Hostos Community College/CUNY Pearlstein 308 (Formerly Pearlstein 101)

    Administrators at a community college were encouraged by faculty efforts to reexamine how its original assessment plan could be tailored to suit existing measures in distinct departments. The growth of this initiative and how faculty now have a voice in an assessment project that began in administration will be shared. The presenters will share how, despite different approaches to assessment, one college was able to develop an assessment initiative to map learning outcomes to accreditation standards and institutional missions, overcome challenges at both institutional and departmental levels, and nurture the culture of assessment while building structures for accountability. Participants will identify assessment activities from their own institutional perspectives and their unique challenges and impediments to assessment. They will develop strategies for standardizing assessment initiatives that address faculty culture and consider institutional dynamics, and share strategies for mapping general-education and program-level learning to accreditation standards and the institutional mission.

    Learning Outcomes:

    1. Participants will identify campus-wide and department-level challenges that can impede and assessment project and discuss potential solutions.
    2. Participants will develop strategies for including faculty culture at the department-level and institutional dynamics in assessment projects.

    Audience: Intermediate

    Innovative Ways of Engaging Faculty in Assessment Practices Faculty Assessment Fellows - Drexel University GHALL 209 (Formerly Pearlstein 102)

    Not all faculty have the same knowledge regarding assessment. To promote assessment best practices, the University Assessment Fellows at Drexel University developed an Instructional Assessment Certificate Program. The program is an online professional development opportunity for full, part-time, and adjunct faculty. This presentation will provide the factors that lead to the decision to develop learning modules, content and structure of the modules, as well as, expected outcomes of the program. We will also demonstrate the online structure of the program.

    Learning Outcomes:

    1. Participants will be able to describe two different approaches to engage faculty in classroom assessment practices
    2. Participants will be able to describe essential components of a modular online course designed to instruct users in best practices of course-level assessment

    Audience: Intermediate

    “Speak my Language": How to Translate Assessment into ‘Foreign’ Languages Dr. Kate Oswald Wilkins & Dr. Susan Donat - Messiah College GHALL 108

    To educators, it may sound like assessment professionals are speaking a different language. If assessment professionals are to help educators make evidenced-based improvements, they must be able to communicate about assessment in ways that resonate with educators from all disciplinary backgrounds. We present a matrix on the “language of learning” in various academic disciplines to help participants consider underlying assumptions, motivations, and values that differ from their own. Then, we discuss ways to adapt assessment goals to these audiences. This session is important in light of the significant shift from compliance-focused assessment to improvement-focused assessment (Russell, 2017). Meaningful campus assessment relies on faculty engagement (Hutchings, 2010), but only if assessment professionals “locate assessment in the commitments faculty hold” and communicate compellingly about assessment (Hackman & Jankowski, 2015, p. 20). We provide assessment professionals with rhetorical tools to address educators’ skepticism toward learning assessment, embedded in language about learning.

    Learning Outcomes:

    1. Participants will be able to describe common resistance to assessment that arise from differences in various academic disciplines’ language (and underlying assumptions and values) about learning
    2. Participants will be able to explain how assessment can improve courses and curricula using the language of learning in a variety of academic disciplines

    Audience: Intermediate

    Don't GET Ready, STAY Ready for Accreditation and Reaffirmation Patti Griffin - Lipscomb University GHALL 109

    Assessment and the accreditation process cannot exist independently since assessment is a vital part of the accreditation process and its successful outcome is just that: a process. Don’t GET Ready, STAY Ready uses actual academic and operational data from each participant’s organization to experientially master the assessment and accreditation processes. This content matters because successful accreditation and reaffirmation requires a demonstration of successful continuous improvement reflected in trended data supporting that improvement. Narrative alone describing success will not result in accreditation; it must be demonstrated to "show me the money." Accreditation liaisons need to know this process. Once this process is in place day-to-day work lives are improved, and the typical accreditation problems are solved. Hint: How to know it is a process problem. When you go home at night and someone says “How was your day?” and you say, “It happened again.” It’s a process problem.

    Learning Outcomes:

    1. Participants will learn and practice the PDCA (Plan, Do, Check, Act) model of continuous improvement required for both assessment and full accreditation
    2. Participants will leave the workshop confident in their ability to return to their organization and successfully lead the assessment and reaffirmation process by using the following skills: strategic thinking and planning, brainstorming; process improvement

    Audience: Beginner

    Sep 13, 2019
    9:45 AM - 10:00 AM

    Break 5

    Sep 13, 2019
    10:00 AM - 11:00 PM

    Concurrent Session 8

    Using Online Surveys for Internal Assessment: The Process from Idea to Final Report Molly Sapia & Dana Dawson - Temple University PISB 104

    Surveys are a powerful tool for gathering quantitative and qualitative data from hundreds of people at once, but it can be daunting as to where to start. Findings are only as good as the survey design and data gathering process. We will introduce participants to all they need to conduct a survey properly on campus, including the nuances of question and survey design, navigating of university bureaucracy, psychology of survey response, softwares available, and crucial data considerations. This session introduces each step in conducting a survey for assessment—moving from idea, to research question, to well-designed questionnaire, to effectively-programmed online instrument, to data analysis and writing. We rely on both survey methods literature and examples of surveys we have conducted on our campus for assessment in GenEd. Here participants will begin thinking through using a survey to answer their own research questions, and also gain resources to continue learning post-session.

    Learning Outcomes:

    1. Participants will be introduced to the basic principles of survey questionnaire design, and gain a good idea of where to continue this education later on their own
    2. Participants will learn the logistical steps to survey programming, distribution, and analysis in a university setting for the purposes of assessment

    Audience: Intermediate

    Values-Centered Assessment: Moving from Compliance to Transformation Joel Bloom - Hunter College (CUNY) PISB 106

    With increasing demands on assessment for compliance, assessment professionals have often focused either on either quality issues of validity and reliability, or quantity issues – producing a report for our evidence inventory. This session will start with a review of the recent literature on assessment, institutional effectiveness, and student success, and move on to discuss ways assessment can be used to inform curricular and pedagogical improvements that will really transform students' educational experiences, consistent with institutional values, mission, goals, and strategic priorities. It's so easy for assessment professionals to get bogged down in compliance and data issues -- which programs submitted reports, and whether they did them "right" -- that we often lose sight of why we are here. This impacts our morale and makes it harder to make the case for why anyone should be doing assessment. In this session, I'll try to bridge that gap.

    Learning Outcomes:

    1. Participants will think about ways in which assessment can be about more than reliability, validity, and external compliance, but also reflect our values
    2. Participants will begin to develop strategies on how to conduct values-centered assessment, by lining up priorities suggested by institutional or unit values with compliance and data quality concerns

    Audience: Intermediate

    Mapping the Curriculum: EdD Program Assessment as Faculty Inquiry in Action Joy Phillips & Deanna Hill - Drexel University PISB 108

    This session builds upon on-going Drexel EdD faculty self-assessment work to illustrate how elements of national accreditation processes can be mapped to program curriculum. With this work, faculty exercise agency to engage in in-depth program inquiry that goes beyond simple accreditation compliance to focus on evidence of authentic graduate student learning. Scrutiny of higher education programs by external entities has prompted national and regional program accreditation organizations’ staff to expand and deepen their scope. Central to this process is the need for university program faculty to produce genuine evidence that students have mastered learning outcomes. Higher education faculty are not exempt. Traditionally, faculty engagement in program assessment has been cursory. The changing university assessment climate provides an opportunity to involve more faculty in a cycle of continuous program improvement. By taking an inquiry stance and using mechanisms such as curriculum maps, faculty can more easily pinpoint problem areas and identify solutions.

    Learning Outcomes:

    1. Participants will become familiar with an EdD (graduate) program curriculum map that includes national and regional program assessment standards and national (Carnegie Project on the Education Doctorate-CPED) program design principles
    2. Participants will have an opportunity during an interactive session to experiment with applying own graduate programs to provided curriculum map model

    Audience: Intermediate

    Assessment (not) Anonymous: The NJ Assessment Affinity Group Danielle Zimecki-Fennimore, Ed.D - Rowan College at Gloucester County. Marianne Baricevic, Ph.D - Raritan Valley Community College. Paula Roberson, Ed.D - Hudson County Community College. Terri Orosz - Bergen Community College Pearlstein 308 (Formerly Pearlstein 101)

    While regional accreditation and assessment requirements expand, too often institutional budgets contract. It has become more important for institutions to share their experiences and knowledge. The New Jersey statewide assessment affinity group creates a structure for assessment and institutional research specialists to meet to discuss assessment, accreditation and institutional effectiveness. This session will describe the history, structure, and function of a statewide assessment affinity group. Participants will learn how an affinity group or similar body can provide a forum for professionals to share their knowledge of assessment, accreditation and institutional effectiveness, identify best practices and offer support to one another. A sector wide affinity group provides two main benefits. First, it serves as a resource for those responsible for assessment, accreditation and institutional effectiveness. Members can turn to the affinity group for ideas and best practices. Second, it encourages colleagues from various institutions to develop a network and sense of camaraderie.

    Learning Outcomes:

    1. Attendees will recognize the purpose and benefits of a sector wide affinity group
    2. Attendees will be able to structure a system-wide affinity group

    Audience: Beginner

    Generate Mindful Movement in Diversity and Inclusion Planning Jacqueline Snyder - SUNY Fulton-Montgomery Community College. Mary Ann Carroll - SUNY Herkimer County Community College GHALL 209 (Formerly Pearlstein 102)

    Does your college include some combination of the following words in its mission, vision, values, or goal statements: diversity, inclusive, global, tolerance, equity, or multiculturalism? Do these equate to more of a concept on your campus than a practice? Does your institution have an articulated diversity and inclusion plan that is a bit confusing, vague, and collecting dust? Higher education has a long standing reputation for appreciating both diverse and inclusive values. However, there is a difference between a reactive and a proactive inclusive culture. Committing to a proactive vs. reactive culture takes strategic thinking, a documented plan, and key performance indicators to create a collective responsibility for diversity and inclusion. In this session, participants will explore how two institutions are utilizing varied assessments to support diversity and inclusion plan development and implementation. Learn how the practice of continual evaluation creates avenues for campus-wide input, feedback, and dialogue. As an institutional plan that supports core values and mission statements, diversity plan goals should be infused in all planning documents (Strategic Plan, Enrollment Management Plan, Academic Affairs Plan, Student Services Plan,....). This session will provide concrete examples on how to map diversity and inclusion goals into key institutional planning documents.

    Learning Outcomes:

    1. Participants will be able to analyze current higher education diversity and inclusion practices and identify assessments that measure proactive diversity and inclusion plans
    2. Participants will be able to review evaluation strategies that support an infused and respected inclusive culture

    Audience: Intermediate

    Outcomes Outside the Classroom: A Collaborative Approach to Developing SLO Assessment in Student Support Services Andrea Kirshman, R. Chad Brown, Marlene Fares, Lori Lentz & Rachel Fager - Kutztown University GHALL 108

    Because student learning occurs inside and outside the classroom, it is important to assess some of the learning experiences in both settings. Created in fall 2018, the Administrative Assessment Council is comprised of representatives of departments within academic affairs that directly provide student support services (e.g., Career Development, Disability Services, Grants and Special Projects, Registrar, Library Services, Undeclared Advisement, Information Technology, Tutoring and the Center for Academic Success & Achievement). As a council, members have over 20 years of assessment experience in academic and student affairs. As a group housed in academic affairs we are committed to both providing and enhancing academic support services. A representative from academic advising, center for academic success and achievement, library, registrar and tutoring services will provide specific examples of student learning outcomes designed for student support services. We will also outline the methodology currently in use to assess the Student Learning Outcomes. Audience members will learn how to collaborate in the development of student learning outcomes for academic support services outside of the classroom. We also developed program outcomes to improve student services and will discuss how those related to the learning outcomes. Audience members will be able to implement student learning outcomes assessment in academic support services.

    Learning Outcomes:

    1. Participants will be able to collaborate in the development of student learning outcomes for academic support services outside of the classroom.
    2. Participants will be able to implement student learning outcomes assessment in academic support services

    Audience: Beginner

    From the Ground Up: Designing, Implementing, and Assessing First Year Experience at West Chester University Rodney Mader, Lisa Marano, & Shannon Mrkich - West Chester University GHALL 109

    First Year Experience courses are high-impact practices implemented in many public universities to improve retention, completion, and engagement, especially among first-generation and under-represented students. WCU’s First Year Experience (FYE) pilot was very successful from student, participating faculty, and administration perspectives. WCU's FYE is a 4-credit course developed as part of General Education reform. Moving from pilot to full implementation, we measured student learning outcomes to understand consistency, monitor variance, and assess effectiveness among classes. WCU's FYE syllabi are developed through a “grass-roots” approach with common learning goals alongside disciplinary outcomes. Our training model offers institutional support but is driven by faculty expertise, team work, and multiple feedback points from faculty and students.

    Learning Outcomes:

    1. Participants will be able to discuss the benefits of a grassroots approach to developing FYE syllabi
    2. Participants will be able to generate ideas to support implementation and assessment of FYE on their own campuses

    Audience: Intermediate

    Sep 13, 2019
    11:15 AM - 12:00 PM

    Closing Remarks

    Closing Remarks & 2020 Registration Raffle Stephen DiPietro, Drexel Assessment Conference Chair PISB 120

    Come to say good bye and reflect on the conference with Conference Chair, Stephen Dipietro. There will also be a raffle for free registration for the 2020 conference.