For a better experience, click the Compatibility Mode icon above to turn off Compatibility Mode, which is only for viewing older websites.

2017 Program

Date/Time Session
Sep 13, 2017
9:00 AM - 12:00 PM

Pre-Conference Workshops

An Administrator's Guide to Fostering a Faculty-Led Assessment Process Jacob Amidon & Debora Ortloff, Finger Lakes University Pearlstein 302

The conundrum for those of us that are tasked with overseeing an assessment process at a college is that in order for the process to be effective, sustainable and meaningful it must be faculty led, but faculty will not, on their own, embrace the assessment process. In this workshop we will explore several techniques and tools that can be used to foster a faculty-led assessment environment. These include how to reframe the act of assessment, building the capacity of the faculty to engage in assessment, creating efficient processes around assessment and managing up to resource and protect the faculty-led process. Participants will work through several hands-on exercises around these core concepts so they can begin to create their own guide to apply within their own campus context.

Outcomes:

At the conclusion of this workshop participants will be able to:

  • Develop ideas for framing assessment on their campus.
  • Create initial targeted professional development plan to support faculty leadership in assessment.
  • Map out efficiency improvement ideas to support high quality assessment.

Creating & Assessing Campus Climates that Encourage Civic Learning & Engagment Robert D. Reason, Iowa State University Pearlstein 303

After a brief discussion about the connections between campus climates and students’ civic learning and engagement, this session will focus on specific ways institutional leaders can create and assess those campus climates that encourage civic learning and engagement. Although the emphasis of the workshop will be on participant’s campus contexts, we will use data from the Personal and Social Responsibility Inventory (PSRI), an ongoing climate assessment project at over 40 institutions, to examine what we know about these relationships broadly.

Outcomes:

At the conclusion of this workshop participants will be able to:

  • Articulate an understanding of how climate shapes learning on college campuses.
  • Draw connections between current (and future) campus programs and climates that encourage civic learning and engagement
  • Develop a plan that incorporates campus climate, institutional policies and programs, and student engagement activities to comprehensively assess the development of civic learning outcomes

Ready, Set, Go: The New Middle States Standards and your Assessment Practice Jodi Levine Laufgraben, Temple University Pearlstein 307

Implementation of the new Middle States standards provide an ideal opportunity to reengage your campus in conversations about assessment. How do your current practices align with the new standards? Where might you improve? In this workshop we will discuss strategies for using the new standards to renew faculty commitment to the assessment of student learning and reenergize the campus commitment to assessing institutional effectiveness.

Outcomes:

At the conclusion of this workshop participants will be able to:

  • Outline how their campuses strengths and weaknesses align with new standards
  • Plan one or more ways to use the new standards to renew campus commitment to assessment

Assessment Toolbox: Supercharge the Direct Assessment of Student Services Michael C. Sachs, John Jay College Pearlstein 101

The Middle States Commission on Higher Education’s publication Student Learning Assessment: Options and Resources, Second Edition states “the characteristics of good evidence of student learning include considerations of direct and indirect methods for gathering evidence of student learning.” Creating direct student learning assessment tools within student support services can be challenging for student service professionals. Often many student service programs rely solely on indirect assessment techniques such as focus groups, evaluations, satisfaction surveys, NSSE results, etc.

This workshop will explore the direct student learning assessment tools available to Offices of Student Affairs and other services offices on campus. These techniques and tools are both qualitative and quantitative in intention and design. This workshop will also enable participants to develop program goals, rubrics, and direct student learning outcomes for their student service areas – linked, of course, to their college’s mission and/or strategic plan. Participants should bring copies of their institutional strategic goals and mission.

Outcomes:

At the conclusion of this workshop participants will be able to:

  • Explain the importance of direct assessment for planning, resource allocation and student learning.
  • Recognize and understand the differences between direct and indirect assessment in student services.
  • Create direct assessment of Student Learning Outcomes for their individual areas / programs that can be incorporated into assessment plans.

Leading Change: Tackling Institution, Program, and Individual Challenges that Derail Assessment Initiatives Catherine Datte, Gannon University & Ruth Newberry, Blackboard Inc. Pearlstein 102

In keeping with the theme Facilitating Conversations that Matter, this interactive workshop engages participants in conversations focused on successful change initiatives related to assessment. Participants will learn to implement the Kotter change model, prioritize initiatives, solicit support, and develop an implementation plan to move a change initiative toward success. Success involves a thoughtful, realistic project plan, driven by a coalition and supported by a “volunteer army” that can serve as spokes-persons, role models, and leaders to move the effort forward. Participants will also learn from one another successful strategies to overcome barriers and resistance that limit forward movement. Attendees will document their SWOCh, gaps, and vision with the assistance of the co-presenters Catherine Datte and Ruth Newberry using the Change Leadership Workbook. In a combined approach of information gathering and self-appraisal, attendees will begin to develop their unique implementation plans and receive guidance regarding specific nuances and challenges related to their institution. Throughout the workshop, Catherine and Ruth will award books related to the specific challenges that are often associated with assessment planning, change leadership, and team building.

Outcomes:

At the conclusion of this workshop participants will be able to:

  • Identify and prioritize critical actions associated with best practices in program or institution assessment along with documenting practical action steps.
  • Learn strategies from peers and share challenges and successes.
  • Create individualized action steps that drive their assessment process.
Sep 13, 2017
1:00 PM - 2:00 PM

Welcome and Opening Plenary

Welcome M Brian Blake; Executive Vice President for Academic Affairs & Provost, Drexel University Mandell Theater

The expectations placed on higher education to foster and document students’ active and deep learning have never been higher. We live in a time of economic uncertainty, global interdependence, and urgent challenges. If our students are to be equipped with the skills to succeed in such a future, we must reject any claims of quality learning that do not include as their focus students’ active learning and understanding and our ability to assess such claims.
At Drexel, our assessment activities are based on institutional values that aim to produce relevant and functional data for aligning curricular design, course content, and pedagogical approaches with Drexel’s mission and values. In all assessment activities, the faculty and staff endeavor to take full consideration of the different educational and cultural backgrounds of our increasingly diverse student population. The primary objective of our assessment program is to establish a practice of action research that informs planning and results in tangible improvements for our students.
In attending the Annual Conference on Teaching & Learning Assessment, you will enjoy three days of thought-provoking speakers, workshops, and invaluable networking on Drexel's beautiful campus, just minutes from the heart of historic Philadelphia and the birthplace of our nation. Come join us as we work together to ensure that all students have continuous opportunities to apply their learning to the significant, real-world challenges which, no doubt, lie ahead for them.

Opening Plenary: Creating a College Culture Where Assessment is a Pathway to Student Success. Sylvia Jenkins; President, Moraine Valley Community College Mandell Theater

Sep 13, 2017
2:00 PM - 2:15 PM

Break 1

Sep 13, 2017
2:15 PM - 3:15 PM

Concurrent Session 1

PDF

Building Faculty Support for a Quantitative Reasoning Requirement: Holistic Assessment of Curriculum and Learning J Bret Benington, S. Stavros Valenti, Frank Gaughan & Terri Shapiro, Hofstra Unviersity PISB 104

We will present a holistic model of outcomes assessment that addresses the ‘fit’ between learning goals and learning opportunities in the curriculum while also collecting data on student learning. To illustrate our model, we will present data and analyses from a recent assessment of Quantitative Reasoning. If done well, analyses of goal‐curriculum fit can be powerful motivators for faculty and administration to cooperate on curricular innovation. This approach led to a broadly supported improvement in the general education curriculum at Hofstra–a quantitative reasoning requirement–that was adopted less than two years after first being proposed. This session will provide attendees with a blueprint for holistic assessment – combining curriculum analysis with student learning assessment – as well as a sustainable method for collecting data using online survey tools that could be scaled up to large numbers of participants with little added effort.

Learning Outcomes:

1. Participants will learn how to collect and analyze data on learning opportunities and engagement within the curriculum (i.e., goal‐curriculum fit).
2. Participants will learn a sustainable / scalable method for measuring student learning outcomes.

Audience: Intermediate

From First to Final Draft: Developing a Faculty-Centered Ethical Reasoning Rubric Genevieve Amaral, John Dern and Dana Dawson, Temple University PISB 106

In this session, we will address how faculty and administrators implemented a faculty‐centered rubric development process. Over the course of one academic year, the team developed, refined and deployed a rubric for the assessment of ethical reasoning in a core text program at a large, urban, state‐related institution. What began as an institutionally mandated process ultimately shed light on unstated, but inherent program goals, and created opportunities to raise awareness of how ethical reasoning informs text selection and learning activities. Presenters will review the program’s history, challenges and lessons learned during rubric development, and plans for implementation. Participants will gain insight into the creation of organic assessment tools that contribute meaningfully to day‐to‐day teaching and curriculum development, and the process of building rubrics to address skills such as ethical reasoning which can be ambiguous and value‐laden.

Learning Outcomes:

1. Participants will better understand the stages of crafting an assessment rubric, and strategies for involving faculty in all aspects of the process.
2. Participants will better understand how to validate a rubric and employ it to carry out a direct assessment of student learning.

Audience: Beginner

PDG

Student’s leading the way: Student Driven Assessment Timothy Burrows, Virigina Military Institute PISB 108

This session details the development of a student driven assessment of leadership outcomes that support the Virginia Military Institute’s mission of developing confidence “in the functions and attitudes of leadership.” Often students are not familiar with the role of assessment in higher education and lack a general understanding of processes in place to help an institution improve. Including students helped to create a high‐level of buy‐in and a sense of ownership (Kuh, Ikenberry, Jankowski, Cain, Etwell, Hutchings, & Kinzie; 2015). This session is relevant because it provides a positive example of participant evaluation and assessment (Ftizpatrick, Sanders, & Worthen; 2011) in a holistic and natural setting. This process highlights how the relationship between several academic‐support units and students can foster stakeholder buy‐in and ownership.

Learning Outcomes:

1. Participants will be able to develop new possibilities for student driven assessment practices at their home institution.
2. Participants will be able to debate the benefits, pitfalls, and challenges facing the implementation and use of student driven assessments.

Audience: Intermediate

PDF

Closing the Loop on Data Collection and Program Improvement Chadia Abras and Janet Simon Schreck, Johns Hopkins University Pearlstein 101

This session aims to present how to collect effective data from course assessments using descriptive rubrics. The session will also present how data collected can be analyzed and utilized to close the loop on course and program improvements. Creative and effective ways to derive meaningful inferences from assessment data sets will be explored. In light of an assessment driven culture at most institutions of higher education and compliance with accrediting agencies, data‐driven decision making is key to maintaining successful and effective programs. Data analysis is key to assess effectiveness of student learning and curricular relevance. Closing the loop on data collection is key in making smart decisions in program design, improvement, and delivery.

Learning Outcomes:

1. Participants will be guided in using effective strategies on creating descriptive assessment rubrics.
2. Participants will be exposed to strategies on how to analyze data for course, program, and unit level improvements. They will understand how to triangulate multiple measures in order to drive decisions for curriculum effectiveness.

Audience: Intermediate

PDF

Criterion Met. Now time to Reflect Kathryn Strang, Rowan College at Burlington County Pearlstein 102

Rowan College at Burlington County’s assessment process serves as a systematic mechanism to measure the strengths and weaknesses of the college’s academic offerings on a continuous basis. Through the implementation of self‐reflection summaries the Assessment Chairs use this tool to highlight what they have learned by conducting the assessments whether the criterion was met or not. Often these summaries involve very detailed and specific adjustments to the curriculum and instructional delivery. Kathryn will lead a PowerPoint presentation followed by a learning activity and a Q&A session designed for professionals with experience in assessment/teaching/learning who seek to develop strategies for a continuous improvement process of assessments. In this session she will outline RCBC’s academic assessment process and the tools and strategies used to establish a strong continuous improvement cycle. Kathryn will take participants through the process of generating assessment results, interpreting these results, and analyzing their implications through the use of reflection summary instrument. At the end of the session, participants will be able to understand how outcomes can be used to create an environment of continuous improvement.

Learning Outcomes:

1. Participants will be able to employ a culture of continuous improvement by learning how to: • implement change based upon assessment outcomes from various well‐defined performance indicators
2. Participants will be able to employ a culture of continuous improvement by learning how to: • design a reflective summary tool to use at their college

Audience: Intermediate

Implementing Assessment in Student Conduct: Understanding a Balancing Act of Challenge, Support, Accountability, and Growth Jeff Kegolis, The University of Scranton GHall 109

When considering assessment within a Division of Student Affairs, historically Student Conduct is a particular functional area that creates challenges for administrators and educators. Although learning may take place over the course of time in one's student experience with the conduct process, it may be difficult to understand how a student believes they are growing through their circumstances and/or the competencies they have improved upon through reflection and processing of their situation. Ultimately, this session focuses upon how assessment of the conduct process was implemented, specifically related to conduct meetings and the results students identified related to their experience. Depending on the size of one's institution or one's Student Affairs division, attendees who attend this session will be able to engage in dialogue related to best implementing assessment to understand competency measurement, the importance of connecting assessment to one's office mission statement/division's priorities/university's strategic plan. Additionally, Student Conduct is a programmatic area that may be difficult to assess due to the nature of the process and a student's lack of willingness to be held accountable. Therefore, through this session, attendees will engage in conversation around their individual department's implementation of assessment and complete a SWOT analysis of how their assessment is implemented.

Learning Outcomes:

1. Participants will discuss best practices related to assessment within one's functional area, and future direction of their assessment based on lesson's learned from previous assessment utilized.
2.Participants will acquire skills to help develop assessment of specific competencies in relation to their programmatic areas.

Audience: Intermediate

Faculty as Networked Improvement Community: Alignment of EdD Program Learning Objectives, Standards, and Measurable Outcomes Joy Phillips, Kathy Geller and Ken Mawritz, Drexel University GHall 209

This manuscript describes how Drexel University School of Education faculty have aligned EdD program principles with Carnegie Project for the Educational Doctorate (CPED) design principles, national Council for the Accreditation of Educator Preparation (CAEP) Advanced Program Standards, Drexel Student Learning Priorities (DSLPs), and the Drexel School of Education Program Themes / Proposal provides example of a participatory, bottom‐up process to align outcomes and assessment activities that includes data/evidence with program‐level learning priorities. Faculty in EdD program aligned Program Learning Outcomes with national and institutional standards. Participants can use this process as a model for developing a cycle of continuous program improvement. Faculty will share a multi‐step process of identifying program learning outcomes (PLOs) beginning with individual course learning outcomes. This interactive session provides participants with templates (see attached) as examples and as working documents to enable participants to engage in such assessment work at their own institutions.

Learning Outcomes:

1. Participants will reflect from discussion of faculty working as a network improvement community to align EdD program learning objectives with national standards and measurable student learning outcomes
2. Participants will gain examples in the form of templates for conducting program‐level alignment of program learning objectives, standards, and student learning outcomes.

Audience: Intermediate

Building a Culture of Assessment and Embracing Technology: A Communication Studies Program Success Patricia Sokolski, Jaimie Riccio and Poppy Slocum. LaGuardia Community College GHall 108

This session will tell the successful story of a community college communication studies program faced with the challenge of implementing and assessing new general education competencies. Revising objectives and learning outcomes, creating new assignments, implementing assessment mechanisms while wrestling with technical limitations resulted in a stronger and better articulated program. Building a culture of assessment is the best answer to the current skepticism over the value of a college education. The systematic inquiry into teaching and student learning provides a way for higher education institutions to demonstrate accountability. Our presentation will show the applicability of our college’s assessment model. / We will explain how we gained faculty participation, how we implemented a loop system of assessment, how we overcame technological limitations, and what we learned in the process. This should help attendees who have to revise curriculum and develop relevant methods of assessment especially for a digital ability.

Learning Outcomes:

1. Participants will formulate a proposal for programmatic assessment.
2. Participants will design exercises that assess digital ability effectively.

Audience: Beginner

Sep 13, 2017
3:15 PM - 3:30 PM

Break 2

Sep 13, 2017
3:30 PM - 4:30 PM

Concurrent Session 2

Self-Esteem is Doomed: A Paradigm Shift to Self-Compassion Allow Everyone to Thrive in Higher Education Laura Vearrier, Drexel University PISB 104

The goal of this session is to teach educators about the elements of self‐compassion—self‐kindness, shared humanity, and mindfulness— and how this construct is more productive than self‐esteem. Self‐ esteem involves the need to feel above average and special in comparison to others and will inevitably wane in higher education settings. / In a society where being average is unacceptable but the norm, most assessments will be perceived as failures and be unpleasant for the educator and the learner. Self‐ compassion involves transitioning from self‐judgement to self‐kindness, isolation to common humanity, and disconnection to mindfulness. This construct allows for a more positive experience. / Self‐ compassion is for the attendee to learn about for personal well‐being. They can then guide assessments with the principles of self‐compassion for a more fulfilling, productive process for themselves as well the learner.

Learning Outcomes:

1. Participants will be able to understand the components of self‐compassion and how it differs from self‐esteem
2. Participants will be able to apply self‐compassion for oneself and then use the construct to guide productive self‐ refection in learners

Audience: Advanced

Snapshot Sessions (5 Minute Mini-Sessions) Various Presenters PISB 106

Does Class Size Matter in the University Setting?
Ethan Ake and Dana Dawson, Temple University

I See What you Mean: Using Infographics and Data Visualizations to Communicate your Assessment Story
Tracey Amey, Pennsylvania College of Technology

The Impact of the 3R2V Strategy on Assessment Questions in the Science Classroom.
Deshanna Brown, Barry University and Broward County Public Schools

Assessing and Addressing the Digital Literacy Skills of First-Generation College Students
Nicole Buzzetto-Hollywood and Magdi Elobeid, University of Maryland Eastern Shore

Utilization of External Reviewers for Student Learning Assessment
Anthony DelConte, Saint Joseph’s University

Core Curriculum Outcomes: Reflections, Reactions, Results, and Other Assessment Tales
Seth Matthew Fishman Marylu Hill and Peter Spitaler, Villanova University

Developing an Exceptional Academic Advising Program Using Student Satisfaction Survey Data
Debra Frank, Drexel University

Faculty Centered Assessment: Getting the Right People to the Right Place at the Right Time
Brooke Kruemmling, Salus University

Showing Educators How to Teach Traumatized Students
Jonathan Wisneski and Anne Hensel, Upper Darby School District and Drexel University

Assessing Critical Reflection: Learning in Faculty-led Short Term Study Abroad Programs: Students in Developed Countries Akosa Wambalaba, United States International University PISB 108

Critical reflection is a transformative learning outcome of the embedded faculty‐led short term study abroad program (Gaia, 2015; Russell and Reina,2014) ‐ Windows to the World‐France at United States International University. We look at short term study abroad assessment activities and assess evidence of critical reflection skill learning by developing country students in a developed country, with implications on assessment objectives and methods. The focus is on assessing process oriented transformative learning (Mezirow,1998) through multiple channels: engaging field discussions, informal interviews (novel), videos, home campus presentations and thematic research. Faculty in short term study abroad programs must continuously provide evidence of the competencies students acquire, that are transformative, transferable and have a lasting positive impact in their adaptation in an increasingly interconnected world. Attendees will assess engaging paths to student self‐discovery with examples of informal interviews, student videos, research and mapping.

Learning Outcomes:

1. Audiences will learn approaches used to identify thought provoking scenarios in short study abroad sites using knowledge related to student cultural assumptions during the initial course phase on home campus
2. Audiences will learn how to use literary texts as discussion points to identify different and similar cultural perspectives between home and study country and how to use it as a basis for critical reflection

Audience: Beginner

Both a Science and an Art: Designing, Developing, and Implementing Academic Program Evaluations that Work Erica Barone Pricci and Alicia Burns, Lackawanna College Pearlstein 101

This session is about building and implementing a useful, sustainable, improvement‐focused academic program evaluation model that meets the needs of multiple stakeholders, including faculty teaching within the program, faculty teaching outside the program, administration, students, accreditors, and the industries that ultimately seek to employ our graduates. Systematic, robust academic program evaluation is not only necessary to remain in compliance with accreditation requirements, but also informs decisions about the critical issues of resource allocation; faculty development, program growth, adaptation, or elimination; curricular revisions; and appropriateness of academic policies. Program evaluation results support and empower data‐based decision making. This session will provide attendees with information about how to launch, re‐energize, or maintain a program evaluation model on their campuses. It will share some of the potential challenges, as well as solutions and will focus on strategies to promote sustainability and faculty and staff engagement in the process.

Learning Outcomes:

1. Participants will learn specific action steps to implement a successful program evaluation model from conceptualization to actual delivery
2. Participants will learn tools to assist attendees with using the knowledge gained in the session, including an evaluation model and a template for sharing program evaluation results

Audience: Intermediate

Lost with ILO Assessment? No Worries, We Bet you are Heading in the Right Direction Jacqueline Snyder and Mary Ann Carroll, SUNY Fulton Montgomery Community College Pearlstein 102

Does your Institutional Learning Outcomes assessment feel like you are randomly winding down a long path toward a goal, trying to get from point A to point B? Do you have feelings of being lost and never reaching the destination? ILO assessment can appear as a never‐ending, foggy path... with many twists and turns, leaving you guessing if a final documented systematic process will ever be achieved. Two State University of New York community colleges will share their experiences of feeling lost, yet eventually landing in a place to document ILO systems and processes that meet Education Effectiveness Assessment... Standard V. Assessment leaders from SUNY Fulton‐Montgomery CC and SUNY Herkimer CCC, both small, rural institutions will provide an interactive session that focuses on concrete examples of how to optimize current resources to assess ILOs and draft an ILO assessment plan that meets Standard V criteria.

Learning Outcomes:

1. Participants will be able to identify institution‐level specific processes and systems available for data collection.
2. Participants will be able to draft an ILO plan outline to meet Standard V criteria.

Audience: Beginner

PDF

Our QuEST for Improving Learning: Year Two Analysis of Wellness Course Revisions Mindy Smith and Susan Donat, Messiah College GHall 109

We revised our general education learning objectives for our student wellness program in 2015‐2016. We continued improving our curriculum by revising course materials in 2016‐2017, changing our required readings to include multimedia content. Assessment findings revealed significant improvement in several areas of student‐reported learning, with implications for other content areas. Higher education professionals face the challenge of engaging students in relevant content with learning experiences that will inspire critical thinking and deeper application. Our session details the process of moving from an instructional focus on wellness research articles to evidence‐based wellness tools such as organizational websites, infographics, and TED talks. Improving student learning is an ongoing process, where one change can open additional avenues for improvement. Participants will reflect on the learning activities and required readings in their own courses and explore other creative tools that they could integrate based upon the findings from our assessment work.

Learning Outcomes:

1. Participants will interact with the revised general education wellness instructional resources at Messiah College and review the assessment findings regarding improvement in identified student outcomes.
2. Participants will raise questions and share innovative recommendations for engaging learners in relevant instructional content using creative and evidence‐based instructional resources.

Audience: Intermediate

PDF

Assessors of the Galaxy: Using Technology Integration to Shift a Culture Ryan Clancy, Mark Green and Nina Multak, Drexel University GHall 209

This presentation highlights the integration of new technology for exam mapping in a graduate level health professions training program. Presenters will share the rationale, challenges and success of the assessment transformation. The development of an assessment framework will be explored by attendees through the presentation and an experiential learning activity. Use of current technology allows for more effective assessment development and implementation, student evaluation as well as program self assessment and improvement as required by accrediting organizations. Utilization of these evaluation methods of learners allows for early student intervention and remediation prior to their taking high stakes certification exams. Attendees will leave with an example of how the program used a change in technology as an opportunity to develop evidence‐based practices and shift a culture towards continual improvement. In addition, the learning activity will promote a simple and effective way to start a discussion on exam blueprint mapping.

Learning Outcomes:

1. Participants will be able to create a blueprint map and label exam questions.
2. Participants will be able to understand one process of using a change in technology to shift a culture towards continual improvement.

Audience: Beginner

PDF

30 Minute Split Sessions Debbie Kell, Deborah E. H. Kell, LLC and Melissa Krieger, Begen Community College GHall 108

Session 1: Assessing Student Learning in Student Affairs: There’s Just Not Enough Time!
Debbie Kell, Deborah E. H. Kell, LLC

Student Affairs Divisions are challenged when asked to assess student learning. Unlike faculty, who see students for a full semester, SA staff often see students in large groups and/or in single, shorter, tightly‐ defined sessions. This presentation highlights some tangible techniques that can be used to study the student learning take‐away. Student Affairs and Academic Affairs should rightfully work in partnership in delivering on and assessing student learning outcomes, especially general education. But the assessment findings for each area need to be in place for meaningful conversations to yield intentional responses. Institutions wish to capture more of the totality of the student experience, especially relating to general education. As a “student‐facing” service, student affairs divisions are a critical piece of the puzzle. Despite the challenges, it is possible to move beyond indirect measures such as office visits, participation rates, and surveys.

Learning Outcomes:

1. Participants will be able to articulate and study samples of measurable student affairs student learning outcomes and determine alignment with common general education focus areas. 2. Participants will be able to determine easily‐deliverable assessment tools suitable for the typical student affairs environment.

Audience: Intermediate

Session 2: Assessment Tools for Experiential Learning and Other Highly Impactful Practices
Melissa Krieger, Begen Community College

Strategies for assessing experiential learning activities can be done through thoughtfully developed rubrics directly aligned with learning outcomes instructors wish students to master. The presenter will share the assessment results of a student service learning project her students completed as an example of effective assessment of a High Impact Practices. Attendees will participate in an interactive visual and verbal presentation to examine High Impact Practices for teaching and learning and discuss possible assessment tools for gathering data on student learning. Participants will have the opportunity to brainstorm learning outcomes that may be challenging to assess via experiential learning activities. / Thoughtful development of fair assessments is imperative, though challenging, for the assessment of learning outcomes in experiential activities and assignments. As mere student participation does not suffice as evidence that learning outcomes were achieved, attendees will have the opportunity to explore worthwhile assessments for engaging and impactful instructional practices.

Learning Outcomes:

1. Attendees will participate in an interactive visual and verbal presentation to examine experiential learning activities and possible assessment tools of these practices.
2. Participants will explore grading criteria that could be used to assess identified learning outcomes within experiential learning activities or other HIPs.

Audience: Beginner

Sep 13, 2017
4:45 PM - 5:30 PM

Ice Cream Social

Sep 13, 2017
6:00 PM - 10:00 PM

Night Out with the Phillies

Sep 14, 2017
7:30 AM - 8:30 AM

Continental Breakfast 1

Sep 14, 2017
8:45 AM - 9:45 AM

Morning Plenary

Morning Plenary: Reclaiming Assessment: Unpacking the Dialogues of our Work Natasha Jankowski, National Institute on Learning Outcomes Assessment (NILOA) Mandell Theater

Abstract: Assessment professionals increasingly find themselves navigating various spaces of difficult conversations. Is assessment for compliance or improvement? Why is it worth faculty time to engage in assessment processes? Have we created a reporting system at the expense of student learning? Is assessment fundamentally about improving individual student learning or improving institutional performance? This keynote will explore various lenses related to meaningful assessment by focusing on the conversations and philosophies underpinning our work. In order to reclaim assessment for our students and ourselves, we will unpack together the narratives and dialogues around our work so that we can best position ourselves to facilitate conversations that matter. Dr. Natasha Jankowski is Director of the National Institute for Learning Outcomes Assessment (NILOA) and research assistant professor with the department of education policy, organization and leadership at the University of Illinois Urbana-Champaign. She is co-author, along with her NILOA colleagues, of the book Using Evidence of Student Learning to Improve Higher Education, as well as co-author of the recently released book, Degrees That Matter: Moving Higher Education to a Learning Systems Paradigm. Her main research interests include assessment, organizational evidence use, and evidence-based storytelling. She holds a PhD in higher education from the University of Illinois, an MA in higher education administration from Kent State University and worked with the Office of Community College Research and Leadership studying community colleges and public policy.
Sep 14, 2017
9:45 AM - 10:00 AM

Break 3

Sep 14, 2017
10:00 AM - 11:00 AM

Concurrent Session 3

PDF

Background, Methods, and Results of a 7-year Longitudinal Assessment of Undergraduate Business Writing Scott Warnock, Drexel University PISB 104

I will present a 7‐year assessment of undergraduate business major writing, including results of more than 3,700 assessments of 2,000 documents and comparisons between English and business assessors. I’ll discuss two curricular interventions. Overall, the student writing was assessed as good, but we found differences between English and business assessors. This presentation is based on a unique, published research project that takes a longitudinal approach at writing assessment. I also describe specific issues with writing assessment and how this study attempted to overcome them, largely through a situated assessment approach. / The way we conducted this assessment represents a potentially reproducible assessment approach that departments and institutions could use for a variety of purposes, including satisfying accreditation requirements.

Learning Outcomes:

1. How to create a large, longitudinal writing assessment
2. How situated assessment can help your campus assess the writing of its students

Audience: Intermediate

The Wizards of Assessment: Peel Back the Curtain and Experience the Art and Science of the Assessor Mark Green and Ray Lum, Drexel University PISB 106

During this hands‐on session, conference attendees will be invited to gather in a lighted hearted, but rigorous process of creating assessment tools. Whether the subject is complex or simple, evidence‐ based assessment techniques will be the foundation of the process. Be surprised as we demystify assessment by using the most unassuming subjects. Participants will work collaboratively with one another to develop their assessment tools which are reliable and measurable. A panel of distinguish evaluators will determine the efficacy and validity of the tools. Alternatively, conference attendees may observe the quick‐witted panel as participants gain insightful feedback and quips regarding their assessment tools. They will witness an array of techniques used. In addition, attendees will identify themes of best practice and tips for improvement. While networking during the session is prize enough for some, top assessment tools presented will receive additional recognition and of course . . . bragging rights for the year. This presentation will focus on making assessment relatable to all audience members and, to promote networking opportunities for attendees. The presentation will give an example of a way to engage colleagues in a conversation about assessment that is fun at the same time.

Learning Outcomes:

Audience: Beginner

Learner-Focused Assessment for the Creative Mind: Cultivating Growth for All Learners Amanda Newman-Godfrey and Lynn Palewicz, Moore College of Arts and Design PISB 108

This session shares the process and results of learner‐focused assessment, specifically rubric designs that emphasize growth. Attendees will participate in an activity demonstrating the instrumentation of a learning focused visual arts activity and its subsequent assessment. Group discussion will question perceptions of ability typically seen as inherent to an individual. Visual arts education designed to support and fairly assess creative and diverse learners levels the performance playing field. Pedagogical practices focused on helping learners develop a mindset for growth rather than relying solely on performance‐based outcomes allows for enhanced teacher‐student dialogue, authentic assessment practices, and sustained engagement in coursework. This session will share how a belief in growth mindset (Dweck, 2006) instead of fixed mindset enhances not only how we see learning in our students, but how we engage in personal and professional development. Additionally, this session will offer strategies on reaching struggling students and diverse learners.

Learning Outcomes:

1. Attendees will gain the ability to use rubrics emphasizing growth mindset to assess students’ project specific performance as well as their learning readiness and potential.
2. Attendees will receive hands‐on experience using a student‐centered, growth mindset oriented rubric to assess their performance on a visual arts lesson and measure it against a traditional teacher‐directed didactic.

Audience: Beginner

Working Hand-in-Hand: Programmatic Assessments and Institutional Outcomes Frederick Burrack and Chris Urban, Kansas State University Pearlstein 101

This presentation will focus on managing programmatic and institutional assessment in ways that promote authentic assessment within disciplinary contexts and enable the collection and reporting of institutional data. The presenters will demonstrate examples of processes and technologies to integrate data from a variety of sources into a seamless continuum. Programmatic student learning is integral to the expectations of higher education in liberal education as well as occupational preparation. Managing assessment results that equally informs institutional and programmatic constituents is a challenge. The processes shared will be of interest to assessment facilitators, administrative leaders, and programmatic faculty. Automating assessment data collection and reporting allows programs and administrative areas the opportunity to focus on meaning behind data and guide resulting decisions toward program and institutional improvement. Data visualizations encourage enhanced consideration of student learning and lead to considerations not easily found empirically.

Learning Outcomes:

1. The participants will learn how automation of data collection and reporting can contribute to advanced consideration through analysis of student learning data.
2. The participants will recognize how programmatic student learning data can contribute to institutional assessment expectations.

Audience: Intermediate

Assessing Engagement in Active Learning Classrooms Dawn Sinnot, Susan Hauck and Courtney Raeford, Community College of Philadelphia Pearlstein 102

In this session, we will share a research study that evaluates student perception of engagement in parallel ALC and traditional classrooms. Significant findings include increased classroom problem solving, student collaboration and student cooperation. Participants will gain perspective on both design and assessment decisions they might face in implementing similar facilities. As technology is implemented to facilitate learning, it is important to assess the effect to ensure continuous improvement. This session outlines a study done to measure the effect of technology on student engagement in an ALC environment using parallel control groups of the same course and same instructor. This study provides context for developing innovative assessment strategies. This case study will engage the audience in considering what is evidence of the results of technology used the classroom. Can tangible effects of integrating technology be measured, and how can these results be used to improve teaching and learning?

Learning Outcomes:

1. Participants will discover the purpose and practice of assessment and evaluation in Active Learning classrooms.
2. Participants will gain insight into how the Active Learning Classroom encourages problem‐solving, collaboration and cooperation.

Audience: Intermediate

PDF

Collecting Meaningful Assessment Data: an Accreditation Strategy Jane Marie Souza, University of Rochester GHall 109

The session concerns creating a non‐threatening strategy for collecting assessment data from all operational units within an institution to inform accreditation reporting. The word “accreditation” often elicits a grimace when mentioned on campuses. It can be associated with countless hours spent retrieving information that has been forgotten or was stored on files that have been forgotten. It is not unusual for institutions to reflect on all the required standards only when an accreditation event is imminent. However, there is another way. The session provides a framework to support a culture of continuous data collection such that use of assessment data becomes systemic, expected, and even celebrated each year through an internal report on progress made across the standards.

Learning Outcomes:

1. Participants will be able to create a plan for routinely collecting information important to the institution that is aligned with accreditation standards.
2. Participants will be able to implement the plan in such a way that it can be embraced by the campus community and become sustainable.

Audience: Intermediate

Peer-to-Peer Blueprints: Leveraging Hierarchical Learning Outcomes and Peer Consultants to Foster Faculty Discussions of Assessment Michael Wick and Anne Marie Brady, St Mary's College of Maryland GHall 209

We will describe a toolkit used to enable a group of faculty leaders to propose, analyze, and refine student learning outcomes for courses and programs from disciplines disparate from their own but within a consistent framework of institutional outcomes. The toolkit proved effective in elevating faculty engagement with assessment. The toolkit developed, peer‐to‐peer blueprints, is a highly effective means of quickly and painlessly moving faculty up the learning curve of student learning assessment. The toolkit fosters a community approach to assessment that both provides administrative support to faculty already overburdened with work and too often underappreciated by external stakeholders. Attendees will gain access to concrete, fully‐implemented, Excel templates in support of the peer‐to‐peer blueprint approach. The templates can be applied immediately in on‐the‐ground assessment processes.

Learning Outcomes:

1. Attendees will understand the peer‐to‐peer blueprint toolkit for student learning assessment as demonstrated by summarizing the blueprint's operation.
2. Attendees will be able to apply the peer‐to‐peer blueprint toolkit to student learning outcomes from their home institutions.

Audience: Beginner

We Need More: Novel Metrics for Classroom Assessment and Proposed Standards in Nonformal Learning Caitlin Augustin, John Harnisher and Kristen Murner, Kaplan Test Prep GHall 108

Non‐formal education is a growing field that is a complement and corollary to formal education. However, most programs are evaluated by posthoc surveys that rely on a student’s memory rather than standardized assessments. This session focuses on three arms of non‐formal education management: defining the space, discussing assessment, and proposing industry standards. Both students and professionals often lack expert judgment to evaluate an investment in non‐formal education, leaving them vulnerable to “gain claims” and other ambiguous statements of impact. However, non‐formal education has been globally recognized as a tool for meeting educational needs. In a digestible format, we present results of both primary experiments and secondary analysis of success measurement in the non‐formal space. In most non‐formal learning arrangements there exists a digital learning component. These systems (LMS) allow evaluators to observe how a student truly engages with a non‐formal course. However, expected measures of engagement and effectiveness of non‐formal courses have not kept pace with available data. We discuss methods of both capitalizing on, and setting best practices for, measurement and assessment maximizing the LMS.

Learning Outcomes:

1. Participants will discuss multiple methods of performance measurement for non‐formal education programs.
2. Participants will be able to identify the value of standards of practice in non‐formal education and propose standards for evaluations

Audience: Beginner

Sep 14, 2017
11:00 AM - 11:15 AM

Break 4

Sep 14, 2017
11:15 AM - 12:15 PM

Concurrent Session 4

All About that ’Base: Database Design as Part of Your Assessment Toolkit Krishna Dunston, Community College of Philadelphia PISB 104

Webinars purporting to present a customized assessment database sometimes deliver, “how I explained to IT what I wanted.” This how‐to presentation will showcase the design process of an Access web app. Participants will receive a copy of the presenter's Access app design as a starting point for building their own online assessment entry form and reporting site. Database design is rarely considered part of the skill set of the assessment professional. Conversely, the essential tasks of making alignment clear, collecting assessment data, articulating connections between learning outcomes, and creating reports for stakeholders, are all tasks which can be made more manageable with a database. A good relationship with the offices managing your institution’s data is key but relying on your own skills to manage ad hoc or creative approaches can become an immensely satisfying part of the job. And building your own solutions can be more effective than forcing data into an assessment management system designed to standardize.

Learning Outcomes:

1. Participantw will examine the design components of an Access web app: tables, queries and views.
2. Participants will apply the process of building a custom entry form to solve an assessment challenge.

Audience: Advanced

Task-based Assessment: A Step-by-Step Guideline Ramy Shabara, The American University in Cairo, Egypt PISB 106

Providing the attendees with the opportunity to learn about the techniques and procedures used to develop appropriate CEFR‐aligned classroom task‐based assessments by providing a step‐by‐step guideline and develop multi‐stage rubrics to collect and document evidence about learners’ language skills Task‐based assessment is a key indicator of the quality of the amount of the language acquired by learners; however, little is known by many teachers about its characteristics, components, task selection and development as well as rubric design. By attending this presentation the attendees will improve their classroom assessment competencies and practices; they will be able to design valid task‐based assessments and multi‐stage rubrics whereby they could collect and document evidence about learners’ language skills.

Learning Outcomes:

1. Participants will be able to develop valid task‐based assessments whereby student's language skills can be assessed
2. Participants will be able to design multi‐stage task‐based rubrics whereby student's language skills can beaccurately assessed.

Audience: Intermediate

PDF

Cracking the Code of Creative “Capital:” Assessing Student Creativity in Science, Engineering and Technology Courses Jen Katz-Buonincontro, Drexel University PISB 108

This three‐part session proposes to address the gap in assessment knowledge and practices in the area of STEM creativity: First, presenters will clarify the key operational definition of creative thinking and creative problem solving (5 minutes). In the second part, seven Ph.D. in Education students completing a Creativity and Innovation in STEM Education course in the School of Education at Drexel during spring 2017 will discuss their assessment plans in these four projects (30 minutes): / Designing Digital Media Tools for Environmental Advocacy / Game‐Based Learning in an Evolutionary Biology Classroom / Divergent Thinking in Ecology / The Creative Problem‐Solving Process in Engineering / Attendees will glean relevant assessment ideas during the presentations to then discuss and jump start their own assessment plans in the final, third part of the session (25 minutes). A Checklist for Assessing Student Creativity will be provided to attendees that include helpful planning tips and resources. Although “Critical and creative thinking” is one of eleven Drexel student learning priorities and is frequently mentioned as a 21st Century competency in national learning standards, faculty members, researchers, and teachers are underprepared to plan for and to assess students’ creativity. Attendees will glean relevant assessment ideas during the presentations to then discuss and jump start their own assessment plans in the final, third part of the session (25 minutes). A Checklist for Assessing Student Creativity will be provided to attendees that include helpful planning tips and resources.

Learning Outcomes:

1. Participants will gain concrete understanding of four new creative thinking and problem solving rubrics in science, engineering, and technology.
2. Participants will engage in hands‐on activities for adapting the rubrics to meet specific learning needs.

Audience: Intermediate

Comparative Program Assessment to Increase Student Access, Retention, and Completion Catherine Carsley and Lianne Hartmann, Montgomery County Community College Pearlstein 101

This presentation outlines a unique assessment process called PEER (Program Excellence and Effectiveness Review) that provided Montgomery County Community College with increased insight into access, retention, and completion issues across programs. The process supplements historical (and therefore somewhat problematic) IPEDS data with current student data to give the College a snapshot of what was happening right now in our programs while ensuring findings were actionable. Supplementing IPEDS data with current enrollment data allowed the College to make specific observations across programs. For example, programs with strong co‐curricular activities also had higher student persistence rates or lower times to completion. Some programs enrolled students who were disproportionately female and low income (based on Pell percentages) and some programs attracted students who were relatively more male and wealthy. / Faculty and staff can now craft up‐to‐date and actionable interventions based on PEER data. Coordinators of programs that are like each other in some way (regardless of enrollment or division) might benefit from discussions with other coordinators whose programs share similar issues; conversely, those program coordinators who observe specific program weakness could be matched with coordinators who were successful in those areas for help.

Learning Outcomes:

1. Participants will be able to craft actionable interventions based on up‐to‐date rather than historical data.
2. Participants will be able to compare programs across disciplines via new data elements and visualization tools.

Audience: Intermediate

Acting on Data: Lessons about the Use of Student Engagement Results to Improve Student Learning Jillian Kinzie, Indiana University Pearlstein 102

The ultimate goal of assessment projects, including the National Survey of Student Engagement (NSSE), is not to gather data, but to catalyze improvement. This session presents lessons about acting on data from institutional stories of NSSE use, including approaches to sharing results, triangulating data, and involving students in interpreting evidence. The content for this session draws from more than 50 accounts of NSSE data use from campuses that have effectively used results to facilitate conversations and take action to improve undergraduate education. Instructive, field‐tested lessons are important for illustrating foundational principles of assessment and furthering best practice. Attendees will gain a handful of field‐tested lessons about what works, reflect on their own data use and consider what stimulates and impedes action, and be encouraged to adopt a new strategy to take action on assessment results.

Learning Outcomes:

1. Participants will reflect on their assessment data use and consider what stimulates and impedes action.
2. Participans will be able to adopt a new strategy to take action on assessment results.

Audience: Intermediate

PDF

Critial Thinking: It's Not What You Think! Janet Thiel, Georgian Court University GHall 109

This session will examine the academic quality of various intellectual skills currently classified as critical thinking. Participants will consider the various nuances of critical thinking and its assessment. The definition of critical thinking will be teased out as problem‐solving, reflective, self‐aware, metacognitive, creative, and critique thinking. Appropriate teaching methods and ways to assess the above intellectual skills will be presented. Participants will consider how critical thinking is defined and assessed on their own campus and within its various programs, both with learning inside and outside the classroom. Critical thinking has often been relegated to tests of inferential reading skills, assessed through logic‐ intensive courses, or through quantitative reasoning. This presentation considers alternate ways to define critical thinking that are applicable to not only academic learning but also can be assessed within student life experiences.

Learning Outcomes:

1. Participants will analyze critical thinking beyond the testing parameters of inferential reading ability.
2. Participants will consider appropriate assessment of various intellectual skills classified as critical thinking.

Audience: Intermediate

PDF

Text Analysis as Assessment for Ethical and Diagnostic Purposes Fredrik deBoer, Brooklyn College GHall 209

The purpose of this session is to demonstrate, practically and theoretically, how text analysis with software can help administrators and instructors make more ethical and pragmatic decisions regarding student writing ability. Text analysis software allows us to look at large sets of student data very quickly, allowing us to mine that data for information that is relevant to how we assess student writing ability. This in turn allows us to make better decisions about placement and use of scarce institutional resources. I'll show attendees how to access free‐to‐use software to analyze student writing samples and how that analysis can make our writing programs more fair, efficient, and effective.

Learning Outcomes:

1. Participants will be able to identify text analysis software that is useful for making decisions in writing program placement and evaluation.
2. Participants will be able to name several types of analysis that can be undertaken using such software.

Audience: Intermediate

Assessment as Research: Using Compelling Questions to Inspire Thoughtful Assessment Practices Javarro Russell, Educational Testing Service (ETS) GHall 108

This session asks participants to reconsider their assessment process. By brainstorming the compelling questions an institution could ask, and by engaging in targeted data analysis activities, the participants of this session will gain insight into how to develop assessment processes that address compelling questions about student learning and success. Far too often institutions begin to ask meaningful questions about student success and student learning after they have already started or completed their assessment process. Ultimately, they realize that they did not collect enough data, nor the right kind of data that would be helpful in answering their questions. Our assessment colleagues are often discourages by the lack of data use in their assessment work. This presentation will present some ideas for our colleagues to collect the kind of data that could compel their colleagues to action.

Learning Outcomes:

1. Participants will be able to identify compelling questions about student success and student learning that encourage the closing of the assessment loop.
2. Participants will be able to identify analyses for uncovering the performance of students on learning outcomes.

Audience: Beginner

Sep 14, 2017
12:30 PM - 1:45 PM

Luncheon & Plenary

A Panel Discusion with Accreditation Leaders Beth Sibolski; President, Middle States Commision on Higher Education. Belle Wheelan; President, Southern Association of Colleges and Schools Commission on Colleges. Patricia O'Brien; Senior Vice President, Commission on Institutions of Higher Education Behrakis Grand Hall

Sep 14, 2017
1:45 PM - 2:00 PM

Break 5

Sep 14, 2017
2:00 PM - 3:00 PM

Concurrent Session 5

Knowing More about our Students in Foundational Math and Writing Reform: Building Multi-Faceted Assessment on the Front End Fiona Glade, University of Baltimore PISB 104

In 2014, University of Baltimore redesigned developmental math and writing with a focus on student self-efficacy to improve retention. After describing the revision process and impact of the new structure, which includes Directed Self-Placement using multiple measures, the presenter will share multiple measures materials and show participants how to adapt them to their own contexts.

Learning Outcomes:

1. Participants will learn how Directed Self-Placement works based on multiple measures of writing assessment, including guided self-assessment.
2. Participants will learn how to use their own institutional context to create Directed Self Placement appropriate to supporting developmental writing students.

Audience: Advanced

Snapshot Sessions Various Presenters PISB 106

Assessing our Assessment: A Process for Reviewing Annual Assessment Reports
Gina Calzaferri, Temple University

Turning 120 Annual Reports Into a Searchable Online Planning/Reporting Database Linked to Strategic Plans
Wenjun Chi, Saint Joseph’s University

Rubrics: Facets That Matter
Diane DePew, Drexel University

The Impact of Co-Curricular Activities as an Assessment Tool on the University Students
Muhammad Farooq and Gehan El Enain, Abu Dhabi University

Learning from the Assessment Process: HBCU Faculty Perspectives on Classroom and Program Review
Pamela Felder and Michael Reed, University of Maryland Eastern Shore

Enhancing Online Course Design by Developing Faculty-Administration Collaborations and Using Quality Matters Rubrics
Moe Folk and Doug Scott, Kutztown University

Collaboratively Assessing Collaboration: Self, Peer and Program-level Assessment of Collaborative Skills
Janet McNellis, Holy Family University

Developing an On-Line Simulation Activity: Assessing the Need and Implementing Action!
Margaret Rateau, Robert Morris University

Assessment in Science using I-LEARN Model
Hamideh Talafian, Drexel University

PDF

Implementing a Student Assessment Scholar Program: Students Engaging in Continuous Improvement Nicholas Truncale, Elizabeth Chalk, Jesse Kemmerling and Caitlin Pelligrino - University of Scranton PISB 108

This session covers the implementation a student assessment scholars program. Student scholars aid in continuous improvement by collecting viewpoints via student‐led focus groups and tendering suggestions to stakeholders within the university community. Training students to design projects and how to administer the program will be some of the information discussed. One goal of assessment is to improve the student learning experience. It makes sense to involve students in the assessment process. Students scholars are able to collect evidence from their peers via focus groups that may not be accessible by regular means of data collection (i.e. surveys, course evaluations, etc.) Assessment of a general education program and institutional learning outcomes can be a burden for faculty at some institutions who may not have the support structure necessary to accomplish these tasks. Having a dedicated group of students invested in their university to collect evidence from other students can help immensely.

Learning Outcomes:

1. Participants will review an example of a student assessment scholar program by interpreting a program blueprint given by the presenters.
2. Participants will discuss and simulate the administration of the scholar program by considering a sample student assessment project at their own institution.

Audience: Beginner

Organizing Program Assessment as Collaborative Problem Solving Barbara Masi, Penn State University Pearlstein 101

In this session, participants will learn how to structure program assessment as collaborative problem solving across the institution. The steps needed to support faculty and administration reframing of student learning assessment processes as a coherent system of student, course, program and institution information and resource elements will be presented. In this session, faculty and administrators will learn how to connect formal program assessment processes with institution‐wide curricular innovation processes. By doing so, they will be able to more effectively target scarce institution resources toward curricular innovations that will have greatest impact on students. All institutions strive for efficient and effective processes to improve student learning, however, at many institutions those processes are often hampered by information and resource flows from the course to program to institution levels. The organization framework and templates presented simplify this process. The result in a streamlined process for improving student learning.

Learning Outcomes:

1. Participants will be able to define and explain the elements of a streamlined institution student learning assessment framework.
2. Participants will be able to use the streamlined institution student learning assessment framework to improve processes at their institution

Audience: Intermediate

PDF

Educational Development and Assessment: Simultaneously Promoting Conversations that Matter Phyllis Blumberg, University of the Sciences Pearlstein 102

To promote conversation, this session will report on research that served mutually complementary purposes: to determine a) the impact of faculty development efforts and b) if colleges met their strategic goals of implementing learning‐centered teaching. Participants will discuss how Suskie’s assessment cycle is useful for educational development purposes and assessment. This session illustrates why educational developers should become involved with ongoing assessment projects. When developers broaden their scope from working with individuals to collaborating with departments or colleges on programmatic assessment, they are poised to improve the overall educational quality of programs through significant conversations. Educational developers are often trusted by the faculty and the administration, therefore, they can be change agents to promote institutional development through assessment efforts. When engaged in assessment efforts, developers provide formative feedback for the purpose of improvement and make available useful information to institutions for accreditation reporting.

Learning Outcomes:

1. Participants will be able articulate why they should be involved with programmatic assessment and how they can promote overall educational quality improvement.
2. Participants will be able to use Suskie’s assessment cycle in significant conversations that lead to improvement efforts, pragmatic assessment, and data for accreditation reports.

Audience: Beginner

Listening for Learning: Using Focus Groups to Assess Students’ Knowledge Corinne Dalelio and Christina Anderson, Coastal Carolina University; Gina Baker, Liberty University GHall 109

We will present experiences and lessons learned using focus groups as an innovative, collaborative, and discussion‐based method to assess higher‐order learning outcomes. Specifically, we will demonstrate a design used to assess students’ ability to evaluate communication processes and messages; think critically about human interaction; and analyze principles of communication. Those in charge of assessing student learning may wish to use qualitative approaches to be able to observe how their students actually apply their learning in a classroom environment. This session will showcase how focus groups can be used to assess student learning emerging through group discussion. We will introduce not only an innovative assessment method, but also a mindset that, in pursuing assessment, we should think about ways of effectively capturing student learning, rather than just relying on the default methods or devices that may be the most frequently used or made most easily accessible (Angelo, 1999).

Learning Outcomes:

1. Participants will gain a broadened understanding of the techniques and methods that can be used for assessing student learning.
2. Participants will be able to apply a focus group method to assess students’ higher order thinking skills and embedded knowledge in their disciplines.

Audience: Intermediate

Rebooting Work Based Assessment for the 21st Century; Shifting to Digital Technologies for Student Nurses Sian Shaw and Anne Devlin, Anglia Ruskin University (UK) GHall 209

This session presents our case study research into the introduction of digital practice assessment for 500 second year student nurses. It outlines practical examples on how blockades to introduction within the community of practice were overcome and how learning analytics are used to enable absolute governance of work‐based assessment. Within the United Kingdom, like the USA, there is a crisis ‐ a national shortage of nurses. In the UK, this has led to new routes of entry, including work‐based pathways, partly driven by a Government employer levy to fund apprenticeships. This creates a need for innovative solutions for work‐based assessment. / To reduce attrition it is essential be able to track students’ progress in practice, ensure that assessments are completed on time and provide timely support. Leading digital change is hard. This session will outline how putting the users first and technology at the heart can create better, more sustainable assessment.

Learning Outcomes:

1. Participants will learn strategies to overcome obstacles in leading the adoption of digital assessment in a work‐based community of practice.
2. Participants will experience using the tablet and web‐based assessment used by Anglia Ruskin University for student nurses so that they are able to critically discuss how learning analytics gathered from digital work‐based assessment can be used to enhance

Audience: Intermediate

Drexel Outcomes Transcript and Competency Portfolio: Empowering Students and Faculty with evidence of learning using Effective Assessment Mustafa Sualp, AEFIS. Stephen DiPietro and Donald McEachron, Drexel University GHall 108

In higher education, courses and instructors are often functionally siloed and students fail to see the connections between curricular elements. Outcomes-based design and assessment should address this problem but often does not due a significant disconnect between what students and faculty understand about the significance of student learning outcomes. In an effort to address these issues, a complete assessment management solution approach and software are being designed and implemented to create ‘learning outcomes transcripts’ which transcend individual courses and educational experiences. By providing developmentally relevant feedback to students in real-time, these transcripts may promote significant student ownership of learning outcomes, creating a stronger sense of purpose and curricular continuity. That, in turn, should promote more effective student learning and academic performance.

Learning Outcomes:

Audience: Advanced

Sep 14, 2017
3:00 PM - 3:15 PM

Break 6

Sep 14, 2017
3:15 PM - 4:15 PM

Concurrent Session 6

Assessing Our Assessment: Findings and Lessons Learned Three Years Later Victoria Ferrara, Mercy College PISB 104

Mercy College has learned quite a bit about the process used to measure educational effectiveness and the quality of their assessment work. This session will engage participants in a discussion about how the institution used assessment findings to improve faculty development opportunities, faculty assessment activities, and the institution’s assessment process. Many institutions struggle to implement formal processes to measure the effectiveness of educational assessment activities. Engaging in systematic self‐ assessment is critical to an institution’s ability to identify faculty development needs, provide support to help faculty determine whether students are meeting expectations, and ensure the effectiveness of the institution’s assessment process. This session will engage attendees in meaningful conversation about how to develop a process to assess their assessment efforts on their campuses. Time will be spent discussing findings, lessons learned, and the evolution of the rubric used to assess the work Mercy’s academic programs engage in to measure student learning.

Learning Outcomes:

1. Participants will be able to develop a process for evaluating academic assessment efforts.
2. Participants will be able to use assessment results to improve the assessment of student learning.

Audience: Advanced

Everything I Ever Wanted to Know About Assessment I Learned from Reality Cooking Shows Krishna Dunston, Community College of Philadelphia PISB 106

Reality cooking shows, like assessment plans, need to balance the evaluation of discrete technique, project‐based application, and synthesis of critical thinking skills and knowledge. In this interactive workshop, participants will deconstruct an episode of Top Chef to identify and align program goals, program objectives, student learning outcomes and criteria for evaluative rubrics. For those new to assessment (or those newly tasked with documenting or collaborating on assessment projects) getting the 'lingo' down can be daunting. This exercise allows for productive dialogue as to what each assessment term means ‐ and how it is used to improve our understanding of student learning. This workshop allows the beginning practitioner or facilitator a low‐stakes method of opening dialogues with colleagues and get beyond, what we mean, to what is meaningful.

Learning Outcomes:

1. Participants will be able to identify examples of program goals, objectives, student learning outcomes and criteria for evaluative rubrics and discuss how each of these distinct elements works cohesively in a balanced assessment plan.
2. Participants will discuss how to use the cooking show genre as a low‐stress way of engaging campus discussion on a variety of assessment topics.

Audience: Beginner

Grit in the Classroom Rebecca Friedman, Johns Hopkins University PISB 108

All teachers praise students but are we thinking critically about how we praise? What words are we using and what impact could they have? This workshop explores praising effort over intelligence. Participants will brainstorm how they can encourage students to develop grit. Incorporation of the growth mindset makes a big difference in how children perform academically. Even children who consider themselves “gifted” often avoid challenge, for fear they might lose status if they fail. But when we teach youth that intelligence is malleable, they tend to persist through difficulties and experience intellectual growth. / / (Blackwell, Trzesniewski, & Dweck, 2007) / When struggling students learn how to “drive their brains” through the use of cognitive strategies – such as the growth‐mindset, they’re more likely to be able to learn and think at higher levels. Teachers often say they need strategies for helping students learn how to increase their attention.

Learning Outcomes:

1. Participants will gain a deep understanding of how growth mindset and grit are related.
2. Participants will learn specific ways we can encourage students to develop a growth mindset.

Audience: Beginner

PDF

Decoupling and Recoupling: the Important Distinctions between Program Assessment and Course Assessment Nazia Naeem, Lesley Emtage and Debbie Rowe and Xiaodan Zhang - York College Pearlstein 101

The panel discussion focuses on one of the key issues in program assessment: the differences between program assessment and course assessment. The panelists discuss the importance of the distinctions and connections of these two kinds of assessment, including how they serve different purposes, how assignments and rubrics are designed differently, and how they are connected from a program self‐ study point of view. Mistaking course assessment for program assessment is common in faculty’s practice even though faculty may conceptually know the differences between the two. We maintain that decoupling program assessment from course assessment and then recoupling the two would enable faculty not only to examine student’s performance in any specific areas of knowledge and skill but also to adopt a broader view over program goals and curriculum beyond an individual course. The panel discussion helps attendees take away important strategies on program self‐study. These strategies include rubrics and assignment designs specifically for program self‐study, and interpretation of the data on student learning related to the evaluation of program curriculum: sequence of the courses at different levels, coverage of major knowledge and skills across those frequently‐offered courses, etc.

Learning Outcomes:

1. Participants will learn how to turn an existing course rubric that a faculty member commonly uses only for giving students a holistic grade into a rubric that examines important learning outcomes a program expects its students to achieve.
2. Participants will learn how to distinguish between course formative assessment and program formative assessment, and then understand how to design an assignment to collect student work for the purpose of improving teaching and learning in specific areas of knowledge and skill

Audience: Beginner

Encouraging Meaningful Assessment By Celebrating It! Letitia Basford, Hamline University Pearlstein 102

The importance of learning outcomes assessment is now clearly recognized at most universities across the country. Increased accountability and reporting demands have supported the momentum. But a greater focus on authentic assessment that facilitates student learning is not the result of succumbing to pressure. In order to build ongoing assessment routines that improve student learning Hamline University has focused on celebrating the efforts and outcomes of all who engage in this work. This session will highlight the different ways our university celebrates assessment efforts, and will provide participants an opportunity to share their efforts and ideas. We have defined our learning outcomes, collected student work, and had conversations about what we are learning from this work. But this effort takes time and demands ongoing encouragement to maintain momentum. Hamline University has taken part in several efforts to celebrate how programs are using assessment data to create meaningful change in their programs. From creating a video that highlights such assessment efforts, to holding regular “Cookies and Assessment” workshops that showcase meaningful assessment on campus, to praising administration for the important efforts faculty and staff are engaged in, Hamline University is focusing on celebrating and rewarding this work.

Learning Outcomes:

1. Participants will analyze the celebratory strategies at one university that encourage ongoing and meaningful assessment work.
2. Participants will share, collaborate and design practices for their own campuses that serve to reward meaningful assessment work.

Audience: Beginner

Application of the Collaborative Active Learning Model (CALM) Simulation An Experiential Service Learning Approach. Francis Wambalaba and Peter Kiriri, United States International University GHall 109

At the United States International University, my colleague and I have been experimenting with the CALM approach over the last three years for a graduate research methods course. The model is premised on three key activities; cooperative learning (teams); experiential participatory learning (project); and service learning (learning outcomes quality assessment). Studies have focused on active learning (Paulson and Foust, 1998; Mascarenhas, 1991; Aison, 2010; and Carlson and Winquist, 2011); effective learning (Li, 2012; Shen, Wu, Achhpiliya, Bieber and Hiltz, 2004); constructivist learning from experience to knowledge (Cooperstein and Kocevar‐Weidinger, 2003); and forms for effective learning (Prince 2004). Why CALM approach? Educators are always searching for effective ways of teaching and adapting to the changing student environment. This session engages participants in a simulation used for introductory overview to research methods, followed by a reflective discussion. Finally, an overview of the service learning experiential research project will be presented.

Learning Outcomes:

1. Participants will be able to identify key features in their courses that can be reorganized to include a CALM approach.
2. Participants will be able to identify courses in their university that could be appropriate for using a service learning approach and hopefully work towards reorganizing them for that purpose.

Audience: Intermediate

Process-Based Assessment and Concept Exploration for Personalized Feedback and Course Analytics in Freshman Calculus Mansoor Siddiqui, Project One and Kristen Betts, Drexel University GHall 209

Synapse, a STEM assessment platform, emphasizes grading based on the process rather than the final answer, rewarding students for demonstrating their thinking throughout a problem based on a rubric. Students can explore mistakes, generating data which is used to give personalized feedback and course efficacy analytics for professors and administrators. Online assessment has long been inefficient in capturing meaningful data about where students are making mistakes and why. In freshmen‐level STEM courses, grading assignments by hand is the only option, but is too time‐consuming. This platform saves teachers time, gives students better feedback, and generates meaningful course efficacy analytics. For teachers, grading time can be cut down tremendously and deep insight into their students' strengths and weakness can be measured after every assignment. Administrators have a method of measuring student learning outcomes and course efficacy from the student, section, and program level.

Learning Outcomes:

1. Attendees will learn how improved user experiences can make grading more efficient in large, intro‐level stem courses.
2. Attendees will learn how machine learning on data gathered from students' exploration of their mistakes can lead to greater insight.

Audience: Intermediate

Sep 14, 2017
4:30 PM - 5:00 PM

Transportation

Sep 14, 2017
5:00 PM - 7:00 PM

Pyramid Club Reception

Sep 15, 2017
7:30 AM - 8:30 AM

Continental Breakfast 2

Sep 15, 2017
8:45 AM - 9:45 PM

Concurrent Session 7

It’s Prime Time to Shift IE to EE Planning and Assessment – but How? Mary Ann Carroll, SUNY Herkimer County Community College and Jacqueline Synder, SUNY Fulton-Montgomery PISB 104

An entertaining, practical look at how your institution uses current practices and processes to tell your accreditation story, particularly Standards V and VI. Explore paradigm shifting from Institutional to Educational Effectiveness in terms of planning and documentation for all college units. The interactive session uses TV storytelling to help participants connect essential elements of IE/EE Planning, toward meeting students’ needs and expected outcomes. Presenters suggest viewing existing structures through a close up lens, then zooming out to the big picture to re‐brand your current IE Plan. Whether your IE story unfolds like a dreaded sit‐com, docudrama, or baffling mystery, an assessment leader can direct the institution’s characters toward a new adventure in plot development. Take the leading role to script your effectiveness story for the student’s POV. Adjust planning to plot twists faced in new accreditation standards, using practical constructs and tools for achieving the epilogue your campus is targeting.

Learning Outcomes:

1. Attendees will be able to identify and fill gaps in institutional processes and practices by connecting the essential elements of IE to EE planning.
2. Attendees will be able to use practical examples and models to help create integrated planning and practices that meet standards for Educational Effectiveness.

Audience: Advanced

PDF

Curriculum Maps: Who Starts a Trip Without a Map? Alaina Walton and Anita Rudman, Rowan College at Burlington County PISB 106

Our session will present curriculum maps as the foundation for successful institution‐wide assessment, including linking outcomes to general education requirements, to divisional budgets, and to the strategic plan. In addition, attendees will have the opportunity to practice creating/revising program curriculum maps. It is imperative that institutions have a solid foundation as they attempt to organize and link the massive amount of assessment data that is collected on an annual basis. Sound curriculum maps offer this foundation and move the institution forward in the cycle of continuous improvement. The opportunity to actually build a program curriculum map and see how that map can link to so many areas of the institution will prove invaluable as the attendees return to their own institutions to implement this foundational change in assessment planning.

Learning Outcomes:

1. Attendees will be able to recognize the importance of creating and maintaining curriculum maps to further institution‐ wide assessment.
2. Attendees will be able to design a program curriculum map that links program outcomes to various institutional outcomes.

Audience: Intermediate

PDF

Using Course Evaluations to Better Understand what Your Academic Program is Messaging to Your Students Beverly Schneller and Larry Wacholtz, Belmont University PISB 108

The session will concentrate on how we adopted the BLUE: EXPLORANCE Course evaluation software and what we did to tailor it to provide information on student learning outcomes that could be used for course and program assessment purposes. We made it into a learning tool for faculty and Colleges. The content will show faculty and assessment leaders how to create a narrative about what your program is really telling students based on how the students respond to the course evaluation prompts and how they score the questions. Using BLUE's top five strengths and top five weaknesses creates an opportunity to dialog within the department and also to inform professional development activities that will enable faculty members to connect more readily with the learning needs of their students. Using course evaluations is systemic to the university. Thinking of them as an appreciative assessment tool may move them beyond the Rate My Professor mentality to a broader use as a learning tool for the campus as a whole.

Learning Outcomes:

1. Participants will learn about a model for starting a conversation about what our course evaluations are telling students we value in their learning.
2. Participants will be able to consider their own needs in answering the question of how they might better use information collected and stored about teaching and learning outside of the tenure and promotion process.

Audience: Intermediate

Mission Impossible and Other Assessment Tales: Snapshots Joanna Campbell, Maureen Ellis‐Davis, Gail  Fernandez, Ilene Kleinman, Melissa Krieger, Amarjit  Kaur and Jill Rivera, Bergen Community College Pearlstein 101

This session will focus on a collaborative endeavor between faculty, staff and administration to create and sustain a culture of continuous improvement. Through “assessment tales” components of institution‐wide assessment will be examined. Practical strategies for sharing assessment data with stakeholders and how to facilitate faculty buy‐in will also be presented. Effective assessment is complex but need not be complicated. Institution‐wide assessment can be broken up into manageable components. This presentation will highlight the essential components of effective assessment design and target strategies to overcome obstacles that arise as the assessment process unfolds. / Attendees will explore program assessment, from coursework design to the development of assessment tools. Information provided will assist attendees in creating assessment processes at their own institutions. Challenges such as addressing faculty resistance, demonstrating accountability to stakeholders and faculty‐administration collaboration will also be discussed. As a group of seven presenters, we have been inspired to create seven five‐minute snapshots, and are requesting our own snapshot presentation session.

Learning Outcomes:

1. Participants will be able to identify the components of an effective, sustainable assessment process.
2. Participants will be able to align stakeholder expectations to an institution’s priorities.

Audience: Beginner/Intermediate

PDF

Developing Sustainable General Education Assessment: The Example of Oral Communication Assessment at St. Lawrence Valerie Lehr, Christine Zimmerman and Kirk Fuoss, St. Lawrence University Pearlstein 102

Using the example of oral communication, we will highlight a method for conducting general education assessment that: a) is sustainable; b) is applicable to learning goals satisfied in multiple courses across the curriculum; c) minimizes workload by being course‐embedded; and d) enables us look at student growth and “close the loop.” / Many colleges have learning goals addressed in multiple courses at all levels. Assessing learning across the curriculum, however, can be difficult because assignments must be meaningful and comparative, faculty must be engaged in assessment of general education goals, and, in areas where students are not growing, there needs to be expertise to address the gaps. / To the extent that conducting meaningful education with a feedback loop is difficult in relation to some general education goals, our session will help those charged with developing models to do so, as well as provide an opportunity for others doing this work successfully to discuss their approaches.

Learning Outcomes:

1. Participants will be able to think through the components of a successful general education assessment project and how to develop such an approach in a way that is efficient, as well as effective.
2. Participants will gain examples on how one might “close the loop” with a large project that touches many departments.

Audience: Intermediate

PDF

Beyond the Classroom: A Collaborative Pilot to Unify Learning Assessment Across Six Academic Support Units Jocelyn Manigo and Janet Long, Widener University GHall 109

This session explores a pilot effort to unify assessment practices for six administrative units within the department of Academic Support Services ‐ Career Services, Counseling, Disabilities Services, Exploratory Studies, Student Success and Retention and Tutoring. Presenters will highlight a year‐long, collaborative journey to design, implement and analyze the assessment of student learning outcomes around the theme of critical thinking. Given the highly‐varied nature of administrative units in higher education, the task of creating a unified assessment process can seem daunting and overly‐complex. Educational leaders may require a starting point as well as strategies to develop a common language and to align thinking. Through interactive conversation and activities, attendees will gain practical strategies and materials to initiate a similar assessment model within their institution. This session will especially benefit those seeking a more uniform framework to assess outcomes across disparate administrative units in higher education.

Learning Outcomes:

1. Participants will gain straightforward strategies and process steps to initiate, align, and implement an assessment program for co‐curricular units that support student learning and development outside of the classroom.
2. Attendees will gain strategies to apply common themes and consistent language to learning outcomes among administrative departments with diverse missions and functions.

Audience: Beginner

Many Questions, Multiple Methods: Assessing Technological Literacy and Course Design in a Modular Team-Taught Course Dana Dawson, Temple University GHall 209

Assessing technological literacy is challenging, as it addresses both technical and conceptual competencies. Even more challenging is assessing technological literacy in the context of piloting a modular, team‐taught course. This presentation will review models for thinking about technological literacy from both a course design and assessment perspective. Technological advances deeply influence our students’ experiences, expectations, and career trajectories. It is crucial that higher education institutions grapple with what it means to be technologically literate. The information shared in this presentation stems from a multi‐year project to develop, pilot and assess courses that would directly address technological literacy. In this presentation, attendees will be exposed to different models for addressing technological literacy in course design, including the two models we piloted over the past year. We will discuss how the learning goals that informed those courses were used to guide a direct assessment of student learning.

Learning Outcomes:

1. Participants will increase understanding of key dimensions of technological literacy.
2. Participants will increase understanding of the relationship between learning outcomes for a course and tools to assess student learning.

Audience: Intermediate

PDF

Best Practices in Assessment: A Story of Online Course Design and Evaluation Gulbin Ozcan-Deniz, Philadelphia University GHall 108

This session will demonstrate how online assessment can successfully be achieved by the use of traditional and not‐so‐traditional course activities. The impact of course design and use of Blackboard as the Learning Management System (LMS) will be discussed. The challenge in assessing online student learning is obvious and this presentation will focus on best practices of online assessment to help future online educators. Formative and summative assessment techniques will be shared with the audience by giving specific examples. The application of the Bloom's Taxonomy will be discussed to set best practices in online assessment. Upon completion of this session, attendees will get basics of online course assessment and will be equipped to identify their own best practices in online assessment by using an LMS. This will help attendees to improve their teaching‐assessment‐feedback loop in online course with a current technology (i.e. Blackboard).

Learning Outcomes:

1. Attendees will gain a working knowledge of online course assessment.
2. Attendees will be able to identify best practices in online assessment for their own courses.

Audience: Beginner

Sep 15, 2017
9:45 AM - 10:00 AM

Break 7

Sep 15, 2017
10:00 AM - 11:00 AM

Concurrent Session 8

PDF

Expanding and Developing Assessment Practices to include Administrative, Educational, and Student Support (AES) Units Christopher Shults, Erika Carlson and Marjorie Dorime-Williams, Borough of Manhattan Community College PISB 104

The Middles States Commission on Higher Education now requires that student learning and the environment that supports student learning are assessed. Many institutions struggle to develop appropriate assessment methodologies for AES units. We will focus on the development of mission, goals, student learning, and support outcomes for AES units. It will highlight the importance of assessing AES units, how to choose appropriate assessment methods and making use of results. The audience will be able to leave with a framework for AES assessment that explains how it all fits together. Participants will be able to develop an understanding of the role of student learning and support outcomes in AES units; create student learning outcomes, support outcomes, a mission, and goals for AES units; and finally, evaluate current assessment practices in AES units and implement valid assessment methodologies as an alternative.

Learning Outcomes:

1. Develop an understanding of the role of student learning outcomes and support outcomes in AES units
2. Evaluate current assessment practices in AES units and implement valid assessment methodologies as an alternative

Audience: Advanced

PDF

Creating a General Education Capstone: Assessing Institutional Outcomes through General Education Jenai Grigg and Gina MacKenzie PISB 106

This session will focus on the development and implementation and assessment of a General Education Capstone Course that serves as a place for institutional outcomes assessment and summative engagement with our university's mission. This content focuses on the conference's sub‐theme "Institutional Change and Assessment" to show how bold change can reinvigorate an institution's approach to the nature of liberal arts education and assessment. This content will serve as an example to other institutions struggling to find mechanics for institutional assessment, engagement with general education and/or mission focus.

Learning Outcomes:

1. Analyze current institutional outcomes and their relationship to programs and courses.
2. Develop mechanisms for assessment of institutional outcomes within general education.

Audience: Intermediate

PDF

Assessing Information Literacy for Community College Students: Faculty and Librarian Collaboration Leads to Student Improvement Janis Wilson-Seeley and Graceann Platukus, Luzerne County Community College PISB 108

Through faculty and librarian collaboration, the college piloted a program that embedded information literacy into general education as a core competency. The pilot program consisted of pre and post testing students’ competency while they used ProQuest’s Research Companion, an information literacy learning tool, throughout the semester. Information literacy, the ability to find, evaluate, and use information effectively, is the foundation of critical thinking. Buzz words such as “fake news” and “alternative facts” have brought information literacy to the forefront of American politics, conversations about control and dissemination of information, and how its misuse can be detrimental. Our study shows that Research Companion is an effective tool for improving information literacy. Attendees will learn about the project’s evolution, data collection, assessment, and ultimately, how it improved students’ competency. Attendees will find insight on how to develop collaborations of their own to assess these critical skills.

Learning Outcomes:

1. Attendees will be able to recognize ACRL’s information literacy framework and explain its importance as a means of student assessment.
2. Attendees will be able to implement similar collaboration efforts to their own institutions.

Audience: Beginner

Community-building through Assessment Design: Reframing Disciplinary Student Outcomes as Inquiry-based Brad Knight, American University Pearlstein 101

American University recently approved a Core Curriculum emphasizing metacognition and habits of mind, replacing a traditional distribution requirement. This session describes the iterative stages of learning outcome development undertaken by faculty and provides insights from lessons learned as we strove to shift from content‐centered outcomes to modes of inquiry. A ground‐up faculty‐driven process over many months allowed us to reach consensus on 3‐4 student learning outcomes that wouldn’t be about disciplinary instruction, the way courses for a major function in a curriculum. Instead, content would be used as the vehicle to deliver the instruction in the mode of inquiry. During this session, we will share how we borrowed from the tools of design thinking to engage more than 75 faculty in our working groups. A course proposal process that grapples with the competing interests of openness and selectivity will also be described and sample rubrics discussed.

Learning Outcomes:

1. Participants will be able to adapt, to their specific context, the steps involved in one approach to developing faculty‐driven student learning outcomes.
2. Participants will be able to select productive prompts for use with faculty when developing inquiry‐based learning outcomes.

Audience: Beginner

Data-driven Conversations to Make a Difference in Campus-wide General Education Mindi Miller, Molly Hupcey Marnella and Bob Heckrote, Bloomsburg University of Pennsylvania Pearlstein 102

Assessing General Education (GE) can be challenging when analyzing evidence across disciplines. Presentation examples explain the use of rubric guidelines to review aggregate data and learning outcomes. Campus‐wide targets for student achievement (like information literacy, communication, and healthy living) can focus on GE outcomes beyond a specialized topic or major. The session activity uses sample outcomes to discuss possible institutional recommendations and improvements. Accreditation standards and the goals of higher education require an evaluation process for obtaining evidence of mission and value attainment. With practical assessment guidelines, data from interdisciplinary GE courses help to document and provide evidence of student achievements. Academic and co‐curricular GE results may be analyzed with the same benchmark to capstone GE criteria. This data alone, however, is insufficient without department, division, and institutional discussions. In addition, comparisons of GE trends with other outcome data is needed, such as post‐graduation surveys and conversations with alumni and employers.

Learning Outcomes:

1. Identify the connection between rubrics, aggregate data, outcomes, and targets ‐‐ The presentation provides background information on general education assessment while showing the importance of department, division, and campus discussions about the us
2. Using samples, analyze campus‐wide GE improvements and further recommendations based on GE assessment data and post‐graduation evaluation

Audience: Beginner

PDF

Should You Take Student Surveys Seriously? Zvi Goldman, Jeremi Bauer, Susan Lapine and Chris Szpryngel, Post University GHall 109

Internal, end‐of‐ class student surveys at Post University provide useful student feedback on their perceptions of course content, instructor performance, success factors and overall student satisfaction. The immediate benefit of these surveys, providing instructors and course developers with the necessary feedback to continuously improve the student academic experience, is obvious. However, there are much deeper applications and value to the student survey once student responses are trended over time and modeled into performance indicators. This presentation will share Post’s survey methodology, integrated approach to result validation, and some specific higher‐value applications that have been derived from the survey outcomes.

Learning Outcomes:

1. Attendees will learn about Post survey methodology and integrated approach to results validation
2. Attendees will learn about some specific higher‐value applications that have been derived from the survey outcomes

Audience: Intermediate

PDF

Skipping Stones or Making Splashes; Embedding Effective Assessment Practice into Faculty Repertoire Dana Scott, Philadelphia University GHall 209

This hands‐on session will present how a community based model of assessment can be effective on multiple levels of faculty development, from large scale, university‐wide presentations, to smaller group workshops, down to one‐on one sessions with assessment advocates. This model will demonstrates how to give faculty a big picture overview of best practices, how to hone‐in on areas that need improvement, and discuss 1‐1 training and feedback to improve specific assessment needs. The session will cover faculty development from a macro to a micro level. Participants will learn to use strategic design tools to identify key issues and insights in their own assessment procedures. They can continue to use these tools to help generate ideas for impactful faculty development.

Learning Outcomes:

1. Recognize a process for creating a holistic system of faculty development, including: ‐ strategies to organize larger presentations ‐ systems for creating a community of assessment for implementation on multiple levels
2. Examine key issues and insights in their assessment procedures: ‐ to categorize between areas of need, potential, and accomplishment ‐ to generate ideas for impactful faculty development.

Audience: Intermediate

Sep 15, 2017
11:15 AM - 12:00 PM

Closing Remarks