2016 Pre-Conference Workshops
2016 Pre-Conference Workshops [PDF]
Workshop #1
Implementing Curriculum Review: From Designing the Process to Using the Findings
Jane Marie Souza, Ph.D.
Abstract:
Periodic curriculum review is essential to maintaining a quality educational program. While faculty and administrators may clearly agree with that statement, implementation of the review process may be much less evident. Questions abound: How do we schedule the review? How long should it take? How are duties assigned? How do we manage the process? What should we look at when reviewing individual courses? What evidence do we use to support our conclusions? And perhaps most importantly: How do we plan for use of our findings? The answers to these and other common questions will be explored in this pre-conference workshop.
This workshop will present a strategy for establishing a Curriculum Review timeline and distributing the workload. Then a review process will be outlined employing a series of questions that can be researched through an established evidence bank. It will be demonstrated how questions posed for the review process can be aligned with targeted goals and specific sources of evidence. Finally, a plan will be suggested for the important step of following through on resulting recommendations.
Participants in the workshop will be provided handouts including a set of possible research questions, a sample evidence bank, and tools to align course-level assessments. They will then be tasked with using the tools to outline a process to fit their unique educational settings.
Outcomes:
At the conclusion of this workshop participants will be able to
- Outline a plan and timeline for a curriculum review process.
- Draft research questions to guide an effective curriculum review.
- Identify appropriate sources of evidence to address research questions.
- Outline a process to follow-through on review findings.
Workshop #2
Developing Direct and Indirect Measures of Student Learning
Jodi Levine-Laufgraben, Ph.D.
Abstract
This pre-conference workshop will focus on strategies for selecting the right assessment approach with which to measure student learning outcomes. We will discuss how to design and implement direct measures of student learning, and how to best use indirect measures of student learning to compliment your direct assessment efforts.
Outcomes
At the conclusion of this workshop participants will be able to:
- Identify direct and indirect measures of student learning outcomes
- Select assessment strategies that best align with their learning outcomes
- Design a direct measure of student learning
Workshop #3
Winning Arts and Minds: Assessing the Creative Disciplines
Krishna Dunston
Abstract
Assessment advocates and leaders in the creative disciplines often find themselves squeezed between the right and left brains of the college campus. We comprehend the urgent need to demonstrate student competency; but find that what fits most easily into a spreadsheet has little or nothing to do with creative success. It is easy to get stuck collecting meaningless data which does not improve student outcomes or allow for informed program improvement. This can prove frustrating to institutions who need arts programs to, “close the loop.” A well-constructed plan can lift the blinders from both sides and reveal the way arts pedagogies provide some of the most important skills of a 21st Century education: collaboration, self-assessment, innovation, discipline and adaptability.
This workshop is for those looking to revitalize their own course or program assessment plan; preparing to build new creative programs; seeking inspiration as an assessment facilitator; or are wanting to learn more about authentic assessment. Participants will experiment with a variety of tools and discuss their use in an assessment structure which balances the evaluation of artistic product with an examination of creative process. The presenter will share how unconventional assessment metaphors: the elementary school science fair, NASA vs. Google, Venn diagrams, and even reality coking shows have proved useful models for opening dialogues: breaking the cycle of useless reporting and encouraging the creation of meaningful assessment processes.
Outcomes
At the conclusion of this workshop participants will be able to:
- Identify and build from existing pedagogies;
- Emphasize process in authentic assessments;
- Discuss the balance of process, product and reflection; and
- Investigate new models and metaphors for program mapping.
Workshop #4
Assessment Toolbox: Supercharge the Direct Assessment of Student Services
Michael Sachs, PhD
Abstract
The Middle States Commission on Higher Education’s publication Student Learning Assessment: Options and Resources, Second Edition states “the characteristics of good evidence of student learning include considerations of direct and indirect methods for gathering evidence of student learning.” Creating direct student learning assessment tools within student support services can be challenging for student service professionals. Often many student service programs rely only on indirect assessment techniques such as focus groups, evaluations, satisfaction surveys, NSSE results, etc.
This workshop will explore the countless direct student learning assessment tools available to Offices of Student Affairs and other services offices on campus. These techniques and tools are both qualitative and quantitative in intention and design. This workshop will also enable participants to develop program goals, rubrics, and direct student learning outcomes for their student service areas – linked, of course, to their college’s mission and/or strategic plan. Participants should bring copies of their institutional strategic goals and mission.
Outcomes
At the conclusion of this workshop participants will be able to:
- Explain the importance of direct assessment for planning, resource allocation and student learning.
- Recognize and understand the differences between direct and indirect assessment in student services.
- Create and use rubrics for student learning outcomes.
- Create direct assessment of Student Learning Outcomes for their individual areas / programs that can be incorporated into assessment plans.
Workshop #5
How I Learned to Stop Worrying and Love Accreditation: Working with the New MSCHE Standards
Sean McKitrick, PhD, Vice President, Middle States Commission on Higher Education
Abstract
In accordance with CFR 34 602.21 Review of Standards, the Commission conducts a regular review of its accreditation standards. During spring 2013 the Commission began its latest comprehensive review of the standards. These efforts were led by a Steering Committee representing MSCHE member institutions, the MSCHE staff, and the general public. The Steering Committee followed a set of Guiding Principles. These four Guiding Principles were developed by the Commission to reflect the areas that were identified as the most important to the membership of the Commission: Mission-Centric Quality Assurance, the Student Learning Experience, Continuous Improvement, and Supporting Innovation.
The Commission approved a plan to implement the revised standards through a unique Collaborative Implementation Project. The project involves a cohort of 15 institutions that are scheduled to submit their self-studies and host evaluation teams during the 2016-2017 academic year. Throughout the next two years these 15 institutions will undergo a “high touch” experience in which they will speak frequently with members of the Commission staff and with each other, as they engage in self-study. They will also play an active role in preparing other institutions to use the revised standards. All institutions hosting an evaluation team visit in the 2017-2018 academic year and beyond will engage in self-studies guided by the revised standards.
Outcomes
At the conclusion of this workshop participants will be able to:
- Discuss and explain the new MSCHE standards
- Demonstrate how the new standards focus on the student learning experience
Workshop #6
From A – Z: An Assessment Toolkit
Sean Joanna Campbell, Professor, Bergen Community College
Maureen Ellis-Davis, Associate Professor, Bergen Community College
Gail Fernandez, Associate Professor, Lead Assessment Fellow, Center for Institutional Effectiveness, Bergen Community College
Dr. Amarjit Kaur, Managing Director, Center for Innovation in Teaching & Learning (CITL), Bergen Community College
Dr. Yun Kim, Vice President, Center for Institutional Effectiveness, Bergen Community College
Dr. Ilene Kleinman, Associate Dean of Curriculum, Bergen Community College
Jill Rivera, Associate Dean of Student Success and Completion, Bergen Community College
Shyamal (Sony) Tiwari, Faculty, Bergen Community College
Abstract
You are charged with developing a robust and formal assessment program at your institution. How do you get started? What are the necessary components of a successful program? Who is in charge of the process? Who are the stakeholders? What value does the institution place on assessment as demonstrated by the institutional resources and commitment? In this workshop, the assessment team at Bergen Community College will (1) help you identify key components that lay the foundation for an effective assessment program, and (2) share “add-ons” that may help sustain and nurture the assessment program at your institution.
Session attendees will have an opportunity to begin building their assessment platforms and will receive an assessment toolkit to bring back to their institutions.
Outcomes
At the conclusion of this workshop participants will be able to:
- Participants will be able to describe the components of an effective and sustainable assessment program.
- Participants will identify the assessment tools to use at their institutions.
- Participants will begin the process of building an assessment platform that meets the needs of their institution.
- Participants will be able to explain how institutional commitment translates into necessary resources.