University Assessment

University Assessment 

CDI Assessment Grants Summer 2013

In spring 2013, the University Assessment Center, in collaboration with the Teaching, Learning and Technology Center, announced a request for proposals for Department/Program assessment grants. This grant opportunity provided faculty across the University with an opportunity to advance the culture of assessment in their department or program. A variety of types of projects were invited, from assessing several student learning outcomes program wide to assessing a particular skill in a capstone course to using clickers to assess student learning outcomes. A large number of proposals were submitted and underwent a competitive review process involving the members of the University Assessment Steering Committee and coordinated by Paul Fisher, Director of TLTC, Dr. Theresa Bartolotta, Office of the Provost, and Dr. Michael Vigorito, University Assessment Center.  Eight projects were funded. Faculty members working on each project will provide regular updates on their progress at meetings of the University Assessment Committee during the 2013-2014 academic year. Abstracts of the funded proposals appear below:

Enhancing the Culture of Assessment in the Department of Biological Sciences

Project Directors: Jane Ko, PhD & Marian Glenn, PhD Department of Biological Sciences, College of Arts & Sciences

The Department of Biological Sciences is assessing its largest program, the BS Biology major.  Students begin this program with 4 required Biology lecture+lab courses, then complete 4 Biology electives and a capstone course, as well as 8 required common courses in Math, Chemistry, and Physics. The BS Biology program has three goals.  (1) Educate and empower students in classroom setting; (2) Educate and empower students in laboratory setting; and (3) Apply knowledge and skills of Biological Sciences to the greater good of society.  This project aims at assessing five Student Learning Objectives (SLOs) which were developed by the faculty.  Goal 1, SLO 1:  Students will learn the principles and concepts of biological systems at molecular, cellular and organismal levels. Specifically, students will demonstrate understanding of the structure, function and classification of various organisms and biological systems; Goal 1, SLO 2: Students will acquire a solid foundation in Chemistry, Biochemistry, Mathematics, and Physics, integrating essential principles in these disciplines into Biological Sciences, and enhance their understanding of various biological disciplines; Goal 1, SLO 3: Students will learn the essential tools to equip them for college life and beyond. Specifically, students will demonstrate proficiencies in reading/writing, numeracy, critical thinking, oral communication and information fluency;  Goal 2, SLO 1:  Students will learn essential biological techniques. Specifically, students will develop technological literacy in biology; and Goal 3, SLO 1: Students will apply their Bioscience knowledge and skills to scientific research.

Four faculty will assess SLOs in individual courses, two required and two electives, and a commercial assessment test, Biology ACAT (Area Concentration Achievement Test), will be used to assess knowledge from the 4 common freshman and sophomore year biology courses.  Assessment protocols will be developed, applied and reported on during the fall 2013 and spring 2014 semesters. Faculty will explain how the outcome assessment will be used to improve their courses.  At the program level, faculty will propose changes to address the results of the ACAT and facilitate a collegial discussion within and across departments about course and curricular changes.

Integrating the New Jersey Library Association Progression Standards for Information Literacy into English 1201 and 1202: a Joint Project of the English Department and the University Libraries

Project Director: Martha Deyrup, PhD, University Libraries

The English Department currently uses e-portfolios as part of its student assessment.  At the end of the fall 2013 semester the English Department and the Library jointly will evaluate the portfolios for English 1201 and 1202. The Library will assess Information Literacy (IL) skills as defined by the New Jersey Library Association Progression (NJLA) Standards for Information Literacy. A team of librarians will work this summer with the directors of the writing center to: a) decide which sections of the IL progression standards we will assess for; and  b)ensure all 1201 and 1202 syllabi  incorporate the IL progression standards.  In addition the library faculty will administer  a mid-semester test to assess basic IL skills.  This test will be constructed in collaboration with the English Department faculty and taken by all students.  Currently all English 1201 and 1202 students are required to have a research session conducted by a librarian and have created two Libguides  for 1201 and 1202 that serve as resources in Blackboard modules. Assessment strategies include summative assessment (the research paper and e-portfolio), direct assessment (the midway test) and indirect assessment (research logs, reflection).

Best Practices and Course-Embedded Student-Learning Outcome Assessments in Computing and Decision Sciences Courses

Project Directors: Leigh Onimus, JD, Associate Dean, & David Rosenthal, PhD, Department of Computing and Decision Sciences, School of Business

The Department of Computing and Decision Sciences (CDS) in the School of Business will work on a two-part project that will build upon ongoing efforts to assess student learning outcomes related to: (1) Excel spreadsheet design and preparation skills taught in BITM 2701 (undergraduate) and (2) application and interpretation of important statistical analysis techniques taught in BQUA 2811 (undergraduate).  In both cases, the goal is to develop/refine one or more course-embedded assessments that would be applied in upper-level courses to evaluate student learning outcomes of the prerequisite IT/statistical analysis course.  An additional goal of the project is to develop a standardized set of “best practices” in spreadsheet design, which would be referenced and applied in subsequent business core and elective courses.  Because learning outcomes in CDS courses will be tested in other business core courses, this work would have a direct impact on the School’s undergraduate and MBA curricula as well as provide the opportunity for important curriculum development collaboration across departments.

College of Education and Human Services "edTPA" Pilot Project for the Authentic Assessment of Senior Student Teacher Candidates

Project Directors: Gerald Babo, EdD, Department of Education Leadership, Management and Policy, College of Education and Human Services

The overall goal for this assessment project is to determine which possible student teacher performance assessment practice might provide a more detailed, comprehensive and efficient authentic assessment for SHU senior student teacher candidates.  Current NJ State policy initiatives along with the national accreditation body (CAEP) have placed a premium on an accurate and authentic assessment of classroom instructional practice for those teacher candidates being prepared to assume the responsibility of teaching our nation's youth.  It is the intention of this project to provide the College of Education & Human Services (CEHS) important data and information to best determine a systemic assessment process that will address this overall policy goal.  A never ending goal of the CEHS is to ensure and facilitate for each of our teacher graduates a successful transition to the field of education in an ever changing educational milieu.

The assessment plan for this multi-level project will focus on the triangulation of multiple sources and forms of data (i.e., student created documents, artifacts and products) collected from a purposeful sample of senior teacher candidates.  By comparing a national performance assessment model supported by AACTE (edTPA), an "in-house" EDST/ELMP collaborative initiative, and a random sample of the "traditionally prepared" senior student candidates, it is the intent to identify the most effective performance assessment model. This triangulated effort will provide both the teacher candidates and the faculty of the CEHS with valid information on the overall effectiveness of our teacher preparation program.  In so doing, the ability for the CEHS to gauge both compliance and achievement of national accreditation standards and NJDOE standards will be better realized.

Development of an inter-rater reliable instrument for assessment of CEHS junior interns field experience.

Project Directors: Marietta Peskin,EdD,  Elementary and Special Education Teacher Training Program, College of Education and Human Services

Currently, CEHS junior interns are formally assessed twice during their 72 hour field experience: once by a university supervisor whose evaluation is based on observation of one lesson taught by the intern and once at the conclusion of the experience by the intern's cooperating teacher who bases the evaluation on global impressions gathered over the 72 hours. Both evaluations are mutually independent and both use an on-line evaluation comprised of 45 criteria. This evaluation is 'top down' and global. Research indicates that reliability of rating by a single observer is .27 and .45.1 The reliability of the current evaluations would be considerably lower due to measurement error resulting from different performance sampling by evaluators.  The proposed project seeks to develop an inter-rater reliable instrument from the 'bottom up'. Three domains will be included representing the criteria most critical to teaching a lesson.  Both the supervisor and cooperating teacher will observe the same intern taught lesson and assess the performance using a new instrument.  This dual observation techniques fall in line with the March, 2013 TEACHNJ Regulations for teacher evaluation which calls for "double-scored observations".2 Once inter-rater reliability is established, the strengths and weakness of candidates, singly and as a group can be determined. Cumulative results will be calculated and will provide data that can be used to inform changes in the program curricula.

  2. NJ Department of Education: TEACHNJ Regulation Proposals: Building an Effective Evaluation System for Teachers and Principals (March 6, 2013)

The University Assessment Center (UAC) solicited proposals for integrating clickers into the classroom for instruction and assessment. Clickers are now available in many formats. These range from devices that students purchase from the Seton Hall Bookstore or to applications that can be downloaded to laptops, smart phones, iPad, and tablets. The i>clicker brand, for example, allows students to use any one of these clicker formats in the same class. Clickers can be used for everyday classroom activities (e.g., attendance, points for class participation, assessing understanding of a specific topic) and can also be used for broader assessment purposes. The abstracts that follow describe three projects that were funded to investigate the effectiveness of clickers for teaching and assessment.

Using a Clicker App and iSpring Quizmaker to Assess Real-time Critical Thinking and Course-specific Competencies in MHA Students

Project Directors: Anne M. Hewitt, PhD & Nalin Johri, PhD, MPH, Master's in Healthcare Administration (MHA) Program, College of Arts & Sciences

The Master's in Healthcare Administration (MHA) program at SHU conducts rigorous on-going evaluation activities to assess both program effectiveness and student competency attainment through a variety of measures. However, faculty recently recognized that current assessment efforts were not adequately establishing a link between attainment of course objectives and mapped course-specific student competencies, such as critical thinking. The MHA program recently introduced the "Flipped Classroom Model" (FCM) into its curriculum to increase the use of experiential classroom activities and facilitate student learning outcomes, i.e. critical thinking. Currently, Blackboard quizzes embedded in FCM pre-class lectures have been utilized for critical thinking assessment in online courses, but an in-class assessment for on-campus students has been missing.

The goal of this grant is to improve current assessment activities that are not sensitive enough to connect student’s use of critical thinking in specific courses to desired and stated outcomes and competencies. MHA faculty will integrate multi-platform clicker apps and iSpring Quizmaker as a valid technique for assessing SLOs for both FCM pre- and in-class sessions across online and on-campus offering of courses. This Program Assessment Grant will enable the faculty to add innovative components to the current program assessment strategies and provide opportunities to formally assess critical thinking experiences.

Using Clickers to Assess Candidates' Content Knowledge and Field Placement Activities

Project Directors: Alisa Hindin, EdD & Debra Zinicola, EdD, Department of Educational Studies. College of Education and Human Services

In this project, we will use clickers to assess candidates’ developing content knowledge in our literacy and science methods courses and to gather data about candidates’ field placements. We have three goals for our project. First, we want to help our teacher candidates better understand their own areas of strengths and weaknesses with regard to course content. Second, we want to find ways to provide faculty members and candidates with “easy to analyze” data to see if students need additional instruction or clarification. Lastly, we would like utilize clickers to improve the connection between field experiences and coursework and to optimize the value of the four, 72-hour field placements in terms of professional preparation.

Using Clicker Technology in a Clinically-based Occupational Therapy Course: Comparing Student Perceptions of Participation and Learning

Project Director: Thomas J. Mernar, PhD, Occupational Therapy Program, School of Health and Medical Sciences

Student–teacher interaction is ranked highly among the factors influencing learning performance (Bullock et al., 2002; Hake, 1998). Clicker technology has been changing how students and their instructor interact within the classroom by providing new opportunities to enhance in-class participation and learning in traditional, lecture hall classroom environments (Blasco-Arcas, et al., 2013). However, less is known about how clicker technology improves in-class participation and learning in more clinically content based courses.  This project examines how learning and in-class participation is effected by the use of clicker technology within clinically based courses.

Blasco-Arcas, L., Buil, I., Hernández-Ortega, B., & Sese, F. J. (2013). Using clickers in class: The role of interactivity, active collaborative learning and engagement in learning performance. Computers and Education, 62, 102-110.

Bullock, D.W., LaBella, V. P., Clingan, T., Ding, Z., Stewart, G., & Thibado, P. M. (2002). Enhancing  the student-instructor interaction frequency. Physics Teacher, 40, 535–541.

Hake, R. R. (1998). Interactive-engagement versus traditional methods: A six-thousand student survey of mechanics test data for introductory physics courses. American Journal of Physics, 66, 64–74.

Sign In to PirateNet