"

5 Evaluation

E-learning Goals

The outcomes for this course (training) are more implicit than explicit:  to remove barriers to future coursework.  1. Skill at navigating LMS and other institutional technologies, 2. proficiency in the mechanics of scholarship (bibliography as a means of information management), and 3. understanding the scope and tasks of theological education.

Kirkpatrick Levels

Level 1 Reaction:  Learners will be surveyed early (week two or three), mid-semester, and toward the end of the course about their reactions to various aspects of the course (especially the shift in modality).

Level 2 Learning:  All course assessments are geared toward measuring student learning: Cognitive (knowledge), Behavioral (skill), and Affective (attitude, confidence, and commitment).  The first two are the industry focus in higher education, while the last is coming into prominence.

Level 3 Behavior:  [developing:  I am working on ways to track this post-instruction period.  Much will depend on resources and institutional access to data (i.e. permission to see if students are motivated to put in practice what was learned in other courses.)]

Level 4 Results:  This training is not mission critical for the institution and there are no plans to evaluate the training at this level (see Kirkpatrick and Kirkpatrick, p. 149 “Pitfall #2”).

Assessment Techniques and Tools

Techniques

Formal Grading:  As this is a credit-bearing required course for accredited degree programs, formal grading is required.

Formative Assessment: Learners will have the opportunity to monitor progress through low-stakes weekly skill building practice, end of module knowledge quizzes, and module specific assignments.

Summative Assessment:  Learners will produce a summative assessment in the form of a 30-item categorized and annotated bibliography.  M.A. Students will focus this bibliography on their field of specialization while including relevant works from other subdisciplines and M.Div. students will balance entries between identified subdisciplines to reflect the breadth of their course of study.  Aspects of several formative assessments will scaffold elements of this summative assessment.  The intention is that this be a working resource list for the M.A. students in their capstone project and a map of the interconnected subfields that M.Div. students will engage.

Validity and Reliability:  Validity is tied to making assessments as close to actual expected behaviors as possible; essentially content-related validation.  Reliability will rely on clearly articulated rubrics for scoring learner work.

Types of Knowledge:  The balance will be tilted toward procedural knowledge; learners will be asked to demonstrate how to appropriately gather and organize specific pieces of information by doing it.  There will certainly be aspects of declarative knowledge, but these will largely be settling terminology and understanding procedure.  [Given the higher ed context, I do hesitate on this a bit: the opportunities to actually observe performance will be relative rare.  We are attentive to process but will be largely considering product.]

Types of Data:  Both quantitative and qualitative data will be collected.  The quantitative will be limited (low stakes quiz scores and item counts in summative assessment) but important.  The qualitative will offer more detailed picture of the learner’s conceptual progress.

Role of Instructor:  The instructor will conduct synchronous sessions, respond to questions and concerns, meet with learners as needed, provide weekly encouragement/announcements, and grade student assessments with feedback.  Again, this a shift from a more synchronous modality to a more asynchronous modality.

Tools:

For quizzes and a pre-test/post-test I will use the native test functionality in the seminary’s LMS, Populi.  The functionality could be better, but it is good enough and reduces both the student burden of learning and navigating multiple systems and instructor burden as scores will be automatically recorded.

I will us the polls/quizzes functionality in Zoom for increased engagement and anonymous feedback during synchronous sessions.  I will need to capture this data for analysis afterward and possible (level 1) evaluation data.

Several assessments reflect traditional graduate school assignments and will be completed using word processing programs.

One assessment will ask students to use MS Visio (or another tool of their choice to create a mindmap (or other visual demonstration) of their understanding of the subfields of graduate theological education and their interrelationships.  The seminary provides access to Visio through the MS 365 student suite of applications and, while it may not be the perfect tool for the job, it does get the student out beyond the Word, Excel, PowerPoint neighborhood and thinking a bit about tools (and visual communication).

Questionnaire

  1. What were the specific activities or elements of the course where your interest was the greatest?
  2. Given the schedule flexibility (few set meeting times), did you find it easier or more challenging to engage the course material?
  3. Thinking about what you knew about theological education coming into the course, what was an important thing that you learned in this course?

 

License

INT 2501: Introduction to Theological Education Copyright © by rdshra. All Rights Reserved.