Professor Simon Mukwembi


TEACHING PORTFOLIO
2014
NATIONAL EXCELLENCE IN TEACHING AND LEARNING AWARDS


Methods of assessing students' work and performance

Our college adopted continuous assessment and nal examination as the overall approach to assessing undergraduate students. Methods for continuous assessment vary from one lecturer to the other. Where I am the sole lecturer for a course, I always assess my students' work and performance by giving them three major tests, the rst test being written early enough into the semester, the second test being written mid-semester and the third one towards the end of the semester. Since 2008 I have introduced, in addition to the major tests, tutorial tests which are written at the end of some well chosen tutorials. I incorporate the assessment procedures in my course information sheet which is provided to students either at registration or on the fi rst day of the module and post a copy on the course website on Moodle, thus allowing students space to forward plan.

When setting a weekly tutorial, I use a developmental approach in which my assessment exercises increase in difficulty as students progress with the tutorial (Annexure 3A). In the tutorial session my tutors and I give guidance, suggest hints and provide feedback to individual students as they attempt each of the tutorial tasks. Although some of my colleagues do not give tutorial tests, I believe that a short (15 to 20 minutes) test at the end of the tutorial should be given in order to assess the ability of the students in demonstrating knowledge of the material covered in the tutorial (Annexure 3B). The tutorial tests, just like major tests, also help improve students' skills in communicating e ffectively. I have always guided my tutors to mark tutorial tests and tests as thoroughly as possible and to indicate where students can improve on their mathematical language (Annexure 3B) as well as giving encouraging comments. I also give them an elaborate marking memorandum to standardize marking. One corollary to the above is that since tutorial test scripts are returned to students before the next tutorial session, these test results are used to provide feedback to students. I believe that this way we can mould a well-rounded graduate who would be convincing in the field of work and academia. I also use the tutorial test performance to gauge the success of my lectures thus far. Further, I attempt to structure my tutorial tests in such a way that I do not unnecessarily overload students. Tutorial tests have received positive comments from students. To illustrate, responding to the question on what aspects of the module they found most useful, MATH236 students' responses included the following:

Tutorials and tutorial tests.
Tutorials.
All aspects.
All the aspects. The entire course was useful (Annexure 2B).

There are indications that the use of tutorial tests improve pass rates. For instance, when I taught MATH144 in 2007 I did not use tutorial tests and the fi nal exam pass rate was 73.68%. I introduced tutorial tests the following year and the pass rate rose to 77.08%.

When setting tests or exams, I follow several guidelines. First, I should maintain (or even improve) the standard of the course. Second, memorization or reciting of material is one of the major negative eff ects of continuous assessment. To reduce it, I set new questions every year given that students tend to be well connected to those that came years before them, and besides I also provide students with past tests and exams. Third, a test should examine a reasonable range of class material whereas an exam should cover almost every aspect of the course with emphasis on important concepts and on the understanding of methods and principles. Fourth, part of the purpose of tests is to help the students learn; tests should also expose students' weaknesses. I keep track of my old tests and exam material in order to get feedback for the purposes of setting the next test or exam. During the semester, I ask students to write anonymous comments (Annexure 3C) about each test which I utilize when creating the next test or the final exam. This aff ords students participation and allows students to be part of their learning and not just to be passive learners.

I consider myself most successful in achieving the above goals. To illustrate, as mentioned earlier on, at the end of each semester, I have always enjoyed good pass rates without compromising the standards and quality of the courses. For example, I began teaching MATH236 in 2008 yielding a pass rate of 71.3% compared to 53.15% and 42.06% in the two previous years. In 2010, the pass rate was 78.20% (Annexure 3D). One colleague of mine, who was sitting in my MATH236 lectures just to enrich herself on the subject, in an unsolicited e-mail to me, pointed out, regarding one tutorial test, that

It covered some really messy theorems and ideas very well, but the numbers were small enough that the test examined if the students understood the concepts (Annexure 3E).

In MATH144 I fulfilled my goal of increasing the examination content while maintaining a healthy pass rate, and still being within the bounds of the course syllabus, when I took over in 2007. In particular, in addition to what was examined on in 2006, I examined systems of di fference equations, Dijkstra's algorithm, Kruskal's algorithm, labelling technique and the Minmax Theorem. To demonstrate maintaining healthy pass rates, as indicated earlier on in 2007 and in 2008 the pass rates were 73.68% and 77.08%, respectively.


Contact Webmaster | View the Promotion of Access to Information Act | View our Privacy Policy
© University of KwaZulu-Natal: All Rights Reserved