Chemistry is an experimental science, and the laboratory remains a cornerstone of chemical education primarily for this reason. As teachers, we use this setting to reinforce and extend fundamental principles presented in the classroom. In addition, the undergraduate laboratory represents the primary environment for developing practical skills and exposing students to core techniques. These include demonstrating proficiency in the handling of standard laboratory glassware and equipment, experimental precision and accuracy, safety awareness and generic skills such as problem-solving, time management, teamwork, preparation and communication. As many readers are probably aware, many of these skills are not only key graduate attributes for chemistry students, they are generic skills that are essential in occupations beyond science-based careers.
Too much time assessing and not enough time teaching?
Traditionally, the performance of students in the undergraduate laboratory is measured by using quantitative assessment models (e.g. performance is assessed using a percentage score). At the University of Tasmania in recent years, we became concerned that the focus of the first-year undergraduate chemistry laboratory program had shifted away from developing student competency to concentrating far too much on quantitatively assessing it. Furthermore, our observations and interactions with undergraduates indicated that the existing form of assessment often led to students focusing solely on obtaining a good grade, to the extent that the acquisition of fundamental skills was a much lower priority. Consequently, the techniques and concepts presented at first-year were often overshadowed and/or quickly forgotten. Additionally, we were concerned that the above-mentioned issues were hindering the development of skills required in second-year chemistry units and in other degree programs, such as pharmacy, agricultural science and biomedical science, which feature first-year chemistry within their programs.
After considering a range of alternative assessment strategies, we decided to investigate the viability of a competency-based assessment model in the first-year undergraduate chemistry laboratory. We recently published our preliminary findings.*
What are competency-based assessment models?
There are alternative approaches to measuring performance in the chemistry laboratory, such as competency-based assessment methods. These assessment models strive to shift the focus away from the end product (the grade) towards the development of key competencies, such as the fundamental skills noted above. Recently, there has been a push to incorporate competency-based elements into our assessment structures and use ‘blended’ approaches in which both competency- and performance-based quantitative assessment are employed. We were interested in taking this a step further by removing all quantitative assessment in the first-year laboratory – committing to a competency-based assessment model exclusively and evaluating the outcomes of this change.
Introducing a competency-based assessment model
One of the most challenging aspects associated with introducing this model into the first-year laboratory was the design of an appropriate criterion-based assessment framework. Consultation and fact-finding were central to the development process. This included discussions with staff and students at the University of Tasmania, informal conversations with colleagues at other institutions, and consulting the literature to familiarise ourselves with the outcomes of related studies. In addition, we endeavoured to ensure that our competency-based assessment model was aligned with the intended learning outcomes (ILOs) associated with our first-year chemistry units and the national Threshold Learning Outcomes (TLOs) for undergraduate university-level chemistry in Australia devised by the Australian Learning and Teaching Council in 2011.
By this approach, we developed and identified 11 key criteria that would underpin our new competency-based assessment structure:
C1: Proficiency in using analytical glassware
C2: Proficiency in using chemical glassware
C3: Experimental accuracy
C4: Recording observations
C5: Mastering chemical calculations and equations
C6: Understanding and applying chemical principles
C7: Heating, cooling and isolation
C8: Safety awareness in a chemical laboratory
C9: Efficiency and time management
C10: Professionalism and preparation
C11: Collaboration and teamwork
Each criterion is assessed a number of times during the semester, allowing students many opportunities to demonstrate competency in different laboratory contexts. In addition to what could be considered ‘chemistry-centric’ criteria, soft skills such as time management, safety and working in groups were also assessed in this model.
Did the focus of the lab program shift?
Our study employed three primary approaches to measure the effects of the new competency-based assessment model. These included a paper-based questionnaire to monitor student perceptions of the laboratory experience, a standard online unit evaluation survey administered by the university, and monitoring the academic performance of the cohort in the laboratory. This data provided a framework for evaluating the effects of our competency-based assessment and how it was received by the students. The results of these changes were then benchmarked against baseline data collected in the previous year, which featured a traditional quantitative method laboratory assessment model.
The baseline data collected prior to implementing this new assessment model suggested that many students had only superficial knowledge of skills presented in the laboratory and treated all experiments as isolated experiences. As a result, students often had to be retaught these same techniques in subsequent first-year experiments and in higher years. When compared to the baseline data, post-implementation responses from students indicated an improved ability to identify the skills they had acquired and enhanced capacity to link these skills to specific experiments.
Of the survey questions posed, the question, ‘What skills have you acquired or improved upon completing the chemistry laboratory component?’ offered the best insight.
Prior to implementation of the competency model, analysis of the responses collected indicated that the most common theme observed by students was an improvement in their general laboratory techniques (28% of the cohort). However, these responses did not identify specific skills. In comparison, the results collected after implementation also illustrated that the most common theme centred on laboratory techniques (14% of the cohort). These responses not only cited techniques, but also detailed specific procedures, including handling glassware, operating spectrophotometers, and performing titrations and synthetic processes.
Did the student experience improve?
Without corresponding data from higher years of study, we cannot make any conclusions with respect to the long-term effects of these changes. However, we noted that the pass rate for students in the laboratory program and the unit, more generally, were essentially unchanged. A greater percentage of students indicated that they found the overall program to be more enjoyable. Furthermore, across six of the seven questions posed in the survey that was used to obtain specific feedback on various aspects of the student experience in the laboratory, there was a 5–15% improvement following the change.
Deeper analysis, by comparison to the baseline data, indicated other more nuanced positive effects. A significant portion of the student responses quoted benefits, including a reduction in the stress of undertaking experiments in a laboratory environment, additional time to learn from mistakes, and a better opportunity to overcome laboratory anxiety. This was mirrored in the anecdotal observations provided by the laboratory instructors, who noted that students were more focused on understanding the experiments and asking questions. This included students seeking out one-on-one opportunities to practise or demonstrate their ability to meet specific competencies.
One of our concerns prior to undertaking this investigation was how students would approach ‘hurdle-based’ laboratory assessment. Indeed, by the midpoint of the program, the majority of students could have satisfied the defined competency standard, at which point they could potentially disengage from the remaining experiments. Our solution was to mandate a 100% attendance requirement for all of the eight experiments comprising the program. A two-day intensive ‘block’ laboratory program offered at the end of semester allowed students to catch up any sessions that they had missed.
We were pleased to note that the overwhelming majority of students approached this new competency-based assessment structure in the spirit that it was intended.
Where to next?
We will continue to monitor the effectiveness of this approach in first-year laboratories and the potential long-term effects on future years of study. This study stemmed from our interest in trying something new to address issues that we had observed. We anticipate that this assessment model will continue to evolve as we endeavour to keep improving our capacity to educate future generations of chemists and enhance the student experience.