Gold Silver Bronze (GSB) Quality Report 2012 Pilot

Our Gold, Silver and Bronze (GSB) quality system originated from early 2012. The first few pilots did not succeed but through perseverance, we have been able to build on the experience of each and eventually created an excellent system that provides an accurate measure of course activity and quality. This blog post outlines the original 2012 GSB pilot and the lessons learned.

The main objective of the GSB system was to create a report identifying the engagement and quality of every course taught at Croydon College and University Centre Croydon in our Virtual Learning Environment (VLE). The end result being able to make a positive impact on teaching, learning and assessment for both our students and staff.

Through measuring the progress of course quality, Paul (ICLT Director) and I wanted a reporting system that could be used to identify areas that needed additional support, provided targeted training, share good practice and most importantly identify where the VLE was not fit for purpose in order to refine the system and meet the needs of the end user.

September GSB 2012 Quality Report

Overview

After conducting research and testing a few of the GSB systems available at the time (along with looking at Moodle's in built reporting systems) we decided that we needed a better reporting system for the GSB. This meant building our own quality report. The very first report was created by identifying the elements of teaching, learning and assessment that determine the quality of courses at four tiers, Certificate, Bronze, Silver and Gold. After the E-Learning team drafted the criteria with Julie (Head of Division for Business, Science, IT and Travel and Tourism) and through consultation with all the Head of Divisions (HoD), we settled on the following brief summary for the criteria:

Certificate Award criteria included very basic administration information, Bronze Award criteria included essential information that students needed to know about to be able to complete a course including information about key deadlines, resource for catch-up and revision materials. Silver Award criteria included utilisation of feedback, formative and summative assessment mechanisms and the Gold Award criteria included extended opportunities for peer work and independence when completing a course online.

Implementation

The GSB 2012 quality report data collection and distribution system comprised of three main elements:

1. The criteria for each tier was mapped out in a document, complete a rationale, grouped together tier. All lecturers were provided access to this document through the intranet and could use this to understand and implement the GSB criteria, broaden their understanding of the facilities offered by the VLE and develop their understanding of what constitutes a high quality course.

2. The quality checking system consisted of a spreadsheet with all courses listed in the first field and the GSB criteria mapped along the top row. A member of the E-Learning team would then perform a manual yes/no check for every course in the VLE to see if the criteria had been met. I added validation to the spreadsheet fields to try and reduce user error and increase overall accuracy when completing the check.

The spreadsheet containing the data would be uploaded to a Moodle course for all HoDs to access. For ease of use, I added a macro based navigation system to the spreadsheet, complete with a main menu page.

GSB 2012 Main Menu (Early draft)

Quite early on in the trial, staff fed back that making sense of the data and processing it into usable information through pivot charts and scatter diagrams was too complicated and time consuming. To solve this problem, I wrote a couple of VBA scripts to automatically process the data through a series of drop down menus. The end user would only need to select the division or team that they wish to analyse and then click a sequence of six clearly laid out buttons to automatically build the pivot and scatter charts.

GSB 2012 Filter Menu

GSB 2012 Filter Menu

 

Pivot chart location
Pivot chart location - After selecting the relevant filters, a pivot chart would automatically appear in this space 

 

3. The final part of the GSB implementation included a self-training Moodle course that explained how to us the spreadsheet. I created the training course by adding resources to explain all relevant information about the data processing (including anomalies) and recorded a series screencasts that demonstrate how to use the spreadsheet, including filters and the interpretation of results.

 GSB training materials 2012

The training course for the GSB 2012 

 

 


Conclusion

Unfortunately, the pilot did not succeed in meeting our main objective but it did succeed in giving us enough information to go forward for the next pilot, GSB 2013.

Lessons learned and changes made

From the GSB 2012 pilot, we found many problems with the current system and had to put in place significant changes. As a quick summary, the problems included some of the GSB criteria being an incorrect measure of quality, some items were irrelevant. It wasn't flexible and missed many areas of good practice. Being a manual checking process, it was too resource intensive, time consuming and the information took too long to get to HoDs and Team Leaders to be useful. The pilot revealed lots of empty courses coming in from MIS which meant we had poor visibility. The GSB was not fit for purpose. A more detailed analysis is as follows:

Point system

The main change we had to make was to move the GSB measure from a rigid boundary assessment to a point weighted system. The 2012 system was too restrictive and did not take into account various methods of teaching and individual style. It was a one size fits all model. For example, every element in the Silver and Bronze Award had to be met before a Silver Award could be acheived but all of the Silver criteria was not suitable for all programmes of study and not suitable for all audiences and teaching practices. It would force the use of a resource or activity without justification and without care for the teaching and learning. Furthermore, the stepping stone system meant that if a course achieved all of the Silver and Gold criteria but not the underlying Bronze, then all of this work would not be recognised and the course deemed to be poor. All elements of a rigid criteria had to be met before proceeding onto recognition for the next award but this isn't a good use of technology.

The only way to address this would be by building the new system based on point weighted thresholds with criteria at each level being worth a different score. With the new system, you can pick and chose several items from the Silver category (since the criteria are all involved with a similar aspect of pedagogy including the provision of feedback, guidance and support) but it is up to you as a lecturer which activity you select. As long as you meet a couple of the criteria and have enough points from the Bronze category, then you are awarded a Silver Award and the delivery mechanism (such as means of feedback) will be flexible based upon your course and style. The other main plus of this system is that hard work will always be recognised. For example, if I miss the Bronze criteria at a 3 point award, I need to have two Silver activities (making up the missing 3 points) or three Gold activities (making up the missing 3 points) to make up the difference and take me over the boundary and gain the Bronze Award. It will also highlight areas of excellent practice, even if the rest of the award criteria has not been met and it will quickly show areas of a course to develop.

Engagement and usage

The pilot showed us that points from the criteria had to be directly connected engagement, not just the presence of a resource or activity. A big problem with the GSB 2012 system was that a good course could be built, met all of the criteria for several awards and then not be used. This meant the report could be tricked and the main purpose of using a VLE was null and void. For the next system, we had to make sure that the awards were connected to usage. Only higher usage courses can achieve a Silver and Gold.

YES/NO criteria check:

Instead of a simple Yes/No system to indicate whether a resource or activity exists, the next system needed to include a count of each type of resource and activity for a clearer picture of course usage and overall development over time. The first report was flawed in that it would indicate if a course contained a resource or activity but offered no details in regard to the number of resources or activities the course contained. A course containing 20 resources or 200 resources would obtain a 'Y' even though hugely disproportionate from each other. I originally decided on a Yes/No check with a threshold of 20 resources and activities instead of individually totalling each resource and activity type because it would take too long to manually count this items for every course.

Time 

Checking the quality of all courses took approximately 6 weeks of solid work. It was a long and repetitive process, prone to user error and by the time we gathered the data, it was already out of date. We needed another solution for the next pilot.

MIS Data

The manual checking process was continuously hampered by irrelevant data (in the form of empty courses) coming in from MIS. This also meant that we could not quickly identify whether a course was actually being used by a lecturer. Sometimes lecturers were missing from the course and we could not offer any support. We therefore had to look at changing the data that came into Moodle for the next GSB system.

Removal of Certificate Award items and re-structure of Bronze criteria:

From the trial, we also realised that some of the information that we deemed as necessary for students to know about, could be centralised on open access Moodle courses, saving replication of information and providing better facilities for keeping the information up to date. It enabled us to look about our working practices and increase overall efficiency.

Bronze restructuring

Following on from research for other emerging GSB award systems, feedback from students and due to the removal of the Certificate Award, the Bronze criteria needed to be the minimum standard we expect to see on a course in order to provide clear information to students, up to date resources containing context for self-study and revision, tied in with a consistent structure for navigation. There are a few clearly defined resources that we expect to see in a course such as a scheme of work.

Silver restructuring

We removed an item or two from the Silver Award criteria and added a few more. Resources or activities that provided summative and/or formative feedback (including self-assessments or anti plagiarism tools for developing research skills), tracking, student voice feedback, interactive multimedia SCORMS, Functional Skills, Employability or PLTS strand mapping (promoting differentiation) are now contained in the Silver criteria. The new Silver Award contains greater emphasis on resources that enable students to interact with Moodle in order to develop their understanding in some form and gain information about how to develop in their learning.

Gold restructuring

We included activities that promote peer/group work or activities that are interactive, increase independence and overall engagement.

Last modified onTuesday, 20 January 2015 12:49
(1 Vote)
Read 5999 times
Tagged under :