
- 192 pages
- English
- ePUB (mobile friendly)
- Available on iOS & Android
eBook - ePub
About this book
The fourth volume in this series deals with one of the ubiquitous higher and further education subjects. With a practice-based approach, the text avoids being overly academic and instead uses a case study format to detail a wide range of approaches to assessment.
Frequently asked questions
Yes, you can cancel anytime from the Subscription tab in your account settings on the Perlego website. Your subscription will stay active until the end of your current billing period. Learn how to cancel your subscription.
No, books cannot be downloaded as external files, such as PDFs, for use outside of Perlego. However, you can download books within the Perlego app for offline reading on mobile or tablet. Learn more here.
Perlego offers two plans: Essential and Complete
- Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
- Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, weâve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes! You can use the Perlego app on both iOS or Android devices to read anytime, anywhere â even offline. Perfect for commutes or when youâre on the go.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Yes, you can access Assessment by Peter Schwartz,Graham Webb in PDF and/or ePUB format, as well as other popular books in Education & Education General. We have over one million books available in our catalogue for you to explore.
Information
SECTION 1
INFORMATION TECHNOLOGY: ONE ANSWER TO ASSESSMENT IN LARGE CLASSES
CHAPTER 1
TAKING THE BYTE OUT OF COMPUTER-BASED TESTING
Issues raised
This case raises the issue of managing some of the difficulties that arise when computer-based tests replace paper and pencil tests for campus-wide assessment.
Background
The Center for Assessment and Research Studies (CARS) at James Madison University (JMU) in Harrisonburg, Virginia, collaborated with the Office of Information Technology (OIT) and the General Education programme to open two computer-testing laboratories in 1999. During the 1999â2000 academic year, General Education Cluster 1, Skills for the 21st Century, required incoming first-year students to show competency in Microsoft Word, Microsoft PowerPoint, technology knowledge and information seeking by the end of their second semester. A 50-seat computer-testing lab was open approximately 25 hours per week for drop-in testing. Over 3,000 first-year students eventually completed their Cluster 1 competency tests during that first year of testing. The events described in this case took place during the first semester of that year and the first author is the narrator.
PART 1
This did not look promising! As I walked up the stairs, I began to see the students. There were at least 200 of them â maybe more â forming a staggered line down the hall. Some were sitting down and others were standing. I saw students eating their lunches and chatting with friends, while others were leaning against the wall with their eyes closed. These students had been waiting quite a while and, by the looks of things, they would probably be waiting some time yet.
I made my way down the hall to the entrance of the computer-testing lab. As I entered the lab, I saw Frank (one of the test proctors) leaning over a studentâs chair assisting with a login procedure, while our programmer/analyst, David, was working on some kind of technical malfunction at an empty computer station. After a moment, I caught Frankâs eye. He must have been surprised to see me, because he asked, âIs everything okay?â I said, âThe proctor scheduled for this afternoonâs session has cancelled and Iâll be filling in for the rest of the day.â Frank raised an eyebrow and said wryly, âWhat a convenient time for the proctor to have to cancel. I hope youâre ready for a challenging afternoon!â
The lab was busier than it had been at any time during the semester so far. In August, I had accepted responsibility for staffing and scheduling the new computer-testing lab. For many weeks, the proctors had reported that only one or two students per week came to take the required tests. Had we recalled our own days as university students, we could probably have predicted that many students would procrastinate as long as possible before meeting their testing requirement. We were now at the day before the deadline and it was going to be a long afternoon.
The room was filled with 48 students because two of the available computers had crashed and would not reboot. Unfortunately, no one from computing support had been able to attend to the problem yet. I saw a girl over in the corner with her hand raised, so I asked her what was wrong. She said with feeling, âIâve just spent 20 minutes taking my PowerPoint test, but the computer froze while calculating my score. Now what happens?â I had seen this problem a few times during the semester, so I knew a few things to try. Unfortunately, all my efforts failed, so I had to tell her, âIâm sorry, but youâll have to retake the testâ. This particular student turned out to be more understanding than most, but I still found it difficult to have to tell her that her score was not retrievable.
As I moved on to the next request, I noticed that Frank and David, like me, were still busy assisting students with problems and questions. This time, a student told me, âThis test question says to underline the word âmanagedâ in this sentence, but how can I do that when the word isnât there to underline?â This problem was easily solved, but now the student had less time to complete the rest of the timed exam. Throughout the semester, I had become aware that the software being used to assess competency in Microsoft Word and PowerPoint had some programming glitches, but the full consequences of these difficulties were only now becoming apparent as the volume of testing increased dramatically.
As I continued to walk around the room, I noticed that our Web-based, Information Seeking Skills Test (ISST) was particularly slow this day. Because this test is Web based, the items load most slowly at times when the network is getting the most use on campus. The ISST links students to specific databases where they can search for answers to test questions. Unfortunately, if the databases are unavailable for some reason, the student receives a âfailure to connectâ message and cannot respond to the question. One frustrated student asked, âHow in the world am I supposed to meet the testing deadline when I canât even get the test to work?!â
So went the âlong afternoonâ. I wish I could say that my experience in the testing lab that afternoon was unique â but I canât and it wasnât. As each testing deadline came and passed during that first semester, the experience was repeated. The more time I spent in the lab, the more I wished we could go back to using the faithful paper tests, pencils and bubble answer sheets. As long as I had enough supplies for everyone, the tests always âworkedâ. Ah for the âgood old daysâ. Realistically, I knew that we had to forge ahead with our computer-testing efforts, but I also knew that we had to do something about the problems â and quickly. What were the most important issues to deal with? And what should we do?
What steps would you recommend taking to improve the performance of the computer-testing programme?
What do you think was actually done?
PART 2
Well, we did not make it through our first semester of computer-based testing as gracefully as I had hoped, but we did make it through. The majority of students had completed their required tests by the end of the semester and we had a readable data file for each test. Based on our experiences with implementing computer-based testing on such a large scale, we identified a number of crucial issues and began revising the testing process immediately. Our two primary groups of concerns were technical and administrative and they led to the following conclusions:
⢠The testing software had to be properly evaluated and made more reliable.
⢠The dependability of the Web-based parts of the test had to be improved.
⢠The scheduling of testing had to be improved so that overcrowding would not occur at particular times.
⢠Overall management of the testing lab had to be more clearly defined.
After our first semesterâs experience in the lab, we realized that there were several problems with the software that was assessing the studentsâ knowledge of Microsoft Word and PowerPoint. First, the software interface was not user-friendly. As a result, two pages of written directions were needed to assist students to begin their tests. In addition to the difficulty of getting logged in to take the exam in the first place, students were often being kicked out of the test because of various programming errors. At one point, we calculated an error rate of 6 per cent for our testing sessions. Basically, 3 out of every 50 students were being forced to retake a test because of the inconsistency of the testing software.
This error rate concerned us, and we shared this information with others in our Assessment Center. We decided that an error rate above 1 per cent was unacceptable, and we requested a meeting with all the relevant parties at JMU and the software company. Frank, David and I along with other members from our respective offices and representatives from OIT attended a meeting early in the second semester. At this meeting, we learnt that JMU was the first institution to implement this software for such widespread testing. We were discovering problems that no one else had ever seen. The software company representative agreed that the company would address these problems in a new version of the software that would be released in a few months.
Before agreeing to use the software again, however, our Centerâs director requested a mass pilot testing of the new software before use the following academic year (2000â01). When we attended the pilot testing in August, we found that the software company had made noticeable improvements in the interface and having almost 40 people enter the software simultaneously did not crash the system. We began using the new software in September 2000. Along with the many improvements over the previous year, a few new problems cropped up as well. (For example, on the PowerPoint test, students were instructed to move a slide to the end of the presentation. When they performed the task, they received an error message: âAn error has occurred and this exam will be shut downâ. Beneath that window could be seen the PowerPoint window error that âThis program has performed an illegal operation and will be shut downâ. Since it was proprietary testing software, there was little we could do about the problem.) Overall, the software had been improved from the previous year, but evaluating its weaknesses and bringing them to the attention of the software company will be an ongoing process.
Just as the software we purchased presented challenges, the Web-based tests developed by CARS presented another technical problem. As a result of being Web based, these tests were slow at those times when the network was busy or bogged down. The movement of student response data between the computer lab and the file server was often delayed. This was not a problem for tests in which all the items were loaded at once and the student merely scrolled down through the test, but, unfortunately, a slow exchange between the lab computer and the server limited severely the possibility of implementing computer-adaptive tests. On such tests, a student is given his or her next test item based on the response to a previous item. This back-and-forth interaction between lab computers and the server was found to be painfully slow during peak times. We are in the process of trying a variety of possible solutions. First, we are looking at the physical location of the file server. Although we have fibre optic cable, having responses travel long distances and through routers might slow transmissions. Second, when we cannot move a file server to the respective computer-testing lab, we are trying to reduce the numbers of routers and nodes that transmitted responses have to pass. Third, we are looking at how responses are transferred, one test item at a time or the entire test transmitted as a whole.
One of our administrative concerns was the long line of students that formed around testing deadlines, resulting in students having to wait two hours or more to take their tests. Typically, students were not particularly excited about required testing anyway, so the long wait made their experience less pleasant. During the first year of testing, we had only one 50-seat computer lab available. Initially, we had made drop-in testing available all semester long, but we found that most students waited until a day or two before the deadline to come to take their tests.
Our second year of computer-based testing commenced at the start of the 2000â01 academic year, and part of our solution to the problem of over-crowding has been to open an additional 100-seat computer lab. We have also developed a system of staggered testing deadlines, based on studentsâ ID numbers. At JMU, student ID numbers are randomly assigned. Our 3,000 first-year students have been divided into groups of approximately 300 by assigning testing deadlines based on the last digit of their ID numbers. Groups of about 300 students will match our current testing resources, because we now have computer lab seating for 150 students. By having the students in smaller groups, we have been able to set 10 different testing deadlines, whereby we can accommodate all of the students who will be attending.
Typically, the testing lab is not open for many evening hours during the semester because it is difficult to staff the lab, and most students have time during the day when they can come in for testing. On days that coincide with deadlines, the test administrator schedules evening lab hours to minimize problems for students who have schedule conflicts. Another approach we have incorporated to help reduce waiting lines is that faculty members have been encouraged to schedule times near the beginning of the semester when they could bring their classes by for testing.
A further administrative concern that we had about the testing lab during its first year was its management â a responsibility that was mine that year. Trying to manage the lab part-time was difficult as the demand for its use increased. The first year, we staffed the lab with graduate students who were trained to be proctors. Piecing together a schedule that would provide good coverage and evening hours was difficult. I also found it difficult to inform eight different proctors about the solution to a specific problem or about a policy change. Because of the location of my office, there were many times that I could not be at the lab, and I thought to myself that it would be better to be closer when difficulties arose. Eventually, a lab manager was hired to coordinate proctors and work with the different parties on campus who were involved in the computer-based testing effort.
Implementing computer-based testing has been a challenge that will continue. As our tests and technology continue to evolve, there will be new issues to address and old issues to be reconsidered. However, we believe that working through the challenges posed by computer-based testing has definitely been worth the effort.
How well do you think the problems were handled?
What alternatives might there have been?
What general lessons can be drawn from the case?
CASE REPORTERSâ DISCUSSION
We learnt many lessons when the transition from paper/pencil testing to computer-based testing occurred. First, when a computer-testing lab is being established, it is important to consider how the network will be set up and where the servers will be located. We described in the case how these factors affected our computer-testing programme.
Other technical issues involve software and hardware. Regarding software, will the Internet be used in a Web-based test or will a microcomputer-based test be constructed? If a Web-based test is chosen, occasions when the Internet response time is slow must be expected. With some tests, selected Web sites and pages can be copied and the test made to look as though one is going âto the Webâ. We have had trouble using PERL code and now use JavaScript. And although the Web was originally created for text (ie HTML) exchanges, improvements are on the way.
Also with respect to software, before settling on the purchase or lease of any proprietary testing software, it is wise to check out the helpfulness of the companyâs support personnel. When we could identify particular problems in the software, we had difficulty getting the software company to correct the mistakes. If they say a correction will be included in the next version, it would pay to find out when the next version will be available.
Regarding hardware, what level of microcomputer is being considered for purchase? Strictly speaking, Web-based tests may not need large memory. If a test requires more calculations or distributed processing, the microcomputer must have sufficient memory and processing speed. We recommend head-phones so that audio capabilities may be exploited. For instance, in our oral communication test, students listened to speeches and small group discussions. Video capacity of the microcomputer is a consideration when students view art or other visual images.
At the same time, when multimedia capabilities are utilized in a testing situation, network personnel should be contacted to determine how best, or if, such media can be transported. We had good cooperation with network personnel in piloting the simultaneous transmissi...
Table of contents
- Cover
- Half Title
- Title Page
- Copyright Page
- Table of Contents
- Contributors
- Introduction
- Section 1: Information Technology: One Answer to Assessment in Large Classes
- Section 2: Reflective Assessment: Journals, Logbooks, Portfolios, Peer Assessment
- Section 3: Institution-wide Assessment Programmes: the US Perspective
- Section 4: Assessment Methods for Special Purposes
- Section 5: Addressing the Needs of Individual Students in Assessment
- Section 6: Hands-on Assessment: Everyday Problems in Assessment Practice
- Conclusion
- Further reading
- Index