Loading...

Assessments - A Primer for Remote Teaching (E-CORE Webinar)

Trying to transition your assessments and evaluations online? Take a look at this before you start!

This is a summary of the Assessments - A Primer for Remote Teaching webinar presented by instructors from University of British Columbia and Queen's University in collaboration with E-Core. They will be giving tips and tricks on how to adapt assessments onto an online platform while ensuring student success and integrity.

What is E-Core

E-Core is a national collaborative initiative recently established by the Canadian Engineering Education Association in response to COVID-19. It makes sure that the remote engineering program delivery will maintain the standards of Canadian Engineering Accreditation Board. They provide support and resources to engineering educators in the country.

Scenario 1

Say you are a student in a circuits analysis course. Your learning outcome is to demonstrate ability to analyze AC and DC circuits. The instructor introduces component fundamentals and analysis theory in his online lectures. The TAs demonstrate solution procedures and check weekly homework completion (30%). The final exam is worth 70%.

As a student, what would your reaction be?

"I'm going to fail."

The student will also feel panic, fear, stress and pressure. There is a very heavy weighting on the final exam. There may be suggestions of turning to Chegg or other sources.

Assessments and Roles

Assessments are not just about grades. Assessments have many purposes including:

  • Providing feedback - identifies strengths and weaknesses; allows students to adjust and adapt their study efforts and methods
  • Enhancing learning - the act of recalling information and applying it is a powerful learning tool
  • Boosting motivation - provides students extrinsic motivation to study and holds them accountable to their course and peers
  • Evaluating students - used to measure student performance

Assessments must constructively align with course and learning outcomes. Students should not be surprised when presented with your assessment. Making it unnecessarily difficult or deviating from learning outcomes will only add needless pressure and stress on your students.

Assessment Planning Framework

While developing your assessment, make sure to:

  1. Identify the context
    • Policies
    • Student numbers and needs (account for scalability in your assessments)
    • Course role within the program
    • Development time and resources
  2. Identify learning outcomes
    • Is this assessment related to previous learning activities
    • Do they align with course learning outcomes
  3. Identify tasks that address assessment goals
    • Feedback / Learning / Motivation / Evaluation

Consider:

  • Academic integrity in the remote environment
  • Student equity: hardware limitations, access to quality internet, timezones
  • Student and instructor workload
  • Authenticity: assessments should also reflect the behaviour of a professional working in this field

Scenario 2

Suppose we have a physics course with 400 students across Canada, India, and China. The learning outcomes include creating simple models of simple systems using Newtonian mechanics to predict behaviour, and recognizing and using physics concepts - including force, power, energy, torque, and momentum.

The assessment plan is as follows:

  • Completion grade for homework uploaded weekly (10%)
  • Randomized 2 hr multiple-choice midterms (20% each)
  • Remote proctored 3 hr multiple-choice final exam (50%)

How well does this marking scheme work?

  • Randomized and remote-proctored tests can be used to maintain academic integrity
  • ✓ Workload for instructors is key to creating fair tests
  • ✗ Completion grades provides little opportunity for feedback and learning
  • ✗ MC exams do not address learning outcome of creating simple models
  • ✗ Equity may be an issue due to remote proctoring (timezones)

Academic Integrity

To avoid misconduct, we must address the student perspective. Assessmets are stressful for students at the best of times. Many students are under pressure to pass or to obtain good marks. In-person assessments limit many (but not all) opportunities for misconduct.

Instructors are now faced with two tasks:

  1. Creating online assessment strategies accessible for all students
  2. Ensuring the integrity of these assessments

Fraud Triangle

The Fraud Triangle states that fraud and misconduct are influenced by three main factors: Pressure, Opportunity, and Rationalization.

Pressure on students, whether for high grades or to pass, can be a double-edged sword. While it boosts motivation, impulsive "panic cheating" can occur on high-stakes exams. We can reduce this pressure by offering flexible grading, multiple low-stakes evaluations, and alternative (authentic) assessments.

Opportunities to engage in misconduct must be minimized. This can be done through proctoring for exams, controlling exam times and access, individualizing assessments and accountability.

Rationalization for misconduct may result from believing that potential rewards outweigh the risks, skewed views on cheating, or lack of understanding of expectations. To avoid the rationalization of "everyone is doing it," or "it's not a big deal," it is important to weave integrity throughout the program. Have clear guidelines of what is acceptable and provide integrity pledges on all assessments. Address misconduct cases accordingly and clearly state the consequences.

Remote Proctoring

*Please note that the Engineering Faculty does not currently support remote proctoring.

Why is it considered? Cons & Concerns
  • Reduces opportunity and rationalization
  • Increases fairness by discouraging collusion and use of prohibited aids
  • Protects the confedentiality of exam questions
  • Student privacy concerns
  • Additional stress on students
  • Student equity (may require additional hardware and reliable internet)
  • Complexity and cost

Scenario 3

  • 800 student traditionally taught technical foundations course
  • Face-to-face consisted of:
    • Auto-graded homework on LMS (10%)
    • In-class participation (5%)
    • Two midterm exams (30%)
    • Written final (50%)

How should the grading scheme change?

  1. Maintain face-to-face approach, but use proctored exams
  2. Make homework worth 25% and replace the 3 exams with 7 unproctored quizzes (10%)
  3. Replace the 3 exams with a major team design project (45% for technical, 25% for oral, 15% peer evaluation)
  4. Replace the 2 midterms with 5 unproctored quizzes, make final exam an oral exam


The class is too big for everyone to do an oral exam, so we can't choose D. Large class sizes can also make team projects challenging, so C may be difficult to implement. B is the most feasible, though quizzes should be randomized to ensure integrity. A can work as well, but only if the institution allows proctored exams.

Scenario 4

  • Small (12-student) senior technical elective course; all students in Canada
  • Face-to-face consisted of:
    • Weekly interactive problem solving tutorials (20%)
    • Closed-book final exam (80%)
  • For remote course:
    • Can replace tutorials with video solutions of tutorial problems
    • Zoom does not allow video proctoring of exams

How should the grading scheme change?

  1. 20% for viewing tutorial videos; 80% final oral exam
  2. No grades for videos, add midterm oral exam (40%) and make final exam oral (60%), both using Zoom
  3. Students complete tutorials as homework before videos are released - then graded for correct answers (20%), add open-book midterm exam (30%), and open-book final exam with open-ended scenario questions (50%) written simultaneously
  4. Same homework as C (20%). Add challenging time-limited midterm exam (30%) and open-book final exam of closed-ended calculation questions (50%) written over 48 hour period.

Students are not getting feedback through route A, so this is not ideal. On the other hand, B can be feasible, but only if the institution allows use of Zoom for oral exams. As upper-year students are more mature and tend to have less integrity issues as lower year students, C and D are both feasible. Ultimately, C is the better choice.

Scenario 5

  • Moderate (100-student) introduction to laboratory methods course including theory and practice
  • Face-to-face consisted of:
    • One graded team experiment design project (35%)
    • 10 labs with graded reports done in pairs (20%)
    • One midterm exam (15%)
    • Final exam (30%)
    • Students must pass weighted exams to pass the course (for individual accountability)
  • Remote labs will be done individually with equipment mailed to students

How should the grading scheme change?

  1. Remove the "pass the exam to pass the course" requirement
  2. Remove the exams and shift the weight to the project and lab reports
  3. Add a short (10-minute) oral follow-up with each student after the final exam, asking them to explain the thought process on their answers on several random exam questions
  4. Conduct the exams as MC and parametric calculations on the LMS,  with each student getting a different question mix through randomized pools; have daytime and evening exams (for different time zones) with completely different question pools

Route A reduces pressure slightly but otherwise does not change much. For B, we cannot ensure we are assessing students individually. Though remote labs are supposed to be done individually, we cannot ensure that students are not working together. C adds pressure on students - what would you do if you weren't sure the students didn't do the work themselves? How would you pursue that? This type of grading may just be a deterrent. Finally, we can see that D is most feasible.

Missed the Session?

Here are the audio recording and slides!

More E-Core and CEEA events can be here.

Article Category: EdTech Events