📘Jupyter Notebook

Overview

An AutoTest V2 configuration for a Jupyter Notebook assignment has two stages:

  • The Setup stage: this is where you setup the auto grading environment. You can install packages and upload files. This builds into an image on which all tests are executed. All the configuration here only runs once for all submissions;

  • The Tests stage: this is what students will see and where you can configure all your tests that you want to run on every student’s code. Everything that you specify here is executed in the working directory of the student.

Steps are created in a “Scratch-like” block layout and can be dragged and dropped, reordered and nested to create an AutoTest program. We are continuously adding new blocks to AutoTest V2 to make it more powerful and easier.

AutoTest V2 is still available next to our original AutoTest, to start setting up an AutoTest V2 select AutoTest V2 when creating a new AutoTest. Already have an original AutoTest? You can switch to AutoTest V2 by deleting it and pressing "Select another version of AutoTest" to finally select AutoTest V2. Please note that your original AutoTest will be deleted when pressing the "Delete" button!

Developing, snapshots and publishing to students

When developing your AutoTest V2 configuration, you can continuously test your configuration on the "Test Submission".

After configuring something, you press the “Build Snapshot” button in the bottom right corner of the screen. This will Build your AutoTest into a Snapshot (a history of your snapshots are available via the Snapshots tab).

After pressing "Build Snapshot", you can:

  • Test the configuration by seeing the results in seconds.

  • If you are ready to publish your AutoTest to your students press the big green Publish button.

  • If you make any changes, you build again and if you are satisfied, you can republish them to your students.

  • If you want to unpublish your snapshot, you can simply go to it in the green bar and press the red “Unpublish” button.

Step 1: Setup

CodeGrade AutoTest V2 runs on Ubuntu (20.04 LTS) machines which you can configure in any way that you want. Common software is pre-installed, most importantly: python3 with pip3. Jupyter Notebook has to be installed separately.

In the setup section of your AutoTest, you can install software or packages and upload any files you might need for testing. The setup section will build an image and is only run once for all submissions.

You can upload files using the "Upload Files" block, if you are intending on using Python Unit Testing fusing PyTest for instance, this is where you will upload your unit test file. Or upload your NBGrader scripts. These files will be placed in the $UPLOADED_FILES directory on the Virtual Server.

You can install software and packages (or configure the build image in any other way) using the "Script" block. To use Jupyter Notebooks in CodeGrade, you can use the command python3 -m pip install notebook to install Jupyter Notebook. You may use a command like python3 -m pip install pandas to install pandas. Or run python3 -m pip install semgrep if you are planning to use code structure tests using semgrep in your tests.

Step 2: Tests

Now that you have created the setup, it's time to create the actual tests. This is done on the Tests tab. Select one of the many test-blocks to configure a AutoTest V2 procedure in a "scratch"-like format. Some tests can be nested and you can chose to connect tests to rubric categories with a certain weight, you can also hide aspects of tests and enable internet access for specific tests.

Set up an IO Tests

You can use IO tests to grade Jupyter Notebooks. You can assess the entire output, test individual functions / variables or use a grading script to assess individual code cells!

Want to easily grade individual code cells? Contact our support team via support@codegrade.com for our easy to use Jupyter Notebook Grading Script.

Step 1: Converting Jupyter Notebooks to Python

  1. Drag a "Script" block to your Test Procedure.

  2. Add the following line to the program to run:

    jupyter nbconvert --to python STUDENTFILE.ipynb
  3. This will convert the Jupyter Notebook to a Python script, which we can then use in the IO test. Replace STUDENTFILE.ipynb with the file your students submit.

Step 2: Testing converted scripts with IO tests

In Python, you can also execute the student's program interactively and specify Python code as input.

  1. Drag the "IO Test" block to your Test Procedure. In the code section, write the command to run Python interactively and import your converted script, this could be something like: python3 -ic "import calculator".

    • This will import everything in their program and allow you to access variables and call any functions they create. All Input fields now accept valid Python code, as this is now input to the Python Interpreter.

  2. Drag one or more "Substring Match" block(s) inside your "IO Test" block. Specify an Input (Python code) and an Expected Output.

  3. Give clear names to the numerous blocks you have created to make clear to the students what is being tested.

  4. Optionally: drag in a "Connect Rubric" block and drag your "IO Test" block in this, to connect it to a rubric category and use it to grade your students.

We can now interact with the notebook. For example we could:

  • Print the results of a function like this print(<your_script>.<your_function>(1, 5)).

  • Or we could test a stored variable using <your_script>.<your_variable>

As the input is regular Python code you are essentially writing to the Python interpreter. You can call functions, do arithmetic operations and print variables, as can be seen in the examples above. You can test numpy arrays and pandas dataframes using I/O tests too.

Importing Python code without printing

One thing to be aware of is that we are checking the stdout of the scripts, which are run completely when importing. As a result, students can clutter the output with additional print statements outside of functions.

You can import the script with the stdout redirected to prevent superfluous output. This can be done using this little code snippet saved as file import_without_print.py and uploaded in the "Setup" tab using a "Upload Files" block, which you run via python3 -i $UPLOADED_FILES/import_without_print.py:

from contextlib import redirect_stdout
from os import devnull

with redirect_stdout(open(devnull, 'w')):
    import jupyter #<-- The name of your script

Set up a PyTest unit test

You can use PyTest to write unit tests for Jupyter Notebooks like you would with regular Python code. Upload a PyTest unit test file in the Setup tab, it can then easily be used to check the students' code. Until the "Unit Test" block is available in AutoTest V2, we have created a simple wrapper script to use PyTest unit tests easily in AutoTest V2.

Imagine students have to code a very simple calculator in a file called calculator.py:

def add(x, y):
    ans = x + y
    return ans

def subtract(x, y):
    ans = x - y
    return ans

def multiply(x, y):
    ans = x * y
    return ans

def divide(x, y):

    if y ==  0:
   	 raise ValueError("Can not divide by zero!")
    ans = x / y
    return ans

A powerful way to test the individual methods, is to write unit tests. In this example, we can test the student's code with the following unit tests in a file called test_calculator.py:

import pytest
import calculator

def  test_add():
    assert calculator.add(3, 2) ==  5
    assert calculator.add(1, -1) ==  0
    assert calculator.add(-1, -1) ==  -2

def  test_subtract():
    assert calculator.subtract(5, 2) ==  3
    assert calculator.subtract(1, -1) ==  2
    assert calculator.subtract(-1, -1) ==  0

def  test_multiply():
    assert calculator.multiply(3, 2) ==  6
    assert calculator.multiply(1, -1) ==  -1
    assert calculator.multiply(-1, -1) ==  1

def  test_divide():
    assert calculator.divide(10, 2) ==  5
    assert calculator.divide(1, -1) ==  -1
    assert calculator.divide(-1, -1) ==  1
    assert calculator.divide(5, 2) ==  2.5
    with pytest.raises(ValueError):
   	 calculator.divide(10, 0)

Creating the unit test in CodeGrade

  1. Upload the PyTest unit test file and the run_pytest.py script (download above) to AutoTest V2 under the Setup tab. Also: install PyTest using pip3 install PyTest in a "Script" block.

  2. Continue under the Tests tab: create and drag "Custom Test" block, using the run_pytest.py script we can run our Unit Test in here.

  3. In the code field, use the uploaded wrapper script as follows: python3 $UPLOADED_FILES/run_pytest.py $UPLOADED_FILES/test_calculator.py. Of course, always replacing test_calculator.py with the name of your uploaded unit test script.

  4. Optionally: Use a "Connect Rubric" block to connect your unit tests to a rubric category.

Set up a Flake8 Code Quality test

Automatically run static code analysis on the student's code, using Flake8. Until the "Code Quality" block is available in AutoTest V2, we have created a simple wrapper script to use Flake8 easily in AutoTest V2.

This will then run automated quality checking on the student's code and assign a score. Code Quality comments are not yet generated on the lines of code, but will appear in the output of this test.

Running Flake8 in CodeGrade

  1. Upload the Flake8 script (run_flake8.py, download above) to AutoTest V2 under the Setup tab. Also, in a "Script" block, install Flake8 via pip3 install flake8.

  2. Continue under the Tests tab: create and drag "Custom Test" block, using the run_flake8.py script we can run Flake8 here.

  3. In the code field now use the wrapper script as follows: python3 $UPLOADED_FILES/run_pytest.py <STUDENTFILE> <MAX ERRORS>. Max errors indicates the number of errors that lead to 0.0 points, other scores are calculated linearly based on this.

  4. Optionally: Use a "Connect Rubric" block to connect your code quality check to a rubric category.

Set up a Code Structure test

CodeGrade integrates a tool called Semgrep to make it easy to perform more complex code analysis by allowing you to write rules in a human readable format. You can provide generic or language specific patterns, which are then found in the code. With its pattern syntax, you can find:

  • Equivalences: Matching code that means the same thing even though it looks different.

  • Wildcards / ellipsis (...): Matching any statement, expression or variable.

  • Metavariables ($X): Matching unknown expressions that you do not yet know what they will exactly look like, but want to be the same variable in multiple parts of your pattern.

Writing the tests

Semgrep can be installed in the Setup step using pip, use: pip3 install semgrep. The easiest way to use Semgrep in AutoTest V2 is via our structure.py script (this will be built-in in a later version). With this script, you will write your Semgrep patterns directly in the Input field in an IO Test. Try out Semgrep patterns using their online editor here:

For instance, this is a pattern to detect a for-loop in the student code:

pattern: |
  for $EL in $LST:
    ...

Creating the test in CodeGrade

  1. Under the Setup tab, upload the structure.py file you can download above. Also, in a "Script" block, install Semgrep via pip3 install semgrep.

  2. Continue in the Tests tab: first drag the "IO Test" block to your Test Procedure. Execute the script in the code field, follow: python3 $UPLOADED_FILES/structure.py <STUDENTFILE>. For a fibonacci assignment, this could be: python3 $UPLOADED_FILES/structure.py fibonacci.py.

  3. Create and drag one or more "Simple Test" block(s) inside your "IO Test" block. Each block is one Semgrep structure to check. As Input write the Semgrep pattern (including pattern: or similar like pattern-either:). If you want to check if the student uses numpy use: pattern: import numpy.

  4. As Expected Output write "Found!" if the student code should match or "Not found!" if the student code should not contain the pattern (e.g. students are not allowed to use numpy).

  5. Optionally: Use a "Connect Rubric" block to connect your structure check to a rubric category.

  6. Optionally: Use a "Hide" block to hide the "config" (i.e. input / pattern) of your tests so that students cannot see the pattern you are checking for.

Step 3: Start the AutoTest

Once you have created a configuration, create a Snapshot to test it on a "Test Submission". In the General Settings, you should have already uploaded a Test Submission, so you will see the results of this straight away.

Once everything works as expected, press the green "Publish to students" button to publish your AutoTest V2 to students. If they use the editor they can run it instantly, and if they hand in in any other way they will get their AutoTest results instantly after any submission.

Last updated