# Multiple Choice Question

Sometimes, programming assignments aren't the most efficient way of testing students' knowledge of programming theory. In those situations, Multiple Choice Questions (MCQs) may be more practical. The Multiple Choice Question block allows you to set up a fully automatically graded MCQ in your assignments.

{% hint style="success" %}
Students can fill in Quizzes only by using CodeGrade's Online Editor. For an assignment that includes quizzes, it is therefore necessary to have the Online Editor as the only allowed way of creating submissions. In this case, it is also convenient to enable the [Simple Submission Mode](/setup-assignment/build-assignment/general-settings.md#step-2-set-submission-settings), so that students are navigated to the Editor automatically when launching an assignment.
{% endhint %}

### Create a Multiple Choice Question Test

To create a MCQ, first add a Quiz Block, and then include a Multiple Choice Question Block within it.

In a Multiple Choice Question Block, you can set:

* The name of the Block so that the student knows what the question is about;
* A description of the question, by filling in the **Question** field. This field supports markdown so that you can format the text or include links to external resources, such as in the following example:

```markdown
Which of the following functions prints the first `n` [perfect squares](https://en.wikipedia.org/wiki/Perfect_square)
```

Let's see more in detail how to set answers:

* New answers can be added by clicking the :heavy\_plus\_sign: icon on the right of an answer block;
* The order in which the answers will be displayed can either be random (different for each student) or  just the same as you add them;
* You will have to select which answer is the correct one by clicking the circle on the left of the corresponding block;
* Each answer has its own markdown description field. Here, you can add formatted text or code snippets, as in the following example:

````markdown
```python
def print_first_n_squares(n):
    for i in range(1, n + 1):
        print(i * i)
```
````

* for each answer, you can provide a hint, again in Markdown, by clicking the :bulb: icon. This will be displayed if the student chooses that answer. This could be a simple confirmation message like `Correct, well done!` , or you could provide feedback to help the student understand why the selected answer is wrong.

<figure><img src="/files/VuNqO4R2X4HIYuWbGVkI" alt=""><figcaption><p>The configuration of a Multiple Choice Question in AutoTest.</p></figcaption></figure>

### Workflow suggestions:

#### Hiding information from the student

It is generally good practice to use **Hide Blocks** with Multiple Choice Questions, as students, by default, can see:

* Whether their answer is right or wrong. You can use a **Hide Result Block** to prevent this (hints will be masked too);
* The correct answer, by inspecting their own submission after handing in. You can use a **Hide Configuration Block** to prevent this.

Remember that any hidden information can be released to the students after the deadline or lock date!

#### Limited amount of submissions&#x20;

Also, with Quizzes, you can [limit the number of submissions](/setup-assignment/build-assignment/general-settings.md#step-2-set-submission-settings) that the student can make. This can be especially useful for MCQs when nesting them within a **Run Only In Block.** The  **Run Only In Block,** when set to **Submission**, will force the student to spend a submission attempt in order to receive feedback about their MCQs. &#x20;

#### Example

To prevent the students from answering a MCQ with a brute force approach, we may then, for example:

* Wrap a **Run Only in Submission Block** around our Quiz Block;
* Wrap a **Hide Configuration Block** around our MCQ Block;
* Set the number of submission attempts to a small number like 2.

In this way, the student will be informed whether their answer is correct only after submitting, and, if needed, will have just one more attempt to guess the correct answer.

<div data-full-width="true"><figure><img src="/files/UnU9OHv60T1j7rf3puv6" alt=""><figcaption><p>The Select All Question configuration discussed in the example.</p></figcaption></figure></div>

{% hint style="success" %}
To make Quizzes graded, remember to wrap a Connect Rubric Block around them.
{% endhint %}

{% hint style="success" %}
Quizzes can be used in combination with Run-If blocks. You may, for example, run other tests conditionally on the result of a Quiz.
{% endhint %}

### AutoTest Snapshot and Test Student Submission

With your Multiple Choice Question Block now set up, follow these instructions to build the AutoTest configuration and publish it to the students:

1. Click the bottom right **Build Snapshot** button to build your AutoTest Snapshot. You will need to create a dummy Test Submission the first time you do so. The file you upload doesn't matter, it's just there to allow you to build and publish your snapshot. Your Multiple Choice Question Tests will fail as the answer file has not been filled in yet. Nonetheless, you should go on and publish your snapshot.
2. Navigate to the Submission Overview page and click on the Test Submission.
3. Open the Test Student submission in the editor by clicking the top right **Edit** button.
4. Fill in the Multiple Choice Question with the correct answer, delete the dummy file you submitted in point 1, and finalize the submission by clicking the **Hand in** button in the bottom right corner.
5. Go Back to the AutoTest configuration and click the  **Build Snapshot** button again. This time you should see your MCQ Tests passing. If not, go back to point 2 to correct the Test Submission.

After publishing your snapshot, the student can fill in the Multiple Choice Question and receive feedback in the Online Editor as shown below:

<div data-full-width="true"><figure><img src="/files/2o2GBODBLKFBOyX2m5Ig" alt=""><figcaption><p>An example of the feedback the student receives when filling in a Multiple Choice Question in the Editor.</p></figcaption></figure></div>


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://help.codegrade.com/automatic-grading-guides/quizzes/multiple-choice-question.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
