Usage - Graphical interface

We shall describe here a usage example with the graphic interface from the conception of the multiple-choice test until the edition of students' scores.

Creating a new project and subject

Let's open the graphic interface. This can be done ordinarily by selecting ApplicationsEducationAuto Multiple Choice in the general menu of Gnome (or its equivalent in KDE or other), but the command auto-multiple-choice can be used directly.

Let's now create a new project, thanks to ProjectNew. A window opens and allows us to see existing project names (if any) and choose a name for our new project (made with simple characters; « test » will be OK for our short test), which we write in the field Project name. Then we push the New project button.

Now we must choose a LaTeX file as a source for the multiple-choice. Several possibilities are shown:

  • model: this choice allows to choose from models shipped with AMC an exam to customize later.

  • file: this choice allows to choose a LaTeX file already prepared for this exam. Somebody may have prepared the exam for you, or you can have prepared the exam outside AMC, using your favorite LaTeX editor.

  • empty: using this choice, an empty LaTeX file will be created. You have to edit it to compose the exam from zero.

  • archive: use this choice if you have a zip or tgz archive containing the exam definition (LaTeX source file, along with image files, parameters file for example). This archive can be made by an external software. It can also be a backup of one of your AMC projects.

For our test, let us choose model. The next window presents the models: choose for example Simple example from [EN] Documentation group. Now we can edit it to modify the shape of the document or the questions, thanks to the Edit LaTeX file button which launches the default editor.

Preparing the subject

Preparing a project is done in two steps. First we must make the reference documents from the LaTeX source file. This can be done by clicking the Update documents or Alt+U. The following documents are produced:

  • The question. This file can be printed to distribute its pages to students (see below).

  • The solution. We can check that the chosen responses there are the good ones. It is also made to be distributed to students.

When produced, those documents can be viewed (and possibly printed) from the corresponding buttons.

Now we can begin the last step of the preparation: analyzing the layout. It can be launched with the button Layout detection. This analysis detects, in every page of the subject, the exact position of every element which must be analyzed in the students' copies.

To check whether the layouts have been correctly detected, we can use the button Check layouts. A short insight allows to check that red checkboxes are correctly located over the boxes of the subject.

Mailing examan to student (with or without password).

Emailing exam sheets is possible.

  • You must prepare nominative sheets (see Nominative sheets (LaTeX) or Nominative sheets (AMC-TXT)).

  • Select printing to files : PreferencesMainPrintingPrinting methodto files.

  • Select extracting method: PreferencesMainPrintingExtracting methodpdftk / gs (ghostscipt) / qpdf (default) / sejda-console (not installed).

    [Warning]Warning

    If option pdfform is available, select sejda-console : (to install)

    cd
    wget https://github.com/torakiki/sejda/releases/download/v3.2.85/sejda-console-3.2.85-bin.zip
    unzip sejda-console-3.2.85-bin.zip
    sudo ln -s ~/sejda-console-3.2.85/bin/sejda-console /usr/local/bin
    
  • Print the files.

[Note]Note

AMC allow to protect all the files with your owner password and each file with a user password.

The list include a column title with the password

# STUDENTS / 1ST YEAR
surname,name,id,email,password
Bienvenüe,Alexis,001,paamc@passoire.fr,123456
Boulix,Jojo,002,jojo.boulix@rien.xx,789test01
Noël,Père,003,pere.noel@pole-nord.xx,clavierbepo

Printing and exam

Two alternative workflows can be considered:

  • For the most robust, create as many exam sheets as necessary for all your students, with different sheets numbers, and print them all. Each page can be identified by its numbers and boxes at the top, so that you can scan several times the same completed answer sheet page carefree.

  • Secondly, you can print a few subjects (or only one if you want), and photocopy them to get one subject for all students. Questions shuffling will be less efficient, and if you give several times a scan of the same page, AMC won't be able to know about it and will create an unwanted duplicate.

[Warning]Warning

To use this second workflow using photocopies, there must be only one page for students to write on (using a separate answer sheet can help you for this). If not, you won't be able to continue with AMC! Indeed, it would be impossible for AMC to make the link between two pages from the same student.

When the preparation is over, we can print the subject, and distribute it to the students... In simple cases, we can directly print from the viewer (after clicking the line Subject in the list of work documents). When it is better to print the copies separately (for example if copies contain multiple pages and when the printer allows to staple them together), we shall rather use the button Print copies after calculating the layout.

Test

Let the students pass the exam.

[Important]Important

When the subject is printed and distributed, we may no more modify the work documents because they must remain identical to distributed copies.

It is preferable that students use a black or blue pen or B or HB pencil.

Depending on the situation, you can ask the students to tick or fill the boxes.

Tick the boxes

If you ask the students to tick the correct boxes, they can correct a ticked box ereasing their mark with a eraser or white-out fluid. However, they must not try to draw the boxes back. Trying to do so, they could draw lines inside the boxes, that could then be considered as ticked boxes.

You can also let the students correct ticked boxes filling them completely. If you choose this option, you have to set the upper darkness threshold (from Preferences menu, Project tab) to some value less than 1 (but not too low). If the darkness ratio of a box is between the darkness threshold and the upper darkness threshold, the box is considered as been ticked. If the darkness ratio is greater than the upper darkness ratio, the box is considered as not ticked.

Fill the boxes

When the letters (or numbers) referencing the answers are drawn inside the boxes, you must tell the students to fill the correct boxes, as AMC can't make the difference between a box with a letter and ticked box.

The stiudents can correct a ticked box ereasing their mark with a eraser or white-out fluid, but they don't have any other solution to correct a ticked box. You must set the upper darkness threshold to 1.

Reading the copies

Now we shall describe the input from students' copies, which can be done automatically and/or manually.

Let's move to the Data capture tab of the graphical interface.

Automated input

For automatic recognition of the checked boxes in the students' pages, they must be previously digitalized. I use a copier/scanner which does it automatically (all the pages in a bundle without interaction with me), with the following settings: 300 dpi, OCR mode (for the characters' recognition, black and white without grayscale - but the scanner does not process any character recognition), each scan delivered as a single TIFF file per page.

[Note]Note

To analyze the scans, we must have them in one or several image files (TIFF, JPG, PNG, etc.). Vector graphics formats (PDF, PS or EPS) are also suitable: scans will then be converted into PNG by AMC before analysis.

[Note]Note

When giving scans for automated data capture the first time, you will have tell AMC which method you used: either different papers printed, or photocopied papers (see Printing and exam).

Then we select this set of scan files in the dialog opened by the button Automated of the section Data capture after examination, then we validate with the OK button. AMC begins with Optical Mark Recognition to detect the position of the four circle corner marks on the scans, position the boxes, and detects the amount of black pixels in each box.

The result of the analysis of each page is indicated in the lists of the section Diagnosis:

  • The valueupdate displays the date the date the page was last modified. Hided by default. Click on the button columns to show it.

  • The value MSD (mean square deviation) is an indication of the good framing of the marks (the four black dots surrounding each copy). When it is too great, the framing must be checked (right click on the page's line then choose page to view the scanned page and the boxes as they were detected).

  • The value sensitivity is an indicator of proximity of the filling of the boxes with the threshold. If it is too great (from 8 to its max value 10), we must check whether the boxes recognized as checked are the good ones (a right click on the page's line the choose zoom to view the set of boxes in the copy, verify whether the detection worked correctly, and correct it if needed drag-and-dropping the boxes images).

  • The value scan files displays the name of the handled page. Hided by default. Click on the button columns to show it.

Manual input

If we cannot use easily the scanner, or if, for a few copies, the automated input did not work as expected, we can manage the input manually. To do so, let's open the right window thanks to the button Manual of the section Input of the copies after exam. In that window, we can input the boxes which have been checked ourselves (by clicking them) on the wanted pages.

[Warning]Warning

Every manual input will overwrite results eventually coming from a previous or posterior automated input for the same page.

Viewing empty or inavlid questions

By clicking on the page numbers, AMC wrap the boxes answers with a colored square :

  • cyan for empty answers,

  • yellow for invalid answers.

You may change this colors in the menu tab: PreferencesDisplayScan

It is possible to search a specific question (see Select a specific question)

Select a specific question

This option make it possible to manually mark on-screen a specific question. This save your having to search on each page the question if they are shuffled.

Mark an open question on-screen
  • open the manual input tab and select "scan" as background,

  • select the question to mark (drop-down menu above the list of pages) .

The open question's check-boxes are on the top of the window, and when you click next you move forward to the following student, always for the same question.

[Note]Note

All questions can be checked like this.

Check on-screen pages with invalid or empty questions
  • The marking must be ended before use (see Process section Correction),

  • open the manual input tab and select "scan" as background,

  • choose if you want to navigate through all pages, through pages with invalid answers (inv), or through pages with invalid or empty answers (i & e).

Correction

In the Marking tab of the graphic interface, the part Marking allows us to deduce the scores of the students from the inputs, but also to read the codes written by the students (see the section called “Code acquisition”).

Process

The computation of the scores is launched with the button Mark, but we must previously make the following choice:

  • If we check the box Update marking scale, the scoring strategy will be first extracted from the LaTeX source file. This allows to try many strategies at the end of the correction process. This action also updates which answers are specified as correct or as wrong. Hence, potential mistakes in the answers can be easily fixed after the exam. The method to specify the strategy in the LaTeX file will be explained in the section Scoring strategy (a default scoring strategy is used when no indication is given).

When we click the button Mark, the correction is made (this can take some time if we also asked for the reading of the scale).

Scoring strategy

The strategy used to score the copies is indicated in the LaTeX source file, with the command scoring. It can be used in an environment question or questionmult, to set it for every response, but also in the environment choices, to give scaling indications about a single response. The argument of the LaTeX command scoring is made of indications like parameter=value, separated by comas. The usable parameters are the following (the table shows also in which context those parameters can be used):

parametersimplemultiplevalue
QAQA
e  The score given when responses are incoherent: many boxes checked for a simple question, or, for a multiple question, the box "none of the responses are correct" checked while another box is also checked.
v  The score given in case of no response (no box is checked).
d   An offset, i.e. a value added to every score not relevant of parameters e and v.
p   The bottom score. If the calculation of the score in that question yields a value below the bottom value, the sore is set to the bottom value.
b Score for a good response to a question.
m Score for a bad response to a question.
    Without parameter name (syntax: \scoring{2}), this indicates the score to give if the student has checked this response.
auto   With this parameter, the value of the response numbered i will be auto+i-1. This option is mainly used with \QuestionIndicative (see section Questions and answers).
mz  This parameter is used for a "maximum or zero" scoring: if all the answers are correct, the score is mz. If not, the score is zero.
haut   When you give this parameter a value n, the score given for a perfect response will be n, and one point will be withdrawn for each error. haut=n has been rewritten as d=n-N, p=0
MAX  Gives the maximal value given for the question (for a "question scored 5", one can write MAX=5). To be used only when it is not the same value as when one replies every good response. If you specify MAX = 3 with 4 points question, a student will have a score of 4/3 in this question.
formula  Gives the score to be given for the question, often using a formula that uses some variables (see the section called “Global scoring strategy”), without taking b and m values into account.
set.XXXGives a particular value to the variable named XXX, that will be available in a formula. In an answer context, the value is associated to the variable only if the box is ticked. As a particular case, give a non-null value to the variable INVALID to declare the responses incoherent (so that the score will be given by the variable e).
setglobal.XXXGives a value to the variable XXX for all following questions (relative to the lexicographic order of IDs)
default.XXX  Gives a value to the variable XXX in the case when no ticked boxes gave a value to XXX. If you use the parameter default, you must declare the variable XXX inside the scoring. You may assign the value set.XXX to the variables b and m.
\begin{questionmult}{Q1}
\bareme{default.CONF=1,m=-CONF,b=CONF}
requires.XXX  Tells that the variable XXX has to be defined, unless the data is told incoherent and the question scored with the value of e.

The default scale for a simple question is e=0,v=0,b=1,m=0, which gives one point for a good response and no point in the other cases. The default scaling for a multiple question is

e=0,v=0,b=1,m=0,p=-100,d=0

which gives a point for every checked box, either good or not (good box checked or wrong box not checked).

The LaTeX command \scoring can also be used outside question definitions, for whole examination parameters:

  • SUF=x gives a total number of points sufficient to get the maximal mark. For example, with 10 for the maximal mark and parameter SUF=8, a student getting a total of 6 points will get mark 6/8*10=7.5, whatever the value of the total number of points for a perfect answer sheet.

  • allowempty=x allows the student to leave x questions unanswered. When summing up questions scores, x unanswered questions will be canceled.

Using all of these parameters in combination allows to define many kinds of scoring strategies, as in the following example:

\documentclass{article}

\usepackage[utf8x]{inputenc}
\usepackage[T1]{fontenc}

\usepackage[box,completemulti]{automultiplechoice}

\begin{document}

\element{qqs}{
\begin{question}{good choice}
  How many points would you like for this question?
  \begin{choices}
    \correctchoice{Maximum: 10}\scoring{10}
    \wrongchoice{Only 5}\scoring{5}
    \wrongchoice{Two will be enough}\scoring{2}
    \wrongchoice{None, thanks}\scoring{0}
  \end{choices}
\end{question}
}

\element{qqs}{
\begin{questionmult}{added}
  Get free points checking the following boxes:
  \begin{choices}
    \correctchoice{2 points}\scoring{b=2}
    \wrongchoice{One negative point!}\scoring{b=0,m=-1}
    \correctchoice{3 points}\scoring{b=3}
    \correctchoice{1 point}
    \correctchoice{Half point}\scoring{b=0.5}
  \end{choices}
\end{questionmult}
}

\element{qqs}{
\begin{questionmult}{3 or zero}\scoring{mz=3}
  Only a perfect response will be scored 3 points - otherwise, null score.
  \begin{choices}
    \wrongchoice{Wrong}
    \wrongchoice{Wrong}
    \correctchoice{Right}
    \correctchoice{Right}
  \end{choices}
\end{questionmult}
}

\element{qqs}{
\begin{questionmult}{all for 2}\scoring{haut=2}
  Perfect response scored 2 points, and give back one point for any error...
  \begin{choices}
    \correctchoice{Right}
    \correctchoice{This one is OK}
    \correctchoice{Yes!}
    \wrongchoice{False!}
    \wrongchoice{Don't check!}
  \end{choices}
\end{questionmult}
}

\element{qqs}{
\begin{question}{attention}\scoring{b=2}
  Some very bad answer yields here to a negative score (-2), but the correct answer is rewarded 2 points.
  \begin{choices}
    \correctchoice{Good!}
    \wrongchoice{Not correct}
    \wrongchoice{Not correct}
    \wrongchoice{Not correct}
    \wrongchoice{Very bad answer!}\scoring{-2}
  \end{choices}
\end{question}
}

\element{qqs}{
\begin{questionmult}{as you like}
  Choose how much points you need:
  \begin{choices}
    \correctchoice{You take two points here}\scoring{b=2}
    \wrongchoice{Check to give 3 points}\scoring{b=0,m=3}
    \correctchoice{Get one if checked, but give one if not}\scoring{m=-1}
  \end{choices}
\end{questionmult}
}

\element{qqs}{
\begin{questionmult}{03}
  \scoring{default.COMP=10,default.PROP=11,formula=(COMP==PROP ? 1 : 0)}
  \AMCdontAnnotate
  Name an important gas in the air and its percentage.
    \begin{choices}
      \wrongchoice{steam}
      \wrongchoice{gas}
      \correctchoice{nitrogen}\scoring{set.COMP=1}
      \wrongchoice{oxygen}\scoring{set.COMP=2}
      \wrongchoice{carbon dioxide}
      \wrongchoice{20\%}\scoring{set.PROP=2}
      \wrongchoice{40\%}
      \wrongchoice{60\%}
      \correctchoice{80\%}\scoring{set.PROP=1}
    \end{choices}
\end{questionmult}
}

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

\onecopy{20}{

\noindent{\bf QCM  \hfill Scoring strategy test}

\vspace*{.5cm}
\begin{minipage}{.4\linewidth}
\centering\large\bf Test\\ Jan. 2008\end{minipage}
\namefield{\fbox{\begin{minipage}{.5\linewidth}
Name:

\vspace*{.5cm}\dotfill
\vspace*{1mm}
\end{minipage}}}

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

\setgroupmode{qqs}{withreplacement}

\insertgroup{qqs}

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

}

\end{document}

Global scoring strategy

To use a strategy globally for a set of questions, one can define it in a LaTeX command, as in the following example:

\def\barQmult{haut=3,p=-1}

\begin{questionmult}\scoring{\barQmult}
  [...]
\end{questionmult}

Another possibility comes with the LaTeX commands \scoringDefaultS and \scoringDefaultM, to be used in the begin of the document (outside the command \onecopy), which allow to give default values for the scoring strategy of simple and multiple questions:

\scoringDefaultM{haut=3,p=-1}

If you use formula with \scoringDefautM or \scoringDefautS, you must cancel it to score different questions with a specific scoring.

\begin{questionmult}\scoring{b=1,m=-0.5,formula=}
  [...]
\end{questionmult}

In some cases, defining a global strategy can be interesting depending of the number of proposed responses. To do so, just input the value N. For example, to get a scale yielding 4 as the maximal score and such as the mean expected score of a student checking randomly the boxes is 1, one can use the scale d=4,b=0,m=-(4-1)*2/N (which give the score -2 if every response is false, i.e. the wrong boxes have been checked and the right boxes are not). Operations allowed in those formulas are the four simple operations (+ - * /), test operator ( ? : ), parenthesis and all perl operators.

[Note]Note

The test operator is written

( test ? if true: if false)

The test part can use operators like > (greater), >= (greater or equal), < (lesser), <= (lesser or equal), == (equal), != (different), || (or), && (and).

Other variables can also be used:

  • N is the number of proposed responses, without counting the response eventually added by the option completemulti.

  • NB is the number of correct responses to the question (without taking in account checked or non-checked boxes).

  • NBC is the count of correct responses which have been checked.

  • NM is the number of wrong responses to the question (without taking in account checked or non-checked boxes).

  • NMC is the count of wrong responses which have been checked.

  • IS is set to 1 if the question is simple and 0 if not.

  • IMULT is set to 1 if the question is multiple and 0 if not.

From scoring strategy to students marks

Here is how students' marks are computed: for every student,

  1. The scoring strategy is applied for each question in turn, to get the questions scores.

  2. All questions (except indicative ones) scores are added to get the student total score.

  3. If a positive maximal mark is given as a parameter (in the Project tab of the EditPreferences window), the total score of the student is divided by the maximum total score (which is the total score for a perfect copy), and multiplied by the difference (maximal mark - minimal mark), then added to the minimal mark to get the student's mark. This way, if the student answered perfectly to all questions, his mark will be the maximum mark, and a student with null score will get the minimal mark. If you set the maximal mark to 100 and the minimal mark to 0, the student's mark can be seen as a percentage of good answers.

  4. This mark is rounded using the following settings from EditPreferencesProject:

    • Grain: set it to 1 if you need an integer value, set it to 0.25 if you need to round up to a quarter, etc. Set it to 0 if you want to cancel rounding.

    • Rounding type: lower, normal, greater

Correct the scoring errors

You may, even after the test, change the scoring. However, you must not never update the document. It is better to open the file with a text editor to make changes and save it.

You may :

  • turn correct answer into wrong answers.

  • turn wrong answer into correct answers.

  • Modify the scoring scale for one or several question or the default scoring.

You can not :

  • Turn a simple question into a multiple choices question.

  • Turn a multiple choices question into a simple question.

  • Add questions, answers.

  • Remove questions, answers.

  • Modify the order of the questions and/or answers.

[Note]Note

If you want to cancel a question use this strategy \scoring{b=0,m=0,e=0,v=0} or this one \QuestionIndicative.

Identification of the students

This stage is not mandatory. It deals with associating each copy with a student. The name of the student is not read in an automated fashion, but two reasonable possibilities are proposed:

  1. It is possible to ask students to identify themselves on their copy with their student number, which will be written by checking one box per digit. A LaTeX command is designed to use this method on the copy (see the part the section called “Code acquisition”). After the exam, copies will be identified automatically taking into account a list matching the students' numbers and their names.

  2. With no input of the students' numbers, or in the case when the automated identification has not succeeded perfectly (for example when a student made a wrong input), the graphical interface allows an assisted manual association.

Let's first move to the Marking tab of the graphical interface.

List of the students

We must previously supply a list of students. This list can obviously be used for many multiple-choices tests. This list is a CSV file with optional comments lines at the beginning with prefix `#', as in the following example:

# STUDENTS / 1ST YEAR
surname:name:id:email
Bienvenüe:Alexis:001:paamc@passoire.fr
Boulix:Jojo:002:jojo.boulix@rien.xx
Noël:Père:003:pere.noel@pole-nord.xx

The first lines of the file which begin with the character `#' are comments. The first of the other lines contains (separated by the character `:') the column titles. Then, with one line per student, we write the corresponding information. There must be at least one column named name or surname.

[Note]Note

One can replace the separator `:' by a comma, a semicolon or a tabulation. However the same separator must be used everywhere in the file which contains the list of students. The used separator is detected by taking the character (out of the four possible characters) which appears most frequently in the first line which is not a comment.

Any CSV file should be suitable.

[Warning]Warning

Type carefully the CSV file to send the same test to multiple recipients.

  • A semi-colon or colon or tabulation to separate the headers and a comma to separate the email adresses.

  • A comma to separate the headers and email adresses between inverted comma/quotation marks.

name,forenama,email
Boulix,Jojo,"jojo@boulix.fr,parents@boulix.com"       

The prepared list of students will then be selected with the button Set file in the Students identification section. We must also choose one of the columns as a unique key which will identify the students (generally, one chooses the column containing the student's number). Last, to prepare an automated association, we must choose the name of the relevant code used in the LaTeX command \AMCcode (if used).

Association

Automated association
Without barcode
[Warning]Warning

To make an automated association, at least one command AMCcode is required (see the section called “Code acquisition”) in the LaTeX source file, as well as a list of students with a column containing a reference (generally a number of student) which will be identical to the input given in the boxes produced by the command AMCcodeGrid.

To perform an automated association complete the two drop-down menus on the Notation tab:

  • Primary key from this list: the name of the column, in the list of students, with the students' numbers (see the section called “List of the students”).

  • Code name for automatic association: : the selected identifier thanks to the LaTeX command \AMCcodeGrid (see the section called “Code acquisition”).

  • Check the name filed type: PreferencesProjectName field typeImage

  • When we push the button Automatic in the part Students identification, matching of the codes given by the students begins. We can watch or improve the result later with a (partial) manual association.

With a barcode

The barcode must be sticked in the space created by champnom.

[Note]Note

AMC does not create bar code, it is your responsibility to do so with external software.

To perform an automated association complete the two drop-down menus on the Notation tab:

  • Primary key from this list: the name of the column, in the list of students, with the students' numbers (see the section called “List of the students”).

  • Code name for automatic association: : _namefield

  • Check the name filed type: PreferencesProjectName field typeBarcode or Barcode tail.

  • Decode name fields: MenuProjectDecode name fields.

  • When we push the button Automatic in the part Students identification, matching of the codes given by the students begins. We can watch or improve the result later with a (partial) manual association.

Manual association

To open the window allowing recognition of the students' names, let's click on Manual button in the Students identification section. This window is made of an upper part which presents in sequence images of the names written by the students, a lower part containing a button for each student from the list we supplied, and a right part allowing to browse easily the copies to be identified. Let's click the button matching the name written in the upper part for each presented page (by default, only the copies not or badly identified are presented - this can be changed by checking the box associated). When every page is read, a blue background appears instead of the names, and we just need to click the Save button to end with association.

Exporting the scores list

At this stage, we can get the list of scores under various formats (currently CSV and OpenOffice), with the button Export. This export will be followed by the opening of the produced file by the appropriate software (if available).

Export to ODS (OpenOffice, LibreOffice)

In the exported file, the following colors are used:

gray

is used for non applicable. This may be for example scores for absentees, or scores corresponding to a question that was not shown to the corresponding student.

yellow

is used for questions that has not been answered by the student.

red

is used for questions with an invalid answer: the student ticked several boxes in a simple question, or he ticked some boxes and the box None of these answers are correct.

purple

used for indicative questions.

green

used for total score of questions in the same group (score or percentage). See Identifier et Groups of questions

Annotation

When we push the button Annotate papers, copies annotation will begin: on every scan, the following annotations will be made (these are the default annotations, they can be configured):

  • The boxes wrongly checked by the student will be circled in red;

  • the non-checked boxes which should have been are checked in red;

  • the checked boxes which had to be checked are checked in blue;

  • for each question, obtained and maximal scores are indicated;

  • the global score of the copy is indicated on the first page of the copy.

The text written on the first page of the copies can be configured (EditPreferencesAnnotationHeader or EditPreferencesProjectPapers annotationHeader text). Substitutions will be made within the provided text (please have a look at the section called “From scoring strategy to students marks” for some details on the meaning of those values):

%S

is replaced by the student's total score.

%M

is replaced by the maximum total score.

%s

is replaced by the student's mark.

%m

is replaced by the maximum mark.

%(ID)

is replaced by the student's name.

%(COL)

is replaced by the value of column COL in the students list for the current student.

This operation is made for each page, giving as a result PDF annotated papers. The name of the PDF file which will contain the corrected copy of a student is based on the template indicated in the field File name model. In that template, every substring as « (col) » is replaced by the contents of the column named col in the file containing the list of students (see section List of the students). If we let this field empty, a default value is built up based on the student's name and student number.

Options with the option separateanswersheet

  • Only pages with answers : the answer sheets will be annoted.

  • Question pages from subject : the answer sheets will be annoted and the subject will be inclued to the pdf file.

  • Question pages from correction : the answer sheets will be annoted and the corrected will be inclued to the pdf fil.

Marks' position

You may choose marks's position with the menu EditPreferenceProjectMarks position

Default choices

Default values edit menu

EditPreferencesScan

Scans conversion

  • Vector formats density (DPI) : 250

  • Black & white conversion threshold : 0.60

  • Erase red from scans : unticked

  • Force conversion: unticked

Detection parameters

  • Marks size max increase: 0.20

  • Marks size max decrease : 0.20

  • Default darkness threshold : 0.15

  • Default upper darkness threshold : 1

  • Measured box proportion: 0.80

  • Process scans with 3 corner marks : unticked