Forum .LRN Q&A: Re: Request for advice from the OCT

Collapse
Posted by Malte Sussdorff on
Hi Ernie, I mentioned earlier:

Now, internally the assessment system is flexible enough to enable a sequencing on it's own. This sequencing though has nothing to do with IMS simple sequencing. The assessment sequencing is used for branching, displaying multiple question on a page and then go to the next one.

I see a clear distinction between what an assessment does *internally* and how it is called in a learning context. If you talk about IMS sequences within an assessment, they have to be controlled by the assessment system using the functions provided by the assessment.

But this is not what the assessment is all about in a learning context. In a learning context an assessment is only *part* of the learning experience. And this learning experience includes other objects as well (e.g. LORSm content, grades given in oral exams, ...). The SS package will deal with the conditions and rules that govern the way how the Sequence between these Learning Objects is created.

Let's try to get the distinct utterly clear, as I think this is the reason for confusion.

  • Question four follows question two if answer to question one was "bar". Otherwise display question three. Strictly assessment package internal.
  • Display assessment higher mathematics if paper on mathematics has been read. SS functionality
  • Display questions a,b,f,g if paper on mathematics has been read, display question b,d,g,j otherwise. SS functionality. Footnote: a,b,f,g is one assessment, b,d,g,j is another assessment.
You asked where we store the grades. The assessment system *internally* stores percentages. The results of an assessment will be *pushed* to the Evaluation package. The SS system has to query the evaluation package if it wants to create a rule based on grades, *not* the assessment system (though it might do so if it so pleases, but I don't think it would make sense for it to do it).

Now, when a student/learner goes about taking that particular assessment, the Assessment packages ask the SS to deliver the correct sequence for this user. Once the SS engine returns the appropiate sequence, then the Assessment package renders it accordingly to the user.
No. This is not the case. The assessment knows on it's own the sequence which to use for the assessment as an assessment *internally* does not differentiate between items and sections depending on external conditions. If you want to modify an assessment based on external conditions you should create two assessments, otherwise the results of *one* assessment are not comparable anymore within the assessment. And this might be the case where I run heads on against the "standards" wall, but unless I see a real use case, where you have an assessments display governed by external conditions, I'm not keen on designing it that way from the beginning (you can always exchange the *internal* sequencing engine at a later stage, if utterly necessary).

If a student leaves an assessment in the middle, the assessment system knows where to continue. No need for the SS system to give the next questions. This is something the assessment does all by itself *internally*.

My whole point is that there is a clear distinction between how sequencing is done *internally* in a package and *externaly*. You are not going to make the SS package responsible for the sequence of paragraphs in a document. Neither do you have to make it responsible for knowing the sequence *within* an assessment. But it is *very* responsible for providing the sequence between the document and the assessment.

Can you see this distinction and does it make sense to you ?

P.S.: I do agree that it would be nice to use the API and storage capabilities of an SS package for handling sequences internally in an assessment. But until we have such a generic API and storage capabilities, we are stuck with the engine currently implemented in the design specifications. If someone (Ola, Ernie 😊 ) wants to take a look at it and modify it in a way that we could split this out and make it into an SS api, that's fine with me. Please look at https://openacs.org/projects/openacs/packages/assessment/design/sequencing.

Collapse
Posted by Ernie Ghiglione on
Hi Malte,

Thanks for taking the time to explain this a bit more clearer. It has been really good. We should have more of these discussion as it really helps us to put everyone in the same page.

<blockquote> No. This is not the case. The assessment knows on it's own
the sequence which to use for the assessment as an assessment
*internally* does not differentiate between items and sections
depending on external conditions.
</blockquote>

But then, is it possible to say that the sequencing of QTI has nothing to do with IMS SS? For instance, a sequence of activities can't reach (for lack of a better word) one single and individual question in an assessment? If no, then we might need to figure out what we can do as Simple Sequencing does not place any restrictions on what can be sequenced in such a tree. (http://www.imsglobal.org/simplesequencing/ssv1p0/imsss_bestv1p0.html#1500831)

More over, can an QTI assessment be sequenced using IMS SS?

<blockquote> My whole point is that there is a clear distinction between how
sequencing is done *internally* in a package and *externaly*. You are
  not going to make the SS package responsible for the sequence of
paragraphs in a document.
</blockquote>

That is true. However, that deals with the granularity of the activities defined in the sequence. It won't be able to sequence paragraphs as they are part of a learning object, which I believe they are the smallest units that can be part of an activitity, right?

However, I was under the assumption -by reading the specs- that it was possible to sequence individual questions as they are the smallest (atoms) of QTI that can be sequenced. But I'm not so sure any more 😊

<blockquote> Can you see this distinction and does it make sense to you ?
</blockquote>

Yes, I believe I do. Summarizing:

IMS SS = sequencing of activities (learning objects, entire assessments, etc)
IMS QTI (assessment) = internal sequence of questions given by the assessment creator before it was uploaded into the system

Right?

Ernie