Forum OpenACS Q&A: Surveys and Branching

Collapse
Posted by Kolja Lehmann on
The data-model for branching is already made, but did anyone yet start implementing something for branching?
If not, we will more or less port the thing, Malte and I have done for ACES. Well, a bit cleaner maybe.
Collapse
2: Re: Surveys and Branching (response to 1)
Posted by Dave Bauer on
Please ask around, i think there are other use cases besides "branching" in the IMS sequencing standards that might need to be addressed. You might not have to code them, but leaving flexibility to implement it later could be important.
Collapse
3: Re: Surveys and Branching (response to 1)
Posted by Kolja Lehmann on
What are the IMS sequencing standards? Are there any documents I should be needing?
Collapse
4: Re: Surveys and Branching (response to 3)
Posted by Ernie Ghiglione on
Kolja,

IMS Question & Test Interoperability Specification

Malte has put the testing module package we did online now... you will find that is based on IMS Q&A

IMS Simple Sequencing

Enjoy,

Ernie

Collapse
5: Re: Surveys and Branching (response to 1)
Posted by Malte Sussdorff on
Just wondering, is the new workflow module written by Collaboraid a way to implement the Branching and sectioning. Lars, do you have any insight here? Especially interesting would be whether Workflow supports branching based on a condition of an object (as we currently plan to define sections as ACS objects).
Collapse
6: Re: Surveys and Branching (response to 1)
Posted by Dave Bauer on
Malte,

Are you working with the current survey package for OpenACS 4.6? Survey sections are objects in there, they just are not exposed, and the hard work of writing the rules is not done.

Do you have a list of requirements that your branching work will fulfill? I know that for IMS support, there is a concept of sequencing which may, or may not mean the same thing as branching :) Specifically, sequencing deals with repeating a question, either a number of times, or based on a response. Another aspect mentioned is the sequencing of questions from a larger pool of questions. For example, a test may contain 50 questions randomly choosen from a list of 100.

I thought workflow might be an interesting way to try to sequence objects. Besides that another useful feature I noticed was the templating wizard. It shows progress along a list of tasks. It requires your code to keep track of the state, so an ideal solution would probably be a mix of workflow and the wizard.

Anyway, since I can't actually work on this, I hope this information helps you.

Collapse
7: Re: Surveys and Branching (response to 1)
Posted by Malte Sussdorff on
We open up the current development stage of the survey module for public UI scrutiny. If you are interested, here is a quick rundown on the feature changes.

- A survey can be structured into sections. You can enable it by clicking on the link for sectioning in the admin page of a survey. Sections are ordered and will be displayed one after another.
- Public Surveys: A survey can be public so even not registered users of the system can fill it out.
- Copy Survey: It lets you use the survey as a template for the creation of a new survey.
- Branching: A section can be made a branch (in the secion properties). You can jump to branches depending on the answer to a (previous) question.

Things missing:

- Unfinished survey answers. If you submit e.g. 2 of 5 sections, you might take a break and finish the rest when you come back.
- Postgres support (this is why we don't upload the code yet).

Nice ideas collected so far:
- Dave had the idea of repeating questions. E.g. if you answer the question "How many kids do you have" with "3", you should get three times the question "What is the name".
- Not sure if loops are supported by the branching system.

It would be nice if interested people could take a look and comment. Login information:

http://aiesec-dev.sussdorff-roy.com:8008/survey/admin
login: mailto:openacs@sussdorff-roy.com
pw: oacs

Collapse
8: Re: Surveys and Branching (response to 1)
Posted by Stan Kaufman on
Malte, very cool! The branching in "Malte's test" is great! Congrats on your progress!

I'm willing to help with the PG port if you want to parse out the effort.

I've been writing up some specs for "son of survey" package I'm currently called "Assessment", following terminology used by the IMS Global Learning Consortium. Our particular need is tools for clinical trials, patient registries etc where a variety of data validation/verification steps, audit trails etc are required. However the needs there are very similar to those in educational apps etc.

I've posted these documents (in progress) in the "Bay Area OpenACS Users Group" area I set up at www.epimetrics.com -- specifically here. Quite a few people in the community are already in this group, but anyone else can let themselves in (ie it has an "open" policy) once they've registered.

I'd certainly welcome any comments (and help as time goes on) from anyone in the community. I'm hoping that once this Assessment package gets done it will go into the OpenACS distro if people find it meritorious. Several of us are working in the same direction; it would be good to collaborate directly, IMHO.

Collapse
9: Re: Surveys and Branching (response to 8)
Posted by Carl Robert Blesius on
Stan,

I would welcome direct collaboration on this. After looking at the overview you put together it is clear you have already done a lot of the preliminary work.

Here is a suggested game plan:

1. Get the relevant parts of your document up on OpenACS in the projects section and generalize it.
2. Integrate other proposals (the Sloan RFP, the proposal by Matthew Geddert, and anything else I am missing) and find interested parties (I know of at least two in addition to Heidelberg).
3. Make sure we have taken any additional developments and requirements into consideration.
4. Use halftime to plan the next steps as a group (potential users+developers).

So... as part of step one we add this page:
https://openacs.org/projects/openacs/packages/assessment/
as a mutual workspace (which would replace
https://openacs.org/xowiki/survey/ )

Sound good?

Collapse
10: Re: Surveys and Branching (response to 1)
Posted by Stan Kaufman on
Carl, I'd welcome bringing the discussion into the OpenACS site at whatever point it seems appropriate. I gather you have edit permissions at openacs.org that allow you to accomplish this? I think it would be particularly helpful to pool our knowledge about existing standards, requirements, and competitive analyses.

Also, early on I think we should examine whether the various spaces we're discussing (educational, clinical research, generic data collection mechanisms for use elsewhere in the OpenACS toolkit, others?) actually do share enough commonality that it makes sense to try to create one general solution, or whether the appearance of commonality is superficial and crisper, separate solutions might be superior.

I've added some additional information and links here about CDISC, the Clinical Data Interchange Standards Consortium, and several big vendors of Electronic Data Capture solutions. The field is occupied by big players but their big, big price tags creates the need for alternatives -- and opens the opportunity for them.

Collapse
12: Re: Surveys and Branching (response to 1)
Posted by Nima Mazloumi on
Hi everybody,

here (user:student@student.de, pw:student) is what I gathered so far regarding complex survey. You can see everything first hand if you follow the links below.
I hope you find this list useful.
Due to security restrictions I was not able to post the html file directly in this forum. So sorry for the troubles. But at least you can see our test .LRN server at the University of Mannheim.

Sources
  1. AIESEC.net
  2. Itemdevil (user:nima, pw:mazloumi)
  3. SCORM v1.3 Application Profile, Working Draft
  4. Seminararbeit - Fragebogengenerator
  5. QuestionPro.com (Login: User:mazloumi, pw:nima)
Collapse
13: Re: Surveys and Branching (response to 1)
Posted by Nima Mazloumi on
SCORM 1.3 is next on my list...specially section 5 is relevant for scorm compliant sequencing.
Collapse
Posted by Carl Robert Blesius on
Stan,

Sorry for the delay in response.

I am sure that there is enough commonality to warrant us working together on this.

I created a new page on the openacs.org site here:
https://openacs.org/projects/openacs/packages/assessment/

I then started to cut and paste.

I would be grateful if you would finish the cut and paste job (so I do not have to feel guilty about blatantly copying the work you have done so far - same goes for you Nima).

Both you and Nima now have ETP rights on the page (you should both see a ETP link on the lower right of the page when logged in on openacs.org). 30 fingers are better than 10. Caroline sent me a copy of the old Sloan RFP. I will try to add parts of that over the next couple of days too.

Anyone else want to help on getting the assessment roadmap written up?

Collapse
14: Re: Surveys and Branching (response to 1)
Posted by Staffan Hansson on
FWIW, way back I collected my thoughts on an assessment functionality for Survey in a sketchy draft. It's not much of an input, but at least it serves to remind us that from the perspective of Curriculum an assessment tool is very much appreciated. We plan to hook up a future assessment service to the Curriculum package as the third phase of tool enhancement, after we've added an IMS Simple Sequencing engine and turned Curriculum from a passive and flat sequencing tool into an active and branched ditto.
Collapse
15: Re: Surveys and Branching (response to 1)
Posted by Dave Bauer on
I really think the curriculum package and the assessment package should coordinate. If you look at the IMS assessment specs it defers to the simple sequencing spec for sequencing of assessment objects.

If the curriculum packge is planning to implement sequencing, it would probably be good to make the assessment package utilize the sequencing service of curriculum instead of duplicating the effort.

This is just a conceptual recommendation. I haven't looked into pratical aspects of actually writing the code to do this.

Collapse
16: Re: Surveys and Branching (response to 1)
Posted by Staffan Hansson on
As I've told you before Dave, I like that concept too. But as I look through my old notes, I see that I once concluded that the IMS Simple Sequencing model should not be viewed as a generic sequencing engine but as a specific one for curriculum management - and that it should not be pulled out of the Curriculum package, which was the topic of the day back then. Unfortunately I forgot to make a note about my grounds for that conclusion...

So, I revisited the IMS Question & Test Interoperability - ASI Selection & Ordering Specification: it never mentions "simple sequencing" but only "sequencing" - it seems to propose a somewhat simpler type of sequencing for assessment. (When IMS people wish to describe a more complex type of sequencing they prepend the word "simple"...) The same spec also mentions an IMS Sequencing working-group, which sounds like a team that might know a thing or two about this issue. I searched the site for this group but was only offered a dead link.

This sequencing confusion probably only emphasizes your point that the Curriculum package and the Assessment package should be coordinated (as sequencing is introduced into Curriculum, which we'd be happy to do this summer if the funding can be found). I doubt that the two of them are as similar to one another as they seem conceptually, but they will hopefully work together at some point. I think Pind's Rule of Five might apply here.

Collapse
17: Re: Surveys and Branching (response to 1)
Posted by Ernie Ghiglione on
Staffan,

It would be great to have a look at your design of the sequencing engine once you have it. As I believe I mentioned to you before, I really would like to include this sequencing engine to be able to launch different learning objects according to IMS simple sequencing rules.

I'm sure I gave you this link already. Here's the conceptual definition of simple sequencion from Carnegie Mellon. It might help.

Thank you,

Ernie

Collapse
18: Re: Surveys and Branching (response to 1)
Posted by Ernie Ghiglione on
Sorry, forgot the link

Ernie

Collapse
19: Re: Surveys and Branching (response to 1)
Posted by Staffan Hansson on
Yes, that document looks like a pretty good summary of what IMS Simple Sequencing is conceptually about. However, as implementers, we're better off sticking to the original IMS Simple Sequencing Specification, IMHO.

Let me share my thoughts a little. We see Curriculum as the UI or front end to the simple sequencing engine. But Curriculum will have working relations with at least two other packages: Assessment and Learning Object Repository (LOR). Our common job as developers must be to coordinate these projects and fit these packages together. How do they fit together? What are their relations? This is my basic understanding:

Curriculum and LOR - As I've suggested in an earlier post, it's not LOR's task "to include the sequencing engine to be able to launch different learning objects according to IMS simple sequencing rules"; it's the task of Curriculum to be able to map content in LOR (or elsewhere) to activities in the curriculum sequences and launch them. LOR's task is to get the relevant information from Curriculum in order to export sequences, right?

Curriculum and Assessment - Here Curriculum is the client that wants to be able to ask the Assessment package for the results of a test taken and check these against the sequencing objectives and rules for the particular curriculum sequence set up in the Curriculum package. This is not something that the developers of Assessment have to worry much about, though.

This is how I see things right now. Does it make sense?

Collapse
20: Re: Surveys and Branching (response to 1)
Posted by Ernie Ghiglione on
Steffan,

> we're
> better off sticking to the original IMS Simple Sequencing
> Specification, IMHO.
Yes, that is cool. However, take a look at the doc since it describes a bit more technically the way that a simple sequencing engine should be design. IMS specs don't get to much technicalities and those are left (unfortunately or fortunately depends) to the designers and developers. It might give you some interesting perspectives on how to approach the design and construction.

> Curriculum will have
> working relations with at least two other packages:
Assessment and > Learning Object Repository (LOR).
I agree.

However, tell me ask you if I could do the following thing. Let's say I have a course that have three learning objects (A, B and C). Also, included in this course, we have the specified rules for the sequencing of those learning object (i.e. if learning object A completed and ScoreGreaterThan X, then Continue with B). Would I be able to pass these rules (or set activities or objectives) to the curriculum api and then it would give me the right sequencing of learning object to be launched for a particular learner that has taked (or not) the course before?

Or when I upload the course, should I pass the acs_object_id for my course and then the all the sequencing rules of the objects so the curriculum package "knows" how the scheme of launching the objects should be layed out?

I think basically, we need to define the API to the sequencing engine and what the SS will do.

> As I've suggested in an earlier post[1], it's not
> LOR's task "to include the sequencing engine to be able to launch
> different learning objects according to IMS simple sequencing rules";
> it's the task of Curriculum to be able to map content in LOR (or
> elsewhere) to activities in the curriculum sequences and launch them.

I guess it makes sense to launch LO from the LOR. However, we would need to work something in the between to support SCORM. It has its own funky API which is very specific. But I think we can work something around it.

Now about assessment. So for what you say, do the assessment package must have objectives and activities set according to IMS specs if the want to use the SS engine?

The way I would like to see it, and please correct me if you think there is something I might have missed, is to be able to give the SS engine the activities and objectives first [ i.e. ssengine::new_set_of_sequences -new_course_id $acs_object_id, -set_of_rules_and_objectives -map_to_LO {map to the LO (acs_object_ids) for the LOs in the content repository})]. Then I would like to be able to be given by the SS engine the proper sequencing for a given learner at any time [ie: ssengine::request_sequence_obj(user_id, course_id)] and get in return the sequence for this particular user (given what he has done in the past).

Moreover, a [ ssengine::check_activity (user_id, course_id activity_id) and get in return the status of the activity (satisfy, completed, attempted, etc).

Also others that deal with some sort of reporting (what have a learner have taken, where is he now, how many times it has taken it, and all that jazz). Do we kinda have the same thing in mind?

Thank you,

Ernie

Collapse
21: Re: Surveys and Branching (response to 1)
Posted by Ola Hansson on
Ernie, I have responded to your post in a new thread. Sorry about the delay.
Collapse
22: Re: Surveys and Branching (response to 1)
Posted by Stan Kaufman on
Carl and all the rest: I've been away for a week but finally am back on the case. I've posted (without much change) the stuff I'd written (overview, user requirements) previously to the page Carl opened up here: https://openacs.org/projects/openacs/packages/assessment/

I'll continue to edit/add to it, and I hope others do, too. Carl, do you want to add Comments to the page to facilitate discussion there, or is it better to keep discussion here in the Forums?

Collapse
Posted by Malte Sussdorff on
As I could not write the stuff we have done to the assessment page, I just continue to post here the latest news :).

After the last demo where we added branching and sectioning a couple of new features made it into survey.

  • Question catalog:
    A site wide administrator has the option to generate a predefined catalog of questions. Predefined questions have the benefit of beeing available in multiple survey's and a user that has filled out a predefined question before will be presented his answer in other survey's he is answering. The questions can be of the following types:
    • Normal predefined question. These are questions where the answers are stored normaly in the survey system. A copy of the question will be made for each particular survey and the individual question will use a parent_question_id to point to the predefined question at hand.
    • Database stored question. The answer to these type of predefined questions will be stored in a database table. When adding this type of question you will have to give a table_name, a column_name and user_id_field. The column of the table will be updated with the answer to the question if the user_id matches the user_id_field.
    In the not so distant future we want to add:
    • TCL based question: The answer will trigger a TCL procedure that executes with some predefined values (e.g. user_id, site_node whatever) and the answer to the specific question.
    • LDAP stored question: Same as database stored question, just storing the data in a LDAP server.
  • Clustering of sections: In the new version, all sections will be displayed on one page til we hit a breaker. The need is derived from the following scenario. On page one you have 5 questions that have 2 different follow up questions each, depending on the answer. In the old section model, the user would be presented page two to six with one question on it. This does not make sense, as you could display page two to six on one page, taking into account that you already have all the necessary branch information.
Currently in developement:
  • Templating of questions. Instead of executing survey specific code to display the forms in a survey, we are switching the code to use ad_form instead.
  • Templating of surveys. Uploading an adp file will allow you to change the positioning and layout of sections within a survey.
Plans for the future:
  • Make our survey i18n aware and add the translation for english into the language catalog.
  • Add assessment capabilities to the survey, or, more specific, to a new package making use of survey capabilities.
  • Make survey adhere to IMS standard.
  • Add generic import capabilities to support individual import from third party products (e.g. Blackboard, WebCT, others).
The code is still strictly Oracle that's why we did not bother uploading it to OpenACS so far. Dave mentioned the possibility to upload survey to /contrib, so once we are done with the templating I'll make a snapshot and contribute it there. In case you are in dire need of sectioning and branching, I'm more than willing to give you a current snapshot.