Meeting Minutes: October 2012

DAS Subcommittee Meeting

October 29-30, 2012

Chicago, Illinois

 

Monday, October 29, 2012

 

Present: Lori Lindberg, chair; Mahnaz Ghaznavi, Liz Bishoff, Veronica Martzahl, Glen McAninch, Subcommittee members; Sandra Dolan, Psychometrician; Solveig De Sutter and Amanda Look, SAA staff. Absent: Jackie Esposito, Subcommittee member.

The group came to consensus that DAS Subcommittee course shepherds should not be anonymous in order to foster a working relationship with developers and instructors.  Shepherds are positioned as the subcommittee liaison that conveys reviewer comments and subcommittee intent.  That way, reviewers are anonymous even if the subcommittee shepherd may be the reviewer.  All information and communication is still channeled through Solveig who’ll arrange conference calls for shepherd and instructors/developers when necessary. The Subcommittee’s monthly conference calls will continue to be used as the shepherds’  update to the group regarding courses and communication that took place.

What constitutes a ‘revision’ and not just a ‘tweak’ in content? We need to date all content to know when the content was last used. Revisions should be major, e.g. a change from a two-day course to a one-day course or a total revision of web seminar content (a whole new offering). Course content should be updated at least once a year to include needed tweaks.

Group settled on a check list sheet for each course to keep the course/web seminar revision on track.  SD to adapt MG’s example of a coversheet the committee could use.

 

Item Writing Training: Sandy Dolan

Dolan gave a quick summary of the exam creation proves.  1) Define major domains of this exam and 2) assign weightings. A certification exam usually uses a Job Task Analysis (JTA). A certificate program exam follows the curriculum.

Options discussed included using Foundational course topics only (8) or use the course topics from all tiers.

  • Is it possible to do a base exam where people are tested on foundational material and then offer an exam that people self-select what their ‘specialty’ is? Yes, but it would be an administrative nightmare and not the ideal option.
  • If we went with the latter option (test on everything) we wouldn’t cover Tools & Services (too specific) and we would ask questions that are very general.

Weighting options: Option one (testing foundational material only) would give equal weight to each course/domain (12.5%) or weight based on number of topics in course/domain. Option two (testing on all tiers) would be weighted based on the number of courses required; for example, Foundational 44%, Tactical and Strategic 33%, Tools and Services 11%, Transformational 11%.

 

The group agreed to:

  • 55% Foundational, 35% Tactical and Strategic, and 10% Transformational.
  • 100 item exam, 3 major domains (Foundational, TST, Transformational)
  • Creating two versions of the test - Form A and Form B ‑ and that 60% of the questions cross over between both. That means writing additional questions for each domain.
  • Update the comprehensive examination at least once a year.
  • Look at questions in each domain to set the pass percentage. This will happen in the mid-winter face to face meeting, and will require 7-10 subject matter experts.
  • Keeping the items secure by creating new questions for the comprehensive exam instead of rewriting questions that were used in the course exams. This also ensures that no one has seen these questions before.
  • Testing competencies that were addressed in the course content.

 

After a discussion on Challenge exams, the subcommittee recommended that we change the language on our website to say “The challenge exam pertains to this course and not the overall discipline/core or subject matter.”

The subcommittee recommended that SAA allow challenge exam takers to challenge the exam once and no re-takes.

If accepted by the Committee on Education, staff needs to change language on the website, in the Bookstore, on the ClassMarker site, and create new exams that are just for challengers (same questions, new link, new language). Carlos will have to create another category for challenge purchases versus web seminar attendee exam takers.

The group decided to test on:

Checksum (integrity checking)

Compression (bit loss)

DoD 5015.2

Emulation

Format (open, vendor/proprietary)

Functional

Normalization

OAIS (software package, standards, compression scheme)

Optical/magnetic

PDF/A

Rights Management

Standard (OAIS, PREMIS, Dublin Core, METS)

 

Appraisal

Accession

Arrangement/Description

Preservation

Access

 

DoD 5015.2

Records

 

 

 

 

Ingest

PDF/A

 

 

 

PREMIS

PDF/A

 

Digitization

CMS

 

Foundational [F]

Tactical and Strategic [TST]

Transformational [T]

Tools & Services [T&S]

 

DoD 5015.2

Injest

Dublin Core

PDF/A

Integrity checks (checksums, option B, option C)

Transmission protocols (compression, option B, option C)

CMS

Rights management

 

 

Retention/selection

Policy

CMS

PDF/A

Digital Repository

RFI/RFP

 

 

 

 

 

 

 

XML email preservation schema

CMS

PDF/A

 

 

 

Sandra Dolan: Presentation on exam construction:

 

A test of ability should be like a ruler, with a common frame of reference, taking on more meaning than just the items it contains.

  • Items must be a continuum from easier to harder; ask at each item, “does the person have this knowledge?”
  • When you pass no more items, you have determined their ability.
  • To make a better test measuring advanced ability, use more complex items requiring application, analysis.
  • To make a better test measuring fundamental ability, use more items requiring basic knowledge, some comprehension.
  • Either way the meaning of the test and ability does not change; the frame of reference is stable.

Measuring by the results – how many pass this question?

  • We must keep in mind that tests are NOT given to determine if a candidate knows the answers to particular items.
  • We are interested if the candidate understands the concepts that the items tap. 
  • We are not interested in the items themselves, but rather in the underlying latent trait or concept that the item investigates. 
  • Again, the test should take on more meaning than the items it contains.

Reading level – 12 grade? The committee agrees to 12th grade level. When writing these questions, this should be kept in mind.

Demographics – staff to send out an email to everyone who is pursuing the DAS certificate and ask them demographics questions.

  • Test on knowledge, comprehension, and application.
  • Avoid the words “should, could, and might.”
  • If the answer ends the sentence it should have a period after it.
  • If the answers aren’t proper names/places, the answers should be lower case.
  • Always use plausible wrong answers.
  • Avoid using none of the above/all of the above.
  • Avoid true/false (it’s a 50/50 guess).
  • Avoid the “A and B”, “B and C” etc.
  • Avoid negative statements – “which is not…”
    • If you have to use these kind of questions, group them together so people don’t have to switch over their thought process multiple times when they hit a negative question.

Share these rules with the DAS instructors so they can use this while writing their questions.

Major = Foundational, TST, T&S/TR

Minor = Appraisal, accession, description

Minor = Specific tool

Add bibliographies to Subcommittee DAS site and assigned pre-readings.

To develop the comprehensive exam we need to:

  • Write 150-170 items
    • 83 Foundational
    • 53 Tactical and Strategic
    • 14 Tools & Services/Transformational
  • Ask instructors or anyone who writes items sign an NDA (give them the guidelines for item writing so that we don’t have to re-write/re-structure items) – be specific about what the questions should be asking – content area, competencies, how many of each, etc.
  • Review the items
  • Approve the items
  • Test needs to be created
  • Test needs to be reviewed
  • Test needs to be approved

Scantron set up/finalizing exam – staff to research and decide on to set deadlines for submissions.

Staff to explore testing options via Peach New Media, Drupal, and/or Moodle.

Standards setting meeting – 7-10 experts will be present for a live meeting and come up with a passing score. This happens after the exam is written and reviewed. This can be done before or after the exam has been given (to do so afterwards enables a look at the data from that exam).

Beta testing? Offer it in March for 5-10 selected individuals. Tell them upfront that this is a beta test, etc., and if we drop bad items, they won’t be penalized for wrong answers on the bad items.  Offer it for free.

Subcommittee began writing items.

Meeting adjourned at 6:00pm.

 

Tuesday, October 30, 2012

 

Present: Lori Lindberg, chair; Mahnaz Ghaznavi, Liz Bishoff, Veronica Martzahl, Glen McAninch, subcommittee members; Solveig De Sutter and Amanda Look, SAA staff. Absent: Jackie Esposito, subcommittee member.

The subcommittee spent the morning writing items based on looking at course slides beginning with the Foundational courses.  By end of the day, 48? items were written.

Slide needs changing: F4_Standards: Slide 9 – take out “published” at “de jure / published”.

Slide needs changing: F6_Appraisal: Slide 46 – change to “GIS files” and add “there are more complex structures”?

Since item writing needs to be completed by mid-December, the group agreed to two conference calls in November and December dedicated to reviewing items that the individual members have written. 

 

Call dates:

Thursday, November 15 at 12:00pm Central (10:00am Mountain; 1:00pm Eastern)

Monday, November 26 (change to 30?)  at 12:00pm Central (10:00am Mountain; 1:00pm Eastern)

Friday, December 7 at 12:00pm Central (10:00am Mountain; 1:00pm Eastern)

Tuesday, December 18 at 12:00pm Central (10:00am Mountain; 1:00pm Eastern)

To achieve fair distribution of item writing, the following schedule was worked out:

 

Lori

Liz

Mahnaz

Veronica

Glenn

TST 03 -5 Qs

TST 08 -5 Qs

F 07  -8 Qs

18  Qs

F 08  -8 Qs

TST 05  -5 Qs

TST11  -5 Qs

18 Qs

TST 06  -5 Qs

TST 07  -5 Qs

T&S 01  -2 Qs

T&S 03  -2 Qs

TR02  -2 Qs

16 Qs

 

F 02  -8 Qs

TST 01  -5 Qs

TST 10  -5 Qs

T&S 02  -2 Qs

TR01  -2 Qs

22 Qs

F 06  -8 Qs

TST 02  -5 Qs

TST 04  -5 Qs

T&S 04  -2 Qs

20 Qs

 

The Education committee meeting is scheduled Feb 27-28, and the group agreed to piggy back the mid-winter DAS Subcommittee meeting at the end (March 1-2) to take advantage of Education committee members who’ll be asked to stay over and function as subject matter experts with the weighting of the items. 

 

Meeting adjourned at 1:45pm.

 

Additional notes that won’t be reflected in the official minutes:

For reviewers, explore Peter Hirtle, Richard Marciano, Anne Gilliland, Gregor Trinkhaus-Randal, Robert Spindler, Richard Pearce-Moses, Aaron Rubinstein, and Melley?

 

For SMEs from Education meeting ask David Kay, Naomi Nelson, Lorraine Dong.