Meeting Minutes: October 2017

Society of American Archivists

Digital Archives Specialist Subcommittee Meeting

October 9 – 10, 2017

Chicago, Illinois

 

Meeting Notes

 

Attendees:  Glen McAninch (Chair), Martin Gengenbach (Vice Chair), Jennifer Brancato, Kate Legg, Alice Prael, Sibyl Schaefer, Jennifer Spamer, Ashley Taylor, Marlee Graser (Intern), Veronica Martzahl (Comprehensive Exam Development Consultant), Solveig De Sutter (Education Consultant), Brianne Downing (Education Coordinator), and Nancy Beaumont (SAA Executive Director). Staff liaison Mia Capodilupo did not attend.

 

Monday, October 9

 

McAninch called the meeting to order at 8:35 am and briefly reviewed the proposed agenda.

 

The minutes of the Subcommittee’s September 18 conference call were approved with no changes.

 

Implementation Plans/Activities for the Goals/Objectives

 

Current Goals/Objectives

 

  1. Examine the role of testing in the DAS program and make adjustments.
    1. Activities could include review of exam evaluation responses and other feedback.
    2. Review DAS mission/mandate against the goals of DAS testing.
    3. Brainstorm and identify other ways of establishing DAS student competency/success.
    4. Who else can we work with (in/out of SAA) to gain additional perspectives on this question?
    5. Determine if online administration of DAS Comprehensive Exam or elimination of DAS Comprehensive Exam is an option to reduce hardship for certificate candidates and reduce SAA time/cost expense.

 

  1. Promote use and refine the new SAA educational management tools to provide metrics that would inform the DAS planning process.

 

  1. Review course offering supervised by liaisons and update the process workflow (please see spreadsheet of courses and liaisons update recently).  [Success is defined by a documented and publicly available course prep workflow on the DAS site.]

 

  1. Mentor archivists through the Mosaic Scholar program meeting requirements of the program.

 

  1. Work with the new intern to determine a project fitting the proposed study.

 

  1. Work with appropriate SAA sections to promote SAA educational goals for DAS.
    1. Mark Matienzo’s report at the 2017 research forum on SAA metadata and digital practice review could be helpful here.
    2. Would be good to have a wider discussion on DAS relationships with other sections and how to better leverage them.
    3. Last year contact was made with Electronic Records Section with suggested appointment.

Re 6.c.:  What would be the role of this person going forward? Should they be attending online meetings? Maybe advice on what courses are needed/desired, getting pointed to the best resources and people to teach such sessions. Make it a limited role, we’ll consult as needed, not necessarily a huge time commitment on the part of this liaison.

Gengenbach:  What is the possibility of having formal liaison relationships with other SAA sections? Extend calls to include individuals who may be targeted to work with the DAS from key groups (e.g., MDOS, ERS, C&U, MRS); ask them to make a presentation on what they’re working on? 

 

De Sutter: While going through new courses in development, let’s keep in mind who we can reach out to help us bolster the usefulness and accuracy of a given course.

 

Should there be an external expert as an official reviewer for new courses? Locating the right person may be challenging.

 

Marty Olliff reached out to Committee on Ethics and Professional Conduct and Standards Committee re webcast on Ethics in Action.  COE provided specs for this course and asked for CEPC’s input on the course content as well as suggested developers/presenters.

 

McAninch will contact the new chair of Electronic Records Section and get the ball rolling on getting a liaison. We should present some basic ideas for the role.

 

Gengenbach:  Maybe there won’t be a need for an official liaison, but merely increased contact and invitations to participate in monthly call.

 

De Sutter: So perhaps we prepare a brief presentation that we can send to other sections as an invitation? Or as we are developing new courses? Use that as a basis for contact?

 

McAninch: They might review the course or point to someone with expertise to review.

 

COE suggested an OAIS course so that this content need not be included in all courses.

 

Beaumont: How are individuals feeling about the goals/objectives?

 

Legg:  Elephant in the room is the comprehensive examination.

 

Gengenbach: Would feel better if we had clear performance measures for the goals/objectives by the end of this meeting.

 

McAninch:  We will revisit these at the end of this meeting to determine paths forward.

 

Workflows for Course Development and Review Each Year

Gengenbach:  Suggest that we create a narrative from the course development diagram [prepared by Mahnaz Ghaznavi].

February:  DAS Midwinter Meeting (virtual)

 

Q1 > Course maintenance / new course review

          SAA budget development (March/April)

Q2 > New course offered

Q3 > Find developers and contract

         SAA Annual Meeting:  First offerings of new courses

Q4 > Accept new course offerings

 

Work flow looks good, but add number of people who review, link to documents that are referenced and specify how communication takes place.

 

“New course” = New (unsolicited) course proposal received via online proposal form.

Goes to DAS (via monthly online discussion) to determine acceptability of course proposal (yes/no).

 

  1. Course proposer completes online course proposal form (https://www2.archivists.org/prof-education/Proposal-Form). Sent to SAA staff (director) and then to DAS.
  2. DAS decides on acceptability of course.  (This conversation occurs on the standing monthly call. Discussion may help the proposer flesh out the form re, e.g., ACE, tier, etc.)

    2a. If DAS decides to move forward, DASS liaison assigned to work with staff.

  3. Education Director or liaison (identified by chair, if needed) communicates the points of this discussion back to the proposer. Proposal is fleshed out according to this feedback to complete proposal. (Education Director will help if needed).
  4. Proposer completes full proposal.
  5. Full proposal reviewed and accepted by liaison and reviewers.

    -      Reviewer(s) assigned by liaison suggestion or Ed Director’s list of volunteers.

-          Reported on by liaison in monthly call, but full proposal does not come back to all of DASS.

  1. Accepted proposal completed, contract issued.

-          Contract sets up deliverables and timeline. Because liaison needs to check in to make sure these are on schedule, it is crucial to include liaison on all communications about deadlines and expectations.

  1. Deliverables for developer:

-          Course content (Course description received by staff and shared with liaison for review/approval.)

-          Bibliography

-          Pre-course readings

-          Handouts/exercises

-          Software requirements

-          Room setup

-          Exam questions (at least one week prior to offering) (2 sets of 20 questions each for a one-day face-to-face offering, 2 sets of 10 questions for a webcast)

 

[Staff:  Set up item writing training for DASS members.]

 

  1. Provided to developer:

-          Contract (with deadlines)

-          Test question guidelines

-          Slide template

-          Review forms

-          Description of the role of the liaison

  1. Liaison expectations:

-          Check-ins

-          Monthly reporting (on DASS call)

-          Review drafts / webcast practice sessions

  1. Draft presentation complete at least 2 months before course. Liaison and reviewers review (and take part in webcast practice session, which is a full presentation).

 

  1. Developer submits final course content. Education staff uploads course content.
  2. Goes to review cycle.

 

De Sutter:  Each webcast should have a complete practice session, start to finish. If liaisons could be part of practice session, that would be a big step forward in ensuring the quality of the finished product. Software orientation for 30 minutes.

 

Gengenbach:  Tie items in Liaison Checklist to duties. Provide guidance on behaviors and tasks expected. Revise layout. (Combine with DAS Shepherd duties.)

 

Subgroup to complete draft revision of these materials:  Gengenbach, Schaefer, Brancato (for November DASS conference call).

 

Review the Course Review Calendar

History:  DACE task force laid out curriculum for what should be in each tier -> spent first few years to develop all of those courses -> DAS Subcommittee sought out new courses -> comprehensive exam was created -> existing courses were reviewed. In the past year or so, the emphasis has been on creating webcasts.

Workflows:  http://files.archivists.org/education/dasfiles/general/Course%20Development%20Workflow.jpg

 

Who is the initiator for these courses? It seems like most of them were offered by instructors, in addition to the first courses developed before the certificate was offered.

Courses have to be reviewed every two years to keep up with technical and legal concerns.

“Edu” in the workflow refers to Education Department staff.

In the past year there has been increased development of webcasts because that is the format that is being requested. Need to consider using this format for more subject areas.

 

Need to document who is doing what along the way and determine the desired outcome from each of these steps.  (Per Jeni:  We should get volunteers to work on this; perhaps it could go into Glen’s summary of deliverables needed?)

 

Per Jeni:  Consolidated Liaison Guidebook (combining/revising existing guidelines and checklist) to be worked on via Google Docs by Gengenbach, Schaefer, Taylor, and Brancato.

 

Duties of DAS Subcommittee Members

 

  1. Course monitoring duties of liaisons

-          Course Liaison Checklist review. New subgroup of DASS will edit the Checklist and present a revised document for feedback on the November 2017 DASS conference call. Collapse sidebar into actual document so that people understand what they’re supposed to be doing in each group (exams, etc.).  No notes re who is on the subgroup.

 

  1. Auditing courses

-          SAA encourages DASS members to attend whenever possible and convenient. Per NPB: SAA will support travel and lodging for DASS members to attend courses, with an emphasis on newly developed courses, for this year (i.e., FY 2018).

-          Contact staff and ask to audit.

-          Try to attend first offering of newly developed course.

-          Try to audit beyond those for whom you’re liaison.

-          Honest feedback required!

-          Must take test!

 

  1. Ensuring that there are two sets of tests in proper form.

 

Review of Courses

 

Gengenbach:  What kind of statistics do we need/want, summarize information about students, faculty, tests and reviews, what employers want to know.

 

See spreadsheet for added notes.

 

Digital Project Management course doesn’t seem to be a good fit for DAS Program.

 

It would be a good use of time for this subcommittee to brainstorm course ideas for the Transformational tier.

 

De Sutter:  Would be nice to include course objectives in evaluation for each course, as we’re not getting feedback specific to the objectives due to technical limitations.

 

Tuesday, October 10

 

Learning Management System
             

Review of Capabilities:  Downing reviewed BlueSky capabilities and an overview of the administrative back-end.  Access is provided in tiers; everyone can see course materials.  We must all make sure that instructors understand that once you send materials to SAA, that is the final version; don’t make changes in a personal version when students will be downloading the one you sent.

 

Access to Course Information: Downing will compile the most recent documents into DASS Course Portal by EOY 2017 and provide a walk-through of the portal to DASS when it’s ready.  Three weeks after a course closes, we will run reports and put them in BlueSky. Staff will send reminder. DASS portal will house all reports that we run.

 

Keep only most recent version of slides, etc., on BlueSky. All historical documents to be retained on the SAA server site.

 

Statistics Generated:

 

How often is course offered? Total number of attendees?  Is interest growing/waning?  Some information currently available in BlueSky for webcasts (after July 2017). For in-person courses, we will have to gather information. Extra data, such as sales reports, will be added later (once we have completed the transition to a new association management software system).

 

Set up references in BlueSky portal.

 

Re-forward to DASS the compiled survey feedback from the past 2 years.

 

Consider what other materials circulated in the past should be re-sent or posted on the BlueSky portal. 

-          Two surveys of DAS certificate holders.

-          Curriculum review done by Lindberg and Ghaznavi.

-          Membership Committee survey on barriers to participation.

-           

Additional needs for measurement?  Examples: Measure progress of program / determine future needs / see if we’re satisfying the student’s need for validation of knowledge.

 

Gengenbach:  Participant feedback. Why are you taking the class? Personal enrichment, employer-mandated, career advancement? What is motivation? Did it work (i.e., did you get promotion, salary increase)? Did taking the course have the desired outcome?

 

McAninch: Do we collect student profiles?  No, not as of now.

 

Brancato:  Did survey of certificate holders in 2016. Could do survey of people working toward certificate, with follow up in 12 to 18 months?

 

Gengenbach: Add question to course survey?  (De Sutter: Adams added great question to current course survey.)

 

Legg:  Would you recommend the DAS program? If not, why not? How likely are you to recommend this course to a colleague (not likely -> very likely).

 

Spamer:  Did you find this course to be worth the money?

 

Once we have new AMS, we will be able to query those who have completed only one or two courses and then didn’t sign up for certificate.

 

Gengenbach:  Are you taking this course for a certificate?

 

Staff to send blank survey to everyone. Please review and suggest revisions. What is the single most important question you’d like to ask?

 

Intern’s Role / Project

 

Previous task was to review job postings specifically related to archives programs (with libraries secondary) to look for mentions of DAS knowledge/skills. Continue what was being done but focus based on DASS’s determination. Add correlation to what we’re teaching. How many job postings actually mention DAS certificate? Or recognizable set of skills that we’re trying to address in the program?

 

Schaefer and Brancato forwarded previous reports to DASS list for Graser’s use.

 

Survey at SAA Research Forum on graduate archival education > what is covered and where > another survey on job listings and what is required for LIS jobs (done by Monica Licelli).

 

Intern project > reviewing students and classes > Graser will send a project proposal upon receiving and reviewing the previous intern report.

Try to get employers to tell us whether they value DAS certificate.

 

Mosaic Scholars Involvement with DAS Program

 

Interaction with Mosaic scholars has been very limited. They are not obligated to attend courses or to take DAS exam. It appears that “mentorship” has been more of an orientation to DAS than helping individuals to plan a specific program of study.

 

Beaumont to reach out to Mosaic Program administrators (ARL) to inform them that scholars have been non-responsive to communications from DAS mentors.  Follow up with Legg and Brancato about their experiences (e.g., Brancato encouraged Carbajal to attend DAS course in Portland).  McAninch has Sandra Delaney; Taylor has Nadia Clifton. Others are Julie Parks and Ashlynn Prasad.

 

DAS Comprehensive Exam and Other Tests

 

[Martzahl joined the Subcommittee via teleconference line for this portion of the meeting.]

 

Martzahl:  Next test scheduled for early November, with same questions as July exam (which, in turn, was same questions as February/March). There are 4 or 5 courses that need questions added to the exam. Integration of new questions runs about a year behind so that people have a chance to take the new courses.  (Per Jeni: Veronica indicated that she had hoped to have questions for us today to review, but will get them to us so that they can be tested in November.)

 

Potential negative of doing away with test is lack of analyses on questions.

Beaumont: Trying to get it as close to “certification” as we can without using the word. It is a certificate of professional competency.

 

Is that what we [still] want?

 

Gengenbach:  We modeled DAS on ACA.  If we don’t need to do that, why copy?  It puts us in competition with library schools and already-existing certification programs.  We may be risking ire, especially since we’ve been getting such negative feedback about the exam and the travel costs. If people aren’t thrilled, why don’t we try to find something that can demonstrate skills and proficiencies and that also can have an impact on the organizations in which these students are working? A field experience, or kind of capstone project, to sum up DAS experiences rather than comp exam? Redirect intellectual time and energy to some other experience?

 

McAninch:  Each course might end with scenario for student to engage with based on experience?

 

Prael:  Question of exam depends on our primary focus as a group.  Do we want to provide a credential or do we want to provide continuing education? If we’re going to set it up as a certification but avoid the word, then we just need to embrace it and make it an actual credential. If we want to treat it as education, then the test doesn’t help achieve that goal.

 

Brancato:  Found the test useful, had to go above and beyond the courses that she had been taken.  The certificate does help people with jobs.  Main thing(s) people don’t like is travel and not being able to take the exam right away.

 

Taylor:  Pursued because even in grad school the content wasn’t discussed. Doing it this way to get experience and make herself competitive. Online testing shouldn’t be an issue. Could be better designed to demonstrate practical knowledge. Also, many issues are so vague that taking a test seems to reward the ability to take tests, not the creative and abstract thought that this educational path really emphasizes.

 

Schaefer:  Not sustainable for the subcommittee in the long term to manage the test, particularly if we lose Martzahl, as it requires so much effort.  Exam questions take up 50% of subcommittee’s time. Would also do away with end-of course exams.

 

De Sutter:  Trying to elevate continuing education to what other professional organizations have, which helps differentiate members and provide them an opportunity to fill gaps in their education, and also helps archivists to be more respected and professional.  It has also been our intent to reach out to employers to emphasize the importance of this certificate. (Schaefer:  It that’s the case, then someone needs to be hired to specifically handle just the DAS program.)

 

Spamer:  Like Brancato, found the test valuable.  Surprised that use of resources wasn’t allowed during test. Felt formal. But she thought it provided motivation to build a resume. Had to do more readings to “earn” a certificate. If the people who tend to give feedback are the negative ones, how do we really know that everyone hates the exam? Right now, it’s professional development AND certification, doesn’t know if that’s the way we should continue operating this in the future.  Believes it needs to be an online exam and it’s okay to allow people to have resources available when taking the test. Taking it away would make a lot of people who took the curriculum feel invalidated.

 

Legg:  Likes the idea of a practical project, but that would require additional staffing at SAA, which is critical. The test does make you learn more than what would have been required just from the classes taken.  Likes online test idea.  For future state: practical project; for current state: move exam online.

 

Martzahl:  From a sustainability standpoint (per Schaefer), getting rid of the test as a whole is a good idea.  Look at structure of courses themselves – require X number of core courses that have more rigorous testing + electives, with certificate at the end. Would mitigate disgruntlement of those who’ve taken the exam. Doesn’t think that we can do practical project. Issue of travel because “more rigorous” may mean more in-person. There is no practical way to do something more evaluative because it would be too subjective and time-consuming.

 

Brancato:  Likes the idea of mandatory classes like a central “track,” creating more rigorous courses for that track.  Have to take more classes, but a higher number of online courses (as opposed to webcasts).

This would have to be phased in. There must be a period when we evaluate testing, but also how students interact with online courses.

 

Legg:  Can we think of this as phased in—start with online exam and plan a different way?

 

McAninch: Likes the idea of a practicum inside the courses, maybe in those core courses. Thinks we should begin phasing out the exam.  It does have the benefit, though, of taking the instructor out of the exams. During phase-in, evaluate the way in which students interact with online courses (not casts). How might we integrate practicums? Assignments for each module, or assignment at end of group of modules? How to build in interaction with instructor?

 

Beaumont:  From a governance standpoint, the format of exam administration (i.e., in person vs onine) can be handled with a recommendation to the Committee on Education and a decision at that level. Restructuring of the program (including doing away with the comprehensive exam) rises to the level of the SAA Council, as it could have a significant impact on the SAA brand and revenue. In that case, DASS would make a recommendation to the COE that would then be made to the Council.  All recommendations should be as complete as possible (laying out timelines, providing a balanced presentation of pros and cons, based on data—not just personal experiences and anecdotal evidence—whenever possible). Sustainability is always a concern – and it should be balanced by the key role that the program has played in terms of SAA branding and revenue.

 

Gengenbach:  Data must be behind these decisions.  Timeline? We might have this ready by the end of next year.  What are the key things we need to know to make a good recommendation?  For example, need to know:  Who takes for information need versus actual certificate; what they like and dislike.  How much actual time does the exam work take? This coming year has to be all about planning for this.  

 

The group conducted a vote (unanimous) to move the comprehensive exam online in 2018, beginning as early as possible in the year.

 

In addition, subcommittee members agreed to spend the year examining the data around the possibility of restructuring the DAS program and doing away with the comprehensive exam.

 

By October 2018, DASS will have fleshed out its recommendations regarding program restructuring , with a goal of submitting them by early 2019.

 

Timeline for recommendation on online comprehensive exam:

 

-          11/17/17:  COE conference call.

 

-          By 11/10/17:  Submit for COE consideration a recommendation to move the DAS comprehensive exam online.

 

-          By 11/1/17:  McAninch, Gengenbach, Brancato, and Spamer will draft the DASS recommendation to move the DAS comprehensive exam online.  The draft will be forwarded to all DASS members for a quick review and will be submitted for consideration by COE on its 11/17 conference call.  (Draft should answer:  Offering frequency? Expiration? Eligibility? Creation of testing batches? Metrics and statistics for reporting?)  Beaumont to send template for recommendations to the Council, which can also be used for recommendations to the Committee.

 

-          By week of 10/23/17:  De Sutter to set up call with psychometrician; Capodilupo to send sample test data to psychometrician; query her re how does this affect her workflow, what does she need from SAA to continue her work, does online administration affect how statistics are recorded/ reported?  All available information is forwarded to working group that is drafting DASS recommendation.

 

Long-term recommendations group:  Brancato, Gengenbach, Martzahl, McAninch, Prael, Schaefer, Taylor.

 

Timeline for long-term recommendations:

 

-          For discussion at October 2018 in-person DASS meeting.

-          Recommendations to COE January-March 2019 (for inclusion of scenario in draft budget).

-          Recommendation from COE to the Council’s May 2019 meeting.

 

Wrap-Up

 

Add to pre-class form:  Do you intend to pursue the DAS certificate?

 

Review current touch points with students to make the most of these opportunities to interact and gather data/information.

 

The meeting was adjourned at noon.