How the Test was Developed

TATIL icon

Click to view full-size logo.

Summary

After careful analysis of the ACRL Framework for Information Literacy and the current assessment literature, Carrick Enterprises developed a test blueprint. We applied instructional design principles to the development of the test questions, creating an engaging test-taking experience and taking advantage of modern Web capabilities. We developed a bank of test questions of varying difficulty levels to describe the information literacy of learners moving from novice to expert. The test items are based on outcomes and performance indicators that were themselves inspired by the ACRL Framework.


Steps

The ACRL Framework for Information Literacy ushered in a new era for theorizing, teaching, and assessing information literacy. The Framework provided a welcome impetus to pursue new ways of measuring students’ information literacy in order to discover more about how we are preparing students to be skilled, reflective information users and creators.

Carrick Enterprises started the process by assembling a panel of experts. The Threshold Achievement Team brings together the professional expertise, judgment, and observations of experienced librarians and professors on a national Advisory Board.

In concert with the Board, we worked through six stages of test development.

1. Specify Test Content

Librarians and other educators participated in an iterative process to articulate definitions and descriptions of information literacy for college students. The process started with identifying undergraduates’ common information literacy behaviors, beliefs, knowledge, and misunderstandings. These profile elements were then organized into core/periphery charts which were compared with the knowledge practices and dispositions outlined in the Framework. Finally, constructs were defined and the experts gave holistic feedback on the list of constructs and outcomes.

Concurrent with this work was the consideration of information literacy dispositions, with the panel recommending that the test provide insight not just into students’ knowledge but also their learning dispositions when faced with information literacy challenges. A structuralist analysis of the Framework document by the team’s rhetoric consultant resulted in the identification and description of disposition content to be included in the test.

2. Define Test Structure and Question Format

Once the panel grappled with the enormous breadth and depth of information literacy as a whole, they recommended the creation of four separate test modules. This approach allows educators to decide on a particular domain instead of administering a very large test all at once. Educators can focus their assessments on the Framework skills and abilities they teach in their classes and programs and make direct connections between students’ results and the content and techniques of their information literacy instruction.

Test question format is addressed with a variety of structured response options that rely on test-taker interaction with the questions, along with more standard multiple-choice questions. This combination of question types means that the test gets at concepts that are higher order thinking and more cognitively complex than definition and identification. These types of items also increase students’ engagement with the test, leading to test results that more fully represent student understanding.

3. Write Test Questions

We created blueprints for each test module to codify specifications and guide the creation of test questions. Blueprints record level of difficulty (novice or high-level) and categories for knowledge questions (understanding, critical thinking, and problem solving) and for dispositions (mindful self-reflection, productive persistence, toleration of ambiguity, and responsibility to community).

The panel extended the work on outcomes by developing performance indicators that served as the basis for each test question. Performance indicators must be aligned, observable, unambiguous, and not compound. Working in teams, advisory board members generated test questions that went through multiple iterations and revisions. Cognitive interviews with students, field testing at multiple institutions, and revision round out the test question process. Read details of test question creation on the Information Literacy Assessment blog.

4. Determine Scoring Method

TATIL is a criterion-referenced test which means that test-taker performance is evaluated in relation to established criteria as described in stage 1, above, and not through comparison to other test-takers. Scoring on the knowledge portion is based on a partial credit model and on difficulty level. Students can achieve full, partial, or no credit on a question. The score a student achieves on a question is based on the difficulty of receiving a particular amount of credit for that question. Difficulties are calibrated based on a database of student scores from all participating institutions. Questions have different levels of difficulty and therefore different maximum scores.

Scoring for disposition questions is based on a student’s judgments regarding strategies. Students earn high scores on these items if they judge behaviors associated with the disposition to be useful and behaviors not associated with the disposition to be not useful.

5. Set Performance Standards

Panels of librarians and other educators participated in standard setting sessions so that we can report more than raw scores when students take the test. By setting standards, we can make and test claims about what students’ scores indicate about their exposure to and mastery of information literacy. The efforts of the panels resulted in three knowledge performance levels that are defined differently for each module. The levels are conditionally ready, college ready, and research ready. Disposition levels are also identified and as with the knowledge levels, they provide interpretation to users of the test.

6. Create Useful Reports

With input and feedback from the Advisory Board and field test participants, we developed reports that are robust, detailed, and informative. The institutional reports offer summary and detailed results, performance level indicators, disposition descriptions, subgroup breakouts, cross-institutional comparisons with peer institutions and other institutional groupings, and suggestions for targeted readings that can assist in following up on the results.

Individual student reports are optional and describe the student’s performance and offer recommendations for improvement.

Development Milestones

March 2014

A two-day workshop is held with academic librarians from around the United States. The goal of the workshop is to determine whether a new instrument should be created given ACRL’s move to the Framework. The consensus is that a new standardized test will bring valuable insights as schools make the transition from the Standards to the Framework.

September 2014

Dr. April Cunningham accepts the position of Project Leader for the new instrument. Her first action is to assemble an advisory board to assist with the design and development of the test.

October 2014

Our first (virtual) Advisory Board meeting is held. Work begins on writing outcomes and performance indicators based on the ACRL Framework.

December 2014

The Threshold Achievement web site goes live.

February 2015

ACRL files the final version of the ACRL Framework for Information Literacy.

Item writers are selected and begin working in teams to create test questions. Item writing and revising continues through summer 2017.

March 2015

The Threshold Achievement Test for Information Literacy is introduced at the ACRL Conference in Portland, OR.

We lay out a plan for two rounds of field testing for each of the four modules and solicit participants for field testing the first two modules.

April 2015

Colleen Mullally, then at Pepperdine University, becomes the first person to register as a TATIL field tester.

Summer 2015

Cognitive interviewing with students commences, allowing us to learn more about the readability, clarity, and validity of the test items.

Hal Hannon, our Rhetoric and Composition consultant, and project lead April Cunningham conduct an analysis of the Framework to investigate and determine information literacy dispositions.

October 2015

A student completes the very first TATIL test. By May 2016, a total of 848 students complete Module 1 and 780 complete Module 2, wrapping up data collection for the first round of field testing.

April 2016

The first test reports are generated for field test schools.

May 2016

Dr. Bozhidar Bashkov, Ph.D. in Measurement and Evaluation, provides psychometric analysis of Module 1 and Module 2 tests, paving the way for informed revisions.

July 2016

A panel of librarians comes together to conduct standard setting for Modules 1 and 2, determining levels of achievement and writing performance level descriptions.

Fall 2016 – Spring 2017

Sixteen institutions participate in the second round of field testing for Modules 1 and 2 and seven institutions try out Modules 3 and 4.

February 2017

The format and content for TATIL reports and data files are finalized.

June 2017

Round 2 field testing for Modules 1 and 2 and round 1 field testing for Modules 3 and 4 are completed.

The standard setting panel reconvenes to focus on Modules 3 and 4, defining performance levels and descriptions.

July 2017

Final touches to items in Modules 1 and 2 are made, based on analysis of round 2 field testing. Items in Modules 3 and 4 are revised as appropriate.

August 2017

Modules 1 and 2 become available for production use. Modules 3 and 4 begin the second round of field testing.

May 2018

Field testing for Modules 3 and 4 ends with more than 1,100 students taking a test. Standard setting panel meets one last time to review and adjust performance levels.

July 2018

Improvements are made to items in Modules 3 and 4. As of July 25, 2018, all four modules are complete!

Ongoing

We continue to add functionality and enhance features of the testing system, reports, and support material.