Social reformers used to worry about the “digital divide,” the gap between schools and communities that had easy access to technology and those that did not. Now they need to worry about a different kind of disparity — the proficiency gap. Just because classrooms and homes are equipped with hardware does not mean that teachers and students know how to use it and exploit it.

Educational Testing Service, which began testing college students six decades ago, realized it needed to react to the dramatic change in the way the world learns. And it saw an opportunity to add add a new test to its formidable repertory. It came up with the Information and Communication Technology test (ICT) for evaluating technical literacy. The test is unusual because it measures how students use technology to solve problems. A pilot program has just ended, and students will begin to take the ICT in April.

The test “presents real-time, scenario-based tasks designed to be highly engaging and valid,” says a spokesperson. It consists of 14 four-minute tasks and one 15-minute task, and it takes 75 minutes to complete. A demonstration of the ICT, which costs $33 to $35 and can be taken only on computers, is available at www.ets.org/ictliteracy

U.S. 1 Newspaper assigned two freelance writers, Euna Kwon Brossman and Julianne Herts, to take the pilot version of the ICT and report on the experience. Brossman graduated from college before technology mattered and is a self-professed technophobe. Herts, on the other hand, represents the exact profile of the student for whom the ICT was written. She is a junior at West Windsor-Plainsboro South. Their stories begin on the opposite page.

Work on the ICT began in 2001 when the Rosedale Road-based ETS brought together an international panel of technology experts, educators, business, and industry leaders to look at the notion of technology literacy, defined as “the ability to use digital technology, communication tools, and/or networks appropriately to solve information problems in order to function in an information society.”

ETS wanted to focus on the ability to use technology as a tool to research, organize, evaluate, and communicate information. It also wanted to include a fundamental understanding of the ethical and legal issues surrounding the access and use of information.

‘We recognized that over the last decade there was a growing digital divide between schools that had access to technology and those that did not, and some kids had an advantage or disadvantage,” explains Terry Egan. As project manager in the new product development area at ETS, she has been working on the ICT project for the last two years. “Now the issue has grown to be one of a proficiency divide. It’s not enough to have the equipment. We need to provide professional development to teach teachers how to use the technology effectively. If we’re not teaching and measuring the right skills, we’re not using technology to its fullest potential.”

The seven charter clients that worked on the pilot program: California Community College System, California State University System, University of California, Los Angeles (UCLA), University of Louisville, University of North Alabama, University of Texas System, and the University of Washington. That panel of advisors later expanded to include Arkansas State University, Bowling Green State University, Miami Dade College, Oklahoma State University- College of Education- DAC, Portland State University, Purdue University, and the University of Memphis.

This group offered the first large scale literacy assessment in January of last year. Students from more than 30 secondary and post-secondary schools took the test but did not receive individual scores. Their institutions did receive scores. “What the colleges and universities wanted to do was to get a sense of where their kids stood in relation to others who were tested,” explains Egan.

“We learned that we wanted to focus on a new design for an individual version of the test. Many who took the first test said it was too long, at just a little over two hours. We have cut it back to 75 minutes,” says Egan. The test initially targeted rising juniors in college. “What we heard from colleges was that they wanted an assessment of students as they began their college careers, students transitioning from high school to college.” ETS decided to move forward with the two levels of the assessment and piloted the advanced level in October and November of last year.

Now there are two versions of the ICT Literacy Assessment. The core academic assessment targets students transitioning to college and is appropriate for all high school seniors, community college students, and freshmen and sophomores at four-year schools. It provides administrators and faculty with an understanding of the cognitive and technical proficiencies of a student doing entry-level coursework.

The advanced assessment targets students transitioning to upper-level coursework and is appropriate for rising juniors at four-year schools. The scores provide guidance for rising juniors and their teachers.

The core assessment pilot study, which ended February 17, was administered to students in 27 to 30 schools, some near the end of high school, others at the beginning of their college career.”It was a trial run,” says Egan. “Students did not get individual results. We’re using this pilot to work out the kinks and figure out problems, including unexpected technical problems.”

One obvious glitch, as the U.S. 1 reporters discovered, occurs when the lab has not been configured according to the requirements of the test. Says Egan: “It’s delivered with a secure browser so it’s important that the labs be set up with all that software and it needs to be downloaded in advance. If that’s not done, there will be technical problems.”

April is the first time the test takers will receive individual scores.The core version of the assessment will be scored on a scale of 0 to 300, the advanced level from 400 to 700.

Egan does not see this test being required for college entrance as the SAT is right now, but she does see schools using it for placement in certain courses.

She also expects that there might be some differences in results based on geography or socioeconomic levels, and ETS will be addressing those issues. “We will compare various populations to see if there are differences in performance. We’ll make sure that we included representation from a variety of educational settings as we develop the test to make sure we don’t disadvantage any populations. If there’s a regionalism that makes something make sense in New York City that wouldn’t make sense in Kansas, we would catch that.”

Egan, who lives in New Brunswick, grew up outside Philadelphia as one of 14 children. Her mother stayed home to raise them and her father was a restaurant manager. She attended Archbishop Prendergast High School, and pursued a degree in American Studies at Rutgers University. She earned a masters at Rutgers in educational administration, taught for 10 years in Philadelphia and New Brunswick, and came to ETS in 1999 as part of a teacher team working on the National Board for Professional Teaching Standards. She worked on that program for five years before moving into the new product development group. Her husband, Joseph Egan, is a construction manager in Morristown. They have four children, ranging from 19 to 32, who are all very comfortable with technology.

Egan says she hopes the ICT Literacy Assessment will help prepare a new generation of students face the challenges of the technology age head-on, confident in using it to its maximum potential.

If students need to be tested on how to solve problems using technology, so do teachers. The realization that teachers may need to improve their own technical skills may be one of the most significant outcomes of a test that demonstrates just how to integrate technology into all subject areas. Apparently not all teachers know how to do that.

Says Egan: “When we demonstrate this test at the high school level, we hear kids saying they want their teachers to take the test too.”

END END END ng goes, What is sauce for the goose is sauce for the gander. The test offers good examples on how to integrate technology into all the subject areas — and not all teachers have this skill.

Teachers may need to bootstrap their own technical skills and this realization may be one of the most significant outcomes of the test.

“We have found that when we demonstrate this test at the high school level, we hear kids saying they want their teachers to take the test too,

Facebook Comments