Having rejected common tests, alabama opts for new act exam
- Select a language for the TTS:
- UK English Female
- UK English Male
- US English Female
- US English Male
- Australian Female
- Australian Male
- Language selected: (auto detect) - EN
Play all audios:

The new comprehensive assessment system being designed by ACT has claimed its first official defection from the common-assessment states: Alabama. Iowa-based ACT made a point of announcing
late last week that Alabama has signed on to use its new assessment system. It isn’t unusual for testing companies to make these sorts of announcements; they’re fond of issuing press
releases when they snag big contracts. But this one is different because of the resonance it carries as two big groups of states push hard to craft new tests for the Common Core State
Standards. When ACT’s partnership with Pearson on a “next generation” suite of tests came to light, ACT was serving as a subcontractor to PARCC, one of the federally funded state consortia.
PARCC saw the project as sufficiently competitive with its own work as to pose a conflict of interest for ACT. ACT withdrew from the PARCC contract. Recently, Alabama announced that it was
pulling out of both consortia—PARCC and Smarter Balanced—and going its own way with a testing system for federal accountability. And lo and behold, what did Alabama choose but the
ACT/Pearson suite of tests. Alabama is only one of 44 states working with one or both of the two consortia, so it’s not as if this one defection endangers the future viability of the two
assessment groups (You might recall that federal rules require each testing consortium to have a minimum of 15 state members in order to be eligible for federal funding.) But if many states
follow in Alabama’s footsteps, PARCC and Smarter Balanced will have even more cause for worry about their futures than they do now. Sustainability is a huge issue on the consortia’s radars
right now, as we’ve reported to you. Their federal Race to the Top money runs out in September 2014, so they have to figure out how to stay afloat through the spring of 2015, when their
tests are first slated to be administered, and in the years after that, if they want to ensure that they can update the item banks, manage the technology platforms, and pursue an ambitious
research agenda to validate the tests as proxies for college readiness.