728 x 90

State Officials Vague on New Standardized Test

State Officials Vague on New Standardized Test
By Doug Page

Massachusetts students in Grades 3 through 8 will take a new standardized test this month, but it’s unclear how new the test really is.

Dubbed “MCAS 2.0” in October 2015 by Mitchell Chester, commissioner of the Massachusetts Department of Elementary and Secondary Education (DESE), the new test was proposed by Chester as a compromise between the aging MCAS and the controversial Common Core-aligned PARCC exam. When suggesting a hybrid test, Chester said it would be a combination of Common Core-based questions, like those found on the PARCC, along with those not aligned with Common Core standards. At the time, critics predicted that MCAS 2.0 would be akin to a wolf in sheep’s clothing — a backdoor way to administer the PARCC test to Massachusetts public school students without the critical backlash of adopting the PARCC outright.

“This is a political deal that was made in order to pretend that we’re not doing PARCC,” Massachusetts Teachers Union President Barbara Madeloni stated following the BOE’s November 2015 decision to adopt MCAS 2.0. “We’re hiding PARCC in MCAS.”

Now, 16 months later, with the English and math test about to hit 425,000 student desks across the Commonwealth, Chester and DESE officials refuse to release a breakdown of the test and state which percentage of questions are based on Common Core standards/PARCC and which are MCAS-like, refueling critics’ claims.

“The source of the majority of questions on the tests varies by subject area and grade level,” DESE spokesperson Jacqueline Reis said. “On some tests, the majority of questions are from MCAS. On some tests, the majority of questions are from PARCC. Each year, we’ll look at the pool of questions and figure out what the best questions are to include in our tests.”

“If this is just a shell game to hide PARCC within MCAS, then it will add to the brewing discontent with the accountability systems,” Madeloni said recently. “The speed at which it’s being designed suggests that they’re not formulating new questions and not entirely designing a new test. I wouldn’t be surprised if [MCAS 2.0] is the PARCC test.”

“I haven’t heard anything more about it,” said Tom Scott, executive director of the Massachusetts Association of School Superintendents, when asked if knew how much of a hybrid test MCAS 2.0 would be.


More on standardized testing in Massachusetts:

Massachusetts Public Schools: Achievement & Controversy
Board of Education Accepts New Standardized Test for 2017: ‘MCAS 2.0’
The Surprising Backers Behind Common Core in Massachusetts
Common Core Opposition Gains Momentum in Massachusetts


Just prior to its November 2015 vote of approval, BOE members heard an impassioned plea for the new, hybrid exam from state Secretary of Education James Peyser: “By incorporating the best of both MCAS and PARCC, we can develop, maintain, and improve a stronger assessment system than would be possible with either test on its own.”

For nine months, baystateparent has been requesting a MCAS 2.0 breakdown from DESE, inquiring how much of the test is comprised of PARCC and MCAS questions. The department, which is in charge of all Massachusetts K-12 public schools and has the final say on the test’s design, has not provided specific percentages.

When adopting MCAS 2.0 in November 2015, the BOE didn’t put any stipulations on the ratio of PARCC to MCAS questions, leaving that open to interpretation by DESE.

The problem with PARCC

Teachers, parents, school administrators, and even elected officials across Massachusetts and the U.S. have railed against Common Core standards, and by extension the PARCC exam, over the past several years. The standards were proposed by the federal government to level the playing field in the quality of education offered across the country. Its goal: that a student in Alabama, for example, would have the same quality of instruction as one in Massachusetts.  While the federal government cannot mandate state education, it can place stipulations on those states accepting federal money for education.

In July 2010, Massachusetts was one of many states that accepted federal money (a $250 million Race to the Top grant) from the U.S. Department of Education (DOE) in exchange for adopting Common Core standards in its public schools. The Massachusetts BOE — not the state legislature or Bay State voters — was empowered with this decision, a move that didn’t sit well with critics who think the issue should be up to more than 11 BOE members. Common Core standards have come under fire from critics (parents, educators, and elected officials) across the nation who say education should not be standardized.  It is too rigid, removes creativity, and leads to “teaching to the test,” in which a teacher teaches only what will be on the annual exam, they claim.

The adoption of Common Core standards meant that the state’s current annual exam (the MCAS, given annually since 1998) was technically out of date because it did not reflect the new standards. This left the state to find — or make — a new test: enter the PARCC (Partnership for Assessment of Readiness for College and Careers). The PARCC was a ready-made, Common Core-aligned standardized test, yet outcry against its potential adoption led to five hearings, held by the Massachusetts Board of Education around the Commonwealth throughout spring and summer 2015. At these public forums, BOE members heard testimony from Common Core supporters and critics.

Meet MCAS 2.0

The MCAS questions were developed by Measured Progress, a Dover, N.H.-based testing company that designs and develops MCAS 2.0, DESE spokesperson Reis said, while the PARCC questions were obtained from Pearson, PLC, a multi-billion-dollar British company. The MCAS questions were reviewed by Bay State educators, experts and DESE staff, while PARCC questions were reviewed by educators, experts, and education officers in various states around the country, Reis noted.

DESE recently signed a new, five-year, $150.8 million contract with Measured Progress to create the MCAS 2.0 test. According to PARCC’s federal tax documents, the testing consortium — comprised of eight states, plus the District of Columbia — made two payments to Pearson, in 2014 and 2015, totaling just over $50 million. Inquiries to PARCC officials regarding payments to Pearson were unanswered.

Chester’s preference

The commissioner has stated a preference for PARCC questions.

In December 2016, baystateparent caught up with Chester at a public forum at Harvard University’s Graduate School of Education and inquired again about the content of MCAS 2.0.

He provided emphatic support for PARCC questions, saying on English, “that PARCC-type items … elevates the kind of critical thinking and writing skills we’re asking our students to do.”

On math, Chester said, “… PARCC has a much greater emphasis on open-ended problem solving, applying your math skills to novel and real-world situations than the MCAS did, so it elevates the expectation for what students can do.”

Chester also declined to answer the ratio of MCAS to PARCC questions on MCAS 2.0.

It’s hard to say whether PARCC or the older version of MCAS is a better predictor of a student being college and career-ready — one of DESE’s objectives with the new test. Just prior to the BOE’s vote approving MCAS 2.0, the results of a study about both tests, conducted by Princeton, N.J.-based Mathematica Policy Research, was released. The study reported both exams were in a dead heat for predicting student success in college English and math.

“It’s important to know that the accountability system we have in place overvalues standardized test scores in a way that’s entirely out of balance for what the test score actually represents,” the MTA’s Madeloni said. “It’s a high-stakes test.  The score on that test [MCAS 2.0] has consequences for evaluating teachers and school districts.”

In addition, a child’s ability to think critically and their “ability to articulate themselves and enter the world with a broad view is being narrowed by a standardized test,” she said.

Related Stories

Leave a Comment

Your email address will not be published. Required fields are marked with *

Cancel reply

4 Comments

  • Monty Neill
    March 14, 2017, 6:10 pm

    Yes, MCAS and PARCC were in a virtual dead heat, but the ability of either to predict future success was very low. If both fail in that job, it is time to stop attaching high stakes for students, educators and schools on test results.

    REPLY
    • William tells all@Monty Neill
      March 15, 2017, 2:03 pm

      Mr. Neill, With all due respect, you are criticizing the MCAS over something for which it was NEVER designed to assess – future performance. Rather, is it is Standards-based test designed to discern if a student has learned what he or she is expected to have learn at various points in her or his public school education.

      Granted, one can critic the formal curriculum expectations, but one cannot honestly diss a well-designed Standards-based test’s assessment of a student’s test performance.

      And as for PARCC, granted: it is supposed to ALSO make an effort to assess college or other future education potential – but such is only a portion of its test design mission statement within – again – a sound Standards-based design protocol.

      Granted also, the best measure of grades is grades. Similarly, the best measure of ones career is one’s career.

      At the same time, how would you endeavor to assess things in advance of having any direct data (e.g., a job history)?

      Or are you suggesting to just drop the whole standardized assessment process and instead basically – well – wing it per local ad hoc sort of assessment protocols?

      REPLY
  • William tells all
    March 15, 2017, 1:51 pm

    People can moan and groan about MCAS and PARCC all they want, but so long as both are based on Standards Based methodology a la the federal gold standard test – The National Assessment of Educational Progress (NAEP) – as opposed to the normative-based tests traditionally used by essentially ALL other states in years past, whatever the Massachusetts Department of Elementary and Secondary Education (DESE) is using is going to prove to be a decent measure of the skills levels of MA public schools students.

    Granted, no sample (such as a standardized test) is a perfect measure – but a methodologically sound measure used in combination with other measures is a perfectly reasonable way to go AND I say this as someone with years of experience successfully challenging standardized testing.

    Say what one may, reasonable measures of performance that have both internal and external validity are necessary to discern what’s working, where things are working and where they are not.

    For example, providing some sort of viable quality assurance as to what a MA high school diploma means is only reasonable to expect.

    Finally, for those who decry that MCAS and PARCC behoove teaching to the test, sound research shoots down this research. Simply put, either a child is taught AND effectively learns what she or he is supposed to learn during his or her YEARS in school and so does well on the test or the child does not. That and teaching a child how to think and so be better able to take a test.

    Think what one may, such is not teaching to the test. Rather, it is TEACHING to what is supposed to be taught.

    .

    REPLY
    • JB@William tells all
      March 23, 2017, 8:16 pm

      In my view, these tests do not measure what they purport to measure and they certainly take time and money that should be devoted to educating the whole child. Let’s stop the psychometric babble
      and get back to honest, authentic teaching and learning.

      REPLY