Smarter Balanced Tests—One Year Later, Same Shameful Tests

Mar 23, 2016 | No Comments

In March 2015, SR Education Associates released a scathing report of the quality of the Smarter Balanced tests for mathematics used by 17 states to measure student proficiency with respect to the Common Core State Standards for Mathematics.  We found that the quality of the tests was so bad that they should have been removed from use before they were ever administered.  Of course, they were not.  While hundreds of thousands of parents and students, aware of the shortcomings of these tests and dismayed by the relentless testing their children were being subjected to, took the matter into their own hands and opted out of the testing, more than 90% of students in Smarter Balanced states took the test as planned.

Of course, when the results of the Smarter Balanced tests were announced in the late summer and fall of 2015, the results were entirely predictable: In state after state, the  majority of students failed to meet the standard the state set for students.  In California, for instance, the state where 3 million students in grades 3 to 8 and grade 11 took the tests, 38% received a score of 1 (Standard Not Met), 29% a 2 (Standard Nearly Met), 19% a 3 (Standard Met), and 14% a 4 (Standard Exceeded).  When disaggregated by race/ethnicity, the results were even more troubling: Only 16% of African-American students, 21% of Hispanic students, and 22% of American Indian/Native Alaskan students either met or exceeded the standard set by the state.  Of course, the state reported that “Economically Disadvantaged” students did poorly as a group (21% met or exceeded the standard) and “English Learners” did worse than any group (only 11% met or exceeded the standard).

Scores on the Smarter Balanced tests were sent to parents across the country.  Parents of failing students were devastated.  Many states faced a public relations crisis and a few Smarter Balanced (and PARCC) states abandoned the tests because of the political pressures the poor results created.  Public pressure state to state varied because each state set their own cut scores to define proficient.

What part of the high rate of failure, however, was due to inadequate student knowledge in mathematics and what part of the failure was due to the poor quality of the questions asked and the convoluted and confusing test interface designed by the test makers.  Unfortunately, we will never know.

The March 2015 SR Education Associates report accurately predicted this dilemma.  While I have seen no studies released that examine the degree to which the Smarter Balanced online tests themselves contributed to the devastating student results, several analyses of the PARCC test results substantiated our prediction that poor computer test design could contribute significantly to poor test results.  (See https://www.washingtonpost.com/news/education/wp/2016/02/04/report-kids-who-took-common-core-test-online-scored-lower-than-those-who-used-paper/ and http://mobile.edweek.org/c.jsp?cid=25919761&bcid=25919761&rssid=25919751&item=http%3A%2F%2Fapi.edweek.org%2Fv1%2Few%2F%3Fuuid%3D6D674BE0-CA7F-11E5-A935-71C9B3743667&cmp=SOC-SHR-TW)  And the students that are penalized the most by the shoddy tests are struggling students who have the least access to computers and who are the most vulnerable in their mathematical self-esteem.  Given the findings of our report on the Smarter Balanced tests, it is our firm belief that an examination of the results from Smarter Balanced would mirror the findings on PARCC: A substantial factor in the failure of many students on the Smarter Balanced tests in 2015 was poor test design and implementation.  The poor craft of the tests invalidates all results.

Has the situation gotten any better in 2016?  Unfortunately, not.  Is there any chance that the Smarter Balanced tests will get better in future years?  The answer is “No” again.

After the 2015 tests were administered, McGraw-Hill, the developers of the Smarter Balanced tests sold their CTB/McGraw-Hill testing division to Data Recognition Corporation and laid off most of the staff that had worked on the tests.  As we stated in our report, “In the current political climate, there will not be funding available for those who could fix them to actually fix them.”  McGraw-Hill, having profited handsomely from their contracts to develop the tests, bailed out before they were held responsible for the disaster they created.  In the year since we released our in-depth report, Smarter Balanced has improved their website design, rolled out a big public relations campaign to convince parents of the virtues of the test, but they have done little or nothing to improve the tests themselves.  Before writing this blog post I took several of the 2016 practice and training tests available on the Smarter Balanced site.  While different problems have been pulled from the testing data bank to offer up as items on these practice tests, the new items suffer from the same terrible design as the 2015 items.  It didn’t take long to come across the problem below on the high school math training test:

Never mind the ridiculous context of the problem—a paper route sustained in 2016 by delivering 8 papers per hour—focus instead on the partially filled in table placed by the test authors.  The problem states, “Lisa and John continue to deliver newspapers at the same rate for the next 2 hours.”  The question asks, “Complete the table to show how many papers each delivers during the second and third hours.”  The answer is simple: During the second hour they will deliver 8 and 15 papers respectively, and during the third hour they will deliver 8 and 15 more papers.  But the first entry in the table has been filled in as 16 by the test author.  Didn’t they read their own question?  16 is the cumulative number of papers that will be delivered by Lisa in the first two hours, not the number that she’ll deliver in the second hour!

Welcome to the 2016 testing season.  Parents and students, prepare to opt out—again.

 

 

Last Week Tonight with John Oliver: Standardized Testing (HBO)

May 4, 2015 | No Comments

A fantastic segment from this week’s show. A must watch! Here is the link: https://www.youtube.com/watch?v=J6lyURyVz7k.

Suggestions for NCTM Regarding High-Stakes Testing and CCSSM

Apr 29, 2015 | No Comments

This week I have rerun my ad promoting my critique of the Smarter Balanced tests for mathematics in the National Council of Teachers of Mathematics (NCTM) electronic publication, Summing Up. Hopefully, friends and colleagues in the NCTM community who find my critique will also take the time to read my blog. This post and the one below it are directed largely to members of my NCTM community.

First, I think it is great that NCTM has advocated on the Hill for less testing. In a letter to the Chairman and Ranking Member of the Senate committee considering the reauthorization of the Elementary and Secondary Education Act (ESEA), NCTM President Diane Briars and Executive Director Robert Doucette wrote:

(1) Performance-based assessments should be mandatory, not optional. (2) Including multiple statewide assessments every year should be eliminated because they take too much time and are of limited value. Investing informative assessments at the school and classroom level provides better monitoring of student progress. (3) States must demonstrate that the data collected via assessment is valid, reliable, of high-quality, and comparable among states — not just among LEAs within each state.

Following up on this initiative, I think there are several important actions NCTM could take to give these positions traction in the public arena as well as move the Common Core effort forward. NCTM should:

1. Produce a position paper on high-stakes testing articulating NCTM’s views. This issue is too important for our positions to be buried in the archives of our advocacy program. We need to make our views widely known. A position paper will encourage our state and other affiliates to advocate at the state level in concert with our national organization.

2. Convene a task force to closely examine the Smarter Balanced and PARCC tests for mathematics and make independent recommendations to the organizations that are responsible for these assessments on ways they can be improved. NCTM should monitor the evolution of these tests to make sure they are improving.

3. Convene a national conference to initiate discussion on Common Core State Standards for Mathematics 2.0. Invite all stakeholders to the conference. It is past time to tap our mathematics education community’s experience with CCSSM iteration 1 and leverage our expertise on mathematics standards in order to strengthen CCSSM and propose what should change for CCSSM 2.0.

4. Encourage and solicit articles in NCTM journals, presentations at NCTM conferences, and comments on NCTM blogs to encourage conversations that would help prepare our profession for a CCSSM 2.0 conference. Nothing is gained by leaving that discussion to extremists who want to undermine the very idea of common standards.

5. Form alliances with parent organizations, unions and other professional organizations hard at work to push back against relentless and oppressive regimens of school testing.

If you have other suggestions, I’d like to hear from you. I plan to write an open letter to NCTM’s leadership advocating for these and other points in time for the summer NCTM Board meeting.

Common Core “Yes”—Smarter Balanced Tests “No”

Mar 31, 2015 | No Comments

How can I be feel that the Common Core State Standards for Mathematics (CCSSM) are a step forward and yet criticize the Smarter Balanced tests as harshly as I have done in the paper “The Smarter Balanced Common Core Tests for Mathematics Are Fatally Flawed and Should Not Be Used”?

The movement to push back against the domination of testing in public education includes participants with diverse educational views—from people who believe the Federal government has no role in public education to parents of special education students outraged that their children are subjected to inappropriate measure. It includes some that see the Common Core standards and the tests aligned with them as one and the same, and others, like me, who do not. I believe that the matter of endorsing or not endorsing the Common Core standards and liking or not liking the high-stakes tests associated with them are two entirely different issues. To say that well considered standards and these poorly crafted high stakes test are inseparable misses the mark. I am sure some who dislike the Common Core State Standards for Mathematics will exploit my own critique for their own goals—goals very different from mine. I have gone to great lengths to be clear about my position. I have been invited to testify against the Common Core standards and the Smarter Balanced tests in a politically motivated hearing sponsored by a state politician with views as far from my own as one could possibly imagine. I declined to testify, but suggested that if the politician wanted to change his bill to support Common Core but ban the tests, I’d be happy to participate.

Students and teachers deserve an assessment system that promotes high-quality education and that accurately assesses what kids know. These tests do neither. If we fall into the trap of defending tests that are not defensible, we will lose our professional integrity as educators. The student walkouts and protests precipitated by high-stakes tests that we are seeing around the country are a sign that something is wrong in education and needs correcting. Parents, kids and teachers are fed up with testing as the main diet in schools. As a mathematics education professional I am with them in this movement. Sheila Cohen, President of the Connecticut Education Association, summarized the issue well in an op-ed piece for the Hartford Courant, “The driving force behind decisions affecting Connecticut public schools should be what is best for children. However, the reality is that schools have been hurt by relentless and snowballing testing that has left reason and learning behind.”

When I say, “Common Core ‘Yes,'” I do not believe that there are no problems with the Common Core State Standards for Mathematics as they were written. I am particularly sensitive to concerns about how Common Core at early grades may be developmentally shaky. And had I written a detailed critique of those standards, you can be sure I would have found much to take issue with. Many educators whom I admire feel that Common Core should be abandoned. I believe differently. I think, at least in mathematics, the Common Core standards have initiated an important and useful process to examine what we teach when, and what we value in mathematics instruction. As I point out in my critique of the Smarter Balanced tests, I especially like the Standards for Mathematical Practice. For those of you who are familiar with the work of the National Council of Teachers of Mathematics (NCTM), I think that the NCTM Principles and Standards for School Mathematics, articulated in 2000, would have been a better starting place for national consensus than CCSSM. More time was taken in the development NCTM’s standards than was taken with the rushed Common Core standards, and more people with deep subject-matter knowledge and with extensive classroom experience informed their contents. NCTM published drafts of their standards and there was an open period of public comment and revision that led to far more consensus among stakeholders regarding their content that we had with Common Core. But here we are in 2015 and Common Core has been put in place by a majority of states. “Woulda coulda shoulda” won’t move us forward.

Diane Ravitch has an excellent post on her blog site titled The Fatal Flaw of the Common Core Standards in which she argues that Common Core are not really standards at because: “They were written in a manner that violates the nationally and international recognized process for writing standards.” Ravitch points out: “There is a recognized protocol for writing standards, and the Common Core standards failed to comply with that protocol. In the United States, the principles of standard-setting have been clearly spelled out by the American National Standards Institute (ANSI).” While I agree with her, I am also a pragmatist. After participating in adult battles over school mathematics “standards” for 25 years, I think it’s time to get on with educating kids. Standards fights have been a diversion of professional energy away from teaching for too long. The mathematics teaching community has been tortured by standards debates for so long that I fear teachers are abandoning the profession over these battles and young people are reticent to go into teaching for fear of being caught in a politicized “standards” quagmire. Let’s accept CCSSM as a reasonable starting point for agreement on what should be taught in school and how and let’s set up a real process—using the ANSI benchmarks that Ravitch discusses. South Korea has refined their national standards for mathematics education through seven periodic cyclic iterations of development, use and revision while we in the U.S. can’t even agree on a starting place for a healthy standards development process. So when I say, “Common Core ‘Yes,'” I really mean “Common Core Iteration 1 ‘Yes,'” I’m ready for Common Core 2.0!

And speaking of Common Core 2.0, friend, retired teacher and active mathematics professional Henri Picciotto has written a thoughtful paper describing changes he’d like to see in CCSSM. Henri’s paper is a must read: http://www.mathedpage.org/teaching/common-core. Whether you agree or not with all of Henri’s points, it is just this sort of analysis and dialog that our community needs in order to capitalize on our experience and expertise and strengthen our standards going forward.

There are other opinions on Common Core worth considering. A new professional friend recently pointed me to an excellent January 23, 2014, piece on the Washington Post website by Valerie Strauss titled: The coming Common Core meltdown. Whatever your position on Common Core, this piece will really make you think about it.

Ever since I publicly challenged the quality of the tests, I have received a trickle of messages from professional friends who think I may have done irreparable harm to the effort to improve testing in mathematics. They feel that my position may lead to the clock being turned back and a reversion to the days of multiple-choice tests and narrow curricula devoid of the things that engage students and make them think deeply. I am sure many others, friends and people who don’t know me, concur.

One friend, who I respect highly, put it like this:

What I am really afraid of in California is that people like [name omitted] will use this to support their case that this whole exercise is a bad idea, and the result will be that we will go back to the 1997 standards and the CSTs [California Star Tests]. As flawed as the SBAC assessments might be, and as much room as there is for improvement, they are so much better than the CSTs that I hate to see the language you used in your critique. To say that they are flawed, and can and should be improved, is great. But I did not see a recognition that what they are trying to do is so much better than what we have had, and that the intent is well worth pursuing. And in truth, the fact that there are performance tasks and constructed response items is a huge, huge improvement over the CSTs.

There is no doubt to the validity of my friend’s point: performance tasks and constructed response items in standardized assessments are important improvements over previous generations of fill-in-the-bubble multiple-choice tests—if they are designed well. We must continue to demand that future tests be based on these types of items. After all, performance tasks and constructed responses are the basis of classroom instruction, and reflecting the norm of educational activity on standardized tests makes basic educational sense. But this argument does not mean that teachers, students, parents, or even this educational professional must accept any test that self-claims to offer improvement without critical analysis.

I considered my friend’s position before I wrote my critique—and I concluded, as I stated in it: “Unfortunately, the Smarter Balanced tests are lemons. They fail to meet acceptable standards of quality and performance, especially with respect to their technology-enhanced items. They should be withdrawn from the market before they precipitate a national catastrophe.” Even in light of criticism from friends and colleagues I trust and respect, I stand by my position.

One could easily argue that I am inconsistent with my position, arguing that the CCSSM should be improved through iterative cycles while the Smarter Balanced tests should not be used. I think there is a vast difference between the role of standards and the role of tests in education. Standards are guidelines used by and mediated by professional educators—teachers, administrators, curriculum developers—who can address their shortcomings and protect students from the impact of those shortcomings. High stakes tests administered by computers directly to students with the intent of taking teachers and all other human intermediaries out of the equation directly impact students. On a good day “relentless” high-stakes testing has dubious value. The direct blows wielded against student self-efficacy, desire to learn, and conceptual understanding by shoddily-crafted, punitive, high-stakes tests that aren’t even intended to inform learning rises to the level of child abuse.

I also believe that technology gives us, perhaps for the first time, useful ways to assess students in ways compatible with good teaching and learning. The technology implemented in the Smarter Balanced tests, however, is not that technology. Nor can it be easily transformed into that technology. I say the tests are fatally flawed because the technology engines that drive the tests are poorly informed and poorly conceptualized. No mere tinkering will correct the flaws. The technology employed is not ready for prime time.

I do not think that pushing ahead with unwavering support for the testing juggernaut represented by Smarter Balanced is going to get us to a place where education will be better. In fact, it is impossible to achieve the goals of better teaching and deeper learning in mathematics classrooms by using poorly crafted high-stakes tests as the benchmark for success. As I state in my critique, I believe that the flaws in these tests will, in fact, doom Common Core standards—and I think this would be a step backwards for education in the United States. Better classroom instruction in mathematics is possible in the U.S. These tests are incompatible with that goal.

To say one supports Common Core standards does not have to mean that we have to also support these “relentless and snowballing” poor-quality high-stakes tests. I wish that the tests were better. There are bright spots is some places, but there are far too many low points. When high-quality assessments were promised as part of the U.S. Education Department Race to the Top initiative, I had high hopes. But Smarter Balanced didn’t deliver. If Smarter Balanced wanted to play the high-stakes game, they should have gotten their tests right—before 10 million students were forced to take them. Maybe future tests can make the grade.

If the only chants are “Stop Common Core and SBAC” and “Common Core and SBAC Forever,” my analysis is too nuanced for the debate. Perhaps, however, it will help people think that we actually have more choices than two. Improving mathematics education in the United States is a complex process. No simplistic strategy will lead to success.

Connecticut Education Association says “State Must Curb Relentless School Testing”

Mar 30, 2015 | No Comments

While my own critique of the Smarter Balanced Common Core tests for mathematics is written from the perspective of a mathematics education professional, many other education professionals with different perspectives share the belief that “relentless” high-stakes school testing has been a disaster for public education. The op-ed posted in the March 16 edition of the Hartford Courant by Connecticut Education Association President, Sheila Cohen, is an excellent example.

http://www.cea.org/issues/news/2015/mar/17/state-must-curb-relentless-school-testing.cfm

Why some students are refusing to take the Common Core test

Mar 23, 2015 | No Comments

The segment below aired on PBS on March 11 documenting the grass roots movement to refuse to take the Common Core tests. The students in the video do a great job of articulating how these tests undermine their education. It is definitely worth watching.

http://www.pbs.org/newshour/bb/students-refusing-take-common-core-test/

5th Grade Class Revealed Poor Design of Smarter Balanced Tests—But No One Listened

Mar 17, 2015 | No Comments

Earlier this month, soon after I released my critique of the Smarter Balanced tests for mathematics, I received the following not from a fifth grade teacher in Michigan.

Steve,

After reading your piece covering the flaws you found on the Smarter Balanced assessment, I had to reach out and thank you. I teach fifth grade. I put my students on the math test, made a video and sent it to Smarter Balanced. My students are on computers almost every day–they are tech savvy. The video is worth a watch:

The email exchange with Smarter Balanced is below. An interesting read. I recently put my kids on the fifth grade ELA practice test and was astounded at the questions my 10 year old students were asked. Questions where each possible answer was so ambiguous it was impossible to answer with any amount of certainty. As a matter of fact, a few of us teachers put the questions on social media and gave adults a shot at these fifth grade ELA questions. No one could come to a consensus. When I retrieved the answer key, we found most of us were incorrect. I forwarded my concerns to the Michigan Department of Education and they wrote me back, this week, informing me they would remove them from Michigan’s sample items (Michigan is purchasing Smarter Balanced questions to populate their state test).

Needless to say, I’m in the trenches with these assessments and I applaud your work in this area. I actually enjoy the study of assessment and data–I’m not a Common Core naysayer nor am I ‘anti-assessment’–quite the opposite. This assessment is simply unfair.

—Elizabeth L. Willoughby, 5th Grade Teacher, Clinton Township, Michigan

The exchange with Smarter Balanced chronicled Ms. Willoughby’s attempts to bring the problems with the user interface to the attention of the consortium and their response. Of particular note was the following email from March 2014:

From: Smarter Balanced Help Desk
Sent: Monday, March 03, 2014 4:40 PM
To: Elizabeth Willoughby
Subject: Smarter Balanced Assessment Consortium

Dear Ms. Willoughby:

Thank you for bringing to our attention the challenges you and your students experienced with the Grade 5 math Practice Test. As I’m sure you know, test development is an iterative process, with improvement coming through rounds of review and editing. It is just this sort of feedback from teachers that is helping us to continue to improve the assessment in advance of its operational launch next year. We are now in the process of updating the practice test, which was released last May, to reflect our latest work in test development. So your feedback is extremely timely.

One of the elements we have continued to work to improve is the student interface. We think your students will have fewer challenges when they take the upcoming Field Test because of those improvements. However, a primary purpose of the Field Test is to identify those questions that are not working as they should so that they can be fixed or eliminated. We will be on the lookout for items in which the test interface is confusing for students.

Please continue to provide us with your feedback. The goal of all the states in the Consortium is to produce assessments that are valid, reliable and fair and that provide valuable information to educators. Hearing from dedicated teachers like you will help us meet that goal.

Sincerely,
Smarter Balanced Assessment Consortium

Had anything significant changed in the Smarter Balanced interface between March 2014 and March 2015, I would not have produced my critique. And more importantly, Ms. Willoughby’s class and the millions of students taking the Smarter Balanced tests this spring might have had a better testing experience.

Steve

Special Project: Fund City Kids at a Rural School

Special Project: Fund City Kids at a Rural School

Mar 15, 2015 | No Comments

Help Steve raise money to support students from MetWest, a small public high school in East Oakland (where Steve is tutoring math) attend the Woolman Semester School, a Quaker school focused on peace, justice and environmental sustainability.

 

Learn More
Show Buttons
Hide Buttons