Wednesday, April 27, 2016

I Went To Another College Board Presentation About The New SAT...

Back in September, I attended the College Board's Annual Counselor Workshop at the University of Richmond. You can read about the session in detail in the post I wrote at the time, but the simple summary was that they gave a very detailed description of the new SAT/PSAT, but seemed a little unprepared for the negative emotions many of the counselors in attendance manifested. 

Last week, I was in Ocean City, Maryland to attend the annual meeting of the Potomac and Chesapeake Association of College Admissions Counseling (PCACAC), the regional affiliate of the National Association of College Admissions Counseling (NACAC). PCACAC is made of up members from Delaware, Maryland, Virginia and West Virginia, as well as the District of Columbia. I was there to represent my company, Method Test Prep but was able to attend the session held by the College Board which discussed the new SAT and how it would be used by colleges and universities in the next year.

The College Board was represented by Mr. René Rosa and Ms. Cassandra Allen. They were also joined by Josh Lubben, Assistant Director of Admission at the University of Maryland, Baltimore County. The plan for the hour-long presentation seemed to be to give an overview of the changes in the SAT, followed by information about the data the new test will provide and how secondary and higher ed personnel can interpret it. Josh was there to describe how he and his colleagues have been preparing to receive scores from the new SAT. Unfortunately, the presentation got off the rails pretty quickly and it did not go as intended. Once again it seemed like the College Board does not have a high level of emotional intelligence which causes them to underestimate how audiences will respond to their presentations.

The session started with Mr. Rosa asking "how many of you are educators?" This seemingly innocuous icebreaker was met with a rude reply of "we're not educators!" which flustered Mr. Rosa and resulted in the follow up "I'm sorry--how many of you are K12 teachers?". That really ticked off the room, as more than one person said "we're counselors" and "there are no teachers here!" I would dispute that a person can't be both, but the hostility seemed to wrong-foot Mr. Rosa and he had some trouble regaining the room. 

Mr. Rosa finished up his section by describing the Khan Academy/College Board partnership, but I'm not sure that he did so effectively. While he touted that students can have a "personalized study plan" if they link their College Board accounts to their Khan Academy accounts, he also told us that so far over 1.1 million students have "logged on" to the Khan Academy SAT prep program, but only 300,000 have linked their College Board scores (and he had no details about how much work those students have since done in the program). He also--inadvertently--weakened his argument that "kids can use Khan Academy to supplement what they learn in school" by describing how his elementary school-age son has used Khan's SAT prep to work on math. While I'm sure that young master Rosa is a very brilliant boy, it undercuts the idea of the SAT being a useful tool to gauge college readiness if kids can work with it before entering middle school.

The next presenter was Ms. Allen. Her presentation took up the majority of the session, and was another example of not responding properly to the emotions of the audience. Many people in high schools have been very unhappy with College Board tardiness, especially the late release of PSAT scores and their delaying of the release of March SAT scores until after the May test (which made it impossible to use the first test to prepare for the second one). Ms. Allen blithely told us that "March scores will be out in mid-May, along with a concordance" to compare the new test to the old one. At no point did she give even a token apology to mollify the audience which was rightly skeptical of College Board overpromising and underdelivering.

She then went on to talk about the revised College Board benchmarks, a preliminary version of which is also due in mid-May.  I wrote about this four years ago, but simply put, the College Board believes that they can correlate performance on the SAT to future success in college. The old SAT generated a single benchmark score, but the new one has five. Students who achieve the benchmark are believed to have "a 75% chance of earning a C or better by the end of their first semester in college". That doesn't actually mean much when you think of it, because 90% of college grades are C or better, but I guess it's a start. 

One of the attendees asked Ms. Allen how the College Board could calculate these benchmarks since no scores have been earned on the new test yet? "Easy", we were told. Apparently the College Board has a top secret concordance based on " a pilot group of 2000 highly incentivized students"; members of the class of 2015 who took the new SAT last summer before it was publicly available. Ms. Allen did note that the benchmarks "may or may not change" (which, of course, is true about everything) and might get recalculated after the class of 2017 finish their first year of college. 

According to Ms. Allen, there are grade level benchmarks on every test the company offers, so the (meaningless) PSAT 8/9 is a benchmark for success on the (largely meaningless) PSAT, which is a benchmark for success on the SAT, which is a benchmark for success in college. 

At this point, Ms. Allen began saying things that frustrated the audience, and she did not react well. She noted that everyone has "the ability to go online and pull data yourself" and visualize it via "nice pretty pictures to print out" and how this is better than the previous system where the College Board sent schools a CD-ROM with all the relevant data. Unfortunately, not every counselor has full, unfettered access to this online data, which Ms. Allen minimized by saying "District level access managers" grant view privileges. She then elaborated that there are actually five levels of access to the online report and indulged in her favorite binary truism to say "You may or may not have full access to data. You may only see summary data, not student data".  She even noted that one unnamed group ("I forget what it's called") doesn't get access at all. 

A member of the audience asked how she as a college counselor could use this data, and Ms. Allen unwisely dismissed her question by saying "If I was a counselor I wouldn't use this information. I would encourage my math and English teachers to use it so that they can see how their kids will do on other Common Core tests." Besides the fact that her answer was unhelpful to private schools and schools in Virginia and West Virginia (those states eschew the Common Core) it was a little brusque. 

Ms. Allen was asked about anecdotal reports from schools which seemed to indicate that PSAT scores in Math were lower than in other sections of the new test. After some hemming and hawing, she was asked "are you saying that the math is harder?", to which she responded, "that's what I want to say without saying it." That point of view seems to jibe with surveys taken of students who took the test in March, but colleagues of mine in attendance were a little unhappy to hear that the math is perceived as more difficult. 

Ms. Allen was asked numerous questions about average scores (on the old test, each administration was curved so that the average score would be 500 in each section) but every time she answered with references to benchmarks (which are set at 510, for now). That was disappointing, and she never answered the question about how much the new test has to be curved to get to that level. As this part of the conversation spiraled out of control, Ms. Allen lost the room for good. She was asked how students would be able to use the PSAT to predict SAT scores (since the PSAT is scored on a 200-1540 scale, as opposed to a 200-1600 scale on the SAT). "Any student ready for college should be able to understand ratios like 600/720 vs. 600/800, so it won't be hard", we were told. Ms. Allen's fellow panelists visibly winced at this.

Mr. Rosa returned to present his information on "Higher Education: Using and Interpreting Scores", but there was very little time remaining, and so he had to scramble. He noted that colleges will get a lot more data about each applicant now; not just total and section scores, but cross test scores, test scores and subscores. He noted that "It is up to colleges to decide what to do with this."  When asked if the College Board has given colleges guidance, or received hints from higher education about how these scores will fit into an "holistic" evaluation, he said no. "They are interested in receiving the scores", we were told, "but they are in 'wait and see' mode to find out if the new scores are valid for admission decisions." So, in other words, the College Board admits that the test that more than a million kids will take next year will not be seen as fully "valid for admission decisions" next year. 


Mr. Lubben described the multiple committees at UM-Baltimore that have been set up to try to get the organization prepared for the new SAT. "The reality at this stage", he said, "is we are still trying to figure this out." They don't know what to expect from applicants of the class of 2017, and he said they are also trying to figure for how long they will accept old SAT scores (in the cases of transfers or non-traditional students). Ultimately, he concluded, nothing has changed: "students should send us every score because we will use whatever is most advantageous" to the applicant.

As my colleague Evan Wessler has pointed out, colleges and universities will not be able to superscore between the old and new SAT, because the assessments are so different. But colleges will need a concordance table to be able to compare a student's score on the old test with the same student's score on the new one (not to mention, to an ACT score). This concordance table will be "available soon", and will have three columns: New SAT score compared to Old SAT score and ACT score compared to old SAT score. So if a person wants to compare the new SAT to the ACT they will actually be doing two conversions. 

I still believe that the new SAT is a much better test than the old one was, and I think it is much more directly relevant to what students see in their high school careers. But I am baffled as to why the College Board's presenters are so apparently clueless of their audience, unaware of how education professionals on both sides of the admission desk use test scores, and unable to empathize with the concerns that are shared with them. Soon enough people will forget that this rocky conversion process ever happened, but in the meantime, it is challenging all around!

Monday, April 4, 2016

Thoughts on the Reuters "Expose" About SAT Cheating-Part Two

This is the second part of two posts that take a look at Reuters' recent expose of SAT test security, especially as it relates to international student admissions. You can find the first one here.

**********************

One of my biggest problem with the articles was a constant (inaccurate) conflation between "the test prep industry" and "Asian test prep centers", such as:
"East Asian cram schools have repeatedly...breach[ed] the SAT, and the College Board has come to see the test-prep industry as a daunting adversary....The ability to obtain inside knowledge of what's going to appear on upcoming exams is critical to the test-prep operators." 
"Basically, the only way to survive in the industry is to have a copy of the test" in advance of a sitting, said Ben Heisler, who offers test-prep and college-consulting services in South Korea. "It's like doping in the Tour de France," Heisler said. "If you don't do it, someone else will." 

"Test prep" in America is a billion-dollar industry, and it includes companies large and small, national and local. I am a part of that industry, and I am glad to say that the people with whom I work are honorable, considerate educators who take seriously our mission to improve the lives and opportunities of the students we teach. It's possible that the intense competition for places in American colleges and universities leads to some otherwise unethical behavior in overseas test prep situations, but that is not the case with leading American test prep companies. Standardized tests are predictable because they conform to a published standard, and test prep companies help students prepare. The articles' casual implication that all test prep is corrupt is unwelcome and inaccurate.

Unfortunately, it's not just Reuters who seem to confuse ethical test prep educators with amoral cheaters like Ben Heisler and his ilk; before the first administration of the new SAT on March 5, the College Board cancelled the registrations of every person who was not "degree seeking", ostensibly to prevent the cheating that Reuters described. As the College Board noted in their rebuttal to Reuters' article,

"We recently refused entry to the first administration of the redesigned SAT to a number of high-risk registrants — many of whom make a living violating security protocols. In the future, we will need to do even more."

It's all well and good to include the qualifier "many of whom", but basically the College Board accused every test prep provider (all of whom were eager to experience the first example of the test we've been preparing kids for for months, to make sure that our methods match the actual questions) of being dishonest. As my colleague Evan Wessler wrote in a scathing blog post about the frustration he experienced by being turned away from a test that the majority of test prep professionals take every year:

"If this were all really about “security”, the College Board could instruct schools to segregate non-high-school-age testers (both physically during the test and statistically in the score-scaling process) from the students taking the exam....Plainly and simply, the College Board doesn't want us seeing its test. It is afraid we’ll be able to use what we see (and the subsequent insights) to help students “defeat” the exam. This is strange, since the College Board maintains that prep courses, tutoring, and the like are ineffective at improving students’ scores. So what are the folks at the College Board afraid of? ...The move we just experienced...is a heavy-handed and capricious attempt at obscuring the test."
I agree with Evan, and if the College Board makes its exclusion of all adult test takers permanent, I don't think it will erode the demand for test prep on the part of students--it will just drive them to choose to take the ACT instead, having first made sure to find professional help to prepare for that test.



**********************

The previous point illustrates one of the biggest weaknesses of the articles: they focus exclusively on the College Board and do not address ACT test security. The College Board and the ACT are very similar. They both administer standardized tests to high school students for the purpose of college admission and (increasingly) for secondary school evaluation. They both compete against each other for market share (and the ACT has a bigger slice of the pie). And they have both been sued for the sale of student information to marketers--the SAT charges 37 cents per name to the ACT's 38 cents. But the articles are solely focused upon the perceived transgressions of the SAT, with the inference that this test is the most important one. Maybe this is because Reuters did not have access to leaked documents from ACT; maybe it really is related to the larger percentage of international students taking the SAT. But overall more students take the ACT than the SAT, and it is hard to escape the thought that the ACT got a free pass from Reuters' otherwise discerning gaze.

College admissions is an important topic to millions of people, and informing the public about flaws in the leading admissions tests is a vital public service. But the Reuters articles were heavy-handed attempts to make the College Board appear to be clueless pennypinchers who are enabling foreign students to take over America's groves of Academe for nefarious reasons. The public deserves better.

If you are a newcomer to the blog who's made it this far, welcome!  You might want to take a look at some of these golden oldies:

Thoughts on the Reuters "Expose" About SAT Cheating-Part One

If you've been a reader of this blog for awhile, you know that I was concerned with the topic of standardized tests even before I started work with Method Test Prep. It's no secret that I have been critical of, among other things, the role of standardized testing in college admissions, the value of standardized tests, and the overall competence of the College Board and the ACT. However, I'm even less of a fan of bad journalism.  Last week I read a pair of articles from Reuters that, to me, went far beyond criticism and veered all the way over to "hatchet job" territory. In response, I have compiled my own two-part piece. You can find the second one here.


On Monday, March 28, Reuters published a set of very long articles (seriously, they combine to be over 8300 words) by Renee Dudley, Steve Stecklow, Alexandra Harney and Irene Jay Liu. The first one, "As SAT Was Hit By Security Breaches, College Board Went Ahead With Tests That Had Leaked" was accompanied by part two, menacingly titled "How Asian Test-Prep Companies Swiftly Exposed The Brand-New SAT".  Part one of my response will focus on what seems like inconsistent editing and reasoning within the article, and on the hints of a "Yellow Peril" nativism within the Reuters pieces. The authors seem convinced that Chinese (and, to a lesser extent) South Koreans are gaming the system to "take" places at American colleges and universities from (presumably honest) Americans.

**********************

Here is a brief synopsis: the piece begins with the tale of Xingyuan Ding, a sophomore at UCLA ("one of America's most exclusive public universities") who got a perfect 800 on the Critical Reading section of the (now defunct) old SAT as a result of illicit aid from "a Shanghai test-preparation school" that took advantage of the College Board's recycling of previously administered tests for overseas administration.  According to the article, Ding says that "...he already knew the answers to about half of the reading section when he took the test in Hong Kong". The article goes on to refer to a leaked internal PowerPoint from the College Board that seems to acknowledge that "half of the SATs in inventory" in June 2013 were "compromised--the College Board's term to describe exams whose contents have leaked...outside the organization. Four of the exams had been compromised by an unnamed 'Chinese website.'"

The article goes on to describe the rising numbers of international students in American colleges and universities (particularly from Asia) and asserts that the College Board reliance on old tests is a major security hole that calls into question the validity of thousands of test scores. The articles are solely about the College Board and the SAT, apparently because "[t]he SAT is taken by far more foreign students applying to U.S. colleges than the ACT is." The authors are very hung up on the growth of international student enrollment, to the point where they seem to have got carried away. At one point they call China "the SAT's largest market by far", and then note that "29.000 [students] from China" took the test in 2013-14. But considering that almost 2,000,000 Americans took it that year it's hard to see that China is a market that challenges the US in size.

The article attempts to lay out economic motivations for the College Board's inaction in the face of what Reuters believes to be widespread dishonesty that gives "foreign applicants" an "unfair advantage" that leads to "displacing Americans". Blame is also apportioned to what are variously (and perhaps carelessly) labeled "test-preparation centers" and "Asian cram schools", as well as to less than virtuous American teenagers who take to Reddit and College Confidential as soon as they leave a test center to violate the confidentiality agreement they signed only hours before.


**********************

As I noted above, these articles seem to substitute what Stephen Colbert would call "truthiness" for actual facts, mostly in the use of insinuation that provides what may be a false sense of cause and effect, as well as lapses in logic that expose weakness of some of the "evidence" supplied by the authors.

One example of insinuation appears when the authors discuss the economic implications of increased test security. As noted above, the article is based on the revelation that when the SAT is administered overseas, they use previously administered tests, the contents of which have, by then, leaked into the public domain, and the article implies that the College Board is too cheap to pay money to prevent this. Discussing the costs related to reducing the number of international test dates (which would require the use of fewer old tests):
"...Reuters calculated that limiting the number of sittings in China would have cost the College Board roughly $1.2 million to $1.5 million in lost revenue over the next fiscal year. The College Board declined to comment on that estimate. The organization reported net income of $99 million on revenue of $841 million in the year ended June 2014."
The College Board is officially a non-profit, though figures like the one above seem to make that point somewhat laughable. But let's assume that Reuters is right, and the College Board was aware of serious security issues on their tests. It doesn't seem likely that a major corporation would risk their reputation and market share for 1.2% of their net annual income (.14% of total revenue), especially when the solution could be as simple as commissioning more original tests.

According to the article, if the College Board were to stop reusing old exams, it might be forced to "double the current fee" to cover the cost of creating new tests.  The authors note that "it costs $54.50 to take the SAT; students in East Asia also pay a $53 surcharge. That's a fraction of the thousands of dollars many Asian students pay to attend test-prep centers." So international students already pay nearly 100% extra to take the test; does that mean they would pay 200% extra for secure tests? Or everyone would face a 100% increase? The article isn't clear.

Another example of the article being oddly vague is how they refer to the Educational Testing Service (ETS). Most people think of the College Board as the company behind the SAT, but it is actually the product of the Educational Testing Service. According to their website, "ETS is a private nonprofit organization devoted to educational measurement and research, primarily through testing. We develop and administer more than 50 million achievement and admissions tests each year at more than 9,000 locations in the United States and 180 other countries." In other words, ETS devises and creates the SAT to the specifications of the College Board. However, according to the article, the ETS is "the College Board's Security Contractor". I found this to be curiously less than fully accurate, and I still can't figure out why Reuters chose this appellation, especially since the article keeps berating the College Board for security lapses, not ETS.

Another example of insinuation is that, even though the authors acknowledge that SAT content has been compromised in "South Korea, Egypt, Saudi Arabia and China", the focus, right from the beginning of the article is on China (and to a much lesser degree, South Korea). The authors state plainly that:

"Security breaches abroad are increasingly significant for U.S. higher education because schools are allocating more seats than ever to foreign students. About a third of the 761,000 degree seeking foreign students in America come from China, according to the Institute of International Education. Overseas students are especially attractive because most don't qualify for financial aid and thus pay full price. Chinese students spent almost $10 billion on tuition and other goods and services in America in 2014, Department of Commerce statistics show."

$10 billion dollars is a lot of money, but the American GDP is $18 trillion, so Chinese student spending is not even a drop in the overall bucket--I think this is another example of the article trying to imply something that may not be as significant as it seems at first blush.

As far as international enrollment goes, the always indispensable Jon Boeckenstedt has compiled information from the Integrated Postsecondary Data System (IPEDS) confirming that from Fall 2004-Fall 2014 international enrollment at the bachelor's level has increased 66%. To dive more deeply into this, one can run one's mouse over Reuters' accompanying graphs, which reveals that at the undergraduate level (i.e., the students who would take the SAT), there were 124,000 Chinese-born freshmen enrolled in American colleges and universities in 2014. That's a big number, but the article doesn't put it in perspective: there were nearly 18 million total college students, so it's not like U.S. citizens are getting elbowed out of the classroom.


All of the information in the charts and the text seem to conflate "international" with Chinese. As I noted above, this is misleading, and furthermore, the article seems to imply something more sinister. The authors suggest that "Evidence that some foreign applicants are displacing Americans because of an unfair advantage on the SAT could add to a backlash against standardized testing in college admissions." They quote Douglas L. Christiansen, Dean of Admissions at Vanderbilt and chair of the College Board Board of Trustees, who says "It is hurting all students when someone cheats on any aspect of their application. It is displacing someone else."

The implication, of course, is that "someone" is Asian and "someone else" is American. This seems similar to some of the divisive and inflammatory language that we have seen in this year's Presidential election, and it does not seem like something I'd expect from a leading journalistic outlet. Let's take the "top hosting institutions" listed in the graph above. None of these are what you would call "small, residential colleges". While they all might guarantee housing for first-year students, their admissions offices are not limited by available beds. Furthermore, while it might seem logical that every international student who enrolls reduces the number of spaces for domestic students by one, that is not actually the case. Colleges typically only allocate a small percentage of places to international students, so in reality the students from other countries are competing with each other, not with Americans. Colleges and universities make their decisions on how many students they want to enroll based on many complex factors, which makes this analysis rather simplistic. And if anyone knows that standardized test scores are not the most important factor in admissions, it is the admissions people themselves.

Please read the rest of my thoughts on the Reuters articles here, and feel free to share your thoughts in the comments below.