Sunday, April 30, 2017

I Keep Going To College Board Presentations: SAT/PSAT Status Report Spring 2017

As a former college counselor and current representative for Method Test Prep (a company that works with over 1,000 schools nationwide to help students prepare for the SAT and ACT) I have a strong interest in changes to standardized testing for college admissions. Whenever I attend a national or regional conference where the College Board (parent of the SAT) or ACT, Inc. plan to give a presentation on their latest and greatest, you can count on me to be there. I've actually been to quite a few of these presentations now, and in late April attended the latest update from the College Board on how high schools and colleges can use the PSAT and SAT. It was fairly interesting, and as always I've included my thoughts along with a summary.  If you missed any of the previous installments, here they are:


In September 2015 the whole college admissions industry was curious to hear what the College Board had to say about the upcoming wholesale rewrite (reinvention?) of the SAT; the session in April 2016 was marked by a capacity audience's naked hostility to the College Board representatives fumbling attempts to explain what had been a rocky rollout of the new test. In September of 2016, the most noteworthy aspect of College Board president David Coleman's presentation at the National Association for College Admission Counseling was his frequent, explicit apologies for the poor communication and execution of the rollout of the new assessment.


In late April I attended this spring's College Board presentation at the Potomac and Chesapeake Association for College Admission Counseling (PCACAC) annual conference in Williamsburg, Virginia. The official title of the session was "The New SAT: Mid Year Review Session-Doing More With Data". Note that this presentation was also offered on the same day at the annual conference for the Southern, Texas and Rocky Mountain ACACs, so my guess is that if you have the chance to attend a regional ACAC meeting this spring you can see a version of this for yourself.  The PCACAC version featured Amy Miranda, a Higher Ed rep from the College Board, Josh Lubben from University of Maryland (who was on the panel the year before) and Edrika Hall, of Prince George's county public schools along with Ms. Allen. I expect that if you get to see this presentation in your area it will feature a college and high school representative from your region.

As I mentioned before, last year's version of this talk draw a crowd of over 100 that overflowed the capacity of the room in which it was held; this year's attendance was only about 30. Perhaps the diminished audience was due to other interesting sessions scheduled at the same time (several people told me they went to the one on applying to British universities, for instance) but the smaller crowd, of which about 2/3 were high school counselors or independent educational consultants, was attentive and interested. 

My perception (and if anyone from the College Board is reading this, please correct me) is that they expected the mix to be closer to 50/50, or perhaps to be skewed more towards the college side. After all, now that they've straightened out test procedures and score reporting, the big unanswered questions (such as the ETA for an official rubric to compare new SAT scores to the ACT) seem to be more on the college side of the process. Either way, the session was divided to spend the first half on high school and the second half on college and there was lots of information for everyone.

The counselors in last year's audience was resentful of being referred to as "K12 teachers", and this year the speakers tried to address that head on by focusing on the tools that school districts and schools could use to analyze student performance on the SAT (as well as the PSAT, PSAT 10 and PSAT 8/9. The presentation began by discussing the successes of "Counselor Week", which was a week of special programming after the release of PSAT scores but before students could see them. According to the presentation there was a 165% increase in educators logging in to access test scores compared to last school year and most of the high school part of the audience indicated that they had done so themselves.

While this topic was interesting, it sort of elided the serious problems with score reporting in 2016--a year when students got scores before schools could parse them, where schools expected materials to be sent that had been discontinued, and where the College Board was uncommunicative at best and brusque and dismissive at worst when counselors had questions or complaints. This was why the upfront and frequent apologies in Coleman's speech were so appreciated. Unfortunately, whether through a corporate decision that all apologies had been made, or a personal style that tends toward brusqueness, Ms. Allen did not follow Coleman's lead. In fact, she started by saying "You should all be familiar with the K-12 portal now, correct? So I'm not showing you anything you don't already know."  This was borderline confrontational, and from my seat in the back of the room I saw several people indicate some surprise at the tone.

The K12 portion of the session focused on the data available on students and how to use it. They spent quite a lot of time discussing their "College And Career Readiness Benchmarks", a proprietary rating that purports to predict grades of C or better in college courses. These numbers are very important for the increasing number of states that require all high school students to take the SAT prior to graduation (it often figures into the state's rating for the high school), but they are not really relevant to college admissions. In fact, it struck me that the College Board really shouldn't keep referring to the Benchmarks as "BMs", considering what so many people think the numbers are full of...

Ms. Hall from Prince George's county explained that her district has fully bought into the College Board offerings: they use the PSAT 8/9 for high school course placement, they use the AP Potential report to "identify and recruit" students for Advanced Placement courses, and since all 11th graders in Maryland have to prove that they are "college and career ready", the district utilizes the SAT to demonstrate this status. Students who do not meet the threshold are required to use Khan Academy to brush up on the content before taking the test again as a senior.  Clearly this was meant to be seen as a model for other schools to adopt. My 19 year career was in independent schools, which I can imagine would balk at this suggestion, but I can imagine this being a very popular model in public schools in the coming years. And a profitable one for the College Board...

A member of the audience asked Ms. Allen how to use this kind of data to help students, which led to a discussion of the Khan Academy practice tools. According to the presentation, 1.1 million students linked their College Board accounts to Khan Academy. This is unquestionably a big number, but when one considers that nearly 7 million people took the SAT or PSAT in 2015-16 (and presumably the number will be similar this year) use of Khan is not as prevalent as it could be. The College Board considers an "active user" of Khan to be anyone who has completed at least one practice SAT problem (which means I am an active user, having answered at least three questions!) and claim that there are over 200,000 active users (not unique users) in any given week, and that on average they use the program for 44 minutes/week. They also claim (without explaining their source) that there has been a 10% drop in "paid-for commercial test prep resources".

If hundreds of thousands of students per week are actively spending about an hour each week preparing for the SAT it is unquestionably a good thing, but I wonder how true this is. Working as I do for "paid-for commercial test prep", I can confidently say that we are working with more students than we did last year. I can also say that while I am a huge fan of the College Board's partnership with Khan Academy, I do feel that there is a problem with the College Board's telling schools to refer kids to Khan since there is no way for a school official (counselor, administrator, teacher) to be able to know a) whether or not the student is working with Khan, and b) whether or not they are actually learning anything. In fact, this is the biggest selling point for Method Test Prep, as we can offer schools robust reporting tools that let them track student progress through our program. 

After the high school section was completed, the second half of the talk focused on colleges. Literally zero of the higher education part of the audience had logged into the College Board portal, which Ms. Miranda accepted with a rueful grin--she clearly expected a low number! Ms. Miranda was an excellent presenter, with a friendly, self-deprecating approach to the material and I found her engaging and collegial. Most of the information that she had to show described how admission offices could see summary data on the students who sent scores to their institutions. The college people in the audience were interested, but I saw several perusing spreadsheets on their laptops; this might be due to the date being a week before May 1--audiences at summertime ACAC meetings might be more attentive.

What stood out to me the most was that colleges can supplement the SAT data with College Board's "Enrollment Planning Service" (for $7,300 per year) to see overlap data on the students who send them their scores. In other words, they can see a ranked list of the other colleges to which students sent their scores. This was presented to us as being something they would get on a .pdf form, but it seemed like there might be a way to dig deeper to see the other destinations for test scores on an individual basis. Important clarification: I've been in touch with Ms. Miranda and she has confirmed that as far as colleges seeing overlap data for individual, specific students, "the answer is a definite NO. Enrollment Planning service and Higher Ed Portal users only get aggregate information and do not have the ability to drill down to the individual student level. As part of our commitment to protecting students' privacy, we never share individual student information unless directed by students (score sending)."

That said, the SAT and PSAT generate lots of useful data, such as reports on "top feeder high schools", "top geomarkets" and summary data on ethnicity/race and first generation college goer status. Currently, colleges and universities around the country pay large amounts of money to consultants to help them identify, market to, and recruit their classes. I wonder if the College Board's new focus on this kind of data will have deleterious effects on some of those consultant companies? And if the College Board intends to take some of that market for itself?

All in all, I thought that this presentation from the College Board was interesting, if far less controversial than previous ones I had attended. In a lot of ways, of course, this must be an easier presentation to put together: while last year's snafus required the College Board people to say "we will have reports and tools for you--trust us", now they were able to actually show completed offerings that have been used by many of the people in the audience. I was a little surprised that they still had nothing to say about an SAT/ACT concordance, and would have wished that Ms. Allen had been more conciliatory and taken more of a customer service approach, but I think that this presentation shows a College Board that is confident in their test, their tools, and their position in the market. After years of uncertainty due to changes in the test, this is a "new normal", and it will be interesting to see going forward whether the College Board opts for continued humility or becomes the 800 pound gorilla it used to be. Time will tell!