Saturday, October 6, 2018

What I Saw At NACAC 2018: A Summary of "Test Optional Admission: Current Trends, Best Practices, and Future Prospects"


I attended the 2018 National Association for College Admission Counseling conference in Utah in the last week of September. This was the fourth straight year I've been able to attend, thanks to my day job with Method Test Prep, a leading ACT/SAT preparation company. I was there to work, and consequently spent most of my time in the exhibit hall meeting current and future customers, but fortunately I was able to attend a session on the Thursday called ""Test Optional Admission: Current Trends, Best Practices, and Future Prospects".


As a representative of a test prep company, and more significantly as a proud alumn of Hampshire College, which was test optional since its opening in 1970 (the college and I are the same age) and four years ago became the only "test blind" 4-year institution in America, I have had a deep interest in the "test optional movement" for decades. I was very interested to learn more about it, and I wasn't disappointed. The presentation was a good one, with several thought-provoking moments. 

    The session met in a very large room with many hundreds of chairs. I had left my phone back at the exhibit hall which vexed me for two reasons: the pedometer didn't count what must have been over a thousand steps to get to the room (the Salt Lake City convention center is surprisingly large!) and I couldn't take a picture of the space. While the room was not "packed", I tried a quick count and there were well over 200 attendees [note: Bob Schaeffer of FairTest says that the room had 655 chairs, and that he estimates about 350 people attended]. My suspicion was that the session would attract more people from the college/university side of things, but a closer look (and a show of hands poll) made it seem that there were probably more high school counselors/independent counselors in the room. Interesting!

    The presenters were:


    If you've been reading this blog over the years, you know that I am a big fan of Jon Boeckenstedt's views on the social value of college admission, his facility with words and data, and his contributions to the field. Last year at NACAC I wrote about a session that featured a refusal to address his questions from the floor, and he follows me on Twitter, which makes me think I'm cool (in a geeky sort of way). I've met Bob Schaeffer at previous NACAC conferences and had chatted with him earlier that morning, during which conversation he laughed at the irony of a Hampshire grad working for a test prep company and seemed very excited about a news article from California that he would mention during the session. I have not previously interacted with Whitney Soule or Andrew Palumbo.

    The session started on time and Bob was the moderator as well as a speaker. He told us that handouts would be available in the conference's mobile app for us to download later, but they have yet to appear 10 days after the session. Unfortunately, my lack of a camera and my trust that the slides would be downloadable mean that some of the data I refer to below may not be wholly accurate, but represent my best efforts to type notes while the speakers were presenting. Any mis-statements are mine, and not the fault of the panelists. [Note: Bob Schaeffer sent me the speaker's slides after I published this post, so I have re-edited it to make sure that numbers are correctly represented.]

      Bob Schaeffer spoke first, introducing the speakers. He noted that Bowdoin is celebrating the 50th anniversary of becoming the first test-optional college, and that Whitney Soule and Andy Palumbo had spent their whole careers at test-optional schools, and that Jon Boeckenstedt would have "numbers and critical thinking" to offer us.

      "49 years and some months since Bowdoin started the test-optional movement", Schaeffer noted, "there are now 1,022 accredited, four-year test- optional colleges and universities, including more than half of the 'Top 100' national liberal arts colleges and many national universities." Schaeffer mentioned the annual Inside Higher Ed survey showing that "a majority of counselors want to do away with the ACT and SAT". I haven't done a blog post on that article, but I did write a mini-Twitter thread on it, if you are interested. Schaeffer concluded his remarks by noting that "more than half of all the schools in the Northeast and a growing number in the mid-Atlantic" have test-optional policies, and he quoted the 2018 Fiske Guide as saying " Students can avoid getting involved in the admissions test rat race and still have a range of colleges and universities from which to choose."

      Overall, Bob Schaeffer's tone was similar to the triumphalist one that pervades almost every publication from FairTest, nearly all of his emails to the NACAC e-list, and I assume most of his conversations. While he surely knows better, Schaeffer seems to conflate "test optional policy" with "students don't have to take these tests" and "the tests are dying out" (the latter two are not direct quotes, but the impression I get from him). I was a college counselor for over eight years and I always urged students to consider test-optional destinations if I thought that their scores might not represent their capabilities as students, but in many cases admissions policies are not in line with policies for awarding aid, or for sports eligibility, or other reasons (such as national origin of the applicant). All of which makes the test-optional scene a very pleasing range of grey shades, and not a black or white proposition. [note: after Bob Schaeffer read this post, he wrote that " I always try to explain that ACT/SAT "optional" policies mean exactly that: applicants have the option -- or choice -- about whether to have their scores considered in the admissions process.  FairTest has never claimed that test-optional admissions is the same as ignoring standardized exam results in all circumstances..."]

        Whitney Soule began her remarks by noting that her 27 years in admissions have all been test-optional. According to Soule, "multiple studies have shown that Bowdoin's academic assessment is a good predictor of first year success with or without test scores." She said that at Bowdoin, the transcript and school profiles put students into context, as do student essays and recommendation letters; if test scores are submitted, they are also used contextually.

        Soule told us that Bowdoin became a test-optional college out of "principles of access", and that over the years, typically 66%-75% of applicants do send test scores. As she said "we all know that anyone applying to Bowdoin has taken these tests multiple times", but she was clear that Bowdoin performs a "holistic review" and that test optional is an important part of that.

        "I know that counselors are skeptical that there isn't an implied weakness for kids who don't send scores," Soule said. "We don't hit a speed bump, we don't stop and think and try to imagine what the applicants' scores were, or their motives for not sending scores for us to review. We're not asking because we make affirmative decisions derived from the materials students do send. Test scores are not heavily weighted, rather they are an extra piece of information, like an interview."

        Soule's final point was that any school that uses test scores (whether they practice holistic admissions or not) "needs to understand what they are using them for, and that expectations match what the test was designed to do." This was a thought-provoking statement, and also a great segue to the next speaker. I say "thought-provoking", because in my line of work, I deal with many people who are not using the tests for their designed purpose. As you may know, dozens of states require all graduates to take either the ACT or SAT (and these highly profitable contracts are the biggest area of growth for the testing companies). Consequently, many schools are evaluated on the ACT or SAT scores of their students-even the ones who do not intend to attend a four-year college. A major problem at these schools is that the administration place great value on an assessment that many students find valueless, and that tests a lot of students on material that they have not had to master in their high school careers. I'm always glad to be able to help these schools to help their students, but we are definitely dealing with "off-label" uses of these college admissions tests.

          Andrew Palumbo noted that, like Whitney Soule, he has only worked at test-optional schools but at his previous three workplaces (Union College, the Sage Colleges, and Plymouth State) he was part of the decision to switch to test-optional. “You can imagine my relief”, he said, "when I got to WPI and they were already test-optional."


          Palumbo observed that WPI became the first national STEM university to go test-optional in 2007, but that "test scores were never a major factor" at WPI. He told us that there are certain pillars of the WPI Plan, including:
          • Collaborative learning
          • Project Based learning
          • Accelerated terms
          • Non-punitive grading
          and that they wanted their admissions process to reveal students who would "thrive" at WPI. He said that "standardized testing doesn't match up" with such an individualized program, and that scores don't show how well students work with others, or their independence. Palumbo said that the most common questions in admissions committee deliberations is "How would this applicant perform on a project team?" and that question cannot be answered by standardized test scores.

          Palumbo told us that WPI's goal is "a more effective and equitable admissions process", and that in the decade since going test-optional, the student body grew by 41%, the number of women by 81%, under-represented students of color increased by 156% and under-represented women students of color by 232%. Besides these undeniably good results, Palumbo stated that Worcester Polytechnic has continued to review all of their policies; recently they stopped giving scholarships to National Merit students because it is based on standardized test scores that not all students will have or will want to submit.

          I think that is super; I was already positively disposed to WPI thanks to the experiences of some of my former students there, but I really liked hearing how thoughtfully they work to make sure that their admissions policies are aligned with their institutional mission. One last thing that stood out was that Palumbo said that WPI had previously required students who applied without test scores to complete some other steps that students who sent scores did not have to do. The university realized that this might have inadvertently been a barrier to applicants, and in the first year after dropping these requirements, the number of "non-submitters increased three-fold", as did diversity.

          Bravo! While FairTest and mainstream media so often present "test-optional" as an elimination of standardized testing, closer reading of the press releases from schools reveals that this is not usually the case, and that it either isn't true for all applicants, or that non-submitters have extra hurdles. I was very pleased to learn that WPI has eliminated those obstacles, and encourage other admissions offices to consider doing the same.

            Jon Boeckenstedt's presentation took the form of a "Personal and Professional Evolution" in three phases. He began by noting that he was a "diamond in the rough kid" (high test scores and low grades; first-generation student) the sort that is often cited by the ACT and the SAT as benefitting from standardized tests. Boeckenstedt said that while he knows that "for some kids, these tests can be a path up" they aren't worth it for the bulk of the country.
            I've been a big fan of DePaul's since I paid them a spontaneous visit five years ago (and I still wear the t-shirt they gave out that day), and for some of the same reasons that I find WPI to be so admirable: a clear focus on their mission and on trying to make their admissions policies match their stated goals. Boeckenstedt said that DePaul's mission "explicitly discusses helping students who wouldn't otherwise have access to college." He said that as their admissions pool grew, they feared that "students we feel compelled to serve might be squeezed out". Further, internal studies of first-year students showed that students with good grades who study and work hard are likely to be "achievers and succeeders", and the ACT and SAT don't evaluate those attributes.


            Boeckenstedt observed that the natural way to form a class is to lop off the bottom, but doing so "quantitatively" often results in culling the "kids you want to help the most." He said that there was a sense of self-loathing in discussing applicants as "3.7, 1220", as opposed to a more individualistic way, and told us that "we wanted to engage more deeply with kids".


            "Phase two" of the story is what Boeckenstedt called "The Onslaught of Stupid". He warned the audience that "if your school goes test-optional, be prepared for know nothings to air their opinions" (while showing a Chicago Sun-Times blog headline proclaiming the "dumbing down" of DePaul).

            After careful study, DePaul found that "test scores co-vary strongly with social capital, which is not what we are trying to do." Boeckenstedt stated that "so much of what matters (work ethic, 'showing up', not playing video games) can't be shown in an application at all. And test scores don't add anything to the equation. But they have a low rate of false positives, so selective schools like them." He went on to note that "many colleges have a faculty full of people who got where they are partly due to test scores". The admissions office spent a year presenting to all of their constituencies (including faculty) before settling on their test-optional plan. Boeckenstedt said that he was able to convince people by saying "I can guarantee you that we live by the idea of academic quality."

            Annually, 10-12% of applicants are non submitters. Most students who apply to DePaul come from states that require scores, so most send them. But "scores are not a big factor. They are looked at as an interesting, extra, non-required item, equivalent to a student who sends a book of their poetry, or additional recommendation letters.

            "Phase three" consisted of "pushback from major testing organizations". Boeckenstedt (whose harvesting of public data to explain trends in higher education has been a staple of my reading for years) says that Wayne Camara (top psychometrician at the ACT) accused him of "not being able to read data--one year after I was invited to Iowa City to teach ACT engineers how to use Tableau software to present data"! He also mentioned the refusal to let him ask questions at last year's NACAC presentation on the book that was sponsored by testing companies and featured research by the College Board's and ACT's pet scholars and that seemed determined to provide evidence for why schools should not choose test-optional policies.

            Boeckenstedt concluded by telling us that "it dawned on me that people are unaware that the College Board and ACT are less accountable to government and the people than the average hot dog vendor on the streets of New York City. Their goal is to create and drive curricula to build and promote testing down to kindergarten, but they are accountable to no one but themselves."

            "Why do we take this?", he asked. "I don’t know. This doesn’t mean you shouldn’t require tests, if that right for you. But I’m actually just sorry you have to sit through this crap every year."

            As always, Jon Boeckenstedt proved himself to be an engaging presenter. What came through to me was the point (that reinforced what Andrew Palumbo told us) that institutions considering test-optional policies need to make sure that their choice reflects the institutional mission as well as the long- and short-term goals of the school and that all stakeholders have a chance to weigh-in, while being assured that the policy is being changed to make the school better, with more capable students.

            What also came through (from all three presenters), is that even at these prominent test-optional schools, the majority of students still submit their scores. Figuring out why they do so goes beyond the scope of this article, but it seems clear to me that testing companies can definitely co-exist with test-optional admissions policies, and that we are far from a "tipping point" that will lead to the tests' irrelevance. Whether that is good or bad is also a topic for another day, but please share your thoughts below.

              A question and answer session was next, and there were microphones set up in the aisles for people to use. While questioners were asked to identify themselves when posing their queries, not all did.

              The first question came from Chad Austin at Kean University in New Jersey. Austin said that they are considering test-optional, but they rely on the tests for merit aid, and asked the panelists for their thoughts. Boeckenstedt said that DePaul has an academic index computed in such a way that doesn't need scores; Palumbo reminded Austin that "this could be a chance to revisit your approach", as WPI did. Finally, Soule pointed out that "using test scores to award scholarships is a way to guarantee that most of your money goes to affluent kids".

              David Sheehy, of Boise High School, asked if there is any evidence showing it is worthwhile to have students send extra materials in lieu of test scores (which many test-optional policies require). The consensus was that none of the three panelists' institutions asked for anything extra.

              Tara Miller, from Austin High School urged Bob Schaeffer and FairTest to do a better job engaging counselors. She said that test agencies work hard to reach out to counselors every year, and that many of her peers don't know how to explain test-optional to their communities. Schaeffer's answer was that FairTest is a "three FTE (full-time equivalent) organization. We wish we could do more to reach new counselors and students, but in the meantime, look at our website, www.fairtest.org" I have to say, I had no idea that FairTest was so small--very interesting!

              A fellow calling himself "Christopher from Honolulu" (who has listened to too much Larry King, I think) said that his school is "investigating the Mastery Transcript, and eliminating grades, but the word we are getting from colleges is that test scores will be needed to make sense of our students." He asked what information the panelists would want to see on an "unusual transcript". Whitney Soule told him that "colleges will want a method of assessing kids in context; maybe not test scores, but we will need a way to understand what their achievements are. Bowdoin had kids apply from 4,500 different schools: we know how to evaluate different environments, but we need the high schools to help us." Andrew Palumbo agreed that "your students are better off with the more context you can provide." Jon Boeckenstedt opined that "the Mastery Transcript is another opportunity to give privileged white kids more opportunity to stand out. Not that we shouldn't reduce stress on kids, but the Mastery Transcript isn't the way."

              As a graduate of a college without grades I certainly applaud any effort to reduce subjectivity under the false flag of numerical objectivity on the part of learning institutions. But as someone who worked with a wide range of students applying to a bewilderingly broad array of colleges, I know that the people deciding on our students' applications need to be able to put kids into context one way or another. I wish Christopher had told us the name of his school, because I'd love to find out what they eventually choose to do.

              The remaining questions were not very substantive. A woman who didn't identify herself asked a confusing question about yield, which the panel didn't seem to know how to answer. Someone called Dorothy who said that she works with the State Department helping students from overseas to apply to American colleges asked if there are any "implications" if international students apply test optional (she clearly wasn't listening to Whitney Soule). Whitney courteously chose not to point out that she'd already discussed this, and noted that Bowdoin only asks for a test of English proficiency for applicants from outside the US, with which the other panelists concurred.

              The final question came from a man who also did not identify himself. He was a bit pugnacious, and pushed the panelists to say how "transparent" their policies are and if "kids can call you to ask if they should or shouldn't submit their scores". Whitney Soule fielded this one, saying "we can't really answer this in a vacuum. The point of our focus on 'context' and 'holistic admissions' is that a 'good score' might vary depending on the applicant." This was a very polite answer. If I was on the panel I would have probably asked if he and his students even bother to read the admissions websites to learn their test policies; in my experience these are almost always spelled out clearly, which is pretty "transparent".


                I found this presentation to be very informative and that it made me think. Not having ever worked on the college side of admissions I don't think that I can add anything to the discussion of whether or not a particular college/university should decide to go to a test-optional policy. But what leaped out to me as someone who has spent a long time advising students and parents is that the number of places that will consider applicants who do not send test scores is growing, and that there is likely to not be a stigma or implied weakness for those applicants. At the same time, the vast majority of students do send test scores, even to test-optional schools, so in many cases there is really no "wrong answer" to the question of "should I send my scores".

                As I mentioned above, Bob Schaeffer was gleeful about news that came out the day before that the University of California system was planning to evaluate whether ACT and SAT scores were actually predictive of college success (which is, of course, a major argument the companies make in favor of their products). Reading about the announcement, however, makes me somewhat skeptical that this will prove to be the death knell of standardized testing that Schaeffer seemed to be hoping for. In fact, it just seemed to indicate that the University of California system wants to study whether their policies align with their mission, which is what the panelists endorsed. [note: in his email, Bob Schaeffer clarified that "What I tried to convey is that, just as lobbying UC to adopt the SAT as an admissions requirement was a key part of the College Board's campaign to make it a national test (see Nick Lemann's history, The Big Test), so too could a UC decision to de-emphasize standardized exam scores (as a result of this study) encourage many other institutions to go test-optional."]

                Personally, I do not see any sign of the standardized test companies losing prestige or market share anytime soon. While test scores were designed to only be useful to colleges and universities, the companies' customers are the students who take the tests, not the admissions office that interpret the scores (in the sense that it is students who pay for the tests). And with more and more states moving to making the ACT and/or SAT a mandatory assessment, the pool of students taking the tests isn't going to shrink significantly in my lifetime. If two-thirds or more of applicants at these test-optional colleges and universities still choose to send their scores, the test companies will continue making healthy (non) profits for the foreseeable future.

                I hope you found this to be an interesting post. If you are looking for more, here is the summary of the session from Inside Higher Education that includes some useful links.

                Wednesday, September 19, 2018

                A College Counselor's Thoughts On ACT Academy

                As you may know if you've visited this blog in the past, and in the interest of full disclosure, I am a former college counselor and now work for Method Test Prep, a company that partners with over 1,000 schools, community organizations and independent counselors to help students prepare for the ACT and SAT. My company has an online ACT/SAT program, so while I see everyone in the college admissions field to be colleagues, it's reasonable for a reader to consider us to be in competition with ACT in the online space. Also, I sincerely think it's great that ACT and SAT no longer persist in the idea that their tests can't be prepared for and applaud them for investing to provide free resources for kids. As a citizen I'm rooting for them to provide excellent tools for students, but I don't think ACT Academy is there just yet. 

                When ACT rolled ACT Academy out in the spring I speculated that it was a "Trojan Horse"; less of an ACT preparation software tool and more of an attempt to make ACT, Inc. schools' preferred clearinghouse for online video resources in their curricula. Colleagues of mine who looked it over were surprised to see that much of the material was not actually relevant to the ACT test, but rather was related to "high school subjects" writ large.

                I just spent an hour watching a webinar from ACT talking about the new counselor/teacher/parent side of the ACT Academy, and it has definitely confirmed my suspicions. Here are my thoughts and observations:

                First, the webinar was not a strong presentation. Besides filling it with advertisements for future webinars and publications, IT WAS A PRE-RECORDED WEBINAR FROM AUGUST, though that wasn't admitted until I asked.  Because ACT has chosen to show this in recorded format I assume that they are comfortable with my quoting the presentation and that nothing in it is "misspoken". The session was led by  Deb Rolfes, ACT's Senior Manager of Content Marketing--she was on live answering questions via text, but the two presenters were pre-recorded, which led to a lot of disconnect. For instance, apparently 95% of attendees today were 9-12 teachers, but the recorded presentation kept talking about elementary school users.  It also kept referring to September 6th in the future tense. I'd expect better from such a large organization that gives so many presentations. 

                Secondly, I was surprised that we were told no less than five times that "ACT is a non profit organization". As in "Did you know that ACT is a non-profit organization? We donated over 660,000 fee waivers last year."  I mean, yes this is technically true for taxation purposes, but it felt like a concerted effort to make the audience see ACT as a friendly non-profit instead of the multi-million dollar enterprise it is. Because is it really a "donation" if you give away something that can only be used for your own product?

                The pre-recorded presenters were Lisa Wolf, Director of K-12 Partnerships (who seemed to be on a speakerphone and was hard to hear) and Polina Babina, Product Manager for ACT Academy. The format was that Lisa would tee up discussion topics and Polina would go in depth, though sometimes Lisa chimed in with some further details.

                Polina began by telling us that ACT Academy as initially launched was "a toe in the water" and was just ACT prep for students. But now they have a "giant catalog of over 500,000 resources (videos and games). We  can track how they are used to help students. Our videos help students grasp the concept and DO NOT TEACH TO THE ACT TEST." The capitalization was Polina's not mine, so that does seem to be confirmation of my theory. 

                I asked a couple of questions via chat--here they are with the answers from Deb:

                1) Are you in licensing agreements with content providers? Is material made for you, or are you linking to pre-existing content?

                A. "We have both-we work with Open Educational Resources (OERs), we have partnerships directly with many of our content providers. It is NOT made especially for us.

                2) How many of the 500K resources are ACT Prep?

                A. "I don't know, but I'll find out and get back to you. But we have about 60 short quizzes and 2 full length exams."

                For question 1, I think that in contrast to the custom SAT-specific content created by Khan Academy through their partnership with the College Board, this offers much more material, but since it isn't focused on a particular topic or area it can seem much more diffuse and hard to find the specific content one needs. Also, since it comes from an array of sources, the presentation style and methods of instruction are by definition inconsistent. For the second question, again, I think that we can see that while the product is called "ACT Academy", preparing students for the ACT is not even close to its primary function.

                A new feature since early September is that teachers can create classes and assign students to them to easily keep track of what their kids are working on and to see what they are learning. Teachers choose the grade/subject and level they are teaching and can even pick the textbook they use and will find pre-selected, "curated" suggestions for videos and games kids can use. I spent 12 years teaching history and I always appreciated the teacher resources that came with my textbook, so I can definitely see how a teacher might find this valuable. But I'm not sold on the execution.

                I set up a teacher account for myself, telling the program I was an ACT Prep teacher and the screenshot below is what I get when trying to find resources for geometry. The results are not that easy to navigate and it's not clear which of the resources might be more "geometry" than "ACT geometry". 

                
                Screen Shot 2018-09-19 at 3.29.36 PM.png
                

                In my job I work with teachers all over the country, and in my experience many of the "test prep" teachers in our schools are not trained in, or experts on, the ACT and SAT. That's fine, and programs like this could help them to help their students, but I would worry about option overload leading to analysis paralysis if I had to decide which two or three videos to use for tomorrow's class.

                The program now offers a report called the "Mastery Chart" that will show student by student performance on every quiz question they've answered. Interestingly, it compares a student's success to her CLASS, but not to the total user base, which seems like a missed opportunity.  The Mastery Chart will also suggest specific videos and games to send to individual students. Cool, but again something that could be pretty time consuming if a teacher had to decide what individual items to send to every student in class. 

                Teachers who have already assigned students to a class can click on "Voluntary Resources" to see what work kids have done on their own. Apparently this has been "extraordinarily successful" with beta testers, whatever that means.  Where this can be helpful would be in a self-paced scenario; a college counselor in school could log in to ACT Academy and look at what a student has been working on to prepare for a meeting to discuss testing strategies. That's cool! But it can only happen if the teacher has created the class already. It seems that students can be batch uploaded, which might simplify things, but there are definitely some pre-requisites before all of our colleagues can start pulling reports. 

                When I first saw ACT Academy in the spring I thought that it was odd to see links to "premium" paid content for students. Apparently "95%" of teacher resources are free, but they do have test banks from Houghton Mifflin Harcourt and Pearson  that schools can subscribe to for $10/month/teacher. Coming in October will be special "academies" for ASPIRE, WORKKEYS and other ACT products. Also there will be an "engine" that will tailor resources to every student in every subject (right now, the robot only suggests content to kids in the ACT mode). 

                Finally, Polina and Lisa tried to tout the support available to users. There is live chat (hard to tell if it is available 24 hours per day or just for the school day) from team members who are "specially trained" to help, and Deb typed in that schools can contact ACT to arrange in person training sessions from local reps.  On the other hand, one of the pre-recorded questions was "how would I implement this in my curriculum?" and Polina  (remember that she's the project manager) responded with "Um, I don't know. It's just a supplement to curriculum." Lisa leaped in to say "this is a great resource for all subjects and all classes and all students at all grade levels."

                So what is it? "One ring to rule them all, and in Iowa City bind them"?  Or "Um, I don't know?"  I suspect the latter...at least for now. This looks like it could definitely grow into a fine technology clearinghouse for a school or district willing to trust ACTs "curation" of resources to help provide their teachers with indexed audio/visual resources across the K-12 spectrum. But as "ACT Prep software" I still think it is bewilderingly overcrowded and unwieldy.  Thanks for reading, and I'd love to see your thoughts in the comments!

                Saturday, June 16, 2018

                Thoughts About The New ACT/SAT Concordance Tables-Summer 2018

                Longtime readers of this blog will know that I work with Method Test Prep, a national ACT/SAT preparation company whose mission is to level the playing field of standardized testing. As such, I have previously written in this space about developments relating to the ACT and SAT (here's just one example) and will continue to do so when developments call for it. The joint publication of concordance tables by the ACT and SAT will be a great source of answers to questions for people about how to compare scores across the two tests, but it also raises some questions that I hope will encourage discussion among all stakeholders. My colleague Evan Wessler and I collaborated on an article for the Method Test Prep blog about this and I've reposted it here as well. I hope that you will find it interesting and useful.

                **********

                In Accordance With Concordance

                by Evan Wessler and Ethan Lewis


                Where some see significant change, those who look deeper see something quite different. When it comes to SAT and ACT concordance, it's important to know what's at stake.
                Let's [Concor]dance!
                Because the ACT and SAT are different tests with distinct scoring scales, students' results are not automatically easily comparable. But there needs to be a way to reconcile scores. Students who have taken both exams naturally want to know if their scores on one test are higher than their scores on the other; counselors want to be able to advise their students properly; colleges, universities, and scholarship providers want to make sure that student scores meet or exceed their cutoff criteria. To accomplish this, we need a document called a concordance. When the College Board released a new SAT in 2016, it changed the the test's scoring scale––shifting from 2400 points back to 1600 points––and unilaterally released a concordance that converted new SAT scores to old ones, and then converted these to ACT scores. This provoked the ire of the ACT, which dismissed the new tables as invalid due to a lack of available score data from the new SAT. Eventually, the College Board committed to cooperating with the ACT to establish new (and, in the eyes of the ACT, credible) concordance tables; two years later, the new concordance is now available, and should be used by all parties interested in comparing scores across the two exams. 
                The more things change...
                The whole idea behind this spat was that the SAT changed in a big way––so big, in fact, that the previously determined concordance between the exams would no longer hold. While the new SAT is genuinely a very different exam than its predecessor, the more things change, the more they stay the same. Here's a snapshot of the former and current SAT-ACT concordance tables, with old and new conversions shown. 
                Concordance
                Adapted from Guide to the 2018 ACT®/SAT® Concordance, The College Board, 2018.
                The yellow cells highlight the scores that have apparently shifted. Looks like a lot of change, doesn't it? The conclusion most organizations have drawn is that things have gotten "better" for SAT students and "worse" for ACT students. For an example that shows why, take a look at the 1340 SAT score. According to the table, this used to be equivalent to a 28 on the ACT, but is now worth a 29. Conversely, in the reverse direction, a 28 on the ACT now lands a student the equivalent of a 1320 on the SAT, whereas it used to be worth as much as a 1340. This interpretation, however, is a bit too simplistic.
                Free Samples!
                When we take a deeper look, however, we begin to see how such small differences are all but irrelevant. To understand why, we must learn more about the statistical methods used to generate these concordance tables.
                In order to produce a concordance, the College Board and ACT must collect data by sampling. That is, because it would be impractical for the organizations to use data from every single SAT and ACT examinee or test, they instead make inferences from a subset of the available data (in this case, 589,753 members of the class of 2017 who took both tests). Regardless of the statistical methods used to generate average score equivalences across exams, sampling inherently generates a certain degree of variability, known by statisticians as standard error, in the final numbers. You're probably familiar with standard error of a sampling statistic: when you see a "±" value, that's the standard error talking.
                In this document, the College Board states the standard error of the score conversion values as follows.
                When using the SAT Total and ACT Composite concordance table to estimate a student’s proximal ACT Composite score from their SAT Total score, the estimates in the table have a standard error of approximately ± 2.26 (2) ACT Composite score points on its 1–36 point scale. When using this table to estimate a student’s proximal SAT Total score from their ACT Composite score, the estimates have a standard error of approximately ± 79.57 (80) SAT Total score points on its 400–1600 point scale. (The emphasis is my own.)
                Let's return to the example scores we used before to demonstrate how things supposedly got "better" for SAT takers and "worse" for ACT takers. Using the table alone, we might conclude that an SAT score of 1340 used to concord to a 28 on the ACT, but now concords to a 29. But the 29 in this table is not really a 29: it's 29 ± 2. Because of the way standard error is calculated, the practical interpretation of the measurement plus-or-minus the standard error is this: we are 68% confident that a score of 1340 on the SAT concords to an ACT score between 27 and 31. Notice how this range comfortably includes the 28 that the 1340 used to "equal". It doesn't take long to see that, when extended to all of the other values in the table, the standard error erases the apparent changes in the tables, placing them well within the ranges of confidence produced by the sampling method.
                The long and short of it is this: any sampling method used to generate concordance produces not "exact" numbers, but instead ranges within an acceptable degree of confidence, or certainty. Thus, the concordance table alone does not tell the whole story. When standard error of the numbers in this table is taken into account, we reach a simple conclusion: the concordance table hasn't really changed, and things have not gotten markedly "better" or "worse" for either SAT or ACT takers. 
                So You're Saying I Have A Chance...
                Despite the mathematical fact that scores on the two tests are essentially the same (in relation to each other) as they were before the new concordance tables were published, the story doesn't end there. Many colleges and universities publish score thresholds for scholarships based on the old, unadjusted concordance. Similarly, some states offer their residents reduced (or free) tuition based on test scores, and unless they speedily change their documentation, the new concordance tables might seem to advantage one test over another. Let's take a look at a few examples:
                At Louisiana State University, recipients of the Academic Scholars Award get $15,500 per year based on an ACT score of 30-32 or an SAT score of 1330-1430 and a cumulative 3.0 GPA. With the new concordance, 30-32 ACT concords to 1360-1440. So, well-meaning advisors might tell students that they should take the SAT because they can score lower (by getting a 1330, which concords to a 29) and still be awarded the scholarship. 
                At Liberty University, students can get into the Honors Program with a 28 ACT or a 1330 SAT. Since the new concordance equates a 1330 to a 29 ACT, if Liberty doesn't change its documentation, students might conclude that it would be wiser to take the ACT and shoot for a 28.
                At the University of Arizona, the "Wildcat Excellence" award criteria are on a sliding scale based on ACT or SAT scores and high school GPA. As you can see in the table below, a small difference in test scores can be worth a lot of money.
                Screen Shot 2018-06-15 at 3.05.29 PM
                For instance, a student with a 3.8 GPA and an 29 ACT stands to receive $18,000. With the new concordance, the 29 on the ACT is a 1330-1350 on the SAT. But a 1380 on the SAT now concords to a 30 on the ACT, which based on the chart, would get our student $25,000. What should our student do? Take the ACT again and shoot for an actual 30? Take the SAT and try for a 1390? Either option might work, but for $7,000, it would make sense to do something, unless Arizona updates their table.
                Similarly, the state of Florida's Bright Futures Program is a wonderful tool for ensuring college access and rewarding students with high test scores and grades. "Florida Academic Scholars" get 100% free tuition plus a stipend for books at state universities with a weighted cumulative GPA of 3.50 and a 29 ACT or 1290 SAT.
                As we just saw, under the new concordance, a 29 ACT equates to an SAT score between 1330-1380. So if all a student must do to be a Florida Academic Scholar is get a 1290 (which is now a 27 ACT), it would seem like they should eschew the ACT and pursue the SAT instead, shooting for that 1330.  As we now know, the standard error makes the apparent difference mathematically insignificant, but if the state of Florida doesn't update its criteria, then there is an effective difference in "real life".
                Final Thoughts
                Because the SAT and ACT generate so much stress for students and uncertainty for everyone in the college admissions process, any change in the tests can generate a disproportionate level of anxiety. The hubbub over the concordance tables is understandable, and is surely something that should be understood by anyone involved in the college process. It's good that there is now an official, universally agreed upon conversion between the two college admissions tests, but it is crucial that applicants, advisors, and advocates make sure that colleges, universities, and scholarship agencies have updated their score thresholds so that students can pursue the test preparation that makes the most sense for them

                Monday, February 19, 2018

                "So You Don't Have To": A Visit to Duke University

                It's been a long time since I was able to visit a college! Last summer my wife and I bought a new house and moving and getting settled pretty much consumed my free time for the last six months. But things are calming down, and I expect to be able to do a few more college visits in 2018. 

                In mid-February I had the chance to visit Duke University. I was very impressed with the school and also with the town of Durham, which looks like a really excellent college town. And located as it is in the "Research Triangle" with close proximity to the University of North Carolina, it would be a great place for a family to do a multi-school college visit.  There's a lot to like about Duke, and while it is one of the most selective institutions in America, for students who are looking at that kind of college it would be an excellent option.


                Duke University At A Glance

                Size:Just over 6,800 undergraduates (approximately 49% women/ 51% men). Duke University is one of the most selective colleges in America, having accepted about 3,300 of their 34,800 applicants to fill a first-year class of 1,750-- an overall acceptance rate of around 9.5%. 
                Programs of Study:53 majors and 52 minors for undergraduates with 33 interdisciplinary certificates; Duke is a research university with graduate programs in business, law and medicine (among others), and students interested in pursuing careers in these fields are well prepared.
                Sports:Duke has 25 NCAA Division I sports (13 women's/12 men's). Duke also has 37 club sports (which compete intercollegiately) and numerous intramural athletic options.
                Campus Life:Duke's website lists over 700 student activity organizations. Over 100 clubs and organizations on campus; 22 fraternities and 17 sororities. On campus housing is guaranteed for all four years, and students are required to live on campus for the first 3 years. 
                Costs & Aid:Tuition, room & board and fees total just about $72,200.  Parents need to fill out the FAFSA (Free Application for Federal Student Aid) and CSS Profile. Duke has a financial aid budget of nearly $120 million and are fully "need-blind"; they guarantee to meet 100% of the need for admitted students who are American citizens or permanent residents.
                Deadlines:Duke University applicants can choose binding Early Decision, with a deadline of November 1 or Regular Decision with a deadline of January 2.  Students use the Common App or the Coalition App. The application fee is $85.
                Tests:Duke requires considerable testing--students must submit either ACT (with optional writing section) or SAT (with optional essay). Students who take the SAT are encouraged to take two Subject Tests as well.  



                ©2018 Ethan Lewis
                Duke University traces it's history back to the 19th century when it was a Methodist school called Trinity College. But the generosity of the Duke family (founders of American Tobacco, an original stock on the Dow Jones Industrial Average) financed massive expansion. Today, Duke University boasts a 9,350 acre campus including a 7,000 acre forest, a 55 acre garden, 2 undergraduate colleges and 9 graduate and professional schools. There is also a new campus in China.

                Undergraduate applicants must specify whether they want to be in the Trinity College of Arts and Sciences or the Pratt College of Engineering. Approximately 80% of students choose Trinity leaving only a fifth of students in Pratt (though students can minor, or even double-major in the other college); this makes Duke a good choice for talented students interested in engineering or computer science who seek a small, close-knit community. According to the admissions presentation, 12% of Duke students double major; 12% have a major and an interdisciplinary certificate and fully half of all students have a major and a minor. According to my tour guide, Duke (especially Pratt) is generous with awarding Advanced Placement credit, which helps students to carry such heavy loads. Overall, the most popular majors include:

                • Public Policy Studies
                • Economics
                • Biomedical Engineering
                • Psychology
                • Biology

                The admissions presentation also shared the most popular careers for Duke graduates (I wish everyone did this!) and they include: consulting, education, engineering, finance and health.

                Duke proudly touts a commitment to diversity; 25% of the class of 2021 identify as Asian, 13% as African-American and 14% as Hispanic/Latinx.  Additionally, Duke has students from over 180 countries and all 50 states; the states sending the most students to Durham last year were:

                • North Carolina
                • New York
                • Florida
                • California
                • Texas

                As you might know, North Carolina has been in the news over the last year or more due to issues involving inclusion for LGBT people. The Research Triangle area (along with Charlotte, Greensboro and Asheville) is more liberal than the rest of the state. Along those lines, I found it noteworthy that Assistant Director of Admissions Chris Briggs (who gave an excellent presentation for the info session) immediately introduced his preferred pronouns and invited us to visit the welcome center bathrooms ("two gender specific restrooms; use whichever you feel most comfortable with"). This was in a crowded room of nearly 400 visitors and I can imagine admissions reps at other schools being more cautious to avoid seeming too "liberal"; but I interpreted it as a very warm welcome and gesture of inclusion and community that gave me very positive feelings about Duke.

                © 2018 Ethan Lewis
                The tour only covered the West Campus (home to lovely neo-Gothic architecture). While there are quite a few residential options on West Campus, all first-years live on East Campus, which I didn't get to see (our tour guide encouraged visitors to approach students at random and ask to see their dorms, but I decided not to). After my visit the Admissions Office sent me a link to a webpage that shows all the Freshman housing options. My tour guide, Liz, was very enthusiastic in her description of Freshman housing; two things that stood out to me were that every dorm has a resident faculty member to add mentorship and support to the students and each dorm has a librarian attached to it so students always have someone to go to for help with research. As a veteran of 19 years' teaching at boarding schools who is married to a school librarian, these are both excellent features. Way to go, Duke!

                © 2018 Ethan Lewis
                The West Campus is not small, but it is quite walkable with a great deal of lighting--lampposts were everywhere and I expect that even students on campus late at night (the libraries are open 24/7) won't have to worry about it being too dark. Liz told us that she thinks it takes 20 minutes maximum to walk from one end to the other.

                Shuttle buses run all day and most of the night (Liz said they don't run from 4-7am) to help students get from place to place (including the other campuses and downtown Durham). For the automotively inclined, all students, including first-years, can have cars on campus.

                Due to the large number of visitors, the list of things NOT shown on this tour was lengthy. We were not shown:

                • the inside of any of the academic spaces (usually these are included)
                • the inside of any of the libraries (usually these are included)
                • the inside of any dining or health facility (usually these are included)
                • the inside or outside of any dorms
                • the inside or outside of any athletic/recreational athletic facility

                 © 2018 Ethan Lewis
                I've been on a LOT of campus tours, and it's not unusual for guides to skip some of these, and with hundreds of visitors on the day I was there I can understand the wish to avoid crowding or inconveniencing students and staff. That said, I think these are all important things to see. If possible (and especially if you are traveling a great distance), you might want to try to contact the admissions staff prior to your visit to make sure that you can see some or all of these spaces. I did make a point of eating lunch on campus at the new Brodhead Center, which contains over a dozen really awesome restaurants--I got an amazing BBQ Seitan dish at a vegan place, which was next to an Indian place, next to a Southern place, near an Italian place, and on and on. Do yourself a favor and check this place out if you are on campus--your stomach will thank you!

                Student life wise, Liz told us that about 1/3 of students participate in Greek life, but she noted that pledging doesn't start until winter of the first year, so new arrivals can focus on academics. This policy also times it to coincide with basketball season, which is a pretty big deal at Duke. I didn't get any pictures (I was pressed for time), but my visit was smack in the middle of "tenting", when students set up a 24-hour per day campsite for six weeks to make sure that they are at the front of the line for tickets against arch-rival UNC in their annual basketball game. The tent city is known as Krzyzewski-ville after legendary coach Mike Krzyzewski. This was held out as a particular example of school spirit, but it seems like Duke students are extremely proud of being Duke students, and not just for basketball.

                © 2018 Ethan Lewis
                Academically, Duke is clearly a top-notch educational institution. The admissions presentation spent some time talking about an interesting program called "Duke Engage", which is a competitive (students have to apply) program that pays all the expenses for a student-planned educational trip with a social engagement component for a summer. Chris Briggs described it as "a fully funded chance to do something in the world; an 8-10 week summer experience to go humbly to learn and serve". Funded by the Bill and Melinda Gates Foundation (Mrs. Gates is a double alumn of Duke), this is a VERY exciting program. I read this article in the Duke magazine about it, which indicates that 80% of students who do Duke Engage "say that the experience influenced their career plans". I have no doubt that this would be a very strong reason to consider Duke for many students, especially those who are already committed to community service and engagement. 

                Chris Briggs told the audience at the info session that there are "no minimums or cutoffs" and "no formulas" for admission to Duke. He said that Duke is looking for "talented, engaged, impactful, ambitious, thoughtful and diverse students". Students should take 5 academic courses per year, including 3 years of foreign language and applicants to Pratt College of Engineering need to take Calculus, with physics being "strongly recommended". We were told that students should take the most rigorous courses available to them and that they should aim for "not straight A's, but more A's than any other grade".  While he was saying this, I was reading the admissions guide they hand out to visitors that notes that "[m]ost students who apply to Duke are in the top 10% of their class." This indicates to me that students who can't take the most rigorous courses at their school might not stand a good chance. In other words, a student can be on a path where the most rigorous courses she can take are not the hardest at the high school, and that student might have difficulty gaining admission.

                Due to my job with Method Test Prep I am especially attuned to how colleges employ standardized tests in the admissions process, and Duke has definitely gone "all in" on these tests. While Chris Briggs tried to downplay the tests by saying "we know it's just one Saturday in your life", the admissions guide says that students have to submit "either the ACT with writing or the SAT with essay. We also strongly recommend that students who submit the SAT also submit two Subject Test scores of their choice."  Realistically speaking, it stands to reason that most students applying to Duke will do the ACT and/or SAT multiple times, and two Subject Tests would be another test date; this means that Duke applicants will be paying a lot of extra money to the test agencies and will need to start doing it no later than winter of Junior year to be able to fit everything in.

                And then when you look at the actual test scores Duke receives, the picture becomes more complicated. I totally believe them when they say that "there is no minimum score requirement", but these numbers somewhat belie that:

                Trinity College of Arts and Sciences Middle 50% SAT: 1440-1570 | Middle 50% ACT: 31-35
                Pratt College of Engineering: Middle 50% SAT: 1490-1570 | Middle 50% ACT: 33-35

                In other words, 25% of admitted students get perfect 36 scores on the ACT, something that only one-tenth of one percent of test takers manage to do; the rate is similar for the students who score in the 99th percentile on the SAT (which is true for scores over 1530).  Of course it also means that 25% of students score lower but how much lower? And do those students stand out on campus? I don't think that people with average scores have "no chance" at Duke, but this is definitely a place where higher scores will really help.

                Applicants can use either the Common Application or the Coalition Application. Applicants to Duke can choose between Early Decision and Regular Decision, and while the official line makes clear that there is a great advantage to applying early. Take a look at last year's numbers:

                Early Decision Total Applicants: 3,503 |  Regular Decision Total Applicants: 30,985
                Early Decision Total Accepted:      864  |  Regular Decision Total Accepted:      2,423
                Early Decision % Accepted:          25%  |  Regular Decision % Accepted:               8%

                Enrolled: 859 from ED, 892 from RD

                Half the class came from the Early Decision pool. Does this mean that it's "easier" to get in by applying early? I doubt it, but it shows how very difficult it is to get into Duke through the regular admissions path. Oh, and Duke claims not to track "demonstrated interest" in admissions (though I think it's still worth it to visit such a super place).

                Duke is beyond generous with Financial Aid. Families must submit the FAFSA and the CSS/Profile to allow Duke to best estimate their need. Duke is completely need-blind and they guarantee to meet 100% of need (for American citizens and permanent residents). Families who earn less than $60,000 per year will have NO parent contribution; that said, "half of families receiving aid" earned over $100,000. Aid comes primarily in the form of grants and work-study; loans are capped at $5,000 per year. The average amount of student debt at graduation is $18,000 (over $12,000 less than the national average). Applicants are automatically considered for over 100 merit scholarships, over half of which have "a need based component".  So yes, the "sticker price" of over $72,000 per year is eye-popping, but very few people will actually pay that much to go to Duke, so excellent students of limited means should definitely consider Duke when looking at colleges.

                Duke may not be for everyone. While my tour guide Liz made a point of saying how helpful and supportive the faculty and fellow students are, my guess is that students who are not Type A, highly motivated, hard-working people might struggle at Duke. Further, while there seem to be ample support resources, people who aren't already very good students would probably fall behind. Based on hearing Liz' stories, reading some issues of the campus newspaper and the alumni magazine, Duke students seem to be proud of how "hard" Duke is and many people aren't looking for that in a college. But future doctors, lawyers, and businesspeople could do much worse than Duke. Factor in the charming city of Durham and the opportunities for jobs and cultural experience in the Research Triangle (one of America's fastest-growing metro areas) and Duke looks like a very good choice.