Sunday, November 4, 2018

"So You Don't Have To": A Visit to American University

In early November I had the chance to visit American University. This was actually the second time I've toured American, the previous time being in the summer of 2007 (years before I started this blog), which makes this the first in this series that sees me revisiting a school. I've been recommending American to students for over a decade--would I still feel positively about the university the second time around? Read on!



American University At A Glance

Size:Just over 8,100 undergraduates (approximately 35% men/ 65% women), with an additional 5,700+ graduate and law students also on campus.  American is a very selective institution, having admitted about 5,500 of 18,700 applicants for a 29% acceptance rate. Students come from all 50 states and over 120 countries.
Programs of Study:90 majors and minors with pre-professional programs in law and medicine; American is a liberal-arts institution which takes great advantage of its location in the nation's capital to provide research, internship and job placement opportunities for its students.
Sports:American has 14 NCAA Division I teams (8 women's / 6 men's). American also has 28 club sports  (8 women's / 8 men's/ 10 co-ed) which compete inter-collegiately and numerous intramural athletic options.
Campus Life:The admissions pamphlet lists over 200 clubs and organizations.  On campus housing is guaranteed for the first two years (if application deadlines are met) with limited housing options for juniors and seniors. 70% of students study abroad (more than 7x the national average) and financial aid travels with the student.

Costs & Aid:Tuition, room & board and fees total just about $65,048 (for "average room and board"--costs could be higher or lower depending on dorm/dining options).  Parents need to fill out the FAFSA (Free Application for Federal Student Aid) and CSS Profile. American boasts of bestowing over $90 million in aid and they guarantee to meet 100% of the need for admitted students who are American citizens or permanent residents. 30% of students receive merit aid. 
Deadlines:American University applicants can choose binding Early Decision, with a deadline of November 15 (notified by December 31), binding Early Decision II, with a deadline of January 15 (notified by February 15) or Regular Decision with a deadline of January 15 (notified by April 1).  Students use the Common App or the Coalition App. The application fee is $70.
Tests:American is test-optional; students (U.S. or international) do not need to submit ACT or SAT scores. Students who apply without standardized tests are fully considered for all available merit aid.  Between 15-18% of applicants choose not to submit test scores.


©2018 Ethan Lewis
American University traces its history to the Gilded Age, when it was chartered by an act of Congress and was the only university within 200 miles of the nation's capital that educated both men and women regardless of race. Today, American boasts a burgeoning campus in Northwest Washington D.C. (near the National Cathedral and Embassy Row).

As I mentioned above, I visited American in 2007 and was struck at the time by the beauty of the 84-acre campus (all of which is classified as an arboretum), the easy walkability, and the advantages of being a discrete suburban campus in the midst of a world-class city (American is only four miles from the White House and is located on public transportations lines). I have a pretty good memory, and I was quite (pleasantly) surprised to see a very different (and bigger) campus than I recalled; according to the university's Wikipedia page, several new academic buildings have been built in the last decade, and dormitory space has increased student housing by over 1,000 beds. When I visited this time the center of campus was dominated by a gigantic construction crane, part of a new Hall of Science that is currently under construction.
© 2018 Ethan Lewis

Our tour also took me through the Technology and Innovation Building, which opened earlier in 2018. Many of the students I met (only some of whom were part of the admissions program) made deprecating remarks about AU's continuous construction projects, but all were very proud of the new buildings and the resources that they provide.

As is depressingly common nowadays, the tour did not take us into the library or a classroom, though we did walk through several of the key academic buildings. The tour did show a "typical" dorm room and common area; as a veteran of 19 years work in boarding schools this doesn't do much for me--at some level, all dormitories are pretty much the same. But it's clear that families appreciate the chance to see where their students will sleep. Other than wishing that we had been able to go into the student center, the library and the athletic center I thought that the tour was perfectly satisfactory. My guides, a junior and a freshman were attentive and engaged and clearly loved their university. Tyler (the junior) was constantly being waved at by other students on our walk. His partner, Scout was more of a beginner but did a good job. She also fielded a question about Advanced Placement by noting that AU will grant incoming students up to 30 AP credits; she entered with 19 and will be considered a junior at the end of this school year--pretty impressive!
©2018 Ethan Lewis

What did not impress me was the information session. The admissions office is now located on the second floor of the Katzen Arts Center (which Tyler proudly told us is longer than the Washington Monument is tall) and is a beautiful structure across Massachusetts Avenue from the heart of campus--I seem to think it was empty space in 2007, but I could be wrong. The session was without question the most inept I've ever seen. The presenter had some trouble with the technology in the room (she said it was a brand new presentation and that she hadn't rehearsed it) which is understandable, but she also read the text of her presentation extremely quickly; she wouldn't give John Moschitta a run for his money, but she was hard for me to follow--I can only imagine that visitors whose first language is English would have been left behind. Also, the format of the presentation involved her synchronizing her speech with a movie playing on screen, which I think is distracting. Finally, she was prone to malaprops ("our faculty are truly extinguished", "legal studies, or the theology behind the law", calling the FAFSA the FASFA) and referring to government agencies and consulting firms by acronyms rather than their full names. One other quibble is that she referred to how generous AU is to provide "free Metro access" to it's students, noting that no other DC school does this. But it is clearly spelled out on the tuition and fees page that students pay $136/year for their Metro passes. That is a very good deal, but it certainly isn't "free". Of course this was probably just one presenter having a bad day, but if you are serious about American and/or do not speak English as your first language it might be a good idea to request a chance to speak face to face with the admissions officer on duty during your visit just to make sure that you leave having learned all that you need, and all that American University wants you to know.

© 2018 Ethan Lewis
One thing that I liked about American in 2007 and is still true is that the main quadrangle is the heart of the campus, and the academic department headquarters are all there. Of course, this means that when class periods are over, several thousand students will converge on this relatively small space, but as Tyler and Scout pointed out to me, there is 20 minutes of passing time between classes, and the daily schedule runs from 8am-10pm, so it is not likely that all 8,000 students will be walking there at once. We were told in the info session that there are no classes on Wednesday, which provides some built-in time for students to do internships--apparently the average AU students does 3 internships during her college career.

© 2018 Ethan Lewis
Though we were not taken into the library, it was pretty clear that it is a vital part of campus life. The library is open "24/5" and 9 am-9 pm on weekends. Scout mentioned that one of her first tasks this year was to write a research paper using all primary sources (which she had not had to do in high school) and that the librarians were very accessible. It seems that they even reach out to academic departments by having office hours in academic buildings, which is pretty cool. All first-year students participate in the "American University Experience" which prepares them for a "problem-based education" and, ultimately, for their senior capstone project.  These classes are capped at 19 students and are taught by the students' faculty advisors (who themselves have a load of no more than 76 advisees, which should make them more able to focus on individuals).

©2018 Ethan Lewis
Students at American have until the end of their second year to declare a major, and when prospective students apply they will be admitted to every program in the university, so there is no need to rush into a commitment to be, say, a Dance major. Students can study in the following colleges:


   •   College of Arts and Sciences
   •   Kogod School of Business
   •   School of Communication
   •   School of Education
   •   School of International Service
   •   School of Public Affairs
There is also a 3:2 engineering degree program with Columbia University. 


All students follow a core curriculum, including at least one writing-intensive and one math-intensive class, but there are options for students focusing in any of the areas of study. We were told in the info session and by the tour guides that many students pursue double majors, or a major and a minor (perhaps facilitated by the generous Advanced Placement credit).

I found the discussion of the admissions process to be quite interesting and was surprised by some things that were emphasized, and by some things that were not emphasized.  Let's start with some basic numbers provided on the screen in the information session:



mid-50% of admitted students mid-50% of students admitted to Honors program
GPA 3.47-4.0 4.0-4.3
SAT 1240-1390 1405-1540
ACT 27-32 33-35


Those are pretty impressive, and the high grades and test scores can explain American University's overall 29% admissions rate.  But during the presentation, we were told that admission rates during the Early Decision and Early Decision II rounds are 80%! That is a dramatic difference that tells me that motivated students who have good grades should seriously consider applying early to American. 

On the other hand, there was something else that came up which might give one pause--the waitlist. It was hinted at in the presentation, where we were told that one of the four possible outcomes for an application is the waitlist; some additional detail was provided by the presenter, who said that in the 2017-18 application year "we took students off the waitlist" but that in the previous year they did not. The very nice admissions packet states:
"We offer a limited number of students a waitlist decision each year. If space becomes available in the fall class and you sign up to remain on the waitlist, we will notify you in May if we're able to offer you admission. But we do suggest that students deposit at another institution to secure a spot somewhere for the fall.
That certainly sounds like good advice. I tend to think, however, that if I was an applicant who got waitlisted I would take solace in the idea of "a limited number" of waitlist decisions. But according to Common Dataset information,  American "offered" 4,049 students a waitlist spot in 2017-18; that number is about 20% of total applicants, which is no solace at all!  Not only did American University accept or waitlist 9,547 out of 18,699 applicants (51%) but only 609 of the people who were "offered" a place on the wait list "accepted" it (which means 85% of those waitlisted said, "no thank you"). This tells me that American must have trouble becoming their applicant's first choice (which could be why nearly every ED applicant gets admitted); further, it tells me that students who have AU ranked high on their lists should do everything they can to visit before the middle of summer before senior year to see if they wants to apply Early Decision. 

So--what did I think after revisiting American University? I still like it! I definitely need to replace my mental image of the school with an updated one of a bustling campus full of LEED certified new construction, but I continue to think that American could be the perfect urban college for a student who doesn't want to live in the heart of a big city. During my day on campus I saw a fraternity organization (about 10% of students participate in Greek life) putting together a fun fair of sorts on the quad (there was music, volleyball, and people in big plastic bubbles knocking down giant bowling pins) that had about 100 people participating. I was also stopped by a former student of mine wearing an AU sweatshirt walking down the path who told me that she loves American and is majoring in the School of International Service. She was happy and healthy which I guess is the best advertisement for any university. 

A student with good grades and test scores (or just good grades who doesn't want to submit test scores) looking for a vibrant campus experience AND an urban location would do well to consider American University. I imagine that if the student is also interested in government, and wants to have a wide array of study abroad options that AU could be a contender for the top of their list.



©2018 Ethan Lewis


See the full series of my blog posts about visits to colleges and tell me what you think!

Saturday, October 6, 2018

What I Saw At NACAC 2018: A Summary of "Test Optional Admission: Current Trends, Best Practices, and Future Prospects"


I attended the 2018 National Association for College Admission Counseling conference in Utah in the last week of September. This was the fourth straight year I've been able to attend, thanks to my day job with Method Test Prep, a leading ACT/SAT preparation company. I was there to work, and consequently spent most of my time in the exhibit hall meeting current and future customers, but fortunately I was able to attend a session on the Thursday called ""Test Optional Admission: Current Trends, Best Practices, and Future Prospects".


As a representative of a test prep company, and more significantly as a proud alumn of Hampshire College, which was test optional since its opening in 1970 (the college and I are the same age) and four years ago became the only "test blind" 4-year institution in America, I have had a deep interest in the "test optional movement" for decades. I was very interested to learn more about it, and I wasn't disappointed. The presentation was a good one, with several thought-provoking moments. 

    The session met in a very large room with many hundreds of chairs. I had left my phone back at the exhibit hall which vexed me for two reasons: the pedometer didn't count what must have been over a thousand steps to get to the room (the Salt Lake City convention center is surprisingly large!) and I couldn't take a picture of the space. While the room was not "packed", I tried a quick count and there were well over 200 attendees [note: Bob Schaeffer of FairTest says that the room had 655 chairs, and that he estimates about 350 people attended]. My suspicion was that the session would attract more people from the college/university side of things, but a closer look (and a show of hands poll) made it seem that there were probably more high school counselors/independent counselors in the room. Interesting!

    The presenters were:


    If you've been reading this blog over the years, you know that I am a big fan of Jon Boeckenstedt's views on the social value of college admission, his facility with words and data, and his contributions to the field. Last year at NACAC I wrote about a session that featured a refusal to address his questions from the floor, and he follows me on Twitter, which makes me think I'm cool (in a geeky sort of way). I've met Bob Schaeffer at previous NACAC conferences and had chatted with him earlier that morning, during which conversation he laughed at the irony of a Hampshire grad working for a test prep company and seemed very excited about a news article from California that he would mention during the session. I have not previously interacted with Whitney Soule or Andrew Palumbo.

    The session started on time and Bob was the moderator as well as a speaker. He told us that handouts would be available in the conference's mobile app for us to download later, but they have yet to appear 10 days after the session. Unfortunately, my lack of a camera and my trust that the slides would be downloadable mean that some of the data I refer to below may not be wholly accurate, but represent my best efforts to type notes while the speakers were presenting. Any mis-statements are mine, and not the fault of the panelists. [Note: Bob Schaeffer sent me the speaker's slides after I published this post, so I have re-edited it to make sure that numbers are correctly represented.]

      Bob Schaeffer spoke first, introducing the speakers. He noted that Bowdoin is celebrating the 50th anniversary of becoming the first test-optional college, and that Whitney Soule and Andy Palumbo had spent their whole careers at test-optional schools, and that Jon Boeckenstedt would have "numbers and critical thinking" to offer us.

      "49 years and some months since Bowdoin started the test-optional movement", Schaeffer noted, "there are now 1,022 accredited, four-year test- optional colleges and universities, including more than half of the 'Top 100' national liberal arts colleges and many national universities." Schaeffer mentioned the annual Inside Higher Ed survey showing that "a majority of counselors want to do away with the ACT and SAT". I haven't done a blog post on that article, but I did write a mini-Twitter thread on it, if you are interested. Schaeffer concluded his remarks by noting that "more than half of all the schools in the Northeast and a growing number in the mid-Atlantic" have test-optional policies, and he quoted the 2018 Fiske Guide as saying " Students can avoid getting involved in the admissions test rat race and still have a range of colleges and universities from which to choose."

      Overall, Bob Schaeffer's tone was similar to the triumphalist one that pervades almost every publication from FairTest, nearly all of his emails to the NACAC e-list, and I assume most of his conversations. While he surely knows better, Schaeffer seems to conflate "test optional policy" with "students don't have to take these tests" and "the tests are dying out" (the latter two are not direct quotes, but the impression I get from him). I was a college counselor for over eight years and I always urged students to consider test-optional destinations if I thought that their scores might not represent their capabilities as students, but in many cases admissions policies are not in line with policies for awarding aid, or for sports eligibility, or other reasons (such as national origin of the applicant). All of which makes the test-optional scene a very pleasing range of grey shades, and not a black or white proposition. [note: after Bob Schaeffer read this post, he wrote that " I always try to explain that ACT/SAT "optional" policies mean exactly that: applicants have the option -- or choice -- about whether to have their scores considered in the admissions process.  FairTest has never claimed that test-optional admissions is the same as ignoring standardized exam results in all circumstances..."]

        Whitney Soule began her remarks by noting that her 27 years in admissions have all been test-optional. According to Soule, "multiple studies have shown that Bowdoin's academic assessment is a good predictor of first year success with or without test scores." She said that at Bowdoin, the transcript and school profiles put students into context, as do student essays and recommendation letters; if test scores are submitted, they are also used contextually.

        Soule told us that Bowdoin became a test-optional college out of "principles of access", and that over the years, typically 66%-75% of applicants do send test scores. As she said "we all know that anyone applying to Bowdoin has taken these tests multiple times", but she was clear that Bowdoin performs a "holistic review" and that test optional is an important part of that.

        "I know that counselors are skeptical that there isn't an implied weakness for kids who don't send scores," Soule said. "We don't hit a speed bump, we don't stop and think and try to imagine what the applicants' scores were, or their motives for not sending scores for us to review. We're not asking because we make affirmative decisions derived from the materials students do send. Test scores are not heavily weighted, rather they are an extra piece of information, like an interview."

        Soule's final point was that any school that uses test scores (whether they practice holistic admissions or not) "needs to understand what they are using them for, and that expectations match what the test was designed to do." This was a thought-provoking statement, and also a great segue to the next speaker. I say "thought-provoking", because in my line of work, I deal with many people who are not using the tests for their designed purpose. As you may know, dozens of states require all graduates to take either the ACT or SAT (and these highly profitable contracts are the biggest area of growth for the testing companies). Consequently, many schools are evaluated on the ACT or SAT scores of their students-even the ones who do not intend to attend a four-year college. A major problem at these schools is that the administration place great value on an assessment that many students find valueless, and that tests a lot of students on material that they have not had to master in their high school careers. I'm always glad to be able to help these schools to help their students, but we are definitely dealing with "off-label" uses of these college admissions tests.

          Andrew Palumbo noted that, like Whitney Soule, he has only worked at test-optional schools but at his previous three workplaces (Union College, the Sage Colleges, and Plymouth State) he was part of the decision to switch to test-optional. “You can imagine my relief”, he said, "when I got to WPI and they were already test-optional."


          Palumbo observed that WPI became the first national STEM university to go test-optional in 2007, but that "test scores were never a major factor" at WPI. He told us that there are certain pillars of the WPI Plan, including:
          • Collaborative learning
          • Project Based learning
          • Accelerated terms
          • Non-punitive grading
          and that they wanted their admissions process to reveal students who would "thrive" at WPI. He said that "standardized testing doesn't match up" with such an individualized program, and that scores don't show how well students work with others, or their independence. Palumbo said that the most common questions in admissions committee deliberations is "How would this applicant perform on a project team?" and that question cannot be answered by standardized test scores.

          Palumbo told us that WPI's goal is "a more effective and equitable admissions process", and that in the decade since going test-optional, the student body grew by 41%, the number of women by 81%, under-represented students of color increased by 156% and under-represented women students of color by 232%. Besides these undeniably good results, Palumbo stated that Worcester Polytechnic has continued to review all of their policies; recently they stopped giving scholarships to National Merit students because it is based on standardized test scores that not all students will have or will want to submit.

          I think that is super; I was already positively disposed to WPI thanks to the experiences of some of my former students there, but I really liked hearing how thoughtfully they work to make sure that their admissions policies are aligned with their institutional mission. One last thing that stood out was that Palumbo said that WPI had previously required students who applied without test scores to complete some other steps that students who sent scores did not have to do. The university realized that this might have inadvertently been a barrier to applicants, and in the first year after dropping these requirements, the number of "non-submitters increased three-fold", as did diversity.

          Bravo! While FairTest and mainstream media so often present "test-optional" as an elimination of standardized testing, closer reading of the press releases from schools reveals that this is not usually the case, and that it either isn't true for all applicants, or that non-submitters have extra hurdles. I was very pleased to learn that WPI has eliminated those obstacles, and encourage other admissions offices to consider doing the same.

            Jon Boeckenstedt's presentation took the form of a "Personal and Professional Evolution" in three phases. He began by noting that he was a "diamond in the rough kid" (high test scores and low grades; first-generation student) the sort that is often cited by the ACT and the SAT as benefitting from standardized tests. Boeckenstedt said that while he knows that "for some kids, these tests can be a path up" they aren't worth it for the bulk of the country.
            I've been a big fan of DePaul's since I paid them a spontaneous visit five years ago (and I still wear the t-shirt they gave out that day), and for some of the same reasons that I find WPI to be so admirable: a clear focus on their mission and on trying to make their admissions policies match their stated goals. Boeckenstedt said that DePaul's mission "explicitly discusses helping students who wouldn't otherwise have access to college." He said that as their admissions pool grew, they feared that "students we feel compelled to serve might be squeezed out". Further, internal studies of first-year students showed that students with good grades who study and work hard are likely to be "achievers and succeeders", and the ACT and SAT don't evaluate those attributes.


            Boeckenstedt observed that the natural way to form a class is to lop off the bottom, but doing so "quantitatively" often results in culling the "kids you want to help the most." He said that there was a sense of self-loathing in discussing applicants as "3.7, 1220", as opposed to a more individualistic way, and told us that "we wanted to engage more deeply with kids".


            "Phase two" of the story is what Boeckenstedt called "The Onslaught of Stupid". He warned the audience that "if your school goes test-optional, be prepared for know nothings to air their opinions" (while showing a Chicago Sun-Times blog headline proclaiming the "dumbing down" of DePaul).

            After careful study, DePaul found that "test scores co-vary strongly with social capital, which is not what we are trying to do." Boeckenstedt stated that "so much of what matters (work ethic, 'showing up', not playing video games) can't be shown in an application at all. And test scores don't add anything to the equation. But they have a low rate of false positives, so selective schools like them." He went on to note that "many colleges have a faculty full of people who got where they are partly due to test scores". The admissions office spent a year presenting to all of their constituencies (including faculty) before settling on their test-optional plan. Boeckenstedt said that he was able to convince people by saying "I can guarantee you that we live by the idea of academic quality."

            Annually, 10-12% of applicants are non submitters. Most students who apply to DePaul come from states that require scores, so most send them. But "scores are not a big factor. They are looked at as an interesting, extra, non-required item, equivalent to a student who sends a book of their poetry, or additional recommendation letters.

            "Phase three" consisted of "pushback from major testing organizations". Boeckenstedt (whose harvesting of public data to explain trends in higher education has been a staple of my reading for years) says that Wayne Camara (top psychometrician at the ACT) accused him of "not being able to read data--one year after I was invited to Iowa City to teach ACT engineers how to use Tableau software to present data"! He also mentioned the refusal to let him ask questions at last year's NACAC presentation on the book that was sponsored by testing companies and featured research by the College Board's and ACT's pet scholars and that seemed determined to provide evidence for why schools should not choose test-optional policies.

            Boeckenstedt concluded by telling us that "it dawned on me that people are unaware that the College Board and ACT are less accountable to government and the people than the average hot dog vendor on the streets of New York City. Their goal is to create and drive curricula to build and promote testing down to kindergarten, but they are accountable to no one but themselves."

            "Why do we take this?", he asked. "I don’t know. This doesn’t mean you shouldn’t require tests, if that right for you. But I’m actually just sorry you have to sit through this crap every year."

            As always, Jon Boeckenstedt proved himself to be an engaging presenter. What came through to me was the point (that reinforced what Andrew Palumbo told us) that institutions considering test-optional policies need to make sure that their choice reflects the institutional mission as well as the long- and short-term goals of the school and that all stakeholders have a chance to weigh-in, while being assured that the policy is being changed to make the school better, with more capable students.

            What also came through (from all three presenters), is that even at these prominent test-optional schools, the majority of students still submit their scores. Figuring out why they do so goes beyond the scope of this article, but it seems clear to me that testing companies can definitely co-exist with test-optional admissions policies, and that we are far from a "tipping point" that will lead to the tests' irrelevance. Whether that is good or bad is also a topic for another day, but please share your thoughts below.

              A question and answer session was next, and there were microphones set up in the aisles for people to use. While questioners were asked to identify themselves when posing their queries, not all did.

              The first question came from Chad Austin at Kean University in New Jersey. Austin said that they are considering test-optional, but they rely on the tests for merit aid, and asked the panelists for their thoughts. Boeckenstedt said that DePaul has an academic index computed in such a way that doesn't need scores; Palumbo reminded Austin that "this could be a chance to revisit your approach", as WPI did. Finally, Soule pointed out that "using test scores to award scholarships is a way to guarantee that most of your money goes to affluent kids".

              David Sheehy, of Boise High School, asked if there is any evidence showing it is worthwhile to have students send extra materials in lieu of test scores (which many test-optional policies require). The consensus was that none of the three panelists' institutions asked for anything extra.

              Tara Miller, from Austin High School urged Bob Schaeffer and FairTest to do a better job engaging counselors. She said that test agencies work hard to reach out to counselors every year, and that many of her peers don't know how to explain test-optional to their communities. Schaeffer's answer was that FairTest is a "three FTE (full-time equivalent) organization. We wish we could do more to reach new counselors and students, but in the meantime, look at our website, www.fairtest.org" I have to say, I had no idea that FairTest was so small--very interesting!

              A fellow calling himself "Christopher from Honolulu" (who has listened to too much Larry King, I think) said that his school is "investigating the Mastery Transcript, and eliminating grades, but the word we are getting from colleges is that test scores will be needed to make sense of our students." He asked what information the panelists would want to see on an "unusual transcript". Whitney Soule told him that "colleges will want a method of assessing kids in context; maybe not test scores, but we will need a way to understand what their achievements are. Bowdoin had kids apply from 4,500 different schools: we know how to evaluate different environments, but we need the high schools to help us." Andrew Palumbo agreed that "your students are better off with the more context you can provide." Jon Boeckenstedt opined that "the Mastery Transcript is another opportunity to give privileged white kids more opportunity to stand out. Not that we shouldn't reduce stress on kids, but the Mastery Transcript isn't the way."

              As a graduate of a college without grades I certainly applaud any effort to reduce subjectivity under the false flag of numerical objectivity on the part of learning institutions. But as someone who worked with a wide range of students applying to a bewilderingly broad array of colleges, I know that the people deciding on our students' applications need to be able to put kids into context one way or another. I wish Christopher had told us the name of his school, because I'd love to find out what they eventually choose to do.

              The remaining questions were not very substantive. A woman who didn't identify herself asked a confusing question about yield, which the panel didn't seem to know how to answer. Someone called Dorothy who said that she works with the State Department helping students from overseas to apply to American colleges asked if there are any "implications" if international students apply test optional (she clearly wasn't listening to Whitney Soule). Whitney courteously chose not to point out that she'd already discussed this, and noted that Bowdoin only asks for a test of English proficiency for applicants from outside the US, with which the other panelists concurred.

              The final question came from a man who also did not identify himself. He was a bit pugnacious, and pushed the panelists to say how "transparent" their policies are and if "kids can call you to ask if they should or shouldn't submit their scores". Whitney Soule fielded this one, saying "we can't really answer this in a vacuum. The point of our focus on 'context' and 'holistic admissions' is that a 'good score' might vary depending on the applicant." This was a very polite answer. If I was on the panel I would have probably asked if he and his students even bother to read the admissions websites to learn their test policies; in my experience these are almost always spelled out clearly, which is pretty "transparent".


                I found this presentation to be very informative and that it made me think. Not having ever worked on the college side of admissions I don't think that I can add anything to the discussion of whether or not a particular college/university should decide to go to a test-optional policy. But what leaped out to me as someone who has spent a long time advising students and parents is that the number of places that will consider applicants who do not send test scores is growing, and that there is likely to not be a stigma or implied weakness for those applicants. At the same time, the vast majority of students do send test scores, even to test-optional schools, so in many cases there is really no "wrong answer" to the question of "should I send my scores".

                As I mentioned above, Bob Schaeffer was gleeful about news that came out the day before that the University of California system was planning to evaluate whether ACT and SAT scores were actually predictive of college success (which is, of course, a major argument the companies make in favor of their products). Reading about the announcement, however, makes me somewhat skeptical that this will prove to be the death knell of standardized testing that Schaeffer seemed to be hoping for. In fact, it just seemed to indicate that the University of California system wants to study whether their policies align with their mission, which is what the panelists endorsed. [note: in his email, Bob Schaeffer clarified that "What I tried to convey is that, just as lobbying UC to adopt the SAT as an admissions requirement was a key part of the College Board's campaign to make it a national test (see Nick Lemann's history, The Big Test), so too could a UC decision to de-emphasize standardized exam scores (as a result of this study) encourage many other institutions to go test-optional."]

                Personally, I do not see any sign of the standardized test companies losing prestige or market share anytime soon. While test scores were designed to only be useful to colleges and universities, the companies' customers are the students who take the tests, not the admissions office that interpret the scores (in the sense that it is students who pay for the tests). And with more and more states moving to making the ACT and/or SAT a mandatory assessment, the pool of students taking the tests isn't going to shrink significantly in my lifetime. If two-thirds or more of applicants at these test-optional colleges and universities still choose to send their scores, the test companies will continue making healthy (non) profits for the foreseeable future.

                I hope you found this to be an interesting post. If you are looking for more, here is the summary of the session from Inside Higher Education that includes some useful links.

                Wednesday, September 19, 2018

                A College Counselor's Thoughts On ACT Academy

                As you may know if you've visited this blog in the past, and in the interest of full disclosure, I am a former college counselor and now work for Method Test Prep, a company that partners with over 1,000 schools, community organizations and independent counselors to help students prepare for the ACT and SAT. My company has an online ACT/SAT program, so while I see everyone in the college admissions field to be colleagues, it's reasonable for a reader to consider us to be in competition with ACT in the online space. Also, I sincerely think it's great that ACT and SAT no longer persist in the idea that their tests can't be prepared for and applaud them for investing to provide free resources for kids. As a citizen I'm rooting for them to provide excellent tools for students, but I don't think ACT Academy is there just yet. 

                When ACT rolled ACT Academy out in the spring I speculated that it was a "Trojan Horse"; less of an ACT preparation software tool and more of an attempt to make ACT, Inc. schools' preferred clearinghouse for online video resources in their curricula. Colleagues of mine who looked it over were surprised to see that much of the material was not actually relevant to the ACT test, but rather was related to "high school subjects" writ large.

                I just spent an hour watching a webinar from ACT talking about the new counselor/teacher/parent side of the ACT Academy, and it has definitely confirmed my suspicions. Here are my thoughts and observations:

                First, the webinar was not a strong presentation. Besides filling it with advertisements for future webinars and publications, IT WAS A PRE-RECORDED WEBINAR FROM AUGUST, though that wasn't admitted until I asked.  Because ACT has chosen to show this in recorded format I assume that they are comfortable with my quoting the presentation and that nothing in it is "misspoken". The session was led by  Deb Rolfes, ACT's Senior Manager of Content Marketing--she was on live answering questions via text, but the two presenters were pre-recorded, which led to a lot of disconnect. For instance, apparently 95% of attendees today were 9-12 teachers, but the recorded presentation kept talking about elementary school users.  It also kept referring to September 6th in the future tense. I'd expect better from such a large organization that gives so many presentations. 

                Secondly, I was surprised that we were told no less than five times that "ACT is a non profit organization". As in "Did you know that ACT is a non-profit organization? We donated over 660,000 fee waivers last year."  I mean, yes this is technically true for taxation purposes, but it felt like a concerted effort to make the audience see ACT as a friendly non-profit instead of the multi-million dollar enterprise it is. Because is it really a "donation" if you give away something that can only be used for your own product?

                The pre-recorded presenters were Lisa Wolf, Director of K-12 Partnerships (who seemed to be on a speakerphone and was hard to hear) and Polina Babina, Product Manager for ACT Academy. The format was that Lisa would tee up discussion topics and Polina would go in depth, though sometimes Lisa chimed in with some further details.

                Polina began by telling us that ACT Academy as initially launched was "a toe in the water" and was just ACT prep for students. But now they have a "giant catalog of over 500,000 resources (videos and games). We  can track how they are used to help students. Our videos help students grasp the concept and DO NOT TEACH TO THE ACT TEST." The capitalization was Polina's not mine, so that does seem to be confirmation of my theory. 

                I asked a couple of questions via chat--here they are with the answers from Deb:

                1) Are you in licensing agreements with content providers? Is material made for you, or are you linking to pre-existing content?

                A. "We have both-we work with Open Educational Resources (OERs), we have partnerships directly with many of our content providers. It is NOT made especially for us.

                2) How many of the 500K resources are ACT Prep?

                A. "I don't know, but I'll find out and get back to you. But we have about 60 short quizzes and 2 full length exams."

                For question 1, I think that in contrast to the custom SAT-specific content created by Khan Academy through their partnership with the College Board, this offers much more material, but since it isn't focused on a particular topic or area it can seem much more diffuse and hard to find the specific content one needs. Also, since it comes from an array of sources, the presentation style and methods of instruction are by definition inconsistent. For the second question, again, I think that we can see that while the product is called "ACT Academy", preparing students for the ACT is not even close to its primary function.

                A new feature since early September is that teachers can create classes and assign students to them to easily keep track of what their kids are working on and to see what they are learning. Teachers choose the grade/subject and level they are teaching and can even pick the textbook they use and will find pre-selected, "curated" suggestions for videos and games kids can use. I spent 12 years teaching history and I always appreciated the teacher resources that came with my textbook, so I can definitely see how a teacher might find this valuable. But I'm not sold on the execution.

                I set up a teacher account for myself, telling the program I was an ACT Prep teacher and the screenshot below is what I get when trying to find resources for geometry. The results are not that easy to navigate and it's not clear which of the resources might be more "geometry" than "ACT geometry". 

                
                Screen Shot 2018-09-19 at 3.29.36 PM.png
                

                In my job I work with teachers all over the country, and in my experience many of the "test prep" teachers in our schools are not trained in, or experts on, the ACT and SAT. That's fine, and programs like this could help them to help their students, but I would worry about option overload leading to analysis paralysis if I had to decide which two or three videos to use for tomorrow's class.

                The program now offers a report called the "Mastery Chart" that will show student by student performance on every quiz question they've answered. Interestingly, it compares a student's success to her CLASS, but not to the total user base, which seems like a missed opportunity.  The Mastery Chart will also suggest specific videos and games to send to individual students. Cool, but again something that could be pretty time consuming if a teacher had to decide what individual items to send to every student in class. 

                Teachers who have already assigned students to a class can click on "Voluntary Resources" to see what work kids have done on their own. Apparently this has been "extraordinarily successful" with beta testers, whatever that means.  Where this can be helpful would be in a self-paced scenario; a college counselor in school could log in to ACT Academy and look at what a student has been working on to prepare for a meeting to discuss testing strategies. That's cool! But it can only happen if the teacher has created the class already. It seems that students can be batch uploaded, which might simplify things, but there are definitely some pre-requisites before all of our colleagues can start pulling reports. 

                When I first saw ACT Academy in the spring I thought that it was odd to see links to "premium" paid content for students. Apparently "95%" of teacher resources are free, but they do have test banks from Houghton Mifflin Harcourt and Pearson  that schools can subscribe to for $10/month/teacher. Coming in October will be special "academies" for ASPIRE, WORKKEYS and other ACT products. Also there will be an "engine" that will tailor resources to every student in every subject (right now, the robot only suggests content to kids in the ACT mode). 

                Finally, Polina and Lisa tried to tout the support available to users. There is live chat (hard to tell if it is available 24 hours per day or just for the school day) from team members who are "specially trained" to help, and Deb typed in that schools can contact ACT to arrange in person training sessions from local reps.  On the other hand, one of the pre-recorded questions was "how would I implement this in my curriculum?" and Polina  (remember that she's the project manager) responded with "Um, I don't know. It's just a supplement to curriculum." Lisa leaped in to say "this is a great resource for all subjects and all classes and all students at all grade levels."

                So what is it? "One ring to rule them all, and in Iowa City bind them"?  Or "Um, I don't know?"  I suspect the latter...at least for now. This looks like it could definitely grow into a fine technology clearinghouse for a school or district willing to trust ACTs "curation" of resources to help provide their teachers with indexed audio/visual resources across the K-12 spectrum. But as "ACT Prep software" I still think it is bewilderingly overcrowded and unwieldy.  Thanks for reading, and I'd love to see your thoughts in the comments!

                Saturday, June 16, 2018

                Thoughts About The New ACT/SAT Concordance Tables-Summer 2018

                Longtime readers of this blog will know that I work with Method Test Prep, a national ACT/SAT preparation company whose mission is to level the playing field of standardized testing. As such, I have previously written in this space about developments relating to the ACT and SAT (here's just one example) and will continue to do so when developments call for it. The joint publication of concordance tables by the ACT and SAT will be a great source of answers to questions for people about how to compare scores across the two tests, but it also raises some questions that I hope will encourage discussion among all stakeholders. My colleague Evan Wessler and I collaborated on an article for the Method Test Prep blog about this and I've reposted it here as well. I hope that you will find it interesting and useful.

                **********

                In Accordance With Concordance

                by Evan Wessler and Ethan Lewis


                Where some see significant change, those who look deeper see something quite different. When it comes to SAT and ACT concordance, it's important to know what's at stake.
                Let's [Concor]dance!
                Because the ACT and SAT are different tests with distinct scoring scales, students' results are not automatically easily comparable. But there needs to be a way to reconcile scores. Students who have taken both exams naturally want to know if their scores on one test are higher than their scores on the other; counselors want to be able to advise their students properly; colleges, universities, and scholarship providers want to make sure that student scores meet or exceed their cutoff criteria. To accomplish this, we need a document called a concordance. When the College Board released a new SAT in 2016, it changed the the test's scoring scale––shifting from 2400 points back to 1600 points––and unilaterally released a concordance that converted new SAT scores to old ones, and then converted these to ACT scores. This provoked the ire of the ACT, which dismissed the new tables as invalid due to a lack of available score data from the new SAT. Eventually, the College Board committed to cooperating with the ACT to establish new (and, in the eyes of the ACT, credible) concordance tables; two years later, the new concordance is now available, and should be used by all parties interested in comparing scores across the two exams. 
                The more things change...
                The whole idea behind this spat was that the SAT changed in a big way––so big, in fact, that the previously determined concordance between the exams would no longer hold. While the new SAT is genuinely a very different exam than its predecessor, the more things change, the more they stay the same. Here's a snapshot of the former and current SAT-ACT concordance tables, with old and new conversions shown. 
                Concordance
                Adapted from Guide to the 2018 ACT®/SAT® Concordance, The College Board, 2018.
                The yellow cells highlight the scores that have apparently shifted. Looks like a lot of change, doesn't it? The conclusion most organizations have drawn is that things have gotten "better" for SAT students and "worse" for ACT students. For an example that shows why, take a look at the 1340 SAT score. According to the table, this used to be equivalent to a 28 on the ACT, but is now worth a 29. Conversely, in the reverse direction, a 28 on the ACT now lands a student the equivalent of a 1320 on the SAT, whereas it used to be worth as much as a 1340. This interpretation, however, is a bit too simplistic.
                Free Samples!
                When we take a deeper look, however, we begin to see how such small differences are all but irrelevant. To understand why, we must learn more about the statistical methods used to generate these concordance tables.
                In order to produce a concordance, the College Board and ACT must collect data by sampling. That is, because it would be impractical for the organizations to use data from every single SAT and ACT examinee or test, they instead make inferences from a subset of the available data (in this case, 589,753 members of the class of 2017 who took both tests). Regardless of the statistical methods used to generate average score equivalences across exams, sampling inherently generates a certain degree of variability, known by statisticians as standard error, in the final numbers. You're probably familiar with standard error of a sampling statistic: when you see a "±" value, that's the standard error talking.
                In this document, the College Board states the standard error of the score conversion values as follows.
                When using the SAT Total and ACT Composite concordance table to estimate a student’s proximal ACT Composite score from their SAT Total score, the estimates in the table have a standard error of approximately ± 2.26 (2) ACT Composite score points on its 1–36 point scale. When using this table to estimate a student’s proximal SAT Total score from their ACT Composite score, the estimates have a standard error of approximately ± 79.57 (80) SAT Total score points on its 400–1600 point scale. (The emphasis is my own.)
                Let's return to the example scores we used before to demonstrate how things supposedly got "better" for SAT takers and "worse" for ACT takers. Using the table alone, we might conclude that an SAT score of 1340 used to concord to a 28 on the ACT, but now concords to a 29. But the 29 in this table is not really a 29: it's 29 ± 2. Because of the way standard error is calculated, the practical interpretation of the measurement plus-or-minus the standard error is this: we are 68% confident that a score of 1340 on the SAT concords to an ACT score between 27 and 31. Notice how this range comfortably includes the 28 that the 1340 used to "equal". It doesn't take long to see that, when extended to all of the other values in the table, the standard error erases the apparent changes in the tables, placing them well within the ranges of confidence produced by the sampling method.
                The long and short of it is this: any sampling method used to generate concordance produces not "exact" numbers, but instead ranges within an acceptable degree of confidence, or certainty. Thus, the concordance table alone does not tell the whole story. When standard error of the numbers in this table is taken into account, we reach a simple conclusion: the concordance table hasn't really changed, and things have not gotten markedly "better" or "worse" for either SAT or ACT takers. 
                So You're Saying I Have A Chance...
                Despite the mathematical fact that scores on the two tests are essentially the same (in relation to each other) as they were before the new concordance tables were published, the story doesn't end there. Many colleges and universities publish score thresholds for scholarships based on the old, unadjusted concordance. Similarly, some states offer their residents reduced (or free) tuition based on test scores, and unless they speedily change their documentation, the new concordance tables might seem to advantage one test over another. Let's take a look at a few examples:
                At Louisiana State University, recipients of the Academic Scholars Award get $15,500 per year based on an ACT score of 30-32 or an SAT score of 1330-1430 and a cumulative 3.0 GPA. With the new concordance, 30-32 ACT concords to 1360-1440. So, well-meaning advisors might tell students that they should take the SAT because they can score lower (by getting a 1330, which concords to a 29) and still be awarded the scholarship. 
                At Liberty University, students can get into the Honors Program with a 28 ACT or a 1330 SAT. Since the new concordance equates a 1330 to a 29 ACT, if Liberty doesn't change its documentation, students might conclude that it would be wiser to take the ACT and shoot for a 28.
                At the University of Arizona, the "Wildcat Excellence" award criteria are on a sliding scale based on ACT or SAT scores and high school GPA. As you can see in the table below, a small difference in test scores can be worth a lot of money.
                Screen Shot 2018-06-15 at 3.05.29 PM
                For instance, a student with a 3.8 GPA and an 29 ACT stands to receive $18,000. With the new concordance, the 29 on the ACT is a 1330-1350 on the SAT. But a 1380 on the SAT now concords to a 30 on the ACT, which based on the chart, would get our student $25,000. What should our student do? Take the ACT again and shoot for an actual 30? Take the SAT and try for a 1390? Either option might work, but for $7,000, it would make sense to do something, unless Arizona updates their table.
                Similarly, the state of Florida's Bright Futures Program is a wonderful tool for ensuring college access and rewarding students with high test scores and grades. "Florida Academic Scholars" get 100% free tuition plus a stipend for books at state universities with a weighted cumulative GPA of 3.50 and a 29 ACT or 1290 SAT.
                As we just saw, under the new concordance, a 29 ACT equates to an SAT score between 1330-1380. So if all a student must do to be a Florida Academic Scholar is get a 1290 (which is now a 27 ACT), it would seem like they should eschew the ACT and pursue the SAT instead, shooting for that 1330.  As we now know, the standard error makes the apparent difference mathematically insignificant, but if the state of Florida doesn't update its criteria, then there is an effective difference in "real life".
                Final Thoughts
                Because the SAT and ACT generate so much stress for students and uncertainty for everyone in the college admissions process, any change in the tests can generate a disproportionate level of anxiety. The hubbub over the concordance tables is understandable, and is surely something that should be understood by anyone involved in the college process. It's good that there is now an official, universally agreed upon conversion between the two college admissions tests, but it is crucial that applicants, advisors, and advocates make sure that colleges, universities, and scholarship agencies have updated their score thresholds so that students can pursue the test preparation that makes the most sense for them