Testing the SAT: Does It Measure College Success and Preparedness?

Reads: 310  | Likes: 0  | Shelves: 0  | Comments: 0

More Details
Status: Finished  |  Genre: Non-Fiction  |  House: Booksie Classic
We give college a lot of importance these days. As more and more students apply to them, the less students are accepted and more the colleges are dependent on standardized tests like the SAT and ACT. But are they reliable? Can they overpower one's whole high school career? Are they enough to show whether one could succeed in college?

Submitted: August 11, 2013

A A A | A A A

Submitted: August 11, 2013

A A A

A A A


Testing the SAT: Does It Measure College Success and Preparedness?

by Chanpreet Singh

 

 

A serious concern has spread among parents, students, and teachers following the release of the 2012 SAT report “Only 43 Percent of 2012 College-Bound Seniors Are College Ready.” 56 percent of all test takers have failed to achieve the SAT Benchmark score of 1550 out of 2400.

 

College Board, the owner of the exam, claims that students who score 1550 or higher have a 65 percent possibility of earning at least a B- average during their first year in college.

 

Are these students really dumb? Many would blame the students for their performance, but College Board President Gaston Caperton, in the same report, holds the education system responsible. “Our nation’s future depends on the strength of our education system. When less than half of kids who want to go to college are prepared to do so, that system is failing.” Caperton believes that teachers need to expose their students to rigorous studies.

 

But should the College Board be trusted? Are these college-bound students truly unprepared for college? No, because the SAT inaccurately measures a student’s true ability. Research links three main flaws to the standardized test: an effect of family incomes on test scores, the emphasis of speed and stamina over knowledge, and the SAT is highly coachable.

 

College Board claims that “the SAT is consistently shown to be a fair and valid predictor of college success for all students, regardless of gender, race or socio-economic status.” But that is wrong. Students from high-class families have an advantage over those from the middle and the lower. Valerie Strauss, author of the article “What Do SAT, ACT Scores Really Mean?” believes there is a direct relationship between family incomes and SAT performance. As writers of national education issues for The Washington Post, Lyndsey Layton and Emma Brown have tested the SAT equity. Their research concludes that “SAT scores increase with every $20,000 in additional family income.”

 

A few years ago, two professors from Temple University challenged the SAT and its accuracy by retaking it. Christopher Harper, author of the article “Two Professors Retake the SAT: Is It a Good Test?” was one of the two. It was Harper’s fifteenth year working in the journalism department at Temple when he initiated his research. He finds the test to measure the exact opposite of what it was designed for. Instead of measuring one’s knowledge on Math, English, and Reading, the SAT measures one’s stamina in all of the subjects above. For Math, Harper reports feeling tricked and seeing rare applications of Arithmetic, Algebra, and Geometry concepts. He even admits struggling in the grammar section. “I teach writing and journalism, yet I found some questions were written so awkwardly —although they were grammatically correct —that I wanted to take a red pen to them and demand that they be rewritten.” (Harper).

 

Only those who are taught the SAT can understand some of the rare problems on the test. Math instructor Dana Mosely believes that the SAT is highly coachable. He claims that there are classes made for just teaching “SAT Math.” Mosely has created a series of DVD's himself which are used to tutor students for the SAT. He claims that the writers of the test sometimes create their own mathematical symbols. In such situation, all his advice to test takers is to simply guess by elimination, or to plug in answers from the answer choices, because the only way one knows those symbols is if one has studied “SAT Math.” (Harper). 

 

Even though there are numerous flaws to the test, research takes the College Board side. Gadi Dechter, author of the article “SAT Scores Well in Predicting College Success,” writes about a research conducted at a few four-year institutions in Maryland. In this research, students were put into three groups based on their scores: Group A comprises of those who score 1100 or higher; Group B, between 800 and 1099; and Group C, less than 800. Researchers determine that students in Group A have a 74 percent likelihood of earning a degree within six years, in comparison to the 57 percent possibility in Group B, and only 44 percent in Group C. The major argument against this study is that it dates back to the time when the highest possible score was 1600. But times have changed now. In addition to the highest possible score, the test takers and their colleges have changed.

“875 accredited, bachelor-degree granting colleges and universities do not require all or many applicants to submit test scores before making admissions decisions.” (Strauss). If the SAT is a great credible source, then why are colleges backing away from it?

 

Bob Schaeffer, the public education director of FairTest, has an answer. “High school grades — even with all the variety between schools and courses — are better predictors of a teenager's performance in higher education, particularly the likelihood of graduation.” (Strauss). FairTest, or the National Center for Fair & Open Testing, is a non-profit organization dedicated to ending flaws in standardized testing.

 

If colleges are beginning to be SAT-independent, then students should, too. But how do students use the SAT? They see the subjects they did well on and choose a career related to it. Harper’s high school SAT scores suggested him to pursue a career in Math. “When I took the SAT in high school, I scored significantly higher in Math than in Writing. In college I took Business Administration, Calculus, Accounting, and Statistics my first year. I performed well but found myself far more engaged when I transferred into journalism and English literature —my weakest subjects, according to the SAT. I graduated Phi Beta Kappa with a double major in English literature and journalism and went on to graduate school. Had I followed the path suggested by my SAT scores, I probably would have become a disgruntled numbers cruncher instead of a satisfied journalist who worked for more than 20 years at the Associated Press, Newsweek, and ABC News before joining academe in 1994. I wonder how many students fall into the trap of basing college and career decisions on their standardized-test scores. I am glad I did not.” (Harper).

 

The SAT is known to be wrong many times, especially with celebrities. Neither can the exam predict college success nor can it predict success in life outside of college. Posted on University Language, the article “Famous SAT Scores” reveals the names and the scores of a few of them. Comedian Janeane Garofalo, known for her roles in films like The Cable Guy and Mystery Men, got a 950 out of 1600. Her score surprisingly got her into Providence College. Radio personality Howard Stern got an 870, but he made his way into Boston University and earned a diploma at the Radio Engineering Institute of Electronics in Fredericksburg, Virginia. Bill Cosby‘s number is even less than a 500. Yet, Cosby received acceptance to Temple University. Who cares if he was on a track scholarship, at least he rose to become the school’s most famous alumni. Even the SAT didn’t predict that. 


© Copyright 2019 Chanpreet Singh. All rights reserved.

Add Your Comments: