Poets & Writers Responds to Open Letter

We are disheartened to have read the open letter written on behalf of creative writing teachers and program directors protesting our publishing the 2012 rankings of MFA and PhD programs. We do, of course, respect the signatories’ right to express their complaints, and we take them with the utmost seriousness.

Our mission is and always has been to serve writers. Poets & Writers Magazine is published by the forty-year-old literary nonprofit Poets & Writers, Inc., which gives more money directly to writers to give readings and facilitate workshops—many in underserved communities—than any other nonprofit in the country. While the magazine’s editorial integrity has been called into question around the issue of the rankings, I can say with conviction that our editorial staff has worked tirelessly to ensure that our magazine adheres to the highest journalistic standards.

In publishing the rankings it is precisely these editorial standards that we are striving to uphold. Our ethical obligation is to be transparent to our readers about the source of the rankings and how they were derived, which we have done consistently and without reservation. We lay out our methodology explicitly in a four-page FAQ section that precedes the actual rankings in print, dealing directly with many of the issues raised in the open letter. The first question in the FAQ is: “Should I rely on these tables to choose where to apply?” We answer unequivocally in the negative, giving afterward an explanation of why this would be a bad idea. To characterize this as a “disclaimer” that we posted only on our website is, to say the least, an oversimplification. But then to state, “Regrettably, it appears on a separate page,” further misrepresents our clear intentions, and also disregards our readers’ ability to think critically. Do the letter’s signers seriously think that anyone contemplating a writing life will not have the desire, common sense, or attention span to read beyond the rankings?

We know our readers. They are writers, some of them emerging or unpublished, but all of them individuals who believe in the written word and identify themselves as committed to it. They actually read our magazine thoroughly. And our responsibility is to serve them and their particular needs—in this case, providing a comparative overview of leading programs’ features, plus other articles on the issues pertaining to graduate creative writing programs.

So let’s raise the tone. Reasonable people can disagree on methodology, and there is surely more than one way to skin this particular cat. But we have labored mightily to contextualize the material put before our readers, presenting the rankings along with guidance for how we think they should be used, plus other advice—much of it the same as that offered by the signers of the open letter—about how students can determine which programs are best for them. Since undertaking this work, we have carefully considered the criticisms we’ve received. The advocacy organization for creative writing programs, the Association of Writers and Writing Programs, for example, criticized us for not taking faculty enough into account. We maintain that faculty quality is too complex to assess because an excellent teacher for one student is not necessarily a good fit for another. But we considered the criticism and decided to address it by including coverage of notable new faculty hires. Subsequently, we have directed readers from our pages to our free online MFA database, which includes a list of core faculty for each program. In response to criticism that our coverage did not include the perspective of program representatives, we included in this year’s issue a 3,000-word feature (an expanded 7,000-word version appears online) of advice directly from this group, some of whom took the opportunity to criticize the rankings.

While we readily consider reasoned criticisms of our work, we cannot in good conscience make editorial decisions in response to outside pressure from those groups and individuals who disagree with our coverage, much less those that threaten to withdraw advertising as a means of influencing editorial content. Our responsibility is to our readers. And we would hope that, as writers, our critics would understand and respect this obligation.

Why did we decide to publish rankings in the first place? With the proliferation of MFA programs, whether or not to attend one has become a growing question among our readership. We began to see more and more users visiting the MFA thread in our Speakeasy Message Forum hosted on pw.org, where we discovered a burgeoning community of individuals exchanging sound advice and wisdom about creative writing programs. Users shared their online research about program features, inside information from students, and advice from mentors. We saw too that there was a growing need for this information to be more widely distributed—many users posted questions asking which programs were strongest and why, while many others expressed their frustrations at not being able to easily navigate some programs’ websites.

We engaged Seth Abramson to assist us in sharing this information. Abramson has been collecting data about applicants’ preferences and about MFA programs for five years, and we stand behind his integrity. In order to tap into the collective wisdom of as many applicants as possible, we turned to the Creative Writing MFA Blog, a website founded by Stanford University professor Tom Kealey, author of The Creative Writing MFA Handbook (Continuum, 2005). We decided to survey “readers of one blog,” as Deborah Landau puts it in the press release that accompanied the open letter, because it is arguably the most highly trafficked website about creative writing programs. Like our MFA thread, it is a gathering place where students, applicants, and creative writing teachers engage in conversation, sharing the information and advice they receive from online research, mentors, and one another. It’s hardly a group whose opinions don’t matter. Instead, it represents a well-researched sample of the annual pool of applicants to creative writing programs. Reporting on the trends of where this group is applying is valuable information to share with our readers.

Why didn’t we survey MFA faculty and students about the quality of MFA programs? To continue the analogy Leslie Epstein used to describe our approach in the press release, that would be like asking diners who only frequent their favorite restaurant to assess the quality of all restaurants.

While applicants are not experts on creative writing programs, they do have a vested interest in researching the various qualities of a number of programs and comparing them. They are advised by their teachers and mentors. And those who have joined online communities focused on creative writing programs are informed by the research shared there. Students and faculty affiliated with any one program do not necessarily have a vested interest in researching other programs, and it makes sense that they’d be biased toward their own. In fact, in some cases, promoting his or her program is a requirement of a faculty member's job. How can they be expected to be fair about assessing their own program or informed about assessing any other? 

But along with the popularity ranking, we include in our coverage five other categories of rankings—all of which are based on hard data, plus eleven other categories about program features.

While we must stand strong in expressing our right to pursue an editorial project such as this, we do not stand in opposition to the many hardworking teachers and administrators who may feel slighted by our work. We have already reached out to some teachers and administrators, and will continue to do so, in order to learn more about their concerns and deepen the conversation. In the meantime, I invite anyone who is interested in sharing a civil and productive conversation on the subject to call or e-mail me directly.

Sincerely,
Mary Gannon
Editorial Director
(212) 226-3586, ext. 209
mgannon@pw.org

 


[Below is a copy of the content contained in the press release and open letter that have been circulating.]

PRESS RELEASE

September 8, 2011

One Hundred and Ninety Writers Agree: the Poets & Writers Creative Writing Program Rankings are “Specious,” and “Misleading”

Creative writing faculty representing MFA/PhD programs from all over the United States—including those who teach in a majority of writing programs ranked in the top ten—have signed an open letter admonishing Poets & Writers for the methodology behind their 2012 program rankings.

The writers who’ve signed the letter—including notable authors such as David Shields, C.D. Wright, Bob Shacochis, David Lehman, Tony Hoagland and Heather McHugh—all believe the Poets & Writers rankings give disingenuous, oversimplified, and incomplete information to those researching creative writing programs.

Leslie Epstein, the celebrated novelist and director of the Boston University creative writing program, said of the rankings, which are based on polling prospective program applicants about where they’re planning to apply to graduate school, “It’s analogous to asking people who are standing outside a restaurant studying the menu how they liked the food. Why wouldn’t you ask those who’ve actually eaten there for an informed opinion?”

Epstein also noted that he recently asked Poets & Writers why the rankings don’t take the reputation of a program’s faculty into serious account. In an email response from the rankings architect, Epstein was informed that the teaching and writing reputations of his poetry faculty—including literary heavyweights Robert Pinsky, Louise Glück and Dan Chiasson—were “irrelevant” in determining P&W’s ranking of Boston’s fabled writing program.

Deborah Landau, critically acclaimed poet and director of the prestigious creative writing program at New York University, gave her opinion: “The Poets & Writers rankings are extremely misleading, a disservice to MFA applicants, and devoid of significance. If the Poets & Writers list were entitled ‘MFA Programs Most Frequently Applied to by Readers of One Blog’ that would be accurate. I’m puzzled that Poets & Writers, a fine publication, continues to publish this misleading list.”

The mission of the signed letter (see below) is to give expert advice to potential writing program applicants about how to research and apply to programs in a meaningful way. Those who’ve signed the letter ask that Poets & Writers stop publishing a ranking that does a disservice to those trying to make an informed decision about their education.

For more information, please contact Erin Belieu, Director of the Creative Writing Program at Florida State University, at ebelieu@fsu.edu.

 

AN OPEN LETTER FROM CREATIVE WRITING FACULTY REGARDING THE POETS & WRITERS PROGRAM RANKINGS

The people who have signed this letter have all taught as creative writing program faculty. Many of us are now program directors and serve as members of our admissions committees. Most of us also hold MFA and/or doctoral degrees. We hope our collective experience and expertise will provide good counsel to anyone thinking about applying to writing programs.

To put it plainly, the Poets & Writers rankings are bad: they are methodologically specious in the extreme and quite misleading. A biased opinion poll—based on a tiny, self-selecting survey of potential program applicants—provides poor information. Poets & Writers itself includes on its website a disclaimer suggesting the limitations of these rankings, recommending that potential applicants look beyond them. Regrettably, the information appears on a separate page.

What’s worse, if a program decides against encouraging a bad process by choosing not to provide information, P&W’s process insists on including that program as though the information was negative, a procedure we think is unethical, as well as statistically misleading.

The P&W rankings, in their language and approach, labor to create the impression that the application process between applicants and programs is adversarial. It is not, as any proper, sensible survey of MFA students and alumni would indicate.

Instead of asking such students and alumni about quality of instruction, or anything else about actual program content, P&W’s rankings are heavily skewed toward viewing a program’s financial aid offer as the final arbiter of that program’s overall quality. We agree that financial aid must be a serious consideration, but a student’s relationship with his or her faculty—what and how one learns—is at least equally as important.

In economic times like these, there is no immediate correspondence between any degree and employment. This is particularly true of the MFA in creative writing and PhD in English with a creative dissertation. While we work hard to help our graduates find jobs, it is essential to understand that creative writing for the vast majority is not a profession. Some writers earn their living as teachers, but others are lawyers, full-time homemakers, doctors, editors, business owners, sales clerks, and mechanics. No applicant should consider pursuing a creative writing degree assuming the credential itself leads to an academic job. And no applicant should put her or himself in financial peril in order to pursue the degree.

Our best advice is to do your research through the programs you’re considering. If you are able to visit those programs, ask to sit in on classes and for the contact information of current and recent students. Talk to people you respect about different programs. Read work by the instructors.

Most programs have basic academic and financial information available on their websites. But don’t hesitate to ask questions of the program directors, admissions committee members, and students presently attending the programs. This kind of commonsensical research will help you find a program suited to your hopes and talents.

Sincerely,

Jonathan Aaron, Emerson College
Lee K. Abbott, Ohio State University
Jonis Agee, University of Nebraska – Lincoln
Marla Akin, University of Texas Michener Center for Writers
Julianna Baggott, Florida State University
Sally Ball, Arizona State University
Aliki Barnstone, University of Missouri – Columbia
Steven Barthelme, University of Southern Mississippi
Jocelyn Bartkevicius, University of Central Florida
Robin Behn, University of Alabama
Erin Belieu, Florida State University
Karen E. Bender, University of North Carolina Wilmington
April Bernard, Skidmore College
Mark Bibbins, The New School
Mary Biddinger, The University of Akron
Scott Blackwood, Roosevelt University
Robert Boswell, University of Houston
David Bosworth, University of Washington
Mark Brazaitis, West Virginia University
Lucie Brock-Broido, Columbia University
Ben Brooks, Emerson College
John Gregory Brown, Sweet Briar College
Andrea Hollander Budy, Lyon College
Janet Burroway, Florida State University
Robert Olen Butler, Florida State University
Sarah Shun-Lien Bynum, University of California, San Diego
Scott Cairns, University of Missouri – Columbia
Kara Candito, University of Wisconsin - Platteville
Kevin Canty, University of Montana at Missoula
Mary Carroll-Hackett, Longwood University
Michelle Carter, San Francisco State University
Alexander Chee, Columbia University
Alan Cheuse, George Mason University
Jeanne E. Clark, California State University Chico
Brian Clements, Western Connecticut State University
Mick Cochrane, Canisius College
Michael Collier, University of Maryland
Gillian Conoley, Sonoma State University
Bob Cowser, St. Lawrence University
Jennine Capó Crucet, Florida State University
Kelly Daniels, Augustana College
R. H. W. Dillard, Hollins University
Chitra Divakaruni, University of Houston
Jim Dodge, Humboldt State University
Timothy Donnelly, Columbia University
Michael Dumanis, Cleveland State University
Camille Dungy, San Francisco State University
Karl Elder, Lakeland College
Leslie Epstein, Boston University
Elaine Equi, New York University
David Everett, Johns Hopkins University
Kathy Fagan, Ohio State University
Andrew Feld, University of Washington
Elizabeth Stuckey-French, Florida State University
Ned Stuckey-French, Florida State University
Forrest Gander, Brown University
Eric Gansworth, Canisius College
Steve Garrison, University of Central Oklahoma
Maria Gillan, Binghamton University, State University of New York
Michele Glazer, Portland State University
Tod Goldberg, University of California, Riverside Palm Desert
Eric Goodman, Miami University of Ohio
Jaimy Gordon, Western Michigan University
Carol Guerrero-Murphy, Adams State College
Corrinne Clegg Hales, California State University, Fresno
Rachel Hall, State University of New York at Geneseo
Barbara Hamby, Florida State University
Cathryn Hankla, Hollins University
James Harms, West Virginia University
Charles Hartman, Connecticut College
Yona Harvey, Carnegie Mellon University
Ehud Havazelet, University of Oregon
Steve Heller, Antioch University Los Angeles
Robin Hemley, University of Iowa
DeWitt Henry, Emerson College
Michelle Herman, Ohio State University
Laraine Herring, Yavapai College
Sue Hertz, University of New Hampshire
Tony Hoagland, University of Houston
Janet Holmes, Boise State University
Garrett Hongo, University of Oregon
Ha Jin, Boston University
Arnold Johnston, Western Michigan University
Diana Joseph, Minnesota State University, Mankato
Laura Kasischke, University of Michigan
Catherine Kasper, University of Texas at San Antonio
J. Kastely, University of Houston
Richard Katrovas, Western Michigan University
Christopher Kennedy, Syracuse University
Richard Kenney, University of Washington
David Keplinger, American University
James Kimbrell, Florida State University
David Kirby, Florida State University
Binnie Kirshenbaum, Columbia University
Karen Kovacik, Indiana University - Purdue University Indianapolis
Stephen Kuusisto, Syracuse University
Deborah Landau, New York University
Jeanne Larsen, Hollins University
David Lehman, The New School
Dana Levin, Santa Fe University of Art and Design
Lisa Lewis, Oklahoma State University
Catherine Lewis, Purchase College, State University of New York
Samuel Ligon, Eastern Washington University
Robert Lopez, The New School
Denise Low, Haskell Indian Nations
Kirsten Lunstrum, Purchase College, State University of New York
Patrick Madden, Brigham Young University
Megan Marshall, Emerson College
Michael Martone, University of Alabama
Cate Marvin, College of Staten Island, The City University of New York
Gail Mazur, Emerson College
Janet McAdams, Kenyon College
Shara McCallum, Bucknell University
Karen Salyer McElmurray, Georgia College & State University
Heather McHugh, University of Washington
Sarah Messer, University of North Carolina Wilmington
Jennifer Militello, River Valley Community College
Wayne Miller, University of Central Missouri
Debra Monroe, Texas State University
Dinty W. Moore, Ohio University
Brian Morton, Sarah Lawrence College
Rick Mulkey, Converse College
Brighde Mullins, University of Southern California
Antonya Nelson, University of Houston
Ian Blake Newhem, Rockland Community College, State University of New York
Thisbe Nissen, Western Michigan University
Daniel Orozco, University of Idaho
Pamela Painter, Emerson College
Alan Michael Parker, Davidson College
Jeff Parker, University of Tampa
Oliver de la Paz, Western Washington University
Donna de la Perriere, San Francisco State University
Joyce Peseroff, University of Massachusetts Boston
Todd James Pierce, California Polytechnic State University
Robert Pinsky, Boston University
Kevin Prufer, University of Houston
Imad Rahman, Cleveland State University
Ladette Randolph, Emerson College
Marthe Reed, University of Louisiana Lafayette
Nelly Reifler, Sarah Lawrence College
Frederick Reiken, Emerson College
Paisley Rekdal, University of Utah
R. Clay Reynolds, University of Texas at Dallas
Kathryn Rhett, Gettysburg College
David Rivard, University of New Hampshire
Richard Robbins, Minnesota State University, Mankato
Mary F. Rockcastle, Hamline University
Robin Romm, New Mexico State University
Michael Ryan, University of California, Irvine
Benjamin Alíre Sáenz, University of Texas at El Paso
Martha Serpas, University of Houston
Bob Shacochis, Florida State University
Brenda Shaughnessy, New York University
Aurelie Sheehan, University of Arizona
David Shields, University of Washington
John Skoyles, Emerson College
Tom Sleigh, Hunter College
Casey Smith, Corcoran College of Art and Design
Maya Sonenberg, University of Washington
Gregory Spatz, Eastern Washington University
Brent Spencer, Creighton University
Sheryl St. Germain, Chatham University
Les Standiford, Florida International University
Domenic Stansberry, Vermont College
Thom Tammaro, Minnesota State University Moorhead
Alexandra Teague, University of Idaho
Daniel Tobin, Emerson College
Mark Todd, Western State College
Ann Townsend, Denison University
Peter Turchi, Arizona State University
Paul Vangelisti, Otis College of Art & Design
Sidney Wade, University of Florida
Jerald Walker, Emerson College
Rosanna Warren, Boston University
Laura Lee Washburn, Pittsburg State University
Joshua Weiner, University of Maryland
Lesley Wheeler, Washington and Lee University
Richard Wiley, University of Nevada, Las Vegas
Ann Joslin Williams, University of New Hampshire
David Wojahn, Virginia Commonwealth University
Gregory Wolfe, Seattle Pacific University
C.D. Wright, Brown University
Robert Wrigley, University of Idaho
Steve Yarbrough, Emerson College
Stephen Yenser, University of California, Los Angeles
C. Dale Young, Warren Wilson College
Matthew Zapruder, University of California, Riverside Palm Desert
Lisa Zeidner, Rutgers-Camden, The State University of New Jersey
Alan Ziegler, Columbia University
Leni Zumas, Portland State University

Comments

Response to Open Letter

I don't think the 190 people who signed the letter object to the information being made available, and I can tell you that I found it mighty handy when I was applying to MFA programs. The problem, which your response doesn't address, is the presentation of this information as a ranking. The programs are ordered by "votes"--i.e. the number of visitors to a few websites who have stated their intention to apply. Supposedly, this is because the people who voted in this way are better informed than the general population of applicants. I don't see any reason to make that assumption; in fact, it's a major selection bias. Why not just publish an alphabetized table with all the same information on it?

Well, we'll consider that

Well, we'll consider that for next year. The reason why we haven't done it that way so far is we that thought it was important to present the reasoning behind why we were only including 50 (plus the honorable mentions) programs of the total number of programs. Our goal was to try to capture what was considered the leading programs. Also, we use the word "rankings" because there are several rankings included in the charts. Further, we didn't assume that the online community surveyed was better informed than any other applicants necessarily, only that they had, in fact, done research. And we knew this because they were a group we could identify. We couldn't survey all applicants because how would we find them? But, again, thanks for your productive suggestions. We'll take them into consideration. I appreciate your taking the time to comment.

Methodolgy

Mary,
I understand that methods will vary and that they will always be flawed, but what really worries me about this methodology is that it fails to account for the power of the previous rankings to perpetuate ordering into the current and future rankings. In the first set of rankings, Seth points out that the US News and World report rankings are both dated and flawed, hence the need for the PW rankings.

If that's the case, then your rankings should be pretty different from the USNWR rankings... However, in the 2009 version, 20 of the top 25 of PW's rankings were represented in the USNWR top 25 and every one of the top 10 still grace your lists... Sure, strong programs stay strong. But when students are looking to apply, they look to your list, then they log onto the MFA blog and say they applied to what schools they applied to (influenced by your list) and perpetuate said list.

This method was fine for year 1 and maybe year 2, but the longer it goes on, the more incestuous it will become (right now it's still correcting for the criteria you highlight... So Columbia plummets because you have decided to have 3 categories about funding, which it lacks in, and therefore the new applicants see this and continue Columbia's descent...)Basically, by setting the criteria (and what's important) for the rankings and then ranking based on those who use the rankings, all you do is make your rankings reflect the criteria you highlight...

Main Concern

Dear Mary,

Fist off, I have no doubt that there were good intentions behind making this list; I have always read P&W and will continue to read your magazine.

But my main concern is for schools like #117 Univ. of Alaska - Fairbanks or #121 UCF - Orlando, programs like these are put at a disadvantage for being at the end of the list. Not many students (and I know this, having gone through the selection process several years ago and having witnessed undergrads go through the process recently) want to even consider going to those schools. This treatment of barely acknowledging them (except as the "remaining 81 programs") hinders the possibility for these programs to stay competitive.

And what about the non-ranked schools? Because they were not ranked, one would get the feeling they are not legitimate. But shouldn't that call into question the knowledge of the potential students you interviewed? How knowledgeable are these students if none even knew of programs like UT - El Paso, BYU, UC - Chico (etc)?

To summarize, I feel too much emphasis is given to ranking MFA Programs, which misleads many prospective students. Much of the information provided is great, but could easily been provided with the programs being organized alphabetically.

Thanks,

H.W.

Leslie Epstein's complaint,

Leslie Epstein's complaint, in the open letter, evidently is that applicants aren't sufficiently aware of the stars on the Boston U. faculty.

Problems with this: My sense is that applicants have learned that MFA programs often hire star faculty as a marketing strategy. The stars tend to have light, difficult-to-predict teaching loads; in the worst case, their main contribution to the program may very well be name recognition. Also, those stars command large salaries; if the program has less competitive funding, as Boston U. does, an applicant may speculate that that's where the money goes.

So maybe it's not that applicants are unaware of the eminence of the Boston U. faculty, but rather that they're skeptical of the hire-stars approach to running an MFA program.

Boston U. also is a virtually brand-new as an MFA program. It was just a 1-year MA program until recently. Perhaps applicants aren't willing to take a chance on a program that has just reinvented itself?

I guess my point is that, in my view, applicants aren't as easily led astray as the open letter would suggest. If a program doesn't rank highly, perhaps there are very good reasons for that. And perhaps the 199 professors who signed the letter can entrust applicants with the rankings -- having faith that applicants will put the data to good use without being misled through a lack of street smarts and common sense.

Richard M, Thanks for your

Richard M,

Thanks for your insights. I see your point about the rankings affecting the rankings. Our challenge was to try to find a way to provide to our readers an overview of what are considered the best programs, while also informing them that the best program for each of them is really the one that meets the needs and expectations they have for their experience. The rankings are not presented as the entire picture, but as a way to begin thinking about all that's involved in making such an important decision.

H.W., Thanks for taking the

H.W.,

Thanks for taking the time to comment. While I see your point, it's a challenge for us logistically to include all 200 programs alphabetically in our pages, which is why we tried to narrow it down. Many of our reader have either received their MFAs or don't care to pursue the degree at all. So we try to provide coverage for that portion of our readers who are interested in pursuing an MFA without having it dominate the pages of any one issue because we have a larger group of writers to serve. But, again, I appreciate your input. We are considering all constructive criticism carefully. 

Re: Lapwing

I'm not sure you are understanding the idea behind the letter, and the 200 faculty members who have signed it. You mostly focus on Leslie Epstein and B.U., alleging poor spending habits (on faculty) and lack of students applying because of the newness of the program and perhaps misspending. Do you have proof of this, or do you just like to speculate? Because B.U. is a private institution, I would assume their financial records weren't made available to you--unlike a public institutions where financials are disclosed. Actually, I recommend you research the top public schools listed on the rankings and see how much they pay their "star" faculty, and then consider their enrollment. I'm sure you'll find that there are many "star" faculty members who rarely teach, yet the program is still strong. Lastly, when did Ms. Epstein say that people weren't aware of the B.U. program? She seems to have only said that polling students who have not attended a program seems wrong and misleading. Your focus on B.U. seems obsessive and fraught with subjective emotions.

But all of this doesn't answer why schools like Iowa, University of Houston and Columbia have also signed the letter. Are they bitter, too?

And lastly, your comment "perhaps there are very good reasons" why some schools are at the bottom of the list is silly and unfounded. Who are you to judge programs you have no idea about? Unless you personally researched each program... You seem to only be able to make blanket statements, and in doing so, sound immature and misinformed.

H.W.

Mary, Thank you for your

Mary,

Thank you for your response letter. Since your final paragraph seems to invite constructive suggestions, I thought I’d offer a few friendly ones that might help to resolve this issue once and for all.

First of all, Poets & Writers has always been, in my opinion, a wonderful resource for writers, a place where we can go to learn about fellowships, awards, writing contests, and other opportunities in the field. It has also been a place where we can read interviews, articles and profiles that offer insights into both the artistic and business side of writing. In short, Poets & Writers has always been a “friend” to writers, a place that provides information that helps us on our individual journeys.

That said, what people seem to object to most about these rankings is that they go beyond simply providing information to young writers and instead assign numerical values to specific programs, leaving many people unhappy, frustrated, and confused. With this in mind, I think a subtle tweaking in the actual language you use to present this information would go a long way toward pleasing everyone. Here are my suggestions:

1. Remove the word “rankings.” I know this word is much more attractive than a phrase like “Popularity among 2012 applicants,” but it’s also not an accurate description of what the data actually is--applicants are not asked to “rank” programs; they’re asked to list the schools they’re applying to. If you simply included a hierarchical list based on “popularity,” and called it that, you’d be providing the exact same information without inviting any of the unnecessary criticism.

2. Remove the word “votes” as well as the “votes” column. I don’t think the specific number of votes is really that relevant to most applicants, and, again, as someone mentioned earlier, applicants are not actually being asked to “vote.” If you took away that word and that column, I think a lot of the criticism would simply go away.

3. When possible, I’d also remove numerical values in other parts of the rankings and simply use general categories, just as you do with other aspects like "program size." For example, with funding, you could have “Excellent,” Good,” “Fair” and “Poor.” Or with selectivity, you could have “Extremely,” “Very,” “Moderately,” and so on. Basically, by removing the numerical values, you’d be removing the impression that there are enormous gaps between particular programs, while at the same time still providing applicants with the exact same information.

Personally, I think that if you made some of these very small changes, nobody in their right mind would have a problem with the service you were providing to potential applicants. The applicants would still benefit greatly from the information, and nobody on the other end would feel left out or misrepresented. And perhaps most importantly, you’d remain in keeping with what has traditionally been the role of Poets & Writers in the literary world: a friend to writers and a source of information.

Please don't resort to an

Please don't resort to an unranked list for next year! The ranking, like all of Seth Abramson's work, has been incredibly helpful. Thank you for publishing it and for standing behind its credibility.

- To Leslie Epstein: It's hard not to be a little offended by your analogy about "asking people who are standing outside a restaurant studying the menu how they liked the food." A more apt analogy would be asking people who had eaten at the restaurant how they'd compare it to restaurants they've never tried. Abramson makes it clear why MFA applicants are the best study pool in the ranking FAQ.

- To hello world's comment about low-ranking schools being unfairly represented, I'm baffled. So if we didn't have the ranking, schools like U of Alaska Fairbanks and UCF would somehow get more attention? And the fact is, the only reason I know of these schools is because of Seth's article in the Huffington Post: The Top 25 Underrated Creative Writing MFA Programs. By your statement, the school that came in 26th should have the same argument.

- Most of all, as an MFA applicant, I feel talked down to by this letter. Deciding to be a writer is such a fragile act of faith. And the application road is terrifying and confusing and lonely. The first site I looked at was UT, James Michener staring me down, saying "Don't waste your time or my money." Good Lord, who do I think I am to apply to a place like this? That's what we all think, even the most exceptionally talented among us.

The P&W ranking isn't much of a hand-holding, but it's enough to get us started. And it mirrors the conversations you hear at workshops, the advice you get from writing teachers and forums. It's important that we know which schools are rising to the top in the minds of the people most invested in the subject, and why. It helps us form our own filters for which programs might fit us best, which is what we applicants all ultimately do: We put the rankings aside and make our own map.

Uncleuncle, I definitely

Uncleuncle,

I definitely appreciate why you and other current applicants find these rankings helpful, but you also have to keep in mind that at some point in the future—perhaps after you’ve gone through an MFA Program and feel an allegiance to that program—your feelings on this issue might change.

In other words, it’s not a coincidence that the people defending these rankings are primarily “current” applicants, and the people questioning them are individuals who have already gone through the experience and completed their degree. At some point, you’ll probably be a member of the latter group, and, accordingly, your feelings might change.

Let me give you just one example of what I’m talking about. Let’s say you get into a program you’re excited to attend; you have an amazing experience there and gain an appreciation for some of the aspects of the program that aren’t reflected in these rankings (the high quality of teaching, the incredible reading series, the friendly and supportive nature of the community, etc.) Then let’s say that three years down the road the university that houses your program decides to cut your program’s budget for reasons beyond the program’s control. Suddenly word gets out about the change in funding and suddenly the application numbers drop, the program’s ranking plummets 15 spots in the P&W rankings, and Seth (or someone else) begins to make disparaging comments online about how your program doesn’t support its students. You try to protest that your program does support its students, that there’s much more to your program than just funding, that your program’s poor ranking isn’t really an accurate reflection of how good it really is, etc., but no one wants to hear you. What they see is a number placed next to your program and that number determines, to a large part, how they feel about it.

This is just one example of something that happens fairly often, but I hope you can see how easily you might find yourself questioning these rankings once you’re on the other side and feel that your own program isn’t being treated fairly and that the applicants determining your program's ranking aren't getting the whole picture.

I'm the one who wrote the

I'm the one who wrote the initial posting on the page devoted to the top 50 programs. I can appreciate the editorial response above for its calm tone. And I, too, don't see how an alum of a particular program could possibly compare that program with others.
*****
On the other hand, I don't know how the editorial staff of P&W can possibly know for sure how thoroughly people read their magazine.
*****
But what I really take issue with is the claim that the magazine "adheres to the highest journalistic standards" (and apart from the rankings, I respect the magazine very much, by the way). Where I don't see that happening is in the handling of reader comments on this issue. I avoided commenting on the rankings (which I regard as poll results) on various blogs where they were discussed because I'd seen extremely rancorous arguments between S. Abramson and others about that topic and others, and I suspected that I lacked the stomach for that kind of exchange. I assumed, though, that the comments section on the Web site of a major publication would involve a more civil exchange of views.
*****
Pretentiousness is not generally considered online abuse. Nonetheless, if I came across as pretentious in my responses to Seth, I'm sorry; I can't say what "anyone who knows me" thinks of me because I can't pretend to know the opinions of everyone who knows me. But I can say with some confidence that those who know me WELL don't regard me as pretentious, which is conduct those people don't tolerate from anyone. In responding to Mr. Abramson's reaction to my posting, I fell into a fight-fire-with-fire mentality, and I realize that was a mistake.
*****
At the same time, I stand by my original posting on that page. There was nothing inflammatory or unreasonable about it. In the comments sections of major newspapers or other major magazines, I've never seen the author of an article repeatedly argue with her or his critics. I can't imagine, say, Eugene Robinson arguing with his readers or jumping onto others' blogs to argue with readers there who disagree with his views; I can't imagine his doing that even if the Washington Post didn't have its current policy regarding their journalists' contact with readers, a policy I consider too restrictive. I think an author's responses can play an important role in setting the tone for readers' comments. (Someone on the other page told me "shut the f*ck up," something I cannot imagine saying--with or without the asterisks--to anyone else online. Someone else ordered me to "leave Seth alone." S. Abramson's tendency to argue with others on numerous other sites should make clear that he hardly needs protection from me. In fact, my response to the rankings is the first time I've ever commented online about him or his work.)
*****
He identifies himself as a freelance journalist. I worked for two and-a-half years as a journalist for a company owned by a large Midwestern newspaper. I never would have dreamed of engaging in angry debates with my readers. Perhaps it would be more appropriate--and more professional--for P&W authors to write one response at the end of the period during which comments can be made by readers. That's the way it's traditionally been done by many other magazines.
Various news organizations--Reuters, the Guardian, etc.--have specific policies regarding how journalists should interact online with readers. The one recommended on this site is more or less typical of what I've seen in specific policies:
http://www.ghnewsroom.com/newsroomhandbook/ethics/x1958452807/Ethics-Guide-Ethics-on-the-Web
*****
In one of Mr. Abramson's postings, part of my name was revealed. The option that allows those commenting to identify themselves by a username led me to believe that my privacy would be respected. I think that P&W should post in a prominent location the magazine's online privacy policy. If authors are allowed to reveal the identities of their critics, shouldn't potential critics be informed of this before they post a comment?
*****
Thanks.

Dear Unclenuncle, -1st off,

Dear Unclenuncle,

-1st off, I assure you, applying to any school is a daunting experience, especially a top ranked one. But the only thing you can lose is the application fee.

-Secondly, if you read my post you would have noticed that I believe listing them alphabetically would work best. I'm sure that P&W would rather have them ranked because it seduces many students applying (like yourself) to buy a copy--much like the US News & World Report college ranking issues.

-Also, I'm not understanding how you equate The Huffington Post only printing the Top 25 Underrated programs to my comments. The Huffington Post is not an authority on CW programs, and the article had a specific focus in mind (25 underrated programs), whereas P&W is a magazine focused on writers, and the article consists of a complete list of every program. And back to my earlier point, wouldn't you get the same information, the same introduction to these non-ranked programs if they were merely listed?

-Next, Leslie's analogy is actually a very good one. It is merely pointing out that students not currently attending a program shouldn't be able to evaluate programs they haven't, or won't apply to. Your analogy is off because the customers (the students in this case) haven't eaten at any restaurant, so they have no background for judging any restaurant. Like Leslie said, they are merely looking at the menu. I'd like to add, it seems like you are more offended by Leslie wanting to cut you out of the process, than by her statements.

-Lastly, Seth does mention that students are his best avenue for these rankings, but that doesn't mean it's true--it merely means that was what was most available. As you can tell, by the 200 faculty members who have signed the letter, many professionals in the field disagree.

HW: Who is this "Ms.

HW: Who is this "Ms. Epstein" you're referring to? I'm not familiar with the woman you're defending, so I'm not sure I understand your rude comments.

Lapwing, You show your

Lapwing,

You show your immaturity by pointing out a simple mistake. Do you have ties to BU? Were you rejected by their program? I feel there needs to be transparency in your comments toward him?

That, or just grow up.

I am a serious applicant to

I am a serious applicant to the MFA programs and have spent an enormous amount of time researching the particulars of programs, which is truly an ordeal. It is peculiar to me that schools, who are at once voted in the top 10 of Atlantic's best programs, are happy to flaunt their appointment, however when their rank isn't uphelt by another, arguably more in-depth ranking system, these same schools become hostile. As an unbiased applicant who would have liked to apply to Boston U or Columbia, for example, I have found, in my own research, that these schools have run on unmerited prestige without being able to back their programs noteriety with those things most important to us prospective students. Most notable, funding, job placement, fellowship placement, and departmental requirements. Both BU and Columbia should not find it so heinous an insult as to be ranked where they did with thier vastly inferior if not non-existent funding, poor job and fellowship placements and arbitrary language requirements. It is as pretentious as it is presumptuous to assume and expect one's ranking without having to prove it. I am thankful for the important work done by Seth Abramson and P&W and will continue to use it, to hope it continues, and like any reasonable person, use it as a jumping off point from which I will further my research. Schools should not lash back with contempt but rather inquire as to the nature of these things, to consider the voices of those seeking their schools, and to honestly, sincerely work toward providing programs that better serve their purpose. As for Mr. Epstein's analogy I would like to say that it makes equally as little sense to ask the chefs of the restaurants in question to rate themselves...a clear conflict of interest.

Itsalwaysdarkest, -When

Itsalwaysdarkest,

-When magazines (like US News & World Report) rank programs, they don't allow the faculty or administrators to rate their own institutions. They use a peer-evaluation form, where they have them rank programs based on their personal knowledge. They are told to rank them: 1(marginal) - 5(distinguished), and "don't know" if they are not familiar with the programs. In other words, the chefs don't rate their own meals. This is one major problem with the P&W rankings; they don't use peer evaluation, they us data from a group who have no experience with MFA programs. It would be like asking me to rank Law Schools--I have no knowledge of law schools.

-Also, I direct this to Seth (if he's here), in your methodology, you seem to be exaggerating the weight that lawyers, judges and law firm hiring coordinators (found under "Assessment Scores by Lawyers/Judges") have in the ranking of Law Schools. It should be noted, for those reading, that "Quality Assessments" is broken into 2 groups: .15 for "Assessments by Lawyers/Judges", and .25 by "Peer Assessment"--this equals .40 for the law school rankings. As you can see, the lawyers/judges assessment doesn't even count for half of the .40. On top of that, assessments aren't even half of the ranking scores. I will also say that at least the lawyers and judges have gone through programs, and can reasonably evaluate programs through past experience and personal knowledge--and if they can't, they put "don't know". MFA Applicants have no point of reference, they are new to the process. I feel it is misleading to equate a MFA Applicant to a Lawyer or Judge. Lastly, why do you think that CW Programs were receptive to giving you information for the rankings? I am merely wondering...

-To those interested, US News has a very good explanation of their methodology. Below is the Law School methodology.

http://www.usnews.com/education/best-graduate-schools/articles/2011/03/14/law-school-rankings-method...

Curious

I suppose it's nice to be able to both condemn a ranking and then celebrate it when it benefits you:

http://www.ohio.edu/compass/stories/11-12/9/phd-ranked-third.cfm

Ohio University’s Creative Writing Ph.D. program was ranked third in the nation for 2012 by of Poets and Writers, a nationally distributed literary and educational magazine.

This is the first time that the creative writing program has received an official ranking by a national organization. There have been unofficial rankings in the past where OHIO’s program has always been highly regarded.

Dinty Moore, professor of English and director of the Creative Writing Program responded to the news, "We were thrilled to be ranked the number three program nationally; both because the ranking is certainly impressive, but also because the faculty and students have been working very hard these past years to enhance the profile of the program."

I obviously can't speak for

I obviously can't speak for Dinty W. Moore of Ohio University. Perhaps he hadn't read the methodology earlier (it does take some time to read).
*****
Here's a link to a site that covers some of the most basic challenges in using statistics. The author is a professor emeritus of mathematics at UT-Austin who was also affiliated with the university's Division of Statistics and Computational Science. (You can find out more about her on the "About" page included on the site.)
*****
http://www.ma.utexas.edu/users/mks/statmistakes/StatisticsMistakes.html

Dear Mary, What saddened me

Dear Mary,

What saddened me about your response letter is that its tone struck me as defensive, even aggressive at times. I can understand that nobody likes to be criticized, and that this whole ordeal has been a bit embarrassing for the magazine, but I really feel like you missed a golden opportunity to extend an olive branch to the signatories and move toward some type of friendly resolution.

Instead, you seemed to spend most of the letter defending the rankings and presenting what seemed to me a point-by-point counterargument, and the overall tone was a sort of “us vs. them” one, not a “let’s work together” one. To put it another way, the content of the letter sounded eerily similar to they types of defenses I‘ve heard Seth Abramson make countless times online. Unlike Seth, however, your primary audience is not just students applying to MFA programs, but ALL writers, including the signatories of that letter, many of whom have contributed to your magazine in the past and even appeared on its cover.

My concern is that this issue simply isn’t going to go away. If the people opposing these rankings had 200 distinguished signatories this year, they’ll have 400 next year, and even more the next. What’s going to happen is that the writing world is going to become completely polarized over this—with MFA professors and graduates on one side and prospective MFA students on the other—and the source of this division will, of course, be the rankings. I know that Seth has made his own allegiances clear, but that doesn’t mean that Poets & Writers has to take Seth’s side, simply because he’s the author of the rankings. In fact, my hope is that you’ll find a way to distance yourself from Seth and work with some of the signatories toward a friendly resolution, one that leaves everyone happy and the MFA world unified once again. I suspect that Seth will likely stubbornly resist any changes to his rankings, but that doesn’t mean that you have to tc concede to him. After all, Seth works for you, not the other way around,

It surprises me that P&W is

It surprises me that P&W is circling the wagons on this issue. Providing trustworthy information on MFA's should be a very high priority for P&W. It's too important a mission to do it by using indefensible methodology and in turn alienating about 200 and more of the most experienced and talented professionals working in the field. P&W's brand has been diminished. I just don't see any benefit in continuing to provide this vital service in a careless and cheap way. This is a real opportunity for P&W. Now that you have everybody's attention, I would assume that a few of them at least would be willing to contribute to a methodologically sound assessment of MFA programs for next year. Don't lose this opportunity by circling the wagons and wrapping yourself in the flag of 'serving writers.' Please, change your mission to "serving writers well." Clearly, these rankings do not. It only takes a little common sense, a wasted 15 minutes reading the overabundance of methodology disclosures and your overabundance of cautions, and a glance at the names on the letter to recognize that.

I noticed earlier: In

I noticed earlier: In paragraph #8 of the magazine's response to the open letter, you identify Tom Kealey as a "Stanford University professor." According to his own online information and other sources, he was a Stegner Fellow (and that, to me, is about as impressive as it gets following--or sometimes in place of--an MFA, so I'm not diminishing that achievement in the least). My understanding is that in the wake of his Stegner, he became a Jones Lecturer at Stanford. (He has a lectureship there now, but I don't know it's designation.) Anyway, as someone who used to write for a couple of newspapers, I thought I'd clarify that detail.