2013 MFA Index: Further Reading

by
Seth Abramson
From the September/October 2012 issue of
Poets & Writers Magazine

Cost of Living

The cost of living in the various programs' listed host locations was determined using Sperling's Best Places (www.bestplaces.net/COL/default.aspx). All cost-of-living data were then compared to a randomly selected national-average-range constant, in this case Ann Arbor, Michigan.

Notations used for cost of living are as follows: Very Very Low (25 percent or more below Ann Arbor, Michigan, the national-average-range constant for the 2013 MFA Index); Very Low (between 16 percent and 24 percent below); Low (between 6 and 15 percent below); Average (between 5 percent below and 5 percent above); High (between 6 and 15 percent above); Very High (16 percent or more above); and Very Very High (25 percent or more above the cost of living in Ann Arbor). While some students may choose to live outside the boundaries of their program's host location, commuting to an MFA program rather than living near campus includes hidden costs of its own, indeed costs of both a pecuniary and nonpecuniary nature. For this reason, only a program's host location was assessed for this measure. Cost-of-living adjustments were also used to determine the package value at individual programs for the funding and “full funding” categories (see “Full-Residency Program Profiles: Additional Program Measures: Funding” and “Full-Residency Program Profiles: Additional Program Measures: Full Funding”).

Teaching Load

While individual applicants' interest in teaching composition, rhetoric, literature, or creative writing to undergraduates will vary, generally speaking the most popular teaching load is a 1/1 (one course to be taught in the fall semester, one in the spring semester). The teaching loads of individual programs have not been ordered in a vertical hierarchy per se, yet this 1/1 standard has been used to determine whether a given program's teaching load is considered Low (“L”), Average (“A”), or High (“H”). That is, because the 1/1 load is the most popular amongst applicants—though it is not the most common teaching load at MFA programs—average annual teaching loads of 0/0, 0/1, 1/0, and 1/1 have been denominated “Low.” An average annual teaching load of 2/1 or 1/2 (the most common teaching load) is termed "Average," while an average annual teaching load of 2/2 is considered "High." Note that the term “load” is not used here pejoratively; some applicants will wish to teach more rather than less, even as other applicants prefer to do no teaching whatsoever. At present the MFA Index takes no position whatsoever on the academic or professional value of teaching a large or small number of undergraduate sections per academic year, nor on the effect such teaching may or may not have on individual students’ personal writing practices.

The term “average” is used here in two different senses: First, to denote a category of teaching load; second, to indicate that all programs are assessed by their “average” teaching load. Because many programs offer different teaching-load packages to different students, and/or increase or decrease teaching load over the duration of the program, the average (mean) number of courses taught per year per student in each program is used. In some instances, students may request and/or get assigned—once admitted to a program—a larger and therefore better-compensated teaching load. Such additional teaching sections are by no means guaranteed, however, and therefore are not noted in or considered by the MFA Index.

Some programs fund a small enough percentage of admittees through teaching assistantships that to assign such programs an "average teaching load" would be to wrongly imply that admitted students are likely to receive an assistantship. For this reason, programs that offer assistantships to less than one-third of their incoming cohort received an “--” in the “teaching load” column. Programs eligible for a “teaching load” designation, but which do not publicly disclose the teaching load they assign their teaching assistants, are indicated with a “no data available” (“n.d.”) notation.

CGSR Compliance

The Council of Graduate Schools Resolution, also known as the “April 15th Resolution,” states that graduate programs that are signatories to the Resolution shall keep funded offers of admission open through April 15 of each application cycle. Colleges and universities that adhere to the Resolution represent that all of their constituent programs and departments adhere to the terms of the Resolution, which include mailing a copy of the Resolution with all acceptances. Under the terms of the Resolution, programs may neither rescind nor threaten to rescind offers of admission to which any funding whatsoever is attached prior to April 15, nor may they explicitly or implicitly indicate to such accepted candidates, in writing or in person or via telephone, that there is any deadline for their matriculation decision other than April 15. Historically, MFA applicants have reported widespread noncompliance with the Resolution, which is problematic for applicants because CGSR-violative programs often require final matriculation decisions from applicants well before they have heard admissions responses from the other programs to which they applied. Applicants increasingly see such acceptances as excessively restrictive of their options and opportunities.

At present, only two CGSR signatories are believed to be noncompliant with the contract they and more than a hundred other universities signed and published for prospective applicants. This said, the CGSR Compliance category does not distinguish between programs known to have already violated the Resolution and those nonsignatories that simply could do so without running afoul of their host universities’ administrative policies. Therefore, while applicants should exercise due diligence and caution in applying to programs that are not CGSR compliant, they should also not presume violations will occur. The best policy is to contact nonsignatory programs directly and inquire regarding their CGSR-related policies; needless to say, some programs will welcome such queries more than others, as of late the question of the CGSR’s viability for creative writing MFA programs has been hotly contested by certain nonsignatory programs.

Any signatory to the CGSR found to be in violation of that contract will be listed as noncompliant, whether or not the program’s host college or university continues to be a CGSR signatory. Compliance inquiries are initiated on the basis of applicant self-reporting; since 2006, fully 100 percent of applicant complaints regarding programs’ CGSR-related policies have been found, following an investigation, to be meritorious. Indeed, in all but one instance the offending program ultimately confessed to the violation.

GRE Required

This category indicates whether or not a program requires applicants to submit Graduate Record Examination (GRE) General Test scores as part of their applications. Generally, programs that offer a substantial portion of incoming students some form of financial aid require these scores, and so applicants are advised to take this test prior to applying in order to avoid artificially limiting their application options. In most instances, student scores are only lightly scrutinized (or simply ignored altogether) by the programs themselves, and instead reviewed—where they are reviewed—by individual universities’ Graduate Colleges, which often have minimum GRE-score requirements (typically very generous ones). Creative writing MFA applicants should not avoid the GRE General Test for fear of the Mathematics portion of the exam; even those programs that do give minor weight to standardized test scores in their admissions processes generally look only at applicants’ Verbal and Analytical Writing scores. At present no programs require the GRE Subject Test in English Literature, though two programs (Johns Hopkins University in Baltimore and Boston University in Massachusetts) strongly suggest that applicants sit for and submit their scores from this exam. Applicants should also be aware that certain university-wide fellowships and grants require the submission of GRE scores. Applicants who do not submit such scores with their applications cannot be considered for these forms of financial aid.

Language Required

This category indicates whether or not a program requires applicants to exhibit proficiency in a foreign language prior to graduation. Some programs with a foreign-language requirement allow applicants to place out of this requirement through the submission and application of prior foreign-language course credits at the college level; other programs require that applicants take an exam (often a reading-knowledge-only translation exam) to show proficiency, regardless of their prior foreign-language experience. At present only a small minority of programs—nine of the 85 listed in the index, or 11 percent—have a foreign-language requirement as part of their curriculum. However, the category is presented here due to applicants’ great interest in, and sometimes anxiety about, such prerequisites for graduation.

Cross-Genre

Certain MFA programs require that individuals who apply and are admitted in a particular genre take only workshops in this “declared” genre while in-program. Other programs permit, or even require, matriculated students to take out-of-genre workshops—and among this latter group are two further subcategories of programs, those that permit students to take as many out-of-genre workshops as they wish, and those that permit or require only a limited number of out-of-genre workshops.

The past five years of online, public discussions between and amongst MFA applicants suggest that the availability of cross-genre study has become one of the top concerns for applicants seeking additional curricular information about the programs to which they wish to apply. Many applicants already write in more than one genre, and hope to have their multifaceted talents as literary artists shepherded, rather than impeded, by the curricula of programs on their chosen application list; other students are merely curious about genres other than their own, and view their in-program time as a rare opportunity to experiment with modes of literary art other than those with which they are already conversant. A smaller—but growing—subset of the applicant pool is comprised of self-styled “literary artists” rather than simply “poets” or “writers,” and these individuals already incorporate so many different aesthetic traditions into their work that to be limited to either “poetry workshops” or “prose workshops” would (in their view) be a betrayal of their artistic vision. Because the availability of cross-genre study is such a prominent concern amongst the applicant class, it is listed as a separate category here. All data for this category were taken directly from program websites; any program that permits or requires applicants to take out-of-genre workshops, in whatever number, has been listed in this column as a “yes” (“Y”). Programs that explicitly prohibit such study are indicated with a “no” (“N”). Because the tradition, among MFA programs, has been to disallow cross-genre study, programs whose websites were silent on the question of such study were also treated as, and are listed in the index as, a “no” for this measure.

Dates of Establishment

Reciting the dates of establishment for the nation’s full-residency MFA programs offers a critical historical context for the full-residency program index, the institutions profiled and assessed in the chart, and the very degree that is the focus of both the chart and the institutions whose attributes the chart’s surveys and hard-data measurements catalogue. This column of data does not apply to nonterminal, academic master’s programs in creative writing (with or without the option of a creative thesis), which are different in form and function from their longer, generally better-funded, more studio-oriented, terminal, art-degree MFA peers.

Previous survey methodologies used in assessing terminal-degree creative writing programs have leaned heavily on the somewhat tendentious factor of program visibility. When programs are assessed by individuals already within the system, the natural result is that older programs—whatever their selectivity, financial resources, faculty resources, curriculum, pedagogy, or student outcomes—move into positions of prominence due to their profile advantage. Yet applicants report only limited interest in programs’ historical pedigrees, as pedigree itself is often considered a suspect quantity in the national literary arts community. By publishing, for the first time, the dates of establishment of eighty-four of the nation’s 167 full-residency MFA programs, the 2013 MFA Index permits applicants and other consumers of these data to both disassociate historical pedigree from the distinct question of program quality, while also better understanding the historical context in which the creative writing MFA has achieved such cultural prominence.

Creative writing as an academic discipline originated in the late nineteenth century, yet by January of 1964 there was still only one MFA-conferring graduate creative writing program in the world. In fact, though the first MFAs in any field were granted in the 1920s, and the MFA-conferring Iowa Writers’ Workshop was founded in 1936, the MFA as a degree would have no abiding place in the national literary arts community until the 1980s. The 1940s, 1950s, and much of the 1960s were marked by attempts to find alternative models to the one provided by the Iowa Writers’ Workshop: first, in the degree-granting, relatively nonselective, grade-free creative writing program at Black Mountain College, which was founded in the 1930s but had its heyday in the late 1940s and early to mid-1950s; second, in the undergraduate-only creative writing program at Stanford University (founded in 1947 by Wallace Stegner) and other undergraduate programs modeled closely upon this one; third, in institutional but non-degree-granting programs like the Writers’ Program at the University of California in Los Angeles, founded in 1964; fourth, in non-institutional workshops such as the Black Arts Movement’s Umbra Workshop, founded in Manhattan’s Lower East Side in 1962; and fifth, in nonterminal MA programs in creative writing founded at a number of institutions, including Johns Hopkins University in Baltimore (1946), University of Denver in Colorado (1947), Cornell University in Ithaca, New York (1948), Indiana University in Bloomington (1948), University of Florida in Gainesville (1948), and Hollins University in Roanoke, Virginia (1960). Some of these latter programs required academic theses of their students rather than creative ones.

Ultimately, certain elements of the Iowa Writers’ Workshop MFA model became ascendant—after these and other elements had been experimented with by the types of degree programs listed above—because of a grass-roots campaign by working creative writers (among both faculties and student bodies at various institutions) to gain greater administrative, pedagogical, and creative autonomy from the academy to which they had previously been attached. Most of the early MFA programs appear to have been founded only after years—in some cases several decades—of struggle between creative writers and university bureaucrats, with the two primary bases for the latter’s objection to the MFA being that it cost much more than the MA to administer (due to the need for greater faculty resources, and the necessity of awarding tuition remission-eligible assistantships to many terminal-degree candidates) and permitted universities less immediate oversight over their resident literary artists. Far from a “cash cow” warmly embraced by U.S. universities, the creative writing MFA was for decades rejected by America’s universities—and often their English department faculties—as too exotic, too expensive, and too distant from the traditional academic functions of an American English department.

At the beginning of the 1980s there were still fewer than two dozen creative writing MFA programs in the world. It was not until the turn of the century that the rate of MFA-program creation significantly increased, as indicated by the table below, which catalogues MFA programs’ dates of establishment by decade:

MFA Programs Founded, by Decade

1920s: 0
1930s: 1
1940s: 0
1950s: 0
1960s: 11
1970s: 11
1980s: 27
1990s: 41
2000s: 94
2010s: 132 *

* = This is prorated from the number of programs founded in the first thirty months of the decade.

Location Assessments

While not listed in the 2013 MFA Index, location was one of the seven categories used to determine inclusion in the index. Programs located in or in thirty miles of a location that appeared in any one of the following eight national media assessments of the best places for individuals (particularly students and young professionals) to live and work were included. Those assessments were Bloomsberg/Businessweek (“America’s 50 Best Cities,” 2011); U.S. News & World Report (“10 Great College Towns,” 2011); Parents & Colleges (“Top 10 Best College Towns,” 2011); Travel + Leisure (“America’s Coolest College Towns,” 2009); The American Institute for Economic Research (“Best College Towns and Cities,” 2011); StudentUniverse (“Top 10 Cities to Visit in Europe,” 2011); MoneySense (“Canada’s Best Places to Live,” 2012); or ELM (“2011 Top Cities to Live and Work Abroad in Asia,” 2011).

Some of the criteria used by the listed media organizations included: Which towns and cities have the highest index of student-centered restaurants, bars, museums, pedestrian malls, bicycle paths/lanes, parks, hiking trails, sporting events, theaters, and concert venues; the total number of restaurants, bars, and museums per capita; the total number of colleges, libraries, and professional sports teams; income, poverty, unemployment, crime, and foreclosure rates; the percent of the local population with bachelor’s degrees; park acres per 1,000 residents; and air quality.

Application Fee

The application fee column lists each program’s application fee for the most recent application cycle.

The relevance of these data has increased in recent years, as three distinct but related phenomena have been observed in the MFA admissions system over the past six admissions cycles: acceptance rates at the nation’s most competitive programs are steadily declining; applicants are responding to this trend by applying to a larger and larger number of programs each year (the conventional wisdom in 2005 was that the average applicant should apply to eight to ten programs; now, applicants are regularly advised to apply to between twelve and fifteen programs, and more if financially feasible.

Given the cost of the Graduate Record Examination (GRE) General Test ($160), the cost per GRE “score report” to be sent to individual programs ($23), and the cost per transcript ordered from an alma mater (costs as high as $50 per transcript at some institutions, and rarely if ever less than $5 per transcript; some applicants, particularly nontraditional and international applicants, must order transcripts from multiple alma maters), applicants are increasingly unable to afford to apply to programs with high application fees. And because of the importance of applicant surveys to the MFA Index, programs with higher application fees are likely to receive fewer applications per annum and thus place lower in the one-year and four-year applicant popularity surveys than they otherwise would.

LOW-RESIDENCY PROGRAM PROFILES: ADDITIONAL PROGRAM MEASURES

Genre Availability for Study

Genre Availability for Study measures the number of genres in which a given low-residency program offers either a formal track or an informal concentration or focus. As many programs occasionally offer courses in genres for which they do not offer a track or concentration, it can be difficult to assess which genres are sufficiently supported at a given program that the program is likely to receive creative portfolios in that genre. The aim of the Genre Availability for Study category is to include only those genres in which a program is likely to attract an appreciable number of applicants—as opposed to a genre offered only as an occasional brief-residency course lacking substantial nonresidency, academic-year faculty support.

Residency

Residency measures the number of days per year a low-residency student in the program is required to be on campus. While at present there is no conventional wisdom as to the value of a long residency versus a shorter one, low-residency programs' residencies are typically the only opportunity for students to interact in person with their professors and to meet their classmates; consequently, many applicants to low-residency programs may prefer longer residencies.

 

Seth Abramson is the author of Northerners (Western Michigan University Press, 2011), winner of the 2010 Green Rose Prize from New Issues Poetry & Prose, and The Suburban Ecstasies (Ghost Road Press, 2009). A contributing author to The Creative Writing MFA Handbook (Continuum, 2008) and a 2008 recipient of the J. Howard and Barbara M. J. Wood Prize from Poetry, his work has recently appeared in Best New Poets 2008, American Poetry Review, Boston Review, New York Quarterly, and elsewhere. Currently a doctoral candidate in English at the University of Wisconsin in Madison, he is a graduate of Harvard Law School and the Iowa Writers' Workshop.

 

 

 

 

 

Comments

Many of the same flaws

I've so far put off commenting on this cosmetically altered version of the "rankings." So, apparently, have others. The people I've discussed this topic with have not based their decision not to respond on the judgment that they consider the problems with this barely-different "methodology" solved; they've based it on the question of whether or not we should ignore this enterprise altogether.

I decided I shouldn't. The rankings are based on so many logical and empircal flaws that it's important, I think, for someone to address them (and I'm hardly alone in this opinion). So I'm gonna add my many 2 cents in the next few weeks, when I have spare moments.

As I mentioned last year, my brother is a mathematical (as opposed to applied) statisticain--which means he also understands the applications of stats. Having already read half of the "methodology" (it's still so long), my brother raises the obvious question about sample size. Before I receved a response from him, I'd raised to him the quesiont of the Central Limit Theorem, and the question of when it does not apply. (I first encounted the theorem in stat 101.) It does not apply well to this kind of sampling. 

As I did early on last year, I won't read any of Seth's responses unless a friend tells me that I ought to because of some innacuracy he's made about my claims or other good detail I should consider. I've seen smart and fair questions raised in response to Mr. Abranson's claims: e.g., on (I believe) HTMLGiant, a woman raised the quite reasonable question that myabe it was "misleading" for Abramson to say that Iowa's program was the "best" MFA program long before any other program existed; after all, "better" and "best" imply that there's something to which the item in question can be compared. His response to her was, in my view, rude and unfair.

Besides, I'm not really writing to him anyway.

 

P.S.

P.S. Sorry for the typos in my first post! I find spell checks useful (though I never rely on grammar checks)--and for that reason, I miss the squiggly red line! (Those can catch clerical errors as well!)

The assumptions you cannot make in science

Bit by bit, I want to respond to several assumptions made by Mr. Abramson. What I'll say tonight:

He describes his respondennts as "well researched," yet he provides no empirical evidence whatesover to support this claim. Also, he states in his "methodology" that his enterprise isn't "scientific" because, he argues, not all programs have responded. The problem, however, is that even if every program WERE to provide all the data he's searching for, his "rankings" (or "index," or whatever P&W wants to call it this year) would STILL be unscientific, and here's part of the reaon that that's the case:

One cannot make assumptions about one's sample without supporting evidence; also, human subjects, when it comes to their OPINIONS or FEELINGS (as opposed to, say, tissue samples), are fraught with well known interpretive difficulties that go beyond those found in the typical study in the natural or physical sciences.

Anyone who fully understands the scientific method understands at least this much: Making unsupported assumptions about your sample is NOT THE WAY SERIOUS SCIENCE IS DONE.

Anyway, more later...

 

Oh, and I'll add...

He argues that surverying MFA graduates would be a create a biased sampling because graduates would tend to rank their own programs highly. Fair enough--at least in theory.

But his method of examining prospective applicants doesn't get rid of the problem of bias; it merely replaces one kind of bias with a set of other biases beyond funding: location, picking "easy" programs regarding admission, having a "connection" to a particular faculty member, etc., etc., etc.

Noticing such details isn't rocket science.

(Oh, by the way: I'm trying to figure out how to separate paragraphs with white space. It seemed easier last year!) 

 

I agree with Caterina

These rankings continue to be an absurd blemish on PW's otherwise superb support for the CW community. The whole debate seems very simple to me - the information is useful, so make it available. But ranking requires criteria, and no-one has yet come up with sensible and generally-applicabe criteria for ranking MFA programs. Seth Abramson's criteria might work for him, and that's great. But putting PW's name on Abramson's ranking is silly (almost as silly as prorating the number of MFA programs founded in the 2010s on the basis of the number founded in 'the first thirty months' of the decade). 

A few final (?) thoughts

I agree with you, TimO'M.

A few additional claims made by the polll's creator that I'd wanted to respond to, including statments made by him that I think of as "Sethisms--a term I don't mean as denigrating but use because Seth has made a number of claims I'd never heard elsewhere but that he presents as if they should be believed merely because he made them (unless he thinks they carry some other obvious force: e.g., that they're self-evident or were handed down by the MFA Goddess and transcribed by Seth):

1) That MFA programs provide a "nonprofessional, largely unmarketable degree..." The problem with this claim is that this used to not be the case, before the number of MFA programs mushroomed (I think there are too many MFA programs now, and I suspect that some of them were created as cash cows--little equipment required but good salaries for the faculty and cheap labor by those who do manage to get funding from them). Although most Harvard law grads probably do manage to find good-paying jobs in the profession, the same phenomenon, more or less, has happened with law schools--a professional and marketable degree, traditionally: http://www.nytimes.com/2011/01/09/business/09law.html?pagewanted=all.

In fact, numerous law professors and members of the American Bar Association have questioned the ethics of this phenomenon.

2) That teaching is a relatively unimportant component in the MFA experience. While Seth is welcome to his opinion on this matter, that's all it is: his opinion. I earned my MFA from a program that, according to Seth, is associated with high post-grad employment. Why did I choose to apply there, though? A) the quality of the alumni; and B) the quality of the writers on the faculty. Most others I knew who had applied to harder-to-get-into programs considered the same two factors.

Although being a good writer doesn't guarantee that one will be a good teacher, I've had only one writing teacher who excelled the former but not the latter. Most good writers are good readers. How can that help (enormously, I'll add) and MFA student? By being read by a nuanced reader who understands the art form--someone who isn't also in competition with you, by the way--you can learn what you're doing well and what you're not doing so well. (Many of us have witnessed or even experienced this phenomenon: where one student will make in workshop a humane and fair-minded criticism of another student's piece and then the latter will later say, as payback, something nasty about the former's work. it's childish but also human, and it's more likely to occur among peers.)

No precise scientific measure will be created for MFA rankings, and I suspect that's why Abramson treats, for example, the quality of the faculty as rather trivial. How would he be able to measuer the quality of the writers on the faculty? By awards won? Which awards? As imperfect as it was, I find the old US News & World Reports helpful in that a) a faculty respondent was unable to rank her own school and b) faculty, who often guest-teach at other programs, have an idea of where the better students tend to be studying. Any more "scientific" a ranking seems highly unlikely to me. 

3) That it's really one's classmates--peers--that determine the quality of one's experience in an MFA program. Again, Mr. Abramson is entitled to his opinion, but that's all it is. A talented poet, and perhaps the gentlest person in my class, left after the first year (of a four-year program) for what he said would just be a "leave." He never returned. One thing he told me before he left was that he'd found no "writing community" there. Others did. But let's face it, an MFA program can include a lot of back-biting among students. (A friend of mine who attended Iowa in the '80s said that a running joke there was that the Iowa Writer's Workshop kept the student counseling services plied with clients. Perhaps the environment there is more humane now. It's refreshing to see the current director publicly state that applicants she's stongly supported--based on their writing sample--have sometimes been, to her surprise, rejected by the rest of the faculty votes.)

4) This distinction between "studio" and "academic" MFA programs, terminology I hadn't encountered pre-Seth Abramson (though I'd done an enormous amount of research on programs before I applied). He's said that Iowa is one of the "least academic" programs. By what measure? That they don't give grades? I know someone who took, during his MFA program there, a seminar that included classical Greek thought and was taught by James Alan McPherson: Pulitzer winner, Guggneheim and MacArthur Prize winner, and graduate of Harvard Law School before he attended the IWW. (Ever read any of his essays, often known for their intellectual, as well as emotional, nuance?) Not an "academic" program? (In contrast, my more "academic" program focused on reading literature as an art form; no postmodernist/post-structuralist/cultural studies-based lit-crit was involved. Otherwise, I wouldn't have attended.)  

5) That the level of the writing of MFA students at Iowa (or similar programs) is exceptional (I wish I could find the reference to that--if I do, I'll include it)--another justification for the claim that teaching isn't all that important?

While, as an undergraduate, I was taking other kinds of courses at Iowa, I used to sneak to the bin of fiction submissions for workshop (but only after the workshop had met) and steal the one or two leftovers (I wasnted to write fiction but was also scared by the prospect). Some of the writing was exceptional. Some of it, though, was rough-hewn (it was a workshop, after all)--and occasionally it was relatively bad, even if the prose was pretty good. My friend once described to me Frank Conroy's response to such stories: "Beautiful prose in the service of what?" (I.E., where was the plot, the characterization, the conflict, the sensory detail...?) Yes, this is hearsay, but I've heard the same depiction from several other grads of the IWW.

Given all of these obvious questions in response to this "ranking" system, what is it that has convinced P&W to attach it's name to it and give it such exposure.

I want to raise one more matter (one I consider at least as important as the above concerns I expressed), but it's getting rather late, so I'll sign off for now.

My feline pal, Caterina (one of three cats I live with), thanks you on my behalf for your indulgence--assuming you've made it this far into my comments.

Oh, just caught:

Forgot the end-parentheses in the second sentence above ("Sethisms").

While I'm thinking of it...

On the positive side: The application numbers are being called “popularity,” as they should be.

On the less positive side: It appears the Seth has still failed to distinguish “selectivity” from “acceptance rate.” As a Yale University administrator, whom I quoted last year, pointed out, the quality of the applicant pool makes a huge difference. In other words, a program that has a 25 percent acceptance rate might be more selective that some schools with, say, 10 percent acceptance rates. (And I have no bone to pick here: According to Mr. Seth’s own measures, the program I finished has a 4-5% acceptance rate.)

 I’m of course, in the above references, talking about Columbia (and some of the other NYC schools). For whatever reasons, Columbia’s MFA program has been associated with an exceptionally large number of fine writers. Tom Keeley, and Seth Abramson, were correct in alerting MFA applicants to the reality that funding is more available at some schools than at others, and that some of those latter schools are incredibly expensive if you don’t get funding. But it seems that Mr. Seth categorizes such schools as moral transgressions, even though some students get funding from them. (And anyway, if you’re living in NYC and you’ve got the money...)

 I also wrote earlier about Seth’s distinction between “studio” and “academic” MFA programs in creative writing, a distinction that caught my attention because I’d never it anywhere when I applied to programs in the ‘90s—which is why I came to call such terms “Sethisms.”

 Again: I have a friend who, during his MFA program at Iowa, took a seminar under James Alan McPherson--who also has a Harvard Law degree--on early classical Western thought. How is that not “academic”?? And why should we conclude that artistry and intellect are mutually exclusive? Since when? The idea that they're deeply different is a fairly recent distinction in the West.

 And one more time: In my own four-year program, we didn’t study Derrida or Foucault, etc., etc... So is that "academic" or not?

Oh, and I’ll add for good measure: I think Jorie Graham is, at least in her later work, a fantastically bad poet. Iowa (IWW) is lucky to be rid of her. And if we’re talkin’ intellectual stuff: Graham’s stupidly irrelevant references to obscure Latin botanical terms and to quantum theory say one thing she seems to want others to believe about her above all other possibilities: “I’m really really really really smarter than you!!

(And I'll later post a small bit about Columbia.)