Models for Writing

No comments
Randall Stephens

When I teach writing I use a short piece by William Zinsser from the American Scholar: "Writing English as a Second Language" (Winter 2010). Yes, my students are native speakers.  Regardless, this essay is spot on for college students. (I've blogged about it before here.)
A WPA poster from 1937. Courtesy
of the Library of Congress.

Zinsser offers up a host of great tips:

Cut horrible, long Latin-origin words: "communicated, conversion, reconciliation, enhancements, verification."  When these are used/overused they lead to stilted or stuffy prose.

Use good, short, simple nouns: "infinitely old Anglo-Saxon nouns that express the fundamentals of everyday life: house, home, child, chair, bread, milk, sea, sky, earth, field, grass, road."

"I have four principles of writing good English. They are Clarity, Simplicity, Brevity, and Humanity."

"So remember: Simple is good. Writing is not something you have to embroider with fancy stitches to make yourself look smart."

Undegrads and grad students need to hear that advice over and over again.

Zinsser also remarks "We all need models. Bach needed a model; Picasso needed a model. Make a point of reading writers who are doing the kind of writing you want to do."  I like to add that if a student is not interested in reading and makes no effort to read good prose, then he/she will most likely never become a good writer.

The point about reading and having a model is excellent.  I have had students in my history methods/historiography course bring in one or two books--fiction or non fiction--that they have enjoyed and that we can use to spark a discussion. We go around the table and ask, "What makes this a good book?  How does the author set the scene and use color?  What do I like about the writing, organization, etc?  How could I model my own writing, in some way, on this?"

Here are some of the authors and books that I enjoy reading.  (I've brought a few of these titles into class for the above exercise.)  I can only hope a little of the style of these authors will rub off on me.  

Jane Smiley, The All-True Travels and Adventures of Lidie Newton (1998)


Thomas McMahon, McKay's Bees (1979)

Peter Guralnick, Last Train to Memphis: The Rise of Elvis Presley (1995)

David Foster Wallace, A Supposedly Fun Thing I'll Never Do Again: Essays and Arguments (1998)

Richard Russon, Straight Man (1998)

Mary McCarthy, The Groves of Academe (1952)

Flannery O'Connor, Wise Blood (1952)

Cormac McCarthy, The Road (2006)

Kurt Vonnegut, Breakfast of Champions (1973)

Haruki Murakami, The Wind-Up Bird Chronicle (1998)

Sherwood Anderson, Winesburg, Ohio (1919)

John Dos Passos, The 42nd Parallel (1930)

What about you?  What models do you have? 

No comments :

Post a Comment

Ashes of Ancient Halls Made Into Mounds

No comments

Archaeologists in England have unearthed the burned remains of two massive 6,000-year-old buildings whose ashes were shoveled into piles to make burial mounds.

"The buildings seemed to have been deliberately burned down," says Julian Thomas, a professor at the University of Manchester who’s leading the excavation. Researchers believe these halls of the living may have been transformed into "halls of the dead" after a leader or important social figure died.
The remnants were uncovered in an open field near Dorstone Hill in Herefordshire. For decades, amateur archaeologists have noticed pieces of flint blades in the area and wondered whether the land there contained relics of a long-forgotten time.
 According to LiveScience.com:
When Thomas and his team began excavating, they found two large burial mounds, or barrows, that could have held anywhere from seven to 30 people each.  
The smaller barrow contained a 23-foot-long (7 meters) mortuary chamber with sockets for two huge tree trunks. Digging deeper, the researchers uncovered postholes, ash from the timbers, and charred clay from the walls of an ancient structure. 
These burnt remains came from what were once two long-halls, the biggest of which was up to 230 feet (70 m) long, with aisles delineated by wooden posts and several internal spaces.
It’s not clear who built the halls and barrows, though the building construction is similar to that found in England between 4000 B.C. and 3600 B.C, predating the construction of Stonehenge by a millennium.

Artist's conception of one of the long halls.

No comments :

Post a Comment

Accessing History: A “Laboratory” at Gettysburg

No comments
Eric Schultz

I was fortunate in early July to attend three days of the 150th commemoration of the Battle of Gettysburg, including a number of events sponsored by the Gettysburg Foundation. It was busy, colorful, sometimes somber but always tropical, a good reminder of what conditions were like in July 1863.  The battlefield itself, nearly 6,000 acres and sometimes called the “symbolic center of American history,” is both inspiring and beautiful. 

The 150th commemoration included a retelling
of the battle and featured first-person accounts.
Events included a spectacular retelling of the battle (focused on first-person accounts), and the grand opening of the Seminary Ridge Museum at which visitors could climb its historic cupola to get a bird’s eye view of the battlefield and town.

As I attended various gatherings, however, it struck me that Gettysburg was nothing less than a kind of living laboratory for how people access history.

For example, there were lots (and lots) of folks taking tours of the battlefield, often led by certified National Park Service guides.  To walk in the footsteps of soldiers and view the battle lines redefines history in a whole new way for many.  Likewise some of the largest groups could be found on Little Round Top, where Col. Joshua Chamberlain made his famous stand--a tribute not only to Chamberlain and his troops, but to the power of Hollywood and films like Gettysburg, capable of creating historical celebrities. 

There were plenty of other visitors taking self-guided tours, some with maps and some with iPods, and some with their noses pressed to air-conditioned windows as they followed along a self-guided auto tour.  There were individuals, couples and families, the latter often gathered around a monument while Dad took pictures and the youngest played surreptitiously on a Gameboy.  (Years later. . . Dad: “Remember when we took that great trip to Gettysburg?” Son: “Um, I think.”)  There’s no telling how many monuments and battlefield scenes were “Instagramed” that weekend, speaking of interesting new ways to access history.

At one event, a young Marine sat next to me.  He’d served three tours in Iraq and was now, in his words, “doing time” as an instructor.  He said he’d driven over from Quantico for the weekend, spent all day on his mountain bike touring the battlefield, went to every event he could attend, and planned to do the same for every 150th Civil War celebration he was able.  He was clearly and enthusiastically engaged in accessing history.

The bookstores were full, and, I’m told, did a landmark business in souvenirs and (I hope) books.  There were certainly plenty of people in hotel lobbies, restaurants and under trees reading as they tried to absorb and understand events.  At the same time, souvenirs are ever-important and, for that matter, first cousin to “relic hunting,” a time-honored (if not always honorable) way of accessing history.  Gettysburg’s famous copse of trees, perhaps the most sacred spot place on the battlefield because it represented the “high tide of the Confederacy” where Pickett’s and Pettigrew’s men were finally turned back, had to be fenced in as early as 1887 because so many souvenir hunters were cutting branches to make walking sticks.
 A view of the battlefield, including tents of
some of the reenactors, from the cupola of the
new Seminary Ridge Museum.

Some ways of accessing the history of Gettysburg are clearly unwelcome.  The National Park Service has long held that the battlefield ought to be unblemished so that visitors can use "grounded imagination" to experience the battle.  When businessman Thomas Ottenstein erected a 307-foot galvanized steel viewing tower--“a classroom in the sky”--near the battlefield in 1974, it was enjoyed by many but seen by others as an abomination.  The structure lasted 21 years until the National Park Service seized it under eminent domain and knocked it down with explosive charges.

In Sacred Ground, Edward Tabor Linenthal describes the long, often controversial history of the Gettysburg battlefield as veterans attempted to access their own history.  Beginning in the 1870s, Pennsylvania chapters of the Grand Army of the Republic (GAR) held reunions at Gettysburg.  With the dismantling of Reconstruction, Union troops were joined by Confederate veterans.  Combined groups tended to emphasize valor on both sides, as veterans like Maj. Gen. Daniel E. Sickles proclaimed at the 25th reunion in 1888, "To-day there are no victors, no vanquished.  As Americans we may all claim a common share . . . in the new America born on this battlefield." 

Not everyone agreed.  Addressing a chapter of the GAR that same year, Bvt. Brig. General J.P.S. Gobin angrily declared that he was "tired of this gush and pretense for the glorification of the veteran simply because he wore a gray uniform with a Southern flag printed on his badge.  That badge meant treason and rebellion in 1861, and what it meant then it means now. . . ."   Others felt that in the rush to reconcile North and South, the plight of blacks and the issue of slavery were lost.

At the fiftieth anniversary of the battle in 1913, enough time had passed that 55,000 Union and Confederate veterans converged on Gettysburg for a four-day celebration.  Festivities included 6,500 tents, 173 kitchens, stores filled with pennants and flags, and a handful of fistfights.  Nonetheless, the event emphasized collective heroism and healing, and featured the ideology of the Lost Cause that had developed in the postwar South.  Capt. Bennett H. Young, commander of the United Confederate Veterans now accessed a kind of combined history by saying, "It was not Southern valor nor Northern valor.  It was, thank God, American valor." There was a famous handshake near the copse of trees when 300 veterans of Pickett and Pettigrew’s charge and defense met.  Four years later Virginia became the first former Confederate state to erect a monument on the battlefield.

At the 75th in 1938, 1,400 Union and 500 Confederate--average age 94--were still hearty enough to gather at Gettysburg for the "Last Reunion of the Blue and the Gray."  But something interesting happened with the passing of the last veterans of the Civil War in the 1940s: A kind of enthusiastic “subculture” arose as a way to continue accessing history.  Civil War Roundtables (discussion groups begun in Chicago), relic hunters and collectors, war-gamers and, of course, reenactors emerged--the latter being among the most controversial.  Dressed in authentic period clothing and intent on recreating the battle experience in every way, reenactors were among the most visible visitors during my time at the Park.  I found General and Mrs. Lee escaping the heat in my hotel lobby, for example.  There were fields of tents spread around the park and soldiers at every turn.

Frankly, reenactment doesn’t seem like much fun to me.  I was hot enough in shorts, and much of the time I saw the troops, clothed in wool, standing at attention in the hot sun.  (Not to mention, I gave up sleeping in pup tents when my son graduated from Cub Scouts.)  And there is certainly a school of thought that abhors reenactors as much as it does galvanized steel “classrooms in the sky.”  Popular Civil War historian Bruce Catton was especially critical of battle enactments which, he said, "require us to reproduce, for the enjoyment of attendant spectators, a tin shadow-picture of something which involved death and agony for the original participants."
Despite the heat, guided tours of the
battlefield and monument were in full
swing throughout the 150th.

However, the view is entirely different among reenactors, who staged two large-scale battles during the 150th commemoration.  One participant wrote passionately afterwards, “The horror of the Civil War hit me then, in ways that history books and Ken Burns’ films never had.  I was watching real people, all of them Americans, killing each other.  I knew it wasn’t real, but I also knew that if it had been, I would have fallen on the ground and sobbed.”

It’s pretty hard to say that that’s not accessing history.

235,000 visited Gettysburg during the commemoration; they read, walked, drove, toured, listened, visited museums, bought souvenirs, took thousands of digital pictures, camped, mountain-biked and reenacted, accessing history in all sorts of interesting ways; one new film even uses drones to illustrate the battle. As with most things Gettysburg, however, it may be best to look to Abraham Lincoln for the final word: How we preserve and interpret the battle’s meaning--and by implication, find ways to make and keep it accessible--should be, he said, all part of the "great task remaining before us."

No comments :

Post a Comment

History and the Voting Rights Act Roundup

No comments

NPR Staff, "The Voting Rights Act: Hard-Won Gains, An Uncertain Future," NPR, July 21, 2013

. . . . Congress also noted, however, that the Voting Rights Act was still needed, and it had
Fort Scott, Kansas, Tribune, August 6, 1965, p. 1
From the Google News Archive.
been used hundreds of times since 1982 to protect against discrimination.

But in his opinion, Chief Justice John Roberts suggested it is a new era. "Our country has changed," he wrote for the majority.

But Rep. Lewis says race is still very much at the forefront.

"I think there has been a deliberate and systematic effort on the part of certain forces in our country to take the whole idea of race out of public policy," he says. "Race is involved in everything that makes up America, and we cannot escape it. We have to deal with it face on.">>>

John Paul Stevens, "The Court & the Right to Vote: A Dissent," New York Review of Books, August 15, 2013

In Bending Toward Justice, Professor Gary May describes a number of the conflicts between white supremacists in Alabama and nonviolent civil rights workers that led to the enactment of the Voting Rights Act of 1965—often just called the VRA. The book also describes political developments that influenced President Lyndon Johnson to support the act in 1965, and later events that supported the congressional reenactments of the VRA signed by President Richard Nixon in 1970, by President Gerald Ford in 1975, by President Ronald Reagan in 1982, and by President George W. Bush in 2006.>>>

"The Future of the Voting Rights Act," On Point, WBUR, June 27, 2013

The Voting Rights Act was the monumental achievement of the civil rights movement, a powerful federal response to racist policies like poll taxes and literacy tests that kept blacks home on Election Day. But that was 1965.

Today southern blacks vote at even higher rates than whites. So this week the Supreme Court struck down the heart of the law, freeing mostly southern states to change their election laws without federal approval. Guests: Vernon Burton, Kareem Crayton, and Ronald Keith Gaddie.>>>

John Fund, "A Civil-Rights Victory," National Review, June 25, 2013

The Supreme Court’s decision today to overturn a small part of the 1965 Voting Rights Act is actually a victory for civil rights.  As the court noted, what made sense both in moral and practical terms almost a half century ago has to be approached anew. . . . Clint Bolick, director of litigation for the conservative Goldwater Institute in Arizona, says the demise of Section 5 of the Voting Rights Act will also reduce the balkanization of racial gerrymandering that has become so popular lately. “Voting districts drawn on racial or ethnic lines divide Americans,” he says. “This decision helps move us toward the day in which racial gerrymandering becomes a relic of the past.”>>>

"New obstacles for voters," Baltimore Sun, July 22, 2013

If there were any doubt the Supreme Court erred badly in the term just ended by striking down a key provision of the Voting Rights Act designed to protect minorities' access to the polls in states with a history of voter discrimination, it's been dispelled by the swift reaction in states formerly covered by the law's pre-clearance requirement. Officials there have lost no time in using the ruling as a license to start discriminating again.

Barely two hours after the court declared unconstitutional Section 4 of the act, which determined which states were required to get Justice Department approval before changing election laws in ways that disproportionately affected minority voters, Texas' attorney general announced that a 2011 voter-ID law a lower court had blocked as discriminatory would go into effect "immediately." Over the next week, four more former pre-clearance states moved to tighten restrictions on voting.>>>

No comments :

Post a Comment

Memo to America, Re: Welfare in the Olden Days

No comments
Gabriel Loiacono 
 
One evening, chatting with friends from church, one asked me what kind of history I focused on. I told him: the history of welfare in early America.  He said: what welfare in early America?

"The drunkard's progress, or the direct
road to poverty, wretchedness & ruin," 1826.
Courtesy of the Library of Congress.
I find myself having a conversation like that one more and more these days.  Whether on the left or the right politically, high school grads or Ph.D.s, most Americans I talk to assume that welfare is a creation of the twentieth century: midwifed by Franklin D. Roosevelt or Lyndon B. Johnson.  Those hearty, independent minutemen of the Revolutionary period, they assume, either made the poor find work or relied only on churches for charity. 

Occasionally, this assumption is voiced explicitly in national, political discourse.  For example, in a famous September 12, 2011 Republican Presidential Primary debate, Representative Ron Paul described assistance to the poor in the past thus: “Our neighbors, our friends, our churches would do it.”  Less off-the-cuff, respectable-looking websites will tell you that charity was almost entirely private before FDR, aside from a few dark and dingy poorhouses, which were more effective at driving inmates out than keeping them comfortable.  And it is not only critics of welfare who think this; one can find defenders of welfare describing the U.S.A. as essentially without welfare before FDR.[1]

More often, this assumption is implicit.  You can see this in recent discussions of food stamp policy and the Farm Bill.  When both critics and defenders of welfare policy bring history into the argument, they usually head back to the 1960s.  Occasionally, they reach back to the 1930s.  They almost never go further back in time.  On both sides, the assumption is that prior to the New Deal, there was no welfare to discuss.  Thus, these are the good old days or the bad old days depending on what you think about welfare today. 

It is for this reason that I fantasize about writing the following memo:

MEMORANDUM

TO: The American People
FROM: A Historian
CC: Candidates, Think Tanks, Warriors of the Internet Comment Boards 
SUBJECT: Um, actually there was welfare when the United States was founded


I would go on, of course, to flesh this statement out with some background, evidence, and precision.  I would point out that poor laws came to North America almost with the first British settlers, and that a large welfare state developed in almost every English municipality.  I would cite figures showing that poor relief comprised more than half of most municipalities’ budgets before the 1820s, when school and road costs grew large enough to match poor relief.  I would feel compelled to mention that poor relief could mean a poorhouse, but more often some combination of cash, food, clothes, firewood, doctor’s attention, medicine, or even full-time nursing care.  I would highlight how significant local taxes were to most early Americans, compared with much lower state taxes and almost non-existent federal taxes. 
"Publicly-owned poor houses like the Dexter Asylum
in Providence, Rhode Island did not come cheaply."
Courtesy of the New York Public Library.

This would lead to the obvious comparisons.  Americans spent more than half of their taxes on poor relief when George Washington was president, compared to 12% on the federal “safety net” today, or 55% if you include Social Security, Medicare, Medicaid, and the Children’s Health Insurance Program.  Unlike today’s contributors to Social Security and Medicare, however, most taxpayers (read: property owners) in 1789 would not have expected to benefit from poor relief in their lifetimes.  They could depend on it, though, if they ever met with a financial catastrophe.  I would almost certainly quote historian Elna Green’s witticism, that so many grocers, doctors, wood-hewers, etcetera made money from the town by helping the poor that the poor law system should be called the “welfare/industrial complex.” [2]

Finally, I would point out one big difference between early America and the present: Today’s welfare is largely federal while early America’s was largely municipal.  In fact, I think the local nature of early American welfare is the reason why so many policy analysts overlook welfare’s past.  They just don’t look at the state and local levels of government. 

My fantasy memo is not a prelude to some specific policy prescription for the present day.  I just wish that when we do bring history to the argument, we use a reasonably correct version.  As an historian writing about pre-Civil War poor relief, I find myself cringing almost every time the history of welfare surges into public discourse.  Usually, there is a 300-year hole in the story.  For colonial American and U.S. history, that is a pretty big hole!

Surely, though, I am not alone among historians.  What about the rest of you? What makes you cringe?  You historians who see your subjects of expertise routinely misrepresented, what do you do?  What is your responsibility?  How do you lend your expertise in a helpful way?

_____________________

[1] On respectable-looking websites, see “The Poor in America Before the Welfare State,” at Intellectual Takeout: Feed Your Mind www.intellectualtakeout.org/library/sociology-and-culture/poor-america-welfare-state.  For a defender of welfare on the non-existent past of welfare, see Charles Michael Andres Clark, “The Truth Deficit: Four Myths About Deficit Spending,” in Commonweal July 12, 2011. 

[2] Elna C. Green, This Business of Relief: Confronting Poverty in a Southern City, 1740-1940, p. 1.

No comments :

Post a Comment

Tablet Tells of Mayan 'Snake Queen'

No comments
A tablet from 564 A.D. found beneath the main temple of the ancient Mayan city El Perú-Waka’ in northern Guatemala reveals what archaeologists describe as a “dark period” in Mayan history, including the violent story of a 6th-century “snake queen.”


The stone tablet stood exposed to the elements for a hundred years, before being buried as an offering in a funeral for another queen. Epigrapher Stanley Guenter, who deciphered the text, believes the tablet was dedicated by King Wa’oom Uch’ab Tzi’kin, a title that translates roughly as “He Who Stands Up the Offering of the Eagle.”


“The information in the text provides a new chapter in the history of the ancient kingdom of Waka’ and its political relations with the most powerful kingdoms in the Classic period lowland Maya world.”

 
Lady Ikoom was a predecessor to one of the greatest queens of Classic Maya civilization, the seventh-century Maya Holy Snake Lord known as Lady K’abel who ruled El Perú-Waka’ for more than 20 years with her husband, King K’inich Bahlam II. 
 
She was the military governor of the Wak kingdom for her family, the imperial house of the Snake King, and she carried the title “Kaloomte,” translated as “Supreme Warrior,” - higher in authority than her husband, the king.


Around the year 700, Stela 44 was brought to the main city temple by command of King K’inich Bahlam II to be buried as an offering, probably as part of the funeral rituals for his wife, queen Kaloomte’ K’abel.
Image: A portion of the newly discovered tablet.

No comments :

Post a Comment

Diaspora: A Useful Idea

No comments
Kevin Kenny*

The word “diaspora” is remarkably popular. But what does it mean? 


The New Shorter Oxford English Dictionary defines diaspora as “The dispersion of Jews among the Gentile nations,” giving as a secondary definition “all those Jews who lived

outside the biblical land of Israel.”
A Harlem "Back to Africa" announcement,
Negro Club, New York (1929). Courtesy of
the New York Public Library.
By this definition, diaspora is both a process (the scattering of Jewish populations) and a thing (their communities abroad). This double meaning is the source of much confusion.

Until recently, diaspora referred almost exclusively to Jewish history. In the 19th and 20th centuries, many other groups—including people of African, Armenian, and Irish descent—adopted the term and molded it to their own purposes. As its illustrative example, the OED cites “[t]he famine, the diaspora and the long hatred of Irish Americans for Britain.” Today, diaspora is applied to virtually every group that moves from one place to another.

The key period of expansion in the usage of diaspora was the half-century after World War II. The reasons are not hard to find. During the era of decolonization, globally scattered populations of African origin forged new transnational bonds of solidarity. Migrant and ethnic communities of Asian origin (e.g., ethnic Chinese in Indonesia and Vietnam or South Asians in East Africa) were expelled from their adopted countries. Global migration assumed a massive new scale. And the international recognition of refugees brought new attention to involuntary migration in particular.


With the proliferation of usage, inevitably, came a decline in coherence. Diaspora is now virtually a synonym for “migration” and “ethnic group”—two related but distinct terms that duplicate the basic bifurcation in the dictionary definition. And because diaspora is now used so broadly, it has lost its analytical bite.

Over the last generation, sociologists and political scientists have responded to this conceptual confusion by constructing elaborate typologies concerning the origins, scale, nature, and character of diasporas. Their goal is clarify, once and for all, which kinds of migration and ethnic group count as diasporic, and which do not.

But typologies have some obvious drawbacks.They can be arbitrary: Who gets to decide on the criteria? Conversely, they can be over-inclusive, with all forms of population movement lumped together under the ecumenical umbrella of diaspora. Between the two extremes lies the checklist approach: Tick off seven out of ten requirements and your group qualifies as diasporic. But if these requirements belong to different orders of experience—some concerning the process of migration and others the lives of migrants abroad—comparison between diasporas becomes impossible. The result is incoherence.

In the end, trying to pin the term diaspora down under a single, fixed definition is futile. Yet leaving its meaning entirely open-ended drains the term of analytical value. Is there a way out of this dilemma?

For historians, the solution is to reframe the question. Instead of seeking a definitive answer to the timeless and static question “What is a diaspora?” we can examine evidence to determine how and why people use the idea in specific times and places.

Approached in this way, diaspora is neither a process nor a thing, but an idea that people use to make sense of the world created by migration. But what kinds of people? Scholars like us? Policy makers and governments? Or migrants in their everyday lives? In other words, is diaspora a category of analysis or a category of practice? The answer is that it is both.

Diaspora is not simply a construct that scholars use to analyze the social, cultural, or political world. It is also an idea that migrants use to make sense of their everyday experience and that governments deploy for political and economic purposes.

As an idea, diaspora has three overlapping elements that vary in relevance depending on the historical circumstances. These can be labeled as follows: relocation, connection, and return. It should be added that people do not necessarily need to use the word “diaspora” to think about migration issues within this framework.

The first element is relocation. If diaspora meant the same thing as migration, both words would not be necessary. The idea of diaspora allows us to distinguish between migration in general and particular forms of migration. Diaspora has its greatest explanatory power when applied to involuntary migration—involving, for example, slavery, famine, or genocide.

The second element is connection. When the members of a migrant community in a given country of settlement involve themselves in the affairs of their homeland, they may or may not begin to see themselves as a diaspora. This form of interaction, after all, is commonplace. But when the connections in question involve not only the homeland, but also a web of globally scattered communities, they assume a multipolar rather than a unilinear form. The interrelated communities can then be seen as nodes within a diasporic network or web.

The third and final element of diaspora is return. Every conception of diaspora features a homeland, whether real or imagined. Return to this homeland can be literal, as in the Zionist movement, but it is more often metaphorical or spiritual. The great majority of African-descended people in the Americas, for example, could never hope to move literally to Africa. But this very fact helps explains the cultural and political potency of their desire to do so.

Relocation, connection, return: These are the three elementary aspects that make up the idea of diaspora. Together they constitute a powerful framework for thinking about migration.

Kevin Kenny is Professor History at Boston College. He is author of Diaspora: A Very Short Introduction (New York: Oxford University Press, 2013).

No comments :

Post a Comment

PhD Applicant Beware

No comments
Randall Stephens

The July 11-17 issue of Times Higher Education includes a must-read article for the grad school bound.  In "10 truths a PhD supervisor will never tell you" (11 July 2013) Tara
Brabazon writes: "As a prospective PhD student, you are precious. Institutions want you – they gain funding, credibility and profile through your presence. Do not let them treat you like an inconvenient, incompetent fool. Do your research. Ask questions." Some of her ten tips apply more to the UK setting, but most are right on target for students in the US as well.

Prospective PhD students in history should think long and hard about who they want to work with. Ask around.  Get to know something about the scholar you'd like to be your mentor.  Has this individual shepherded other PhDs?  Do his/her students land good jobs? What is your prospective mentor's publishing record like?  Is he/she a good fit for your project? What will it be like to work with him/her?  Will he/she lend a hand or remain aloof and passive reclusive? 

Brabazon offers some dos and don't and, most of all, warns, "don't let the supervisors grind you down."  Here's one of her particularly helpful pieces of advice:

The key predictor of a supervisor’s ability to guide a postgraduate to completion is a good record of having done so. Ensure that at least one member of your supervisory team is a very experienced supervisor. Anyone can be appointed to supervise. Very few have the ability, persistence, vision, respect and doggedness to move a diversity of students through the examination process. Ensure that the department and university you are considering assign supervisors on the basis of intellectual ability rather than available workload. Supervising students to completion is incredibly difficult.   

 
Read more here.

No comments :

Post a Comment

FDR, Disability, and the Journal of the Historical Society

No comments
Randall Stephens

Scott Hovey, managing editor of the Journal of the Historical Society, points us to the July 12th issue of Time magazine online. In it doctoral student in history at Boston University Matthew Pressman challenges the idea that a "gentlemen's agreement" existed between
the press and Franklin Roosevelt regarding the president's disability.  Writes Pressman:

The recently discovered film clip of President Franklin D. Roosevelt being pushed in a wheelchair, despite showing neither Roosevelt’s face nor the wheelchair, has become an object of considerable public interest. One reason people find the clip so fascinating is that it seems to represent a radically different era in American political life—one in which the president could rely on the press corps to help him hide from the larger public something so glaringly obvious as the fact that he was a paraplegic from having contracted polio at age 39. 


An NBC Nightly News report on the discovery stated that there was “a gentlemen’s agreement” between FDR and the press corps to hide the extent of his disability, and the Associated Press wrote that it was “virtually a state secret.” That has long been the conventional wisdom, repeated in countless books and articles. But it is inaccurate. In fact, the press sometimes described his condition in great detail. (read more)

Find out more in the September 2013 issue of the Journal of the Historical Society, which will include Pressman's article on the subject. Here is the TOC for that forthcoming issue:

PETER A. COCLANIS, "Editor’s Introduction"

JAMES B. LEWIS, SEONG HO JUN, AND DANIEL SCHWEKENDIEK, "Toward an Anthropometric History of Chosŏn Dynasty Korea, Sixteenth to Eighteenth Century"

KAREN M. HAWKINS, "A Moderate Approach: How the War on Poverty Was Kept Alive in Eastern North Carolina, 1963-1968"

MATTHEW PRESSMAN, "Ambivalent Accomplices: How the Press Handled FDR’s Disability and How FDR Handled the Press"

WYATT WELLS, "Research Note: Appointments of Catholics during the New Deal"

No comments :

Post a Comment

Summer Scholarship for the #altac

No comments
Elizabeth Lewis Pardoe

As I struggle to find the energy, focus, and drive to complete my summer writing deadlines, the opening lines of Thomas Paine’s The Crisis take on new meaning:

THESE are the times that try men's souls. The summer soldier and the sunshine patriot will, in this crisis, shrink from the service of their country; but he that stands by it now, deserves the love and thanks of man and woman.

For those of us “Alternative Academics,” marked by #altac hashtags on Twitter, the summer IS the season that tries our souls.  Our tenure-line colleagues disappear into the archives and post to Facebook from glamorous destinations around the globe. At the same time we work full time and wonder whether or not to attempt CPR on the scholarly commitments we left flailing for breath during the academic year. 

The difference appears less acute from September to June.  I may advise while others teach, but the strain on scholarship seems less stark then.  In the summer, when the professoriate retires from lectures, seminars, and office hours, I still Skype with fellowship applicants as registrars revise databases.  In some ways the summer pressure is less.  Undergraduates don’t line the halls.  Thus, the summer #altac scholar thinks a flurry of productivity just might be possible.


Other hindrances crop up as well.  For instance, if we stand by our scholarship as good patriots of the academic cause, no one thinks we deserve accolades and thanks. Simply put, no one cares.  My annual review holds no space for academic conference presentations and publications.  I can practice semantic gymnastics and squeeze mention of my scholarship into some discussion of professional development. I know full well, though, that no increase in title or pay will result.  That is not what the university hired me to do.  And still, I don’t think I would be capable of advising students on scholarly development if I were not an active scholar myself. I am, however, in a distinct minority. 
 

Some of us in administration are trained in history.  Many more have degrees in higher ed.  The research of the latter covers the practicalities of university administration.  As it happens, my scholarship sometimes involves educational institutions too. True enough, from 1550 to 1750 few people fretted about MOOCs and multicultural curricula.  Still, those institutions from long ago struggled with parallel problems and offer instructive lessons for today’s educators. My research subjects speak to me from the grave.  My colleagues with contemporary topics can circulate surveys among the living.

As summer progresses, I eventually find my way back into on-line archives, thankful for the treasures the digital humanities offers #altacs and independent scholars. I can log on and dive into documents from my desk while I eat lunch or from my sofa while my children play in the sun.  For that, this #altac summer scholar is always thankful and sometimes productive.

No comments :

Post a Comment

History in the News Roundup

No comments

Vox Tablet, "The Dreyfus Affair Holds a Sacred Place in French History. Is There Room for Debate?" Tablet, July 11, 2013

Nearly 120 years after the Dreyfus Affair shook the world, you would think we know all there is to know about the seminal case involving a French Jewish officer falsely accused of treason. Alfred Dreyfus was found guilty and deported to prison on a small, remote island, and it was only after his family, joined by leading intellectuals of the time, rallied in protest that he was acquitted, his case becoming a cornerstone of the democratic French republic.>>>

Mark Feeney, "Edmund Morgan, 97; professor, leading historian of Colonial era," Boston Globe, July 10, 2013

A frequent contributor to The New York Review of Books and other publications, Dr. Morgan strove to appeal to the interested layperson as well as fellow historians. “In writing about the past, there’s more of an aesthetic dimension than people realize,” he told the Globe. “You’re trying to see connections, patterns, to tell a story. The dispute among historians as to whether there should be narrative is misguided. All good history is narrative. History that doesn’t tell a story just hasn’t gotten far enough; people have been too lazy to tell the story.”>>>

Len Barcousky, "Historian McCullough gets bridge to call his very own," Pittsburgh Post-Gazette, July 1, 2013

"My father's business on First Avenue was right by the Smithfield Street Bridge," the Pittsburgh native recalled in a recent phone interview. "And I remember very distinctly sixth grade at Linden School when students in an older class made wooden models of all the different kinds of bridges. They were set out on the classroom windowsills, and I was utterly fascinated.">>>

Martin Pengelly, "The Maine lesson of Gettysburg: real history is never so romantic as reel," July 2, 2013

The story goes like this: 150 years ago today, Little Round Top was the key to the Union position at the battle of Gettysburg. If the Confederates had taken the hill, they would have won the battle. If the Confederates had won the battle, they would have won the war. >>>

No comments :

Post a Comment

Michal Jan Rozbicki on "The Rise of Learned Hagiography"

No comments
Randall Stephens

The following excerpt is from Michal Jan Rozbicki's review essay in the June 2013 issue of Historically Speaking.  Rozbicki uses Jon Meacham's Thomas Jefferson: The Art of Power
(Random House, 2012) to delve into "the cult of great individuals," which even in the present, does not lack enthusiasts.  Rozbicki is professor of history and director of the Center for Intercultural Studies at Saint Louis University. He is the author of Culture and Liberty in the Age of the American Revolution (University of Virginia Press, 2011).

One would be hard pressed to find a dull period in Thomas Jefferson’s life. Mindful of that, I began devouring Jon Meacham’s 800-page Thomas Jefferson: The Art of Power anticipating a gourmet biographical feast. It started off as an enjoyable and well-paced story but it was not long before the taste of syrup began to take over. By the time I reached page eight I had already been informed that “Jefferson was the
most successful political figure of the first half century of the American Republic,” “had a remarkable capacity to marshal ideas and to move on, to balance the inspirational and the pragmatic,” and was “a transformative leader.” He was a “formidable man” and his “genius” was that he “dreamed big but understood that dreams become reality only when their champions are strong enough and wily enough to bend history to their purposes.” A case of “the rare leader who stood out from the crowd without intimidating it,” his “bearing gave him unusual opportunities to make the thoughts in his head the work of his hands, transforming the world around him.” Not only “a man of Enlightenment, always looking forward, consumed by the quest for knowledge” but also “an inveterate walker,” “fit and virile,” “never tired of invention and inquiry,” “delighted in archeology paleontology, astronomy, botany, and meteorology,” “drew sustenance from music and found joy in gardening,” “bought and built beautiful things,”“was an enthusiastic patron of pasta,” and “enjoyed the search for a perfect dressing for his salads.” Moreover, he “knew Latin, Greek, French, Italian, and Spanish,” was “a student of human nature, a keen observer of what drove other men,” “perfectly acquainted” with “subject after subject,” an “extraordinary man.” Politically, he was “the father of the ideal of individual liberty, of the Louisiana Purchase, of the Lewis and Clark expedition, of the American West,” “the author and designer of America, a figure who articulated a vision of what the country could be.” A “builder and a fighter,” he “gave the nation the idea of American progress.” He was able to operate “on two levels, cultivating the hope of a brighter future while preserving the political flexibility and skill to bring the ideal as close as possible to reality.” The world “found him charming, brilliant, and gracious,” friends thought “he was among the greatest men who had ever lived, a Renaissance figure who was formidable without seeming overbearing, sparkling without being showy, winning without appearing cloying,” and “women in particular loved him.” Oh, and “he had great teeth” (xvii-xxiv).

The first pages are a herald of what is to come. The book is a glitzy glorification of Jefferson. But it is much more than that. It represents a new and intriguing genre that is perhaps best described as learned hagiography, an eclectic mix of seemingly incompatible components like old-fashioned hero worship, elite-centered topic, seductive narrative aimed at popular readership, solid scholarly research with a heavy apparatus of citations, and a didactic political objective.

The cult of great individuals has never lacked enthusiasts. Some are drawn to the Nietzschean belief that strong personages, superior to ordinary people, are the driving force of history. Others long for a savior who will liberate us from all evil. Some are passive followers finding comfort in believing that this or that leader is wise, knows best, and will guide everyone through the turmoil of existence. Others are mesmerized by the pomp and majesty of office. All reflect the uncanny ability of people to create idealized fictions and then believe their own wishful thinking. Their heroes have one thing in common: they are not so much who they are as who they ought to be.

Greatness sells, which is why hero-worshiping literature has a long and venerable
tradition, rooted in both popular and elite culture. It stretches from Greek mythology, Gesta Romanorum, lives of saints, Arthurian legends, chivalric romances, troubadour songs, and chronicles of monarchs, through the political drama of Renaissance and the epic poetry of European romanticism, to didactic biographies of leaders (such as Parson Weems’s life of Washington) and historical novels. These writings—as opposed to modern, critical history aiming to explain why things happened—mostly describe prominent people and events in time, often embellishing them with invented episodes, folk legends, and even the personal views and experiences of the authors. The goals are usually fairly simple: exalt the qualities of the great and the saintly, lionize the powerful, and point to their role in changing the course of history. One of the distinctive features of this literature is that it was an instructional tool. Its aim was pragmatic and rooted in the present. Authors hoped that their works would supply the collective memory with worthy themes and symbols that bind societies and invite followers. The enduring attractiveness of such stories lies less in their adeptness in reconstructing facts than in their ability to conjure up ideal types, to celebrate the potential of the individual person, and to offer positive models of virtue—all qualities that defy the incoherence of the world.

By selecting Jefferson, Meacham taps directly into the pantheon of national icons. Reverence for the Founders is perhaps the most conspicuous form of hero worship in America. Politicians, journalists, and ordinary people look to them for usable history, and revel in “discovering” their own views in the Founders’ minds. A suitable quote can buy a badge of legitimacy for a broad variety of endeavors. For writers of historical biographies, this appeal has been a gold mine. Hardly a few months go by without a new volume, and the demand seems insatiable. All things considered, this is terrific for history, and helps thwart a broader weakening of public interest in the country’s past (exemplified, for instance, by the decline of U.S. survey courses as core requirements in colleges). . . . 

 
Read more from Historically Speaking at Project Muse

No comments :

Post a Comment

A Preview of Historically Speaking, June 2013

No comments
Randall Stephens

In not too long Project Muse will post the June issue of Historically Speaking.  In the meantime, copies are being shipped across the country and overseas to subscribers. 

The latest issue contains a lively range of essays.  This month we have the last of Joe Amato's three essays on revitalizing local history. Here he sketches a sensory history of 20th-century rural America. He then explores some causes and effects of the countryside’s marginalization in modern American society. On a related note, Don Yerxa interviews Canadian cultural historian Constance Classen about sense history.  Classen has written extensively on the senses, exploring the lived experiences of embodiment from the Middle Ages to modernity and helping us appreciate the tactile foundations of Western culture.  

Also in this issue are pieces on history and political thought, Mormon historical studies, Stalin and Nazi Germany, Civil War naval history, Britain and the Treaties of 1713 and 1763, family life in colonial New England, and a critique of hagiographical popular history. To round it out, Sean McMeekin speaks with Don Yerxa about the significance of July 1914 and the coming of World War I.

In "It’s Complicated: Rethinking Family Life in Early New England" Allegra di Bonaventura writes of the private lives of colonial Americans, teasing out a fascinating tale from the existing evidence.  "[W]riting about the early African-American family in New England," she notes, "depends unavoidably on the histories of the literate English who often owned and worked their African neighbors and bedfellows. Less evident, however, is that any mainstream history of the New England family faces a similar reckoning with the ethnic, logistical, and emotional complexities of real households."

She begins her piece with a harrowing story:

When his pregnant wife Joan and son Jack were taken from him by law and forced into slavery, a penniless former slave named John Jackson refused to submit to the prevailing powers or to society’s conception of him. Instead, he bided his time and planned a most audacious rescue of his family. After many months of preparation, he pulled it off—traveling across land and rough waters in the middle of the night to break into the house where they were held and “steal” them home. The year in which John Jackson staged this daring rescue was 1711, and the place he ran back home to was New London, Connecticut.


John and Joan Jackson belonged to New England’s first generations of enslaved people, men and women dispersed across English households and found especially along the region’s long coastline and in its port cities. John had arrived in Connecticut in 1686 as a young man of eighteen, emerging as freight and human property from the hold of a West Indian trade ship. At first, he probably worked the wharves at New London Harbor, then went on to receive training in husbandry, the stock and trade of the vast majority of New England men. His wife Joan was a different sort of New Englander. Hers was already an “old” New England family by colonial standards, one that could trace itself at least to the 1650s and the early years of settlement in New London. Joan herself was a native Connecticut girl. Arriving enslaved in a new land, John Jackson would nevertheless make a place for himself in its cold, unwelcoming clime. In time, he would unabashedly assert a life and family of his own—in freedom. 

Around 1700 John Jackson became free and married Joan. Still, whatever happiness the couple felt at uniting in marriage was dimmed by the reality that they had to live apart. Joan was an enslaved woman living in another man’s house entirely under another man’s control. Their first two children, a boy and girl, were born into this grim and uncertain reality—mother and father separated against their will and by miles. Within just a few years, however, Joan Jackson received a highly unusual grant of freedom from her master and mistress, acknowledging in part her dutiful service. Once free, Joan was able to join her husband, but even that reunion was cruelly shortchanged. Because their children, toddler Adam and baby Miriam, had been born while Joan was in bondage, they inherited their bonded status from her as well. By law, Adam and Miriam would be perpetual slaves, the property of their mother’s former owner. When she left to join her husband, Joan was forced to leave the children behind. She and John could visit them, but they would never live together as a family. Yet the Jacksons added a succession of additional children to their family, spending nearly a decade together in relatively peaceful domesticity. John was a farmer who owned no land but who could nevertheless hire himself out in support of his family. Joan was skilled at housewifery, an occupation that she, too, performed for others, as necessary. Any tranquility was abruptly halted in 1710, however, when a powerful local landowner claimed ownership of Joan in court, calling her freedom into question and eventually winning her as his property at trial. The sheriff came and seized a then pregnant Joan, along with their youngest child, two-year-old Jack. Joan and little Jack were taken to live in slavery across the Sound on Long Island, and it was from there that their husband and father would rescue them.

When John Jackson did act, he was not alone. With him that night in 1711 when he retrieved Joan and two of his children (a baby, Rachel, was born in slavery on Long Island) was an aging merchant by the name of John Rogers. The merchant, too, was an ardent family man, and one who had also found it necessary to fight for his family when the law took his wife and children away. For Rogers, it had not been slavery but religious difference that led him to suffer that deeply personal loss repeatedly. A religious radical, Rogers founded his own Baptist Sabbatarian sect, diverging from the prevailing Congregational way. Rogers was rich; Jackson, poor. Rogers was English; Jackson, African. Rogers had bought and owned Jackson after he arrived in Connecticut, and freed him more than a decade later.

The bond between Jackson and Rogers had unusual characteristics, formed first in the injustice of slavery, but also steeped in the new ideas and common purpose of a shared insurgent faith. . . .
Read more by subscribing to Historically Speaking.  Or, access the June issue on Project Muse through your library's account.

No comments :

Post a Comment

Is There Such a Thing as the American Character? Or, Is It American Caricature? A Roundup

No comments

Ca. 1863. Courtesy of the Library of Congress.
Terry Eagleton, "No Self-Mockery, Please, We're American," Chronicle of Higher Ed, July 1, 2013

Can one even speak of Americans and Europeans in this grandly generalizing way? Is this not the sin of stereotyping, which all high-minded liberals have learned to abhor? Nobody falls into a general category. Everyone is his or her own elite. As a character in one of James's novels proudly puts it,

In The American Scene, [Henry] James writes of the country's disastrous disregard for appearances. For the Calvinist, a delight in anything for its own sake is sinful. Pleasure must be instrumental to some more worthy goal, such as procreation, rather as play on children's television in America must be tied to some grimly didactic purpose. It can rarely be an end in itself. The fact that there is no social reality without its admixture of artifice, that truth works in terms of masks and conventions, is fatally overlooked.>>>

Jason Bailey, "Nashville in Paris: The Quintessential American Film, as Seen Abroad. On July 4, in France, I felt just how well Robert Altman captured our national character," Atlantic, July 4, 2013

Watching Nashville from outside of that country puts Altman's intentions to the test. Perhaps critics like Greil Marcus and Robert Mazzocco were right; maybe he is, in fact, judging these people, pointing and laughing at them, as we snicker when Haven Hamilton sings his insipid ballad "For the Sake of the Children," or when Barbara Jean tees up another down-home chestnut. But I don't think so--I didn't before, and I certainly didn't in Paris, where the French audience seemed just as willing to accept Altman's 24 characters, with all of their faults and flaws, into their open arms. They are with these people, and with the film, and they gasp at its ending (despite all of its broad foreshadowing). When Haven Hamilton picks up the microphone and implores the crowd, "This is Nashville! You show 'em what we're made of," the gooseflesh rises, and it continues through the heartbreaking sing-along of "It Don't Worry Me," as good a choice for an alternate national anthem as any.>>>
"We are all princes here." . . .

Alan Ryan, "America’s Unthinking Majority," Time Higher Ed, June, 20 2013

. . . . From the beginning, the American view of politics was that of the radicals in the English Civil War. For all Jefferson’s high-flown rhetoric about natural rights, the colonists held old-fashioned English views about the likely wickedness of all holders of monarchical authority; it was British rights they thought they were protecting, and English radicals who did their thinking. Once independence was achieved, the arguments that roiled 19th-century Europe couldn’t gain any purchase. The hereditary principle was excluded by the Constitution; universal suffrage (for free white men) was inevitable; everyone was committed to social mobility (for free white men); religious barriers to political office were illegal. Not until the rise of the robber barons did European socialist ideas get any sort of a hearing in the US, and one of the curious features of that period is the extent to which socialists complained of the loss of the old agrarian America: not the world of a land-owning aristocracy but that of the yeoman farmer.>>>

Andro Linklater, "The Men Who Lost America, by Andrew O’Shaughnessy," American Prospect, June 29, 2013

The birth of the United States was a more complex — and less heroic — drama than the one enshrined in American folklore. . . .

Central to O’Shaughnessy’s thesis is his well-sustained argument that in Britain neither politicians nor generals believed military means alone could restore parliament’s power to tax colonists who were so numerous and so motivated to resist. It was George III, he suggests, who personally silenced his ministers’ doubts by insisting that acceptance of colonial demands would deliver an irreparable blow to national prestige. ‘We are contending for our whole consequence,’ he declared, ‘whether we are to rank among the Great Powers of Europe or to be reduced to one of the least considerable.’ A divided leadership ensured that every attempt at a political solution was compromised, while at crucial moments Lord George Germain, minister for the American department, undermined the military effort by diverting resources to other, more winnable conflicts.>>>

Gordon Wood, "Dusting Off the Declaration," New York Review of Books, August 14, 1997

Scholars who talk about America’s “civic religion” often don’t appreciate the half of it. Not only have we Americans turned profane political beliefs into a hallowed religious-like creed, but we have transformed very secular and temporal documents into sacred scriptures. We have even built a temple to preserve and display the great documents consecrating the founding of the American creed—the Declaration of Independence, the Constitution, and the Bill of Rights. At the National Archives in Washington, D.C., these holy texts are enshrined in massive, bronze-framed, bulletproof, moisture-controlled glass containers.>>>

No comments :

Post a Comment