Associate Professor of English, University of Michigan-Flint. I research and teach rhetoric, technology, and writing.
2885 stories
·
39 followers

how soon

1 Share

When my daughter was born I felt a love and connection I’d never felt before: a surge of tenderness harrowing in its intensity. I knew that I would kill for her, die for her, sacrifice anything for her, and while those feelings have become more bearable since the first delirious days after her birth, they have not abated. And when I think of the future she’s doomed to live out, the future we’ve created, I’m filled with rage and sorrow.

Every day brings new pangs of grief. Seeing the world afresh through my daughter’s eyes fills me with delight, but every new discovery is haunted by death. Reading to her from “Polar Bear, Polar Bear, What Do You Hear?,” I can’t help marveling at the disconnect between the animal life pictured in that book and the mass extinction happening right now across the planet. When I sing along with Elizabeth Mitchell’s version of “Froggie Went a-Courtin’,” I can’t help feeling like I’m betraying my daughter by filling her brain with fantastic images of a magical nonhuman world, when the actual nonhuman world has been exploited and despoiled. How can I read her “Winnie the Pooh” or “The Wind in the Willows” when I know the pastoral harmony they evoke is lost to us forever, and has been for decades? How soon do I explain to her what’s happening? In all the most important ways, it’s already too late.

-- Roy Scranton, "Raising My Child in a Doomed World

Read the whole story
betajames
15 hours ago
reply
Michigan
Share this story
Delete

EPA needed to act more quickly, forcefully on Flint water crisis, report says

1 Share
The Environmental Protection Agency should have reacted much more quickly and forcefully to the Flint water crisis, even if it meant asserting emergency authority over the Michigan Department of Environment Quality, according to report released today the EPA's Office of Inspector General.



Read the whole story
betajames
16 hours ago
reply
Michigan
Share this story
Delete

We should be building cities for people, not cars

1 Share

Devon Zuegel says that our cities and the people who live in them would be much better off if we designed them around people and not cars.

Unfortunately, America’s inherited infrastructure is more like the old Embarcadero Highway than the boulevard that replaced it. Urban planners spent the 20th century building cities for cars, not people, and alternatives to driving have been systemically undervalued. This legacy has resulted in substandard health outcomes, missed economic opportunities, and a shortage of affordable housing.

We can’t wait around for another earthquake to reverse generations of bad policy. Luckily, it doesn’t require a natural disaster to begin reshaping our infrastructure. Small changes can have an outsized impact in expanding alternatives for how people move around. Rebuilding our infrastructure to enable walking, cycling, and mass transit would bring health and economic benefits that far outweigh its price tag.

People who live in rural areas more or less need their own cars in order to do anything, but private cars in cities are much less necessary. Cities should optimize for buses, subways, cyclists, and pedestrians — they get people to where they’re going without all the outsized infrastructure, waste, and pollution. *repeatedly sticks pin into voodoo doll of Robert Moses*

Tags: architecture   cars   cities   Devon Zuegel
Read the whole story
betajames
16 hours ago
reply
Michigan
Share this story
Delete

The Radical Notion of a Smartphone-Free Campus

2 Shares

 

There’s a scene in Don DeLillo’s story “Midnight in Dostoevsky” that reflects on the current omnipresence of digital media and the relative oasis that the college classroom can be. Here we are in a laughably self-serious logic seminar, where the wizardly professor, Ilgauskas, utters one-line axioms before the small group of anxious, if intrigued, students:

“The atomic fact,” he said.

Then he elaborated for ten minutes while we listened, glanced, made notes, riffled the textbook to find refuge in print, some semblance of meaning that might be roughly equivalent to what he was saying. There were no laptops or handheld devices in class. Ilgauskas didn’t exclude them; we did, sort of, unspokenly. Some of us could barely complete a thought without touch pads or scroll buttons, but we understood that high-speed data systems did not belong here. They were an assault on the environment, which was defined by length, width, and depth, with time drawn out, computed in heartbeats. We sat and listened or sat and waited. We wrote with pens or pencils. Our notebooks had pages made of flexible sheets of paper.

I don’t want to wax nostalgic for an earlier era when college students dutifully shunned digital technology or didn’t have it to begin with. I do want, as my university often encourages me, to meet my students “where they are.” But sometimes the imperative to digital mediation overwhelms me and makes me wonder about the threshold of these different ways of being: analog and digital. But of course, it’s never that simple, never a clear-cut binary.

Here’s a story that may sound apocryphal, but this really happened: One spring day on my campus, I saw a student who was staring into his smartphone walk straight into a light pole. He crashed into it, stumbled backward, and looked around to see who had seen him. (I was some distance away; he didn’t notice me.) Then he adjusted his course and went back to whatever he had been doing on his phone, unfazed. This is one of the often ignored, occasionally painful, and sometimes embarrassing consequences of what Ian Bogost discusses in an article for The Atlantic called “Hyperemployment, or the Exhausting Work of the Technology User.” Hyperemployment is the endless work we do for unseen agencies, owners, and conglomerations while seemingly merely tapping away at our phones, communicating or otherwise being entertained.

Around that time, I had been tuning into how hyperemployed people are on my campus. Just a few days before the student and the light pole, I had dropped my iPhone, and the screen shattered. The phone still worked, more or less, but after the fall, it lived on my desk in a Ziploc freezer bag, glass splinters crumbling away and accumulating gradually into tiny glinting dunes in the corners of the bag. So I had been reexperiencing my life without smartphone and especially reconsidering how these things permeated my workplace, the university.

After a few weeks of being smartphone free, from this altered vantage point I noticed just how busy everyone seemed to be, all the time. Whether in class, in meetings, or in the hallways—everyone was on their phones. And I don’t say this from an easy standpoint of judgment, for I had grown so accustomed to being on my phone, justifying my near constant attachment to it by the fact that it was allowing me flexibility and freedom. I would draft essays and outline book chapters on my phone’s notepad in the middle of the night. I emailed frantic students at all hours, reassuring them about assignments, missed classes, or exams. I carried on committee work long after meetings had let out, hashing out the fine points of strategic planning and SWOT analyses. I networked with remote colleagues on Twitter and set energizing collaborations into motion. This all seemed worthwhile and productive—and I suppose it was, for the most part.

I’m fully conscious of my own cyborg existence, and I have always been a lenient professor when it comes to students and their technologies. I generally don’t police their use in the classroom and have called students out only a handful of times when their texting got too conspicuous or a facial expression suggested that they had become totally distracted by something on their phone. For the most part, I accept that these things have interpenetrated our lives so thoroughly that it is impractical and unrealistic to try to sanction their use in the classroom. Rather, figuring out the etiquette and subtleties of smartphone use in everyday life is one of the innumerable soft skills that should be learned over the course of college.

But that was before my smartphone hiatus. During those weeks, I found myself walking to work, feeling great. Why? Because I was not thumbing madly and squinting into my hand as I stumbled along, neck craned, tripping over the curb. I was swinging my arms and looking around. Between meetings on campus, I was processing things people said as I strolled back to my office, rather than going immediately to my email inbox, replying to messages as I marched upstairs. I wasn’t leaving my classes and getting directly on Slack to catch up with my collaborators; I was decompressing and thinking about what my students brought up in our discussions. I thought my smartphone was granting me freedom, but it was more like the opposite.

I began to see these things everywhere on campus, and they were increasingly disgusting to me. This has been a difficult piece to write because I am aware of how my criticism verges on hypocrisy, or almost depends on it: I appreciate what smartphones can do—are doing—on a daily basis. But seeing these things from a slight remove, they became revolting to me. I saw my students and colleagues tethered to their smartphones, and I wondered how these things were meshing with—or not—our ostensibly collective purpose of higher education: working together to make the world better, at least our human part in it. I realized how entangled with my smartphone I had become and how different—how refreshing—it felt to be without it. I started reading (books!) for uninterrupted minutes in ways I hadn’t been able to for years because I always felt the need to live tweet or cross-reference whatever I was reading.

I talked to my students about this at one point during this time, extrapolating that they, too, probably didn’t realize how supplementary they had become to their phones—to which they looked at me wide-eyed as if to say, Oh, yes, we well realize this. The look they gave me was tragic, their faces creased in quiet despair. I told my students I was writing a piece on my experience of being without my iPhone, and they viewed me with sardonic skepticism. Good luck with that, they seemed to be thinking.

One student later emailed me a timely New Yorker piece called “The Useless Agony of Going Offline,” in which Matthew J. X. Malady describes the pointlessness of going off his handheld devices cold turkey. He tries it for seventy-two hours and concludes: “I would like to say that I reached some time-maximization epiphany … but I’m not sure that I used my time any ‘better’ than I normally would have during that span. I just used it differently, and found myself frequently bored as a result.” Malady complains that he was basically less informed when off his handheld devices, and the piece ends with a sort of discursive shrug, as if to suggest that it is futile to resist the hegemony burning away in our hands, pockets, and brains. It is a persuasive and shrewd article, and my student seemed to be daring me to prove Malady wrong. But I’m not trying to make a wholesale pronouncement against these things. My relationship with my phone persisted during that time the screen was shattered—it’s just that I didn’t see the thing for hours at a time, particularly when I was on campus.

I told my colleague Tim Welsh about the shattering of my iPhone, and he quickly dialed up dozens of bizarre, hilarious YouTube videos testing various fall heights and reporting the damage incurred by different devices put under various forms of duress. Take after slow-motion take of smartphones crashing into the pavement, being dipped in miscellaneous liquids, and being run over by SUVs. But these were tutorials ultimately geared toward protecting one’s phone or purchasing the most durable model out there. I was watching these videos from the other side, my phone having already been smashed. And perhaps the videos served as yet one more layer of entertainment and seamless commerce, no matter why they were dialed up in the first place.

The weird thing is that I probably wouldn’t have done it on my own; I don’t have the self-discipline to simply use the phone less (some people do, I understand). It took an accidental fall. And then, not wanting to spend a few hundred dollars to replace it, or suffer through the ordeal of an average AT&T or Apple customer-service experience, I just let the phone lie there in its bag, mostly inert, for several weeks. It was functional but changed, limited in a new way. As I was checking my phone one day, sheathed in its plastic envelope, my partner Lara remarked how having it in a gallon-size Ziploc freezer bag made the ridiculousness of these things wickedly obvious: we’re all hanging around gripping and staring into these awkward containers full of junk.

In his 1996 novel, Infinite Jest, David Foster Wallace imagines an early-twenty-first-century technological stress point as phone calls become increasingly video projected to others around the world—in short, a surge of things eerily like Skype or FaceTime. Wallace goes on to ponder what would follow: a feedback loop of increasingly reflexive self-awareness brought about by the constant demand of face-to-face communication by screen. Wallace conjures an elaborate cottage industry of ultrarealistic masks and background dioramas that would bloom alongside “videophonic” devices. Little did Wallace know that the problem would not be rampant self-presentation or its artifice. Our handheld devices are far more insidious for how they seduce us into tuning out while believing that we are tuned in. What Wallace got right was just how much power such communicative media could come to have over our whole bodies.

Only when you no longer carry around a phone all day—or have it at your bedside all night, to look at when you get up to pee and first thing in the morning—will you realize how chained to it you’ve become. I realize I’m making something of an arbitrary distinction here. The Internet for many of us is so intertwined in our lives that it has become a ubiquitous dwelling space, the dispersed hearths of modern homes. Where one device ends and another begins is no simple matter. The smartphone is a special kind of device, though—not because it merely gives us more of the Internet but because the smartphone gets insinuated into our creaturely lives. It has thoroughly “extended our senses and our nerves,” to borrow a line from Marshall McLuhan. McLuhan wrote those words in 1964; he was concerned that an “Age of Anxiety” was in the offing, thanks to new communications and entertainment technologies. Wallace, in the nineties, was projecting this age’s next phases. This is basically where we are now: in this age of posttruth, where the reality of where my body ends and communicative technologies begin is relentlessly complicated by the very object in question. Being without my iPhone clarified many of McLuhan and Wallace’s observations and insights about the addictive, accelerative qualities of our latest electronic media.

Not that I have any easy solution, beyond my own personal revelation. And it was a short-lived one. By the time I was revising this piece for my book, I had a new iPhone sitting mere inches from my fingers as I typed these words, and I eagerly awaited its alerts and epiphanies. Our imbrications with these devices are complex and intricate, to say the least. However, there is something to be said for the jolt of a fresh perspective. I understand that not everyone is caught in the vicious circle of checking their phone every few minutes. I recognize that many people have more self-discipline when it comes to these things. Not everyone is staring into a screen—at least, not yet.

These days, I’ve got an even newer iPhone burning away in my pocket. I don’t feel good about this, and I would be glad to discard it, especially if I were forced to again. I realize that this is a strange sentiment to articulate while not seeming able to act on it. My campus, Loyola University New Orleans, recently went smoke-free. The effect was striking. Where smokers used to sit in a place called “smokers’ alley” in the central quad, now they group together on a road adjacent to campus and puff away, newly organized, if also visibly abject. I wonder, though, if a smartphone-free campus might be a far more radical—and perhaps ultimately healthier—move for a university these days.

 

Christopher Schaberg is the Dorothy Harrell Brown Distinguished Professor of English at Loyola University New Orleans. He is the author of The Textual Life of Airports (2013), The End of Airports (2015)Airportness (2017), and The Work of Literature in an Age of Post-Truth (2018), as well as the coeditor of Deconstructing Brad Pitt (2014), all published by Bloomsbury. 

This piece was adapted from The Work of Literature in an Age of Post-Truth, publishing July 26 by Bloomsbury.

Read the whole story
betajames
21 hours ago
reply
Michigan
cjmcnamara
1 day ago
reply
Share this story
Delete

Migrating Arctic Geese Are Confused, Exhausted By Rising Temperatures

1 Comment
Barnacle geese have sped up their migration to their breeding grounds because of warming Arctic temperatures.

Warmer weather means that barnacle geese fly faster to their breeding grounds, leaving them too tired to lay eggs right away. By the time they're ready, the babies have missed the best food.

(Image credit: Thomas Lameris/NIOO-KNAW)

Read the whole story
betajames
21 hours ago
reply
Welcome to the club
Michigan
Share this story
Delete

How Cars Divide America

1 Share

Urbanists have long looked at cars as the scourge of great places.  Jane Jacobs identified the automobile as the “chief destroyer of American communities.” Cars not only clog our roads and cost billions of dollars in time wasted commuting, they are a terrible killer. They caused more than 40,000 deaths in 2017, including of some 6,000 pedestrians and cyclists.

But in the United States, the car plays a fundamental role in structuring the economy, our daily lives, and the political and social differences that separate us.

Writing from prison in the 1930s, the Italian Marxist Antonio Gramsci dubbed our modern economic system Fordism—invoking the system of automotive production developed by Henry Ford. On the factory floor, Fordism described the powerful synthesis of scientific management and the moving assembly line, which revolutionized industrial production. Applied to the economy, the term captured Ford’s move to higher pay for his workers—the famous $5-a-day wage—that enabled them to buy the cars they produced. At a broader societal level, Fordism catalyzed the shift to a mass suburbanized society.

As Ford himself once put it: “We shall solve the city problem by leaving the city.” The car enabled the American suburban dream, prompting the relocation of the middle class, industry, and business from the city. In doing so, it helped shape the relatively short-lived era of post-World War II prosperity and the rise of a stable, blue-collar middle class, stoking economic demand for the products coming off the country’s assembly lines.

But today, the car plays a central role in worsening America’s social, political, and economic divides.

This can be seen in a simple statistical-correlation analysis by my colleague and frequent collaborator Charlotta Mellander. Mellander ran correlations for the share of workers who drive their cars to work alone, along with three other types of commuting: taking transit to work; walking to work; and biking to work. She compared these to certain key features of our economic and political geography, including income, education, and occupational class; population size and density; and political affiliation and voting.

As usual, I point out that correlation in no way infers causation, but simply points to associations between variables. (All of the correlations reported below are statistically significant.)

She found sharp differences between metropolitan areas where a high share of people drive their cars alone to work and those where greater shares of people take transit, walk, or bike there. These are especially striking in the light of the fact that an overwhelming share of Americans—85 percent of us—drive alone to our jobs. Also, car dependence encompasses both liberals and conservatives: 73 percent of independents, 86 percent of Republicans, and more than three-quarters of Democrats say that they depend on their cars to get to work.

The key is not individuals’ car use, but the way we sort into communities based on our reliance on cars.

For one, the geography of car use tracks with income and wealth: Car-dependent places are considerably less affluent. Metros in which a higher share of people depend on their cars to get to work are poorer, and those where more people use transit or bike or walk to work are considerably more affluent. The share of commuters who drive to work alone is negatively correlated with both wages and income. Conversely, in more affluent metros, a higher proportion of commuters use transit, walk, or bike.

Drive alone to work  

Take transit to work

Walk to work

Bike to work

Income

-.36

.53

.21

.23

Wages

-.49

.62

.22

.27

College Grads

-.53

.50

.50

.52

The geography of automobile dependence is also related to divisions along educational lines. Metros with lower levels of educational attainment (measured as the share of adults who have a college degree) are those where a larger share of commuters drive to work. In more highly educated metros, larger shares of commuters use transit or bike or walk to work.

Drive alone to work

Transit to work

Walk to work

Bike to work

Knowledge workers/creative class

-.43

.47

.32

.31

Working class

.48

-.33

-.34

-.34

Innovation

(patents per capita)

-.27

.36

.32

.37

America’s geography of car dependence also reflects differences in the kinds of work we do. Car dependence is a feature of working-class metros, while metros with higher concentrations of knowledge workers and the creative class have much higher shares of people who use transit or walk or bike to work.

We see the same basic pattern where we look at metros that are knowledge and tech hubs. Driving to work alone is negatively associated with the innovativeness of metros (measured as patents per capita), whereas the share of commuters who use transit or bike or walk to work is positively associated with innovation.

America is an increasingly polarized and politically divided nation, and the car both reflects and reinforces those divisions. Car-dependent places are much more likely to have voted for Trump in 2016. Although the associations are stronger for Trump votes, the same basic pattern holds for Romney votes in 2012. On the flip side, metros that voted for Hillary Clinton in 2016 and Barack Obama in 2012 have much higher shares of commuters who use transit or walk or bike to work.

Drive alone to work

Public transit to work

Walk to work

Bike to work

Clinton

-.48

.48

.34

.36

Trump

.54

-.50

-.40

-.43

Obama

-.38

.44

.35

.30

Romney

.40

-.44

-.36

-.32

Density

-.53

.62

.30

.28

Of course, voting patterns differ based on the size and density of places as well as their educational and class composition. It is well-known that Trump took the presidency by winning smaller and medium-sized places and rural areas, whereas Hillary Clinton took America’s largest, densest, and most productive areas. Car dependence is negatively associated with the size and density of metros. People in larger, denser urban areas are more likely to commute to work by transit or bike or walk (although the correlation between population and biking to work is statistically insignificant).

All of this raises the question: How exactly does this geography of car dependence work to divide us?

The detailed historical research by Stanford political scientist Clayton Nall offers some clues. Nall’s work shows how road infrastructure that has promoted car use—and in particular America’s massive investment in the federal interstate highway system—played a profound role.

The car and car-dominated infrastructure propelled suburbanization and white flight. They split our society into white, affluent suburbs and poor black and minority cities. The car shaped the rise of what Richard Nixon identified as a “silent majority” of suburban whites back in the late 1960s, and is a precursor to the suburban and rural backlash that lifted Donald Trump to victory in 2016.

Nall has written that “Democrats and Republicans have adopted increasingly different positions on spatial policy issues such as transit and highways. Transportation infrastructure has been a necessary condition of large-scale suburban growth and partisan change, facilitating migration into rural areas that were previously unoccupied and inaccessible to metropolitan commuters and workers.” In other words, the car and the infrastructure that enables it had a huge influence on the disparities that vex us today.

The car’s politically divisive role extends beyond America. It has helped shape the politics of my adopted hometown of Toronto. Indeed, dependence on the car was a key factor in whether or not someone voted for the city’s late, dysfunctional mayor Rob Ford.  Ford singled out so-called “urban elites” for waging a “war on the car,” and promised supporters he would remove bike lanes to give more room on roads to drivers. According to detailed research by political scientist Zack Taylor, commuting to work by car and living in the suburbs (inside the city limits) were among the strongest factors in electoral support for Rob Ford.

I’m not trying to blame the car for everything that’s wrong in America. But it is increasingly clear that in addition to wasted time and productivity, reduced quality of life, and even fatalities, the automobile takes another toll. It may be that cars are not only the chief destroyer of our communities, but are tearing at the nation’s political and social fabric.

Read the whole story
betajames
1 day ago
reply
Michigan
Share this story
Delete
Next Page of Stories