Charles Kenneth Roberts

Politics, History, Culture

Links to Read: April

The LeBron Paradox: This is a good LeBron James article because it gets at what I think is the fundamental aspect of James, something that’s applicable to him and just a tiny handful of living human beings: it’s impossible for normal people to imagine what it’s like to be him. It’s easier for me to imagine being the president, who is in the end no more than a particularly busy and public bureaucrat, than to imagine being LeBron James. As Brian Phillips puts it: “I can’t imagine what it’s like to be him. Can you? With any confidence? To have gone in the blink of an eye, while still an adolescent, into a state of almost unfathomable fame, cameras everywhere you turn, so many people to tell you ‘yes,’ so few people to tell you ‘no.'” And that doesn’t even consider the transcendent physical talent.

Is Prison Necessary? Ruth Wilson Gilmore Might Change Your Mind: Whether or not you are a prison abolitionist, I think a general policy that aims us toward prison abolition, even if we never got there, would be better for society.

How to reduce digital distractions: advice from medieval monks: “They complained about being overloaded with information, and about how, even once you finally settled on something to read, it was easy to get bored and turn to something else. They were frustrated by their desire to stare out of the window, or to constantly check on the time (in their case, with the Sun as their clock), or to think about food or sex when they were supposed to be thinking about God.”

The Raisin Situation: I wondered why everyone kept talking about raisins on the internet. This is why. It’s good! Raisins are serious business.

 

Advertisements

“The Self-Defeating Logic of the Attack on Pearl Harbor”

This is the text of a talk I gave to the Randolph County Historical Society, titled “The Self-Defeating Logic of the Attack on Pearl Harbor.”

The title of my talk today is “the self-defeating logic of the attack on Pearl Harbor.” I’d like to discuss why Japan attacked the United States at Pearl Harbor and why, despite being a short-term victory, it ended up being a long-term failure.

There are many examples in history of winning a battle but losing a war. Sometimes, it’s the fact of winning that battle that loses the war. I don’t think there’s a better example of this than the Japanese attack on Pearl Harbor on December 7, 1941, which has gone down in history as “a date which will in infamy,” in the famous words of American president Franklin D. Roosevelt. The Imperial Japanese government believed that its attack would accomplish a number of different aims that would guarantee Japanese success and power in the Pacific. As it turned out, the very act of attacking Pearl Harbor ended up guaranteeing that Japan could never maintain a position as the leading power in the Pacific. Read more of this post

Links to Read: March 2019

The Epic Hunt for a Lost World War II Aircraft Carrier: I had the exact same reaction reading this that Ed Caesar had reporting it: surprise when “I realized how invested I had become in the Petrel’s finding the Wasp.”

Why do so many Egyptian statues have broken noses?: Assuming human civilization survives another 4,000 years (it won’t), I wonder what future archaeologists and historians will make of all of our weird stuff. Will they think Banksy was some kind of priest? There’s really only one piece of art I care about surviving, of course.

The Day the Dinosaurs Died: “A 2013 study in the journal Astrobiology estimated that tens of thousands of pounds of impact rubble [from the asteroid that killed the dinosaurs] may have landed on Titan, a moon of Saturn, and on Europa and Callisto, which orbit Jupiter—three satellites that scientists believe may have promising habitats for life. Mathematical models indicate that at least some of this vagabond debris still harbored living microbes.”

Links to Read: February 2019

How Tech Utopia Fostered Tyranny: Perhaps first response to this essay is a thought about unintended consequences, how the blinkered optimism of our tech elites has ended up empowering the worst kind of actors. But I think the real lesson here, at least in the democraticish west, is the even greater need for mental discipline. It’s necessary to be conscious about the kind of media you consume and, more specifically, the kind of media ecosystem you create around yourself. Maybe it’s in the service of crass consumerism, maybe promoting conspiracy theories, but there are ideas around that are intentionally or unintentionally engineered to change our behavior and thought processes. What we think and react to has to become an intentional decision.

The Secrets of the World’s Greatest Art Thief: This is a fascinating story, although I find it rather distasteful to romanticize the actions of what appears to be a mildly sociopathic thief. Finkel gave a similar treatment to “the Last True Hermit,” although that was a bit more sympathetic character.

A Journey into the Animal Mind: I think the general failure to treat other living creatures, human and non-human alike, in a decent way will go down as the greatest sin of our species.

Climate Change Enters Its Blood-Sucking Phase: A drastic increase in tick-related fatalities among moose is the result of combination of an environmental success, the revival of moose numbers in New England, and environmental disaster, climate change.

A Whale’s Afterlife: I love this kind of Blue Planet stuff. What’s fascinating to me about this brief article is how connected different natural environments are; there are weird bone-eating worms on the bottom of the ocean, thousands of feet down, that apparently only exist because of whale carcasses that sink to the bottom. I should have been some kind of oceanographer.

 

Links to Read: January 2019

The Sea Was Never Blue: Something I think about fairly often is how so much of what we consider objective is so subjectively experienced. Color seems as universal as anything can be. There are light waves that our eyes receive in different hues and brightness. But different societies divide the color spectrum differently, and they emphasize colors in different ways; a society might refer more to brightness or hue. People speaking languages without a blue-green distinction have trouble differentiating between colors that seem obviously separate to speakers of other languages.

Sidney Wants to Be Someone Else: What if we just let people go to high school as often or as long as they wanted? The cost would be a downside, but I don’t know what other serious problems there would be.

What Life Is Like When Corn Is off the Table: Americans live, of course, in a nightmare dystopia, though one that is fairly pleasant as far as nightmare dystopias go. There are all kind of bizarre aspects to it that will make fascinating topics for future writers of cultural anthropology, history, and sociology books. The omnipresence of corn is one of things: If you’ve got a severe corn allergy, everything in America is trying to kill you because everything has corn in it, because the federal government uses a variety of subsidies and penalties to make corn cheap, because corn farmers have outsized political influence.

A 4-Year-Old Trapped in a Teenager’s Body: “Didn’t he know how lonely it was? Didn’t he know my brain was unfit to handle the hormones assailing my body? Didn’t he know the behavior for which he was constantly punishing me was out of my control? Of course he did. He’d been through it himself. But he had dealt with precocious puberty by lying about it, concealing it, and ignoring it, and that was how he wanted me to deal with it too — as if it didn’t exist, as if the cause of my misbehavior was simply my own immaturity, poor decision-making, and lack of self-discipline, all things I could control if I weren’t so weak.”

Links to Read: December 2018

Since I lack the time (or time management skills) to blog as much as I’d like, I’m going try using this blog as a platform for something useful: A reading journal of sorts, in which I will link to some interesting articles I have read and which I think other people may be interested in. Will I stick with it? Tune in to find out!

How Much of the Internet Is Fake? Turns Out, a Lot of It, Actually.: “Studies generally suggest that, year after year, less than 60 percent of web traffic is human” is perhaps the most interesting thing here. Increasingly I view the internet like the automobile: something that, despite its great value and potential, turned out to have an overall harmful impact on society.

Something this article doesn’t really address, but which I think is of the same phenomenon, is how the internet or I guess more specifically social media drives the most artificial, robotic behavior, not by force but by some perverted choice. I recall how distinctly unnerving it was during the Kavanaugh hearings the way in which people who were clearly real humans, not bots, would go through Twitter’s search function to respond almost word-for-word with the same argument to random people’s tweets. You can also include people who respond to tweets or in the comments section with the identical “fake news!!!” type responses over and over again. This isn’t a spontaneous human response, but it’s not a bot, either. We’re becoming the lamest kind of cyborgs.

Here’s a sort of related story about how to abuse the Amazon Marketplace. Here’s a sort of related story about, well, whatever this is.

How Mark Burnett Resurrected Donald Trump as an Icon of American Success: This article captures much of what’s wrong with America. It’s tough to pick a single excerpt, but this is relevant:

“The Apprentice” was built around a weekly series of business challenges. At the end of each episode, Trump determined which competitor should be “fired.” But, as Braun explained, Trump was frequently unprepared for these sessions, with little grasp of who had performed well. Sometimes a candidate distinguished herself during the contest only to get fired, on a whim, by Trump. When this happened, Braun said, the editors were often obliged to “reverse engineer” the episode, scouring hundreds of hours of footage to emphasize the few moments when the exemplary candidate might have slipped up, in an attempt to assemble an artificial version of history in which Trump’s shoot-from-the-hip decision made sense.

The Insect Apocalypse Is Here: I did not expect this first batch of links to be such downers, but here we are.

Nancy, a 1930s comic strip, was the funniest thing I read in 2018: Here’s something uplifting. Nancy is so good.

My Academic Notebook Productivity System

There are a lot of productivity systems out there. I have developed one that works for me, and I’m laying it out here in case you might find it useful, too. It’s a sort of mash-up of especially Getting Things Done, but also Bullet Journal, Autofocus, and Raul Pacheco-Vega’s Everything Notebook. This system won’t work for everyone, but I think it will work for people with jobs like mine. I work at a small, teaching-oriented college where everyone has to wear a number of different hats. During the summer, I can focus on bigger or more intensive projects, but during the school year, I have what I think of as lots of little stuff to deal with. I have a heavy teaching load and a heavy service load, and I try (with limited success) to get some scholarship done during the school year as well. My day is interrupted frequently with classes or scheduled meetings. This is the system I developed to manage that.

It has three basic parts:

  • Everything I have to do goes on a to-do list, either the master weekly to-do list or its own separate to-do list.
  • Everything that has to be done at a specific time goes in an electronic calendar with notifications.
  • Everything has a home where it belongs in a physical or electronic filing system.

The notebook. The most important part of my personal productivity system is the notebook to-do list system. For me, the most valuable part of Getting Things Done is the concept of universal capture–you never have to worry about remembering things because they are all written down and organized. But I am easily and often distracted. If I don’t have a reminder directly in front of me, I forget. I need to be able, at a glance, to see everything I have got going on right now. So, directly beside my keyboard on my desk lives a notebook. During the school year, it looks something like this:

2018-05-10 15.09.34

I like to use legal pads because they’re longer, but you could use the notebook you want to use. I rewrote this sample week to be easier to read, which distressingly means that this is as neatly as I can hand write anything, even when I specifically trying to do it. If I have a weekly to-do list that is longer than a single notebook page, I take that as a sign that I have to re-prioritize or reorganize something in my work life.

At the beginning of each semester and the summer, I go through the notebook and write down the week number, date, and any to-do items that are reoccurring (like grading online discussion threads every week) or known well ahead of time (like preparing for final exams or revising a paper to present at a conference). As the semester goes by, I add new things as needed. If there’s going to be a meeting I need to prepare for in three weeks, that goes in the list three weeks from now. If someone mentions something that I don’t quite know what to do with right now, it goes on the list for further processing. Each item is an action item–it is a specific task that I can do in a reasonable amount of time to further some goal or meet some responsibility.

In this system, most of the things I have to do are given a specific day to be done, but not a specific time. This allows for greater flexibility. I used to try scheduling specific times for grading or revisions, but one emergency meeting or rescheduled essay and the whole day is off-kilter. As I go, it is easy to add new items to the list; because the day of the week is written in the margin on the left, adding new things does not mess up the order or confuse me later. If something is too big for a single action item, it gets a separate list of its own. Like I assume most college professors, I produce a tremendous amount of ephemeral printed material, so usually the back of unused meeting agendas or old quizzes becomes the home for sub-to-do list and gets tucked behind the appropriate weekly page in my to-do notebook. Full-blown projects get their own separate to-do list and folder(s), with the relevant weekly responsibilities also being listed in the notebook.

As each item is finished, it gets marked with a single line. Sort of like the Bullet Journal, I have a system of indicating whether a task was completed, deferred, or not completed. So, by the end of the day on Friday, my to-do list might look like this:

2018-05-10 15.26.02.jpg

Something marked through is completed. An arrow indicates that an action item was moved–perhaps a committee meeting was rescheduled, so I didn’t have to prepare for it. An X indicates that I did not complete an item. I can mark through something but also X it to indicate to myself that I partially or unsatisfactorily completed an item, perhaps that I didn’t sharpen up a paper presentation as much as I had hoped and had to add it to next week’s list. Things with just an X weren’t completed; sorry I didn’t finish grading your essays, HIS 105! Anything more complicated to remember than that becomes a note or memo in the appropriate file.

In addition to the listed action items, the notebook can serve as an inbox of sorts (because if I have another separate inbox where I do not constantly see everything, I will forget about it). Papers to process, a new copy of an academic journal, or notes I wrote to myself can be piled on/under the notebook to remind me to deal with them. I don’t need to write “Read Agricultural History” when I can just set my copy under the notebook, where it is impossible to miss. If I have something that’s more substantial than a single action item, like reviewing a book, it can be turned into to-do list action items.

Finally, at the bottom of the notebook is an orange tab. That’s my “everything else” section, a combination list of ongoing projects and things to think about when the semester is over or which otherwise don’t require immediate attention. A fellow professor mentioned using Kahoot in class and recommended it, but after some thought I decided I’m going to need to consider if or how to implement it more carefully, so it goes to the end of the notebook. Periodically (i.e. when it gets to be more than a page long, so I can’t see it all at once) I reorganize that section of the notebook into a series of more manageable reminders or to-dos.

The calendar. When I was an undergraduate, every semester, I did the exact same thing. I got a physical planner, maybe one handed out by the college, or one gifted by an optimistic relative, or one I wasted money on by purchasing myself. I wrote all my classes down in the schedule and put important due dates throughout. I’d use it for about two weeks, and then one day I would forget to look at it, and then that was it. If I printed out a schedule and taped it over my desk, that sometimes worked better, but the real lesson is that I don’t remember things and need to be told what to do constantly like a child.

Thanks to the magic of computer technology, I can have something tell me what to do constantly. I use Google Calendar (which links to my college e-mail address, since we’re a Google-powered campus, but you can use whatever works for you), and every single scheduled event goes into it. This means all committee meetings, college events, meetings with students, classes, and even my office hours. Everything gets a ten minute notification which makes a distinct buzz on my phone and a little pop-up on my computer; things that are farther away or for which I am afraid I will forget to prepare get an earlier reminder, too.

The end result of this is that I never worry if I have missed a meeting or that I am supposed to be somewhere that I am not, because my calendar tells me what to do. During the work week when I am not at a scheduled something, I’m doing whatever is on my to-do list for the day, or in the unlikely event I am ahead, taking care of things for later in the week.

The filing. In a way, my notebook is a combination to-do list and chronological index of or hub for my ongoing projects and responsibilities. Instead of constantly having all kinds of stuff rattling around in my head, I can safely let it go, assured that I will remind myself what I need to do and when I need to do it. Everything else gets put away until it is needed. A research project, course redesign, or some other big enterprise I’m working on has its set of to-do lists and notes, but it’s at home in a manila file folder in my desk unless I’ve specifically noted to do something with it on my weekly to-do list or I come across some relevant information that needs to be processed.

Banning Laptops in the College Classroom

There has been another internet dust up about whether or not professors should ban laptops in their classrooms. Matthew Numer says students should be insulted by laptop bans, and Kevin Gannon writes that the argument behind the laptop ban “fails miserably as an intellectual position about teaching and learning.”  But I think the arguments against a laptop ban are, generally speaking, wrong. With all the usual caveats and exceptions, in my experience laptop bans are a good idea in introductory-level college classes. I would (and do) recommend it to my colleagues.

To me, there are three strong arguments against the laptop ban, and I think the two articles above do a good job of capturing them: 1) laptop bans are unfair to students requiring disability accommodations; 2) laptop bans are a bad response to larger issues in education, like poor pedagogical techniques or overcrowded courses; 3) laptop bans fail to treat students like the young adults they are.

Only the first argument has any real traction with me. That’s why I meet with the director of our campus’s disability services office each semester before the class begins, to discuss how the ban might affect the students taking my class that semester. I’m fortunate to work at a college small enough to make it realistically possible to discuss the impact of classroom policy changes like this at the individual level, and I’m aware not every professor has that luxury. Also (as I think everyone who does not allow laptops in class feels compelled to explain), I don’t have a blanket ban; students can get special permission to use a computer. I don’t call people out in class for using a laptop, although I do try to catch them as they leave to make sure they know the policy. So far, most of my students requiring accommodation prefer to record lectures in addition to taking notes by hand, rather than typing notes in a computer. The most regular laptop users I have are our frequently-injured student athletes; there is consistently enough someone who has permission to use a computer that I don’t think using one would single out a disabled student, no more than is the case with other more obvious accommodations like not being in class or leaving class early to take an exam or quiz with extended time. I don’t think this is a perfect solution; I can imagine a shy student who would be better accommodated using a laptop but deciding not to do so because they don’t want to (in their view) cause trouble. But it’s also the case that the argument for better disability accommodations could work the other way, in favor of a laptop ban. Limiting the extent to which additional screens can distract someone is an upside to a technology ban that people rarely talk about.

That laptop bans are a bad response to larger issues in higher education: well, yeah. Allowing laptops in class isn’t going to make the college enough money to fund the additional faculty members to allow me to have the ideal seven-person seminar-style class I want to hold. It won’t make the college enough money that we could loan all our students laptops or have all our classes in computer labs so that I could re-structure class meetings to incorporate technology (at my institution, I can’t be certain a student has a car or a cell phone, much less a laptop; I’ve yet to find a way to incorporate electronic devices into the classroom that doesn’t leave some people behind). Until then, we’re all making do.

The most common reason for allowing laptops in the college classroom is that students are adults and should be allowed to make these choices on their own; if they don’t choose to participate in class, that might be a rational choice or not, but it’s their decision to make. Really, we the professors have the obligation to win their attention. To quote Matthew Numer:

Our students are capable of making their own choices, and if they choose to check Snapchat instead of listening to your lecture, then that’s their loss. Besides, it’s my responsibility as an educator to ensure that my lecture is compelling. If my students aren’t paying attention, if they’re distracted, that’s on me. The same goes for anyone presiding over a business meeting.

(I feel like they will fire you if you have a meeting with the boss and play on your laptop instead of at least appearing to pay attention, but it’s been a while since I had a job outside academia.)

and Kevin Gannon:

Ultimately, it comes down to how we see our students. Do we see them as adults, or at least as capable agents in their own learning? Or do we see them as potential adversaries in need of policing? Do we see them as capable of making their own decisions and learning from those that aren’t necessarily the best ones?

The arguments that students are grown-ups and that it’s the fault of the professor if they/their course are insufficiently engaging seem to me to be manifestly contradictory. One thing that successful adults do is pay attention to things they find boring. I have to do it all the time! I’m not enthralled navigating SACSOC regulations or doing my income taxes, but I do it, because it’s important. I don’t personally find everything in history (even in my field or my classroom) equally exciting or interesting, but there are boring things you have to understand to do the neat stuff. Everybody wants to be able to order wine in a fancy Parisian restaurant; nobody wants to learn irregular verb conjugations.

I teach at a tiny, mostly-associate’s degree granting institution in the rural Deep South, largely teaching non-majors in survey courses. I can’t always count on my students to know what the best decisions are; that’s part of what I am trying to teach them. I could let a student learn that watching soccer online instead of paying attention will lead to a failing grade; in fact, I did, in one of my larger classes before I stopped allowing students free use of electronic devices in class. I don’t feel like doing so empowered the student.

Perhaps most importantly, I don’t want to allow students to make bad decisions that make learning harder for other students in the classroom. I’m hesitant to write further, even anonymously, about my current or former students on the internet, but I think most people who teach survey-level courses have had students who simply don’t know how to be in a college classroom, or students who are smart enough that they can get through a survey class without trying particularly hard in class. I have seen such students act in a way that make it more difficult for other struggling students to succeed. I’ve been teaching long enough to not take it personally when students look at their phones rather than listen to me, but students participating in classroom discussion or making a presentation might take it differently.

I can imagine an introductory history survey that uses laptops or other electronic devices in a way that makes that class better (though I can’t easily imagine it realistically happening where I currently teach). Generally speaking and in my experience, students using electronic devices do not improve the classroom, and they usually make it worse. And I’ve not heard differently from a student. Anecdotes and data etc, but I’ll finish this with a story: At the end of the semester, I usually give each class as a whole a bonus opportunity: for every day that the entire classroom is engaged (including reading the material beforehand etc but most importantly, nobody focuses more on their cell phones than the class), the entire section gets a bonus on the final exam. From what I can tell, this improves grades, but more than that: I’ve also had several students tell me that I should do the same thing the whole semester, that they felt like they did better when they knew they couldn’t become distracted. I’ve yet to find the student who does better with more distractions in the classroom.

“Good riddance”

There are a variety of things wrong with the Democratic Party; there are a lot of reasons why it’s, generally speaking, losing in the 21st century. I think the most important reason is that, despite what the base wants and the overall popularity of the policies supported by rank-and-file Democratic membership, the party leadership is beholden to the economic elite who favor policies which benefit the already-rich and powerful. Universal healthcare is a good example of a policy that most Democrats, especially the most motivated and committed, want but that the party leadership has hesitated to get behind.

But that’s not the only thing wrong with the Democrats. Josh Marshall has written an article about Alabama’s Republican nominee for Senate (and likely winner) Roy Moore, whose most important financial backer has expressed Christian supremacist and neo-Confederate ideas, and he’s associated with the League of the South (as has Moore).

That’s awful, though it’s not terribly surprising. But whenever you look at stories like this on liberal/Democratic-leaning websites, or when they’re posted by prominent liberal Twitter accounts, there’s a common tendency in the comments or responses. By my count a solid majority of the replies to the above-quoted tweets and a sizable chunk of the replies on TPM are some variation of “Good riddance!” or “Let them go” or “Kick them out.”

So: an explicitly Christian supremacist with ties to more-or-less openly white supremacist organizations want to take control of a state that votes 35-40% Democrat and is 30% or 35% non-white (depending on how you count Hispanics). For a sizable portion of the people who care enough to post about it on the internet, the response is to just let them do that.

This isn’t a matter of whether Democratic policies favor those voters, or whether Republican policies harm them. Compare the way that conservatives respond to stories about, say, the (much over-stated) liberal bias on college campuses. There are calls for affirmative action for conservative academics, calls for the firing of professors who make inflammatory liberal statements, insistence on equal rights for using college spaces for even the most extreme conservative/anti-liberal speakers, and demands that college administrators or even state legislatures step in to protect conservative voices. There’s an element of “That’s what you get when you go to liberal colleges” or talk about conservative alternatives, but it’s in general extremely supportive for conservatives who decide to attend those institutions (a self-selected identity, unlike where you’re born).

I know it’s the internet, and it’s Twitter, and it’s the comments section. I know a lot of people think it’s funny to post such thoughts and wouldn’t actually favor secession. But I also know that lots of black people didn’t vote in 2016 because they didn’t think the Democratic Party did anything for them. Black voter turnout declined sharply in 2016, and despite the most anti-immigrant candidate in decades, Latino turnout didn’t increase very much. Republicans were effective in reducing (and in some cases, repressing) turnout, but Democrats were also bad at increasing turnout.

The reason so many black and rural Americans feel like Democrats don’t care about them is because lots of Democrats don’t really care about them, or at least they don’t act like they do.

Southerners, southerners, and southerners

People were rightly critical of this idiotic tweet sent by Virginia gubernatorial wannabe Corey Stewart:

idiot tweet

But I am glad for it. Perhaps you, like me, are a history professor teaching American history, and it’s the end of the semester, so you’re in the middle of or getting into a discussion of secession and the Civil War. This tweet is a perfect teaching moment, a distillation of everything wrong with how we talk about secession, the Confederacy, and the South.

Stewart’s comment is about the wrongly-described “Confederate” monument (actually a monument to an attempted Reconstruction-era white supremacist insurrection). Stewart describes “a Yankee” and “a Southerner” as though these are real and natural categories. The implication is that someone from the North and someone from the South would have opposite perspectives about this monument. Stewart’s understanding of “Southerner” means white supremacists who supported the Confederacy; the South’s black population, or for the matter the large white Unionist population and whites like Italians who also faced racialized violence, is completely erased.

I think I am going to start class today with this tweet.

Perhaps the most important part of this tweet, which really does an amazing job of packing so many wrong and bad assumptions into so few characters, is the “don’t matter” part. This rhetorical maneuver asserts that people who oppose white supremacist public monuments don’t care about history, southerners, or the South. People on Twitter have clever comebacks about how really? nothing is worse?, which is fun and enjoyable, but we shouldn’t allow the rhetorical boundary-setting of this tweet to go unchallenged. Nobody is saying these monuments or the history they represent don’t matter; they mattered, and still matter, all too much.