I finished reading the latest proofs of the Better Posters book this week. Having just done that a couple of days ago, I appreciate this quote from Charles Darwin.
When I think of the many cases of men who have studied one subject for
years, and have persuaded themselves of the truth of the foolishest
doctrines, I feel sometimes a little frightened, whether I may not be
one of these monomaniacs.
This was in a letter to one Dr. W.B. Carpenter in 1859, about none other than Darwin’s most famous work, On the Origin of Species. Darwin wrote the letter the same month the book was released and sold out in a day. I found the quote mentioned in this article.
Re-reading my own book more than a year after finishing the manuscript and that nobody else has seen yet (besides the publisher’s staff) brings up “Did I just write something that nobody else will want to read?” thoughts.
Also: I love Darwin’s hat and think there should be a new version that evolutionary biologists can buy.
What I didn’t know what its role in my favourite space move, the first Star Wars.
I knew the Milleneum Falcon went through several redesigns. The shots inside of the Falcon don’t always make sense relative to the exterior, because the Falcon was originally the ship that became the rebel blackade runner.
What I didn’t know was that a good part of the reason the design changed was that it looked just a little too much like the Eagle transporter.
May I introduce Bipes biporus, also known as the Mexican mole lizard or Belding’s mole lizard.
It’s an odd and fascinating beast, because it has arms (forelimbs) but no legs (hindlimbs). You can see its front legs very well in the picture above. They even look pretty chunky relative to the head.
But there are no obvious rear legs.
There are tiny remnants of leg bones in the back of the animal, but they are not visible just by looking at the animal.
Above is Figure 8 from Zangerl (1945).
A more recent paper (Kearney and Stuart 2004) says Blanus (another worm lizard genus) has forelimb skeletal elements but only vestiges of rear limbs. But pictures of Blanus don’t show obvious limbs like Bipes does.
Why do I say this worm lizard is like a whale? Because like whales, only the forelimbs are visible. The hindlimbs are all but lost. In some ways, the worm lizard is a more impressive specimen of evolution because its forelimbs are still obviously arms, unlike the flipper of a whale, which is so heavily modified that its relationship to out own arms is obscured.
HI EXCUSE ME, I just found out the the prof for this online course I’m taking died in 2019 and he’s technically still giving classes since he’s literally my prof for this course and I’m learning from lectures recorded before his passing
..........it’s a great class but WHAT
IDK SOMETHING ABOUT IT IS WEIRD
I mean, I guess I technically read texts written by people who’ve passed all the time, but it’s the fact that I looked up his email to send him a question and PULLED UP HIS MEMORIAM INSTEAD that just THREW ME OFF A LITTLE
...that feeling when a tenured professor is still giving classes from beyond the grave
There’s job security, then there’s this lmfao.
Also like, all dystopian “you can retire when you’re dead” jabs @ the institution aside—this is actually really sad and somebody should have realized that.
This prof is this sweet old French guy who’s just absolutely thrilled to talk paintings of snow and horses, and somehow he always manages to make it interesting, making you care about something you truly thought could not possibly be that interesting.
It’s fucking sad man wtf Why would you not tell someone that? Do you think students just don’t give a shit about the people they spend months learning from?
And like, it’s shitty that won’t get to thank him for making all of this information so engaging and accessible
I tend to you know...actually talk to my teachers a lot?
Idk man it’s just a weird thing to find out when you’re looking for an email address.
I’m getting a little tired of people comparing teachers to reusable objects so I’m going to go ahead and mute this lmao.
It’s weird to romanticize labor the way some of you do, and it’s weird to act like it’s normal to just not tell students that their teachers dead, goodnight!
Emphasis added.
The last time I was in the faculty senate at UTRGV, a recurring argument was about who owned courses that were created for online teaching. At the time, I thought there was far too much time spent discussing the matter.
But this example shows exactly why that question of who controls course materials matters. It is a sharp and sad reminder that as far as many institutions are concerned, teaching does not require personal interaction if pure Skinner boxing will do. Professors do not even rise to the level of interchangeable cogs. Professors are a mere convenience once they have created content.
External links
Dead man teaching (Added 26 January 2021; The Chronicle of Higher Education caught up)
A friend of mine sent me a screenshot of a page from Abominable Science: Origins of the Yeti, Nessie, and Other Famous Cryptids by Daniel Loxton and Donald Prothero.
It reads:
Invertebrate neuroethologist Zen Faulkes noted further that DeNovo lists no editor, no editorial board, no physical address—not even a telephone number: “The whole thing looks completely dodgy, with the lack of any identifiable names being the one screaming warning to stay away from this journal. Far, far away.”
The excerpt is from this blog post about the claim of sasquatch DNA being sequenced back in 2013. (Most scientists were deeply unconvinced by this.)
I’ve published enough stuff that getting cited is usually not worth a blog post. But having blog posts cited in real physical books still tickles me and is something a little unusual and wonderful.
And I think it speaks to something that makes the rounds now and then: the role of blogging in the 2020s. People occasionally pronounce blogs “dead.” While blogging isn’t a “scene” like it was in the late 2000s, a blog has a lifespan that social media just does not. Being cited in this book is one tiny little piece of evidence of that.
There’s a graphic on Instagram from August that started making the rounds on Twitter. It says, “imagine a world run by scientists instead of politicians.”
I do not like it.
I’m not showing the picture because I don’t think it deserves more
eyeballs, but for the record, the people shown are: Lawrence Krauss,
Neil deGrasse
Tyson, Sam Harris, Michiu Kaku, Richard Dawkins, and Leonard Susskind.
First, the image shows only men. That alone marks it as completely tone deaf.
Second, several of the people shown have amply demonstrated that they do not have leadership ability. Two were investigated for sexual misconduct. One was found guilty of sexual misconduct and took money from Jeffrey Epstein. Others are combative or flaky.
Third, the men (because yeah) chosen are mostly science communicators more than scientists. Sam Harris is a pop science author, not a active scientist. He’s done very little original science, ever. The others have stronger academic bona fides, but most are better known for being on television than publishing original research.
It’s so bad, I wonder if this image is supposed to be an ironic warning of the perils of science politicians. “Meet the new old men, same as the old old men, but with less experience.”
Saw a complaint about how PLOS had created predatory journals and made publication inaccessible through its article processing charge (APC) business model.
When PLOS ONE started, there was no guarantee the APC model would work. It works because scientists choose to publish there, fully well aware of the costs.
It’s not a journal’s fault that scientists use them. Scientists have options. If you don't like a business model, don’t submit there. Then, convince colleagues.
I’ve written before about how refusing to review a paper because you don’t like a journal hurts authors more than editors or publishers. I called refusing to review “shooting the hostage.”
I am being sorely tested in my resolve not to shoot the hostage.
MDPI is a publisher already short of good will for their amateurish practices. Their president last week seems intend in burning any remaining good will by spouting pretty fascist-sounding rhetoric.
When I got an invitation to review yesterday, I legitimately couldn’t do it because I’m moving. But it was a lot easier to say “No” than it would have been otherwise.
Six months ago, I wrote about how American patients dying of COVID-19 would fight with physicians who were trying to save them because they didn’t believe the virus was real.
Six months and 154,293 deaths later, and people... People. Still. Don’t. Think. This. Is. Real.
A South Dakota ER nurse @JodiDoering says her Covid-19 patients often “don’t want to believe that Covid is real.”
“Their last dying words are, ‘This can’t be happening. It’s not real.’ And when they should be... Facetiming their families, they’re filled with anger and hatred.”
It’s sad and depressing. Particularly when we have promising news that COVID-19 vaccines look like they will work.
Update, 17 November 2020: I’m heartened to hear of at least one person who changed his mind.
He mentions hating “fake news”. He says, “I don’t think covids is really more than a flu.“ I clarified, “Now you think differently though?”
He replies, “No the same. I should just take vitamins for my immune system. They (news) are making it a big deal.”
I’m shocked.
I’m at a loss for words. Here I am basically wrapped in tarp, here he is in a Covid ICU. How can you deny the validity of covid? How is this possible? Misinformation is literally killing people in mass, I think to myself.
Typically as a nurse we usually put on a face. We don’t tell our patients another patient just died. We don’t tell them what we just saw. We walk in to care for that patient as they are. We give them our full unbiased care.
I make a choice. Something I’ve never done. I say, “To be honest this is my last shift. You’re the only patient of 25 that has been able to speak to me today or is even aware I’m here.”
He’s surprised but doubtful and asks if other people are doing as well as him. I tell him I’ve never seen so many people SO very sick.
“Really?” He asks if a lot of people have died.
I’m brutally honest. I tell him in 10 years of being a nurse I’ve done more CPR and seen more people die in the last 2 weeks than I have in my entire career combined.
His tone changes, he seems to have understood the gravity of what I’m saying. He apologizes.
You are a young person. Let’s say a woman, for the sake of argument. Like many others, you are mostly working at home. You may not have a lot of your own space at home, if you’re living with family or roommates.
As part of your professional obligations, you are working with a more senior person. Let’s say a man, for the sake of argument.
Your supervisor informs you that you have to install software on your computer that allows him to turn on the camera so he can watch you.
If you are interrupted by anyone, there will be serious professional repercussions. So you may have to do this work someplace private, like your bedroom.
You’re informed that the room has to be well lit and you have to dress a certain way while you’re doing the work.
If you don’t do this, there will be serious professional repercussions.
So you have older man demanding a young woman let him take video of her in her home or he’ll retaliate.
Tell me that’s not creepy.
Yet that’s exactly what is happening at universities all over North America. Professors are requiring students install some sort of “proctoring software” for exams and tests.
Of course, unlike my hypothetical scenario above, either the student or professor could be a different gender than the one I described. I picked the genders I did because I think it makes the potential for creepiness clearer.
But the intrusiveness is a problem regardless.
I wasn’t exaggerating about dictating what you can wear. This example shows professors dictating what students can have on their heads. That’s religious issue for some students, is it not?
That’s on top of issues like this one making the rounds on Twitter. A student got a zero on an exam because she read questions out loud. The software flagged this.
It’s not clear if the software or the instructor decided that this constituted cheating, but someone, somewhere decided that the only possible reason a student might talk during an exam was to speak to a confederate to cheat. That’s stupid.
There seems to be only one counter to pointing out these concerns.
“But they’ll cheat.”
And many professors will be quick to detail all the times they caught students cheating in one way or another.
Academic integrity is important. I get that. The degree
has value because people trust that it represents a fair assessment of a
student’s internalized knowledge and abilities.
But the presence of cheating alone doe not justify any and all actions that professors might take in the name of “academic integrity.”
There is such a thing as “proportionate response.”
If you are worried about someone walking on your property, you put up a sign and put locks on your doors. You do not install a minefield to blow up people. Because that would not be a proportionate response to the problem.
Trusting students is hard. Some students will abuse that trust. But there is a line between thoughtful use of measures designed to say, “Cheating is not okay, so don’t do it” and an overblown invasion of students’ lives.
Anyone more worried about students cheating than they are about how to get students excited about the material and learn has already lost the battle. - Amelia Lindsay
It took almost
3 years to publish an article compared
to
two years of doing the research in sand
crabs.
Unfortunately, I am even not able
to to check out this article, as it is
pay walled for ten dollars. Of course this
case is extreme
but sometimes even two, three,
four months are crucial not only for
scientists’
career but also for the impact and
relevance
of this research for the society that
actually
paid for it.
The video also features Björn Brembs, who’s consistently been one of the best commentators of academic publishing.
(This was written for a behaviour class I am teaching this semester.)
Konrad Lorenz was an important figure in the development of the science of animal behaviour. But I also want to acknowledge that he was a member of the German National Socialist party in the 1930s (Kalikow 2020). Which is to say, Konrad Lorenz was a literal Nazi.
Munz described his party affiliation as “an ugly mix of careerism and genuine enthusiasm for the Nazi regime.” Some of his writing (not necessarily his scientific articles, but
his letters and the like) showed many anti-Semitic attitudes and
arguments for eugenics.
Lorenz was never in the military during World War II. (Correction, 7 October 2020: Lorenz served as a military physician in Poland near the end of the war. Kalikow 2020.) He was not personally pushing people to their deaths. After the war, he said that he was never a party member. It’s not clear to me whether his attitudes ever changed.
I bring this up because there’s a tendency to talk only about scientists’ research contributions, and gloss over or ignore other things they’ve done, particularly when those actions are distasteful or horrible. We like it when people are consistent. We like it when people who create work that is useful, powerful, or enjoyable are also decent human beings.
That is, unfortunately, not always the case.
An author who created a world you love might be racist, homophobic, or transphobic. An actor you enjoy watching might end up doing a perp walk for some crime or misdemeanor. A song you love might be sung by someone who was abusive. And it can makes it hard to sing that song that you love.
But we do ourselves no favours by acting as though only the science matters. It matters when someone was a bigot or a bully or whatever. Real people suffer real hurt because of those attitudes. We have to grapple with the fact that terrible people can do good science.
Part of that is owning up to the dark corners of scientific history. That’s one small part of how we treat people in science better now and in the future.
Munz T. 2011. “My goose child Martina”:The multiple uses of geese in the writings of Konrad Lorenz. Historical Studies in the Natural Sciences41(4): 405–446. https://doi.org/10.1525/hsns.2011.41.4.405
So across the US, university towns are quickly becoming COVID-19 hotspots because campuses reopened, with crummy plans, and despite warnings for months. Given that South Texas was alreayd a hotspot for COVID-19, with something like 20 deaths reported every day in the country during weekdays for weeks, I was convinced reopening UTRGV – mandated by the UT System very early on – would be a disaster.
Last week, I swung by campus in the middle of the day. This was the second week of class, and the university is nomiall open and holding face-to-face classes.
That parking lot was as empty as I see in the week after spring semester ends. Almost as empty as the week between Christmas and New Year.
I was incredibly relieved.
Somehow, our faculty and students have made this semster a de factor online semester for the campus. I don’t know how it happend, because there was no coordination, but I’m glad it did.
Today is my last day as a tenured full professor at The University of Texas Rio Grande Valley.
This is going to be a tough post to write.
Since starting at The University of Texas Pan-American, I’ve made no secret to people here that I would love a reason to move back to Canada. But it’s always been a low-level, “Wouldn’t it be nice if...?” wish. I had been looking and occasionally applying for years.
But this year, a new word kept forming in my head:
“Flee.”
I kept wondering throughout the summer, “Is it time to go, regardless of the job I have now?” At one point, I took out a lot of cash from my bank account in case I needed to leave immediately. The sort of money that many people call the “Fuck you” fund in case they have to leave an abusive partner. Things have felt that bad.
Living in the United States in 2020 has broken my belief in this country.
And I don’t think an election and a vaccine is going to fix it.
It’s not just that the current administration is awful (though it surely
is). It’s how so many people have embraced the awfulness.
It’s about how the US can’t address its chronic problems. The COVID-19 pandemic didn’t create these problems but it sure as hell threw them into sharp relief.
Over at the Better Posters blog, I’ve been compiling pictures of 2020 events in the United States. They are probably more powerful than anything I might write.
I accepted a new position at McMaster University in Ontario, Canada. And doing so was the most brutal professional decision of my life. That is the only word I have to describe the decision. Brutal. There was some crying on the couch with me holding my wife in one arm and my dog in the other arm.
Giving up a tenured professorship? I mean, that is the thing that you are just not supposed to do. I like being a professor. I like doing research. I still have questions about those sand crabs and crayfish that I want to answer. It hurts to think that for all I know, I’ll never be able to hold one of those little Lepidopa in my hands again. I like my department colleagues. I like the students I work with.
But this job offer seemed to come at a “now or never” moment. Between taking the time to focus writing the Better Posters book, followed by a global COVID-19 pandemic making both field and lab research difficult, my biological data collection had practically ground to a halt. It looks like 2020 may be the first year in well over a decade that I haven’t published something. I’m not abandoning projects in mid-stream. I don’t have any graduate students who are counting on me to finish their degree. No mortgage I’m stuck with.
McMaster is teaching remotely this semester, so I will still be in Texas for a while at least. But the plan is to move back to Canada. I am anticipating massive reverse culture shock. I know that Canada is not perfect, but Canada at least looks like functioning democracy and not like a collapsing empire.
I am not sure what this move will mean for me professionally. But I am convinced that this move will result in a better quality of life for me, my wife, and family. I want to look after myself and them. This is not just me wanting to move home.
But I am not kidding myself. There is a big leap of faith here. And in any leap of faith, you have to ask what do you believe? Do I believe I am smart enough and hard working enough and resilient enough to make this okay for me, my wife, and family?
Leaps of faith are scary.
As I mentioned a while ago, I have recently rediscovered the music of The Alarm and Big Country (both bands connected by frontman Mike Peters). And as so often happens, music helps.
There will be hurt, there will be pain
There will be a lot of tears, a lot of joy
What we have left cannot be destroyed
Time to move on, to let it bleed
What will be, will be
There is a land, there is a sea
There is a place where we can be
There is a hope, there is a dream
Mike Peters: “Sometimes, you’ve got to make the journey. We have to make that leap of faith. We have to cross that line to embrace what is happening now. We’ve crossed that line, and this is a song that lyrically encourages everybody to cross that line. And it acknowledges, ‘Yes, it’s gonna be a tough journey. There will be pain, there will be joy, there will be tears.’ Everything associated in life comes into making this particular journey.”
The death of actor Chadwick Boseman is unexpectedly hard.
It was bad enough that he died young, but it was seeing the tributes and reaction on Twitter that brought it home. What he brought with his performances meant so much to so many people – especially his work as T’Challa, the Black Panther, in four movies.
People had wanted to see a hero like him for so long.
Losing a hero is hard. Even (maybe especially) a fictitious one.
Before I saw Captain America: Civil War, I read an interview with one of the filmmakers, who nailed what Boseman brought to the role of T’Challa. Boseman’s performance didn’t blow you away: it made you lean in. When I saw the movie, that was exactly it. It wasn’t a showy performance, but it was intense and magnetic and compelling.
And what else has been clear in the last few days was that he was so much more than his acting roles and much more than T’Challa. By all accounts, he was just a decent human being.
In one of the most famous scenes in Black Panther, Killmonger says, “Is this y’all’s king?!”
Despite its decisive drubbing in Kitzmiller v. Dover, intelligent design just keeps showing up like the proverbial bad penny. The latest poking of the intelligent design helmet out of the foxhole is a paper in the PNAS, spotted on Twitter.
I don’t like that an intelligent design paper was published in a journal. But nor do I like comments about the age of the author.
Guess that’s what happens when a paper about evolution by an 87-year-old
physicist is reviewed by another physicist and a complex systems
theorist. (here)
That the author is in his 80s is trotted out as though it’s an explanation. How? How is the author’s age at all relevant? I don’t know of any data that show people slide into intelligent design beliefs as they get older.
Don’t judge work by the age of authors. That’s ageist.