Maybe one to show your classes in the next week or two as a mood lightener (and encouragement for making that first trip to office hours)?
Courtesy of Arizona State University (check out the story behind it here):
Maybe one to show your classes in the next week or two as a mood lightener (and encouragement for making that first trip to office hours)?
Courtesy of Arizona State University (check out the story behind it here):
Website Wednesday is an occasional feature in which we highlight one (or a couple) of sites from the Billions floating around the Intertoobz that just might help you with your Herculean task of educating inquiring minds. Any and all suggestions for future editions are welcome.
Pedagogy Unbound: “A place for college teachers to share practical strategies for today’s classroom.”
Take something/leave something.
~On the myth of a gender gap in mathematical ability (Gawker)
~The math and cost of pennies (xkcd.com)
~Statistics and some (devastating/common) fallacies of probability–very accessible and interesting (Salon); or learn about Bayes and his famous theorem (farnamstreetblog);
~What do you know about infinity? Did you know there are different infinities? There are different infinities (Plus.maths.org and NYT)
~Check out the mathematics of sport (note the great set of links if you have interest in a particular sport) (sabermetric research);
~This one has infinity in the title, but it’s about a person teaching math in prison (Prospect)
~Check out the world’s fastest number game; can you correctly sum 15 numbers shown to you in 1.85 seconds? If so you wouldn’t have won this year’s championship; I’m not sure if the video makes it more or less believable (Guardian)
~More about history and society than Math proper, it’s fascinating anyway–“A look at anti-Semitic university admissions in the USSR from the perspective of a leading mathematician” (New Criterion)
~Another history lesson–this time on Emmy Noether, “the most most significant mathematician you’ve never heard of” (NYT)
~Teach yourself logic or at least gather some info about resources for doing so (Logic Matters)
~Read about the excitement about a possible proof of the deep connection between primes–“The usually quiet world of mathematics is abuzz with a claim that one of the most important problems in number theory has been solved. Mathematician Shinichi Mochizuki of Kyoto University in Japan has released a 500-page proof of the abc conjecture, which proposes a relationship between whole numbers — a ‘Diophantine’ problem”
~Learn why Base 12 is better, if Art DiVito didn’t make it clear to you already (Guardian)
~Fractals materialized (NYT); or read about the “father of fractals” (WSJ)
~Math-phobic parenting (WSJ, via Jen Asimow’s Math At Home Blog)
~Eleven ways shoppers go wrong in their math (Atlantic)
Why literature? Because it’s good for you and tastes better than brussel sprouts.
~Use THIS, which is glorious, to teach narrative, interpretation, personification, metaphor, whatever. Or just watch it. It’ll be a highlight of your day. Promise;
~Consider irony. Don is doing it. And in response to the article that prompted his reflections, many others did too, though they came to different conclusions about the merit of the original piece (as here and here);
~Check out this article on the top Literary Heroines of 2012 (with links to other such lists) or this list of great books from 2012 (Hologram for a King was entrancing; I read it in two sittings only because I wanted to slow the experience down a bit to enjoy it longer. Really, really great.) or this longer one (with poetry!);
~Think about translation and how it affects what you read (you read stuff in translation, right? RIGHT?)
~Read up on the various perspectives and associated controversies surrounding the latest Nobel Prize Laureate, Mo Yan (whose book, for the record, Life and Death Are Wearing Me Out was one I enjoyed greatly) and the difficult intersections of politics, language, and art;
~Have you read Lawrence Durrell’s Alexandrian Quartet? I loved it when I was in college. I’ve been a little fearful to go back to it, lest it disappoint, and it hasn’t come up since, except in my own mind when Alexandria is mentioned. It was in those books that I first found C.P. Cavafy, whose work I love. And someday, I hope to visit. In the meantime, I was happy to find this;
~Learn about Wayne Booth’s helpful distinctions of narrator, implied author, and actual author (or at least the implied author part) as applied to political punditry. Or, learn about Wayne Booth. He was awesome;
~Read about epigraphs and their history, one way that books talk to each other, as Umberto Eco might put it.
~Did you know “Toni Morrison” is a pen name? Or that she did this with Rokia Traore (who put on one of the most amazing live music experiences I’ve ever had–you should check her out if and when she comes back to town) I didn’t, until I read this;
~Imagine life as an editor. Nothing but commas everywhere. And errors;
~Read this absorbing essay about Literature and Digital Humanities. A bit of it:
At the advent of print, the humanities emerged, under the aegis of Erasmus and others, to negotiate the spread of the classical tradition out of the monasteries into private hands. Today, with the advent of the Internet, Google’s self-described project is to make the world’s information “universally accessible and useful.” Academia could have done what humanists have done throughout history and tried to add to Google’s mandate: make the texts legible and available. They could have tried to bring out the contemporary relevance that only historical context, knowledge of literary tradition, and scholarly standards can provide. But this ancient task was anathema, for the simple reason that it would have involved honest work. Much easier to remain in the safe irrelevance of mass publication in the old mode, what Kingsley Amis called “the pseudo-light it threw on non-problems.” For at least 50 years, humanities departments have been in the business of creating problems rather than solving them.
All in all, it’s fair to say that the conversion of literature into data could not have gone much worse, which does not bode well for the second, oncoming phase, where we decide what to do with the literary data we now have…
But the really great part of the essay (I think) comes in the second half when the author discusses literature as “resistance to data.” Which is another reason to love Lit.
~Read a difficult book; or, better (?) read about other people’s picks for the ten most difficult books;
~Check out these two interviews with Junot Diaz (here and here)–both great;
~Or find some other author talking about her or his book;
~Read this letter from Steinbeck to his editor about books and reading and audiences and life;
~Consider what should (and should not) be “required reading” or think about re-reading;
~Or read about a snob’s opinion of Stephen King’s work;
~Have you read any lit crit lately? Are you wondering what Terry Eagleton is up to? Or wondered why contemporary lit is “gutless” (as compared to the work of Rabindranath Tagore–do you know Tagore? You should. Interesting dude.)? Anyway, not to fear–postmodernism is dead. Unless it isn’t;
~Finally, to bring it around, you might (re)-consider the effects of literature and its limits:
When we’re practiced in sympathy it is easier for us to notice “what is not seen.” When we have tried, over and over again, to put ourselves into others’ places and to see the world from where they are standing, we’re better people, living in a more civil world. Because we’ve read Alice Adams, we might not go over the top trying to impress people the next time we’re under great social pressure and we might not be so harsh on those who do. Because our children have read, and have had read to them, stories that help them think about the perils of greed, or the importance of kindness, or the dangers of drinking from bottles marked “Drink me,” they will grow up to be more considerate and more careful of themselves and others.
It’s tempting to close with promises about how if we all just read a few more books—better books—support our local arts scene, visit museums, attend concerts, read to our children and make them take piano lessons, our problems will be solved. Surely, a society that’s grounded in civility and sympathy and learned in the humanities would not be plagued with financial irresponsibility and ethical misconduct. Surely it wouldn’t be run by politicians and reported on by journalists who use language that would have shocked Lady Chatterley. Unfortunately people who offer easy answers to complicated questions are usually trying to sell you something.
The humanities can teach us civility and sympathy, but they can’t make us perfect and they can’t fix our problems for us. They can help us be more aware of the “unseen,” but they cannot help us predict unintended consequences. There isn’t a philosophical theory or a novel or a painting or a piece of music in the world that can solve the Middle East or clean up an oil spill or make the economy recover. The best the humanities can do is to remind us that, as Auden put it, “We must love one another or die,” and then show us how to do it.
Way back last year, I had a notion of a series of posts on things to do over break that was meant to help clear out my “Instapaper” account (which currently stands at 20+ pages of material) and backlog of stuff I wanted to post and address some other items that I hadn’t managed to get to over the course of last fall. Unfortunately I didn’t get any farther than the first one.
Still, as with most things, I prefer the long view, and so, though belated, you can expect a few more of these.
One of the things I wanted to get to last semester (admittedly, in order to do some moaning and complaining about it) was security. There were a couple of incidents in the fall that had me thinking (again) about the topic (as here along with the numerous incidents around and near Truman and Kennedy King, as well as elsewhere). Remember the Faculty Council survey from 2011 and the findings (more information on the results and the context here)? If not, this is what we found:
1. 90% of the respondents feel HWC is a safe environment, but roughly 3/4ths of those who held this view think it can be improved;
2. Roughly 75% of the respondents did not experience any immediate or potential threat to their safety this semester; that number dropped to 65% when asked to consider other situations that made them feel uncomfortable about their safety;
3. A clear majority of the faculty that responded do not know how to access emergency plans and crime information;
Well, that one, at least, we can do something about. If you missed it, over the break, Armen sent out some links for everyone to review; you might also consider posting a link to one or more of these in your Blackboard site.
Finally, one last thing you should know about (and please help spread the word)–Harold Washington College has something called the Supportive Intervention Team (SIT). Its origins lie in the Clery Law’s mandate that every college have a Threat Assessment group with training and processes for identifying and dealing with threats that strike the necessary balance between intra-institutional transparency and student privacy. In the worst of the college-campus tragedies of recent years–Virginia Tech, Northern Illinois, etc.–subsequent investigations found that earlier interventions might have made a difference. Over the past year and a half or so, George Bickford and Michael Russell have led the development of a set of process/protocols that have led to regular communication among security, administration (especially student services), faculty, human resources, and our Wellness staff. The process has been both educational and painstaking (I was a Faculty Council co-rep, along with Rosie and Matt Usner, last spring and summer) and extremely thoughtful. You should DEFINITELY take a gander at the SIT page and maybe make a note of the link for reporting a “Person of Concern.” If you scroll down on the SIT page, you’ll find guidelines for reporting, as well as an explanation of the process once the report is made. There is also a link to a page with helpful reminders about engaging with distressed people (students, faculty, staff, strangers–whoever).
Originally I was going to belly ache about the absence of a sign in every room with the phone number for security large enough to be read from the back of the room (maybe we should make our own in the meantime?) and the fact that the last lockdown drill and associated key distribution (that I know of) was conducted almost two years ago and was only partial even then. I would guess that these items will get more attention in light of Newtown. At least I hope they will. In the meantime, for yourself and your colleagues and your students, make sure you’re not the person who doesn’t know what to do if you need to know what to do.
Learn some science! (First, crack open a beer–(I’m assuming positive effects irrespective of sex)):
“We live in a society absolutely dependent on science and technology,” Carl Sagan famously quipped in 1994, “and yet have cleverly arranged things so that almost no one understands science and technology. That’s a clear prescription for disaster.” Little seems to have changed in the nearly two decades since, and although the government is now actively encouraging “citizen science,” for many “citizens” the understanding of — let alone any agreement about — what science is and does remains meager.So, what exactly is science, what does it aspire to do, and why should we the people care? It seems like a simple question, but it’s an infinitely complex one, the answer to which is ever elusive and contentious. Gathered here are several eloquent definitions that focus on science as process rather than product, whose conduit is curiosity rather than certainty.
Salon spoke with Kelly about hiding the science behind disgust, why we’re captivated by things we find revolting, and how it can be a very dangerous thing.
One of the classic conundrums in paleoanthropology is why Neandertals went extinct while modern humans survived in the same habitat at the same time. (The phrase “modern humans,” in this context, refers to humans who were anatomically—if not behaviorally—indistinguishable from ourselves.) The two species overlapped in Europe and the Middle East between 45,000 and 35,000 years ago; at the end of that period, Neandertals were in steep decline and modern humans were thriving. What happened?…
There is no shortage of hypotheses. Some favor climate change, others a modern-human advantage derived from the use of more advanced hunting weapons or greater social cohesion. Now, several important and disparate studies are coming together to suggest another answer, or at least another good hypothesis: The dominance of modern humans could have been in part a consequence of domesticating dogs—possibly combined with a small, but key, change in human anatomy that made people better able to communicate with dogs.
It is natural for those not deeply involved in the half-century quest for the Higgs to ask why they should care about this seemingly esoteric discovery. There are three reasons.
First, it caps one of the most remarkable intellectual adventures in human history — one that anyone interested in the progress of knowledge should at least be aware of.
Second, it makes even more remarkable the precarious accident that allowed our existence to form from nothing — further proof that the universe of our senses is just the tip of a vast, largely hidden cosmic iceberg.
And finally, the effort to uncover this tiny particle represents the very best of what the process of science can offer to modern civilization.
Over the next few years, Doeleman says, he and his group will combine as many as a dozen of the world’s most sophisticated radio-astronomy installations to create “the biggest telescope in the history of humanity”—a virtual dish the size of Earth, with 2,000 times the resolution of the Hubble Space Telescope. Tonight the Event Horizon Telescope astronomers have a more limited goal: They want to catch as much light from Sagittarius A* as possible and study its polarization to learn about the black hole’s magnetic field. But eventually (if all goes well) astronomers using the fully scaled-up Event Horizon Telescope—a machine with resolution high enough to read the date on a quarter from 3,000 miles away—will see the silhouette of an object that is, in itself, unseeable.
Imagine trying to learn biology without ever using the word “organism.” Or studying to become a botanist when the only way of referring to photosynthesis is to spell the word out, letter by painstaking letter.
For deaf students, this game of scientific Password has long been the daily classroom and laboratory experience. Words like “organism” and “photosynthesis” — to say nothing of more obscure and harder-to-spell terms — have no single widely accepted equivalent in sign language. This means that deaf students and their teachers and interpreters must improvise, making it that much harder for the students to excel in science and pursue careers in it.
The idea of building artificial life forms, whether in software or in synthetic cytoplasm, has always been controversial. Mary Shelley, almost 200 years ago, wrote a deep meditation on this theme: Frankenstein, or the Modern Prometheus. In Shelley’s time the debate was framed in terms of vitalism versus mechanism. The vitalists argued that living things are distinguished from inorganic matter by some “spark of life” or animating principle. The opposing mechanist view had its greatest early champion in René Descartes, who compared animals to clockwork automata.
Within the world of science, the doctrine of vitalism is long dead, and yet there is still resistance to the idea that life is something we can fully comprehend by disassembling an organism and cataloging its component parts. In the brash early years of molecular biology, DNA was “the blueprint of life,” a full set of instructions for building a cell…Now that we read DNA sequences quite fluently, it seems clearer that there’s more to life than the “central dogma” of molecular biology.
The idea of simulating a living cell with a computer program stands in the crossfire of this argument between reductionism and a more integrative vision of biology. On one hand, the WholeCell project makes abundantly clear that the DNA sequence by itself is not the master key to life. Even though the transfer of information from DNA to RNA to protein is a central element of the model, it is not handled as a simple mapping between alphabets. The emphasis is on molecules, not symbols.
On the other hand, the very attempt to build such a model is a declaration that life is comprehensible, that there’s nothing supernatural about it, that it can be reduced to an algorithm—a finite computational process. Everything that happens in the simulated cell arises from rules that we can enumerate and understand, for the simple reason that we wrote those rules.
I would love to believe that the success of simulation methods in biology might forge a new synthesis and put an end to philosophical bickering over these questions. I’m not holding my breath.
What made antibiotics so wildly successful was the way they attacked bacteria while sparing us. Penicillin, for example, stops many types of bacteria from building their cell walls. Our own cells are built in a fundamentally different way, and so the drug has no effect. While antibiotics can discriminate between us and them, however, they can’t discriminate between them and them–between the bacteria that are making us sick and then ones we carry when we’re healthy. When we take a pill of vancomycin, it’s like swallowing a grenade. It may kill our enemy, but it kills a lot of bystanders, too.
Using simple behavioral tests, Wright’s research team showed that like other lab-tested brooders — which so far include us, monkeys, dogs, and starlings — stressed bees tend to see the glass as half empty. While this doesn’t (and can’t) prove that bees experience human-like emotions, it does give pause. We should take seriously the possibility that it feels like something to be an insect.
The concept that current humanity could possibly be living in a computer simulation was first seriously proposed in a 2003 paper published in Philosophical Quarterly by Nick Bostrom, a philosophy professor at the University of Oxford. In the paper, he argued that at least one of three possibilities is true:
- The human species is likely to go extinct before reaching a “posthuman” stage.
- Any posthuman civilization is very unlikely to run a significant number of simulations of its evolutionary history.
- We are almost certainly living in a computer simulation.
Savage said, however, signatures of resource constraints in present-day simulations are likely to exist as well in simulations in the distant future. These constraints include the imprint of an underlying lattice if one is used to model the space-time continuum.
Is scientism defensible? Is it really true that natural science provides a satisfying and reasonably complete account of everything we see, experience, and seek to understand — of every phenomenon in the universe? And is it true that science is more capable, even singularly capable, of answering the questions that once were addressed by philosophy? This subject is too large to tackle all at once. But by looking briefly at the modern understandings of science and philosophy on which scientism rests, and examining a few case studies of the attempt to supplant philosophy entirely with science, we might get a sense of how the reach of scientism exceeds its grasp.
In the last couple of months we have seen more and more surveys popping up in our inbox. There was the survey about the Inspector General’s Office, about Morale (where are those results?), Lecture capture cameras (ditto), and now Registration. I know many people do not fill out these surveys which I think is a mistake. Consider this, in some small way filling out these surveys is like voting. If you don’t vote, you don’t get to complain about decisions made later. How much happier we all would have been if the powers that be had sent a survey about branding or graduation or most recently, no more spring hires. (Just a side comment- if we are no longer allowed spring hires, does that apply to district too? So, if there is a job opening do they have to wait until fall semester to hire or are they allowed to hire based on need and availability?).
Not to say that a survey would have changed the decisions that the money spenders made, but at least our voice would have been heard. A large complaint about this administration is the total top-down communication. I would like to think these surveys are at least an attempt to give the people who actually work with students a voice. So next time you see a survey pop into your inbox, don’t ignore it, fill it out. Don’t pass up an opportunity to actually communicate back to the powers that be, we have so few opportunities to do so….
I originally planned on doing this way back in May, but by the time I got around to it, the place had pretty much cleared out (much like the graduation!), but since I promised it and one of your faculty council members has recently raised the topic as one in need of discussion, I thought I should post the chance for people (now or in the future) to make some suggestions about next year’s graduation while this year’s is still a good ways off.
Just in case you missed it, last spring’s was better in some ways than the year before (no stifling hot weather, no three hour wait in the parking lot) and I really, really liked the fact that faculty and students were staged in the same area together, so I had the chance to see and interact with our graduating students before the ceremony started, at least.
Even still, it’s hard to say that the ceremony was a “success;” one commentor mentioned the de facto faculty boycott of the district wide event–I counted 24 HW faculty members proceeding in with the students. Others volunteered. Even including them, though, I’d guess that most expected it to be long and frustrating, and so avoided it. The students who went were, for the most part, practical in their approach. Here is the picture I took from my seat at the moment the Chancellor said, “I now officially confer your degree” etc., and asked everyone to move their tassles from the left to the right, you know–when everyone throws their hats in the air in the movies.
You might recall that I was sitting with HW students and we were at the very back of the floor. So, all of that space in front of me is the space where the graduates from the six other colleges had been. If you look up to the stands on the far left, you can see that they are empty, too. The worst part is that the few people in the picture were all faculty marshals. If there were 50 students in the building at that point, I’d be surprised.
And if we keep increasing the number of degrees we give out, it’ll only get worse.
Finally, I kind of miss the fact that the people who earned Basic and Advanced Certificates (and GED degrees) used to be included, but aren’t anymore.
And I really missed hearing the rocking HW student ensemble on the recessional.
So, in that spirit, I’d like to suggest that graduations go back to local campuses. The different colleges at Universities hold separate graduations all the time. Why not us?
Failing that, I have no solution to the degree conferral issue other than universal conferral, with simultaneous later pick up/individual recognition. If we did that, the whole thing would be over in less than an hour and it would be much more celebratory.
Also, as our Valedictorian suggested, maybe some music in the parking lot to keep spirits up?
And maybe an email to the faculty announcing where to be and when, if invitations and instructions are not going to be distributed with the gowns (as they once were).
Any and all others can go in the comments. I’m sure someone will take notice…though, the board report on the event suggests that everyone was pretty happy with how things went. I suppose it’s all relative. It was certainly easy to get to a bathroom at the end of the ceremony. Beyond that, I’m not too sure I would classify it as a ‘success.’ But maybe that’s me.
For Them (Seven is a winner on the come out roll):
For You (this list goes to 11):
What else have you got?
Speaking of the union, there are two really good pieces in this month’s American Educator that you should check out:
Putting Students on the Path to Learning makes the case for Fully Guided Instruction (as opposed to Partially Guided Instruction (a.k.a., Discovery or Inquiry learning) particularly for novel information, including this interesting gem:
Researchers found that algebra students learned more by studying worked examples than by solving equivalent problems…For novices studying worked examples seems invariable superior to discovering or constructing a solution to a problem…studying a worked example reduces the burden on working memory (because the solution only has to be comprehended not discovered) and directs attention…toward storing the essential relations between problem-solving moves in long-term memory. Students learn to recognize which moves are required for particular problems, which is the basis for developing knowledge and skill as a problem solver.
The other is “Principles of Instruction: Research Based Strategies that All Teachers Should Know”and it provides an overview of 10 strategies and 17 principles of effective instruction that you probably already know but might, like me, benefit from seeing again.
Here is the second of five “How to Study Videos for your students (and you). It’s called, “What Students Should Know About How People Learn”
(The first one is here.)
You’ll know it from the glassy eyed stares in class, the conversational non-sequiturs, the muttering, and complaining…and that’s just the faculty!
So, we’re here to help. All week long I’ll be posting some videos to show your students (or make available to them) that I keep running across vis-a-vis other educators saying how helpful students say they are.
Today’s is called “Beliefs that Make You Fail…or Succeed.” Check it.