Two articles crossed my path this morning: (1) a NYTimes editorial on the impact of technology and automation on middle-tier jobs (arguing that middle-tier jobs will need to integrate using automated data with human skills of adaptation and communication); and (2) a Slate piece from a software engineer striking back on the recent "everyone should learn to code" frenzy (roughly: quality coding is hard and people don't need to understand the details of computers to work with them).
These articles help highlight two key problems. First, each of "everybody", "learn", and "code" has wildly different meanings for different contexts and we tend to conflate them when talking about this question in general terms. Second, the recent surge of activity around programming education for all has fixated on roughly a single answer to this question (lessons in scripting via something like Python), and that answer isn't suitable to address the challenges raised in the NYTimes piece.
The current incarnation of online coding classes can serve a particular interpretation of "everyone" and "code" well: if your job will require you to script some data or an application, knowing basic scripting is a good idea (I'm avoiding the question of instructional design/choices in many of the current tools, many of which give me great cause for concern -- that's another post for another day). Learning basic scripting will not, however, particularly help those looking to attain jobs that blend working with automated tools and applying human skills. Yes, those jobs require someone to be able to think about aspects of computation (what is possible, where potential consequences lie, etc), but writing scripts isn't necessarily the right way to get there. Or, to be more precise, writing scripts for the kinds of general, application-centric activities that seem to be common in online programming classes isn't necessarily the approach to get there.
There's a huge challenge and opportunity here for computing educators to articulate contextualized learning goals for "everyone" who will need to blend their work with computational processes, and then to design instruction that meets those goals. In being too narrow about how we think of "everybody", "learn", and "code", we're just setting ourselves up to argue without making meaningful headway towards solving a significant problem.
Monday, August 26, 2013
Sunday, August 25, 2013
When does sabbatical start (and what is it, really)?
I'm bemused by how my thinking about sabbatical has evolved over the last several months. (I am on sabbatical for the entire upcoming academic year):
- Having front-loaded my entire teaching load to the fall semester last academic year, I gave my last pre-sabbatical lecture in December. I was conscious of it being my last lecture for over 18 months. Under the "sabbatical-as-concentrated-research-time" view, part me believe sabbatical started as soon as my final grades went in. That wasn't a great interpretation, as my spring semester was so deadline-driven (proposals, papers, committee/university service, etc) that I got little new research done compared to my teaching-intensive semesters, and I was frustrated.
- The day of my last committee meeting on campus, I went home giddy, convinced that sabbatical had finally started. This took the "sabbatical as freedom from meetings" view, as well as the "sabbatical-as-freedom-from-commuting-to-campus" view (I live an hour's drive from campus). These views discounted the wave of "I'm really tired and need some rest before I can usefully think again" that characterized the start of summer.
- July 1st was my official first day of sabbatical, after which I could reasonably tell anyone who asked me to do any university work that I was unavailable (not that anyone did, but I still sensed power in the date). This was the "sabbatical-as-owning-my-own-time" view.
- Solid progress on new research projects in new areas in the second half of this summer have been personally rewarding. This is the "sabbatical-as-time-to-do-new-big-stuff" view.
- This weekend, I am conscious that the incoming freshmen are moving in today, classes start on Thursday, and I'm not responsible for a dang thing. I'm gloating internally at all the university and department emails that I'm not bothering to open. I've reconnected with the idea that August can actually be a relaxing and enjoyable part of summer; I could get used to that. This is a "sabbatical-as-freedom-from-death-by-a-thousand-time-cuts" view.
Reflecting from my calm August deck chair, I see how much of my early view of sabbatical has been framed around the idea of "freedom". That's somewhat sad, as it encouraged me to focus on the aggrevating parts of faculty life (which I do actually enjoy on the whole). It also had me thinking about the end of sabbatical--the time when I would lose that freedom--from the time it started. Focusing on the looming end of freedom made me a outright basket case for the first several weeks, when I felt a responsibility to make the most of every single minute of freedom I had, like grabbing a precious breadth when coming above water.
Those early weeks of sabbatical actually weren't much fun emotionally.
With a summer of rest and reading behind me, my perspective is healthier. Sabbatical now feels like the "responsibility-to-push-myself-in-new-intellectual-directions". Yeah, that's what the official memos on how to apply for sabbatical said, but I didn't feel it in my bones before now. Finally, in the week when classes are about to resume, I finally feel sufficiently rested and initially rejuvenated mentally to start the real work of sabbatical. I'm just thankful to have gotten to this point with a full year still to go, and with enough productive work done over the summer that I know what to do in the times that I won't be lecturing, orienting, or otherwise sitting in meetings this week. There's no more academic baggage to lose, so the true freedom of sabbatical can begin (apologies to Janis Joplin).
Subscribe to:
Posts (Atom)