Reading a NYTimes piece this morning that summarized work on how people tend to dismiss those with less social power in a given situation. The article is unfortunately titled around money, even though the point of the article is much broader, looking at the implications for public policy if high-power people dismiss those below them.
I'm thinking about the impact of this in education, particularly in my own field of computing.
Much has been written about the nasty or dismissive culture of computing and its role in hampering diversity. Much has been written about how early encouragement was key for many women in the field (citing women simply because I'm familiar with writings on women in CS); this was a key component of last week's NYTimes piece on Women in Science. The surface-level connection between these themes seems clear, but I've never understood the underlying mechanism that makes computing culture somewhat abrasive. The social power piece leaves me wondering how we define social power in computing, and how that evolves over the span of a degree program or career.
My sense is that early on, we equate social power with programming ability in some abstract sense: those who can crank out code quickly, know arcane commands, and never need to reference a manual are granted power and status. The "social" power here comes from these folks being able to act as human manuals (if you can muster the nerve to approach them). As computing assignments and projects grow richer so that programming is more a mechanism and less the whole picture, new forms of social power emerge. By then, of course, many of those who struggled with the first definition of social power have dropped out.
CS educators are increasingly thinking about how to broaden the perspective of computing in intro courses: get beyond programming and into applications, requirements, ethics, etc. This is a great first step. But if social power theory applies here, then these efforts will only have impact of courses can create a similarly broad social power structure. Exercises that talk about broader issues but then have students primarily express their ideas in code won't challenge the core problem. Expanding what we talk about in classes targets an individual's perspective on computing; changing what we visibly value and celebrate in our courses targets the social communities that yield broader groups of computing professionals.
How often do we talk about instructional design for shaping communities rather than individuals? If anyone has pointers into literature on this for computing, I'd like to hear about it.
Monday, October 7, 2013
Saturday, September 21, 2013
Mediating Differences in Computing Skills in the same Classroom
I just came across Ross Penman's post on the problems of computing education. Ross is a 14-year old web programmer from Scotland. His description of his school's computing curriculum is depressing, particularly the descriptions of the various ways in which it inadvertantly turns students off from computing. These ways include the impact of old software and technologies that yield inferior-looking products (in the web-design or video-production spaces) and programming tools with low ceilings (in this case, Scratch), which aren't able to hold the interest of students who grok programming and want to do more with it, faster.
The latter point particularly got to me. In our drive to providing computing education for most students (a good thing), we can't afford to alienate the students who will actually be good at it. Many comments on his post are from professional programmers advising him on how to learn on his own. That's reasonable advice for a student with Ross' abilities to teach himself, but it completely misses the point: what do you do for the students who have similar potential to Ross, but not the confidence, drive, or established interest to do this on their own? Our curricula need to engage and foster them!
A key part of the problem here is how to gracefully handle the wide variation in both teacher and student abilities in computing: variation in student abilities is wide, the knowledge gap between some students and their computing teachers is also wide. How do we enable teachers to work with such a wide-ranging class, including students who are more comfortable (or skilled) than the teachers?
Teachers need curricula that they can use on an entire class, but that (a) accommodate a wide range of student abilities, and (b) are sufficiently within the teacher's comfort zone to be usable. The community seldom talks about (b), but we can't ignore its impact on adoption and effectiveness of computing education.
Our current standard early-programming solutions, such as Scratch and many of the other drag-and-drop tools, are too far skewed to the lower-middle end of (a) while achieving (b). The problem isn't drag-and-drop per-se, but the limitations of the languages and tools that get created around drag-and-drop.
What if we had a curriculum (with language, software, materials, etc) that provided a simple, beginner-friendly approach to writing basic programs, but that scaled---within the same framework and tools---to more sophisticated programs that exercised more advanced computer science concepts? Teachers could focus on learning to teach the core framework; most students could learn computing at the level of the basic programs, but students who are doing well could go forward without having to go off entirely on their own (and staying within the realm of what the teacher recognizes, for various benefits).
Program by Design (PbD), which we have worked on for 20 years, was designed for the sort of smooth ramp-up that accommodates learners at all levels. The world-programming infrastructure within PbD developed the basic framework of programming animations, websites, and other reactive systems using a small set of constructs. Bootstrap adapts animations programming in world to the needs of middle-schoolers and teachers who are new to computing. In this ecosystem, a teacher can cover the core world-programming framework with the entire class. Interested students can get into more advanced computing concepts by creating animations with richer features (multiple similar characters to get into lists, etc). But these advanced students are still in the tools and approach that the teacher knows, which makes for a smoother curricular experience all around.
Some of our papers describe the underlying model, tools, and curricula. There are some rough spots to the way we've packaged these curricula in the past, which we are actively working on. But the fundamental idea and inspiration remains: our challenge as computing educators is to create tools that (a) accommodate a wide range of student abilities, and (b) are sufficiently within the teacher's comfort zone to be usable. We hope others will join us in trying to meet this challenge.
The latter point particularly got to me. In our drive to providing computing education for most students (a good thing), we can't afford to alienate the students who will actually be good at it. Many comments on his post are from professional programmers advising him on how to learn on his own. That's reasonable advice for a student with Ross' abilities to teach himself, but it completely misses the point: what do you do for the students who have similar potential to Ross, but not the confidence, drive, or established interest to do this on their own? Our curricula need to engage and foster them!
A key part of the problem here is how to gracefully handle the wide variation in both teacher and student abilities in computing: variation in student abilities is wide, the knowledge gap between some students and their computing teachers is also wide. How do we enable teachers to work with such a wide-ranging class, including students who are more comfortable (or skilled) than the teachers?
Teachers need curricula that they can use on an entire class, but that (a) accommodate a wide range of student abilities, and (b) are sufficiently within the teacher's comfort zone to be usable. The community seldom talks about (b), but we can't ignore its impact on adoption and effectiveness of computing education.
Our current standard early-programming solutions, such as Scratch and many of the other drag-and-drop tools, are too far skewed to the lower-middle end of (a) while achieving (b). The problem isn't drag-and-drop per-se, but the limitations of the languages and tools that get created around drag-and-drop.
What if we had a curriculum (with language, software, materials, etc) that provided a simple, beginner-friendly approach to writing basic programs, but that scaled---within the same framework and tools---to more sophisticated programs that exercised more advanced computer science concepts? Teachers could focus on learning to teach the core framework; most students could learn computing at the level of the basic programs, but students who are doing well could go forward without having to go off entirely on their own (and staying within the realm of what the teacher recognizes, for various benefits).
Program by Design (PbD), which we have worked on for 20 years, was designed for the sort of smooth ramp-up that accommodates learners at all levels. The world-programming infrastructure within PbD developed the basic framework of programming animations, websites, and other reactive systems using a small set of constructs. Bootstrap adapts animations programming in world to the needs of middle-schoolers and teachers who are new to computing. In this ecosystem, a teacher can cover the core world-programming framework with the entire class. Interested students can get into more advanced computing concepts by creating animations with richer features (multiple similar characters to get into lists, etc). But these advanced students are still in the tools and approach that the teacher knows, which makes for a smoother curricular experience all around.
Some of our papers describe the underlying model, tools, and curricula. There are some rough spots to the way we've packaged these curricula in the past, which we are actively working on. But the fundamental idea and inspiration remains: our challenge as computing educators is to create tools that (a) accommodate a wide range of student abilities, and (b) are sufficiently within the teacher's comfort zone to be usable. We hope others will join us in trying to meet this challenge.
Zen Sabbaticals
I've just finished reading Natalie Goldberg's 1993 memoir "Long Quiet Highway". It focuses on her intertwined paths into writing and Zen, exploring the idea of writing as a form of practice (in contrast to traditional seated meditation). Natalie's writing turns me inward, and perhaps not surprisingly, it brought my attention to sabbatical.
Fundamentally, sabbatical is about focus. More importantly, the practice of focus. Throwing oneself into just a single project isn't necessary (though the time to do that is nice). Sabbatical is a logistical clearing of the clutter: no committee work, meetings, appointments, or grading. It's a great justification for postponing reviewing work and other forms of service that slice and dice the days and weeks. Your time becomes your own, and you're expected to do something significant with it.
And it can all go to waste if you see it more as losing others' demands on you without confronting your own time-wasting demands on yourself. For the moment, I'm thinking mostly of distraction.
A true, internalized embrace of sabbatical would see me deeply exploiting my right to push the world away. I'd read the university mailing list much less frequently, hugely slow down my response to email, and really reflect on (rather than simply check-off) things I was reading. Then, the work would start. I'd pay attention, let distracting thoughts bubble up and float away, and look for the deeper insights that lead to great research and learning. In other words, I'd approach academics as a form of Zen practice.
"Long Quiet Highway" brought this point back home: the real practice is in how you work and live. If you can't practice focused attention when given the institutional sanction and support, how can you expect to do so once sabbatical is over and the flurry resumes? Sabbatical is a sesshin (long sitting period) in disguise. Make the desk a cushion.
Time to close the email. Gassho, Natalie.
Fundamentally, sabbatical is about focus. More importantly, the practice of focus. Throwing oneself into just a single project isn't necessary (though the time to do that is nice). Sabbatical is a logistical clearing of the clutter: no committee work, meetings, appointments, or grading. It's a great justification for postponing reviewing work and other forms of service that slice and dice the days and weeks. Your time becomes your own, and you're expected to do something significant with it.
And it can all go to waste if you see it more as losing others' demands on you without confronting your own time-wasting demands on yourself. For the moment, I'm thinking mostly of distraction.
A true, internalized embrace of sabbatical would see me deeply exploiting my right to push the world away. I'd read the university mailing list much less frequently, hugely slow down my response to email, and really reflect on (rather than simply check-off) things I was reading. Then, the work would start. I'd pay attention, let distracting thoughts bubble up and float away, and look for the deeper insights that lead to great research and learning. In other words, I'd approach academics as a form of Zen practice.
"Long Quiet Highway" brought this point back home: the real practice is in how you work and live. If you can't practice focused attention when given the institutional sanction and support, how can you expect to do so once sabbatical is over and the flurry resumes? Sabbatical is a sesshin (long sitting period) in disguise. Make the desk a cushion.
Time to close the email. Gassho, Natalie.
Monday, August 26, 2013
Time to unpack "everybody learn to code"
Two articles crossed my path this morning: (1) a NYTimes editorial on the impact of technology and automation on middle-tier jobs (arguing that middle-tier jobs will need to integrate using automated data with human skills of adaptation and communication); and (2) a Slate piece from a software engineer striking back on the recent "everyone should learn to code" frenzy (roughly: quality coding is hard and people don't need to understand the details of computers to work with them).
These articles help highlight two key problems. First, each of "everybody", "learn", and "code" has wildly different meanings for different contexts and we tend to conflate them when talking about this question in general terms. Second, the recent surge of activity around programming education for all has fixated on roughly a single answer to this question (lessons in scripting via something like Python), and that answer isn't suitable to address the challenges raised in the NYTimes piece.
The current incarnation of online coding classes can serve a particular interpretation of "everyone" and "code" well: if your job will require you to script some data or an application, knowing basic scripting is a good idea (I'm avoiding the question of instructional design/choices in many of the current tools, many of which give me great cause for concern -- that's another post for another day). Learning basic scripting will not, however, particularly help those looking to attain jobs that blend working with automated tools and applying human skills. Yes, those jobs require someone to be able to think about aspects of computation (what is possible, where potential consequences lie, etc), but writing scripts isn't necessarily the right way to get there. Or, to be more precise, writing scripts for the kinds of general, application-centric activities that seem to be common in online programming classes isn't necessarily the approach to get there.
There's a huge challenge and opportunity here for computing educators to articulate contextualized learning goals for "everyone" who will need to blend their work with computational processes, and then to design instruction that meets those goals. In being too narrow about how we think of "everybody", "learn", and "code", we're just setting ourselves up to argue without making meaningful headway towards solving a significant problem.
These articles help highlight two key problems. First, each of "everybody", "learn", and "code" has wildly different meanings for different contexts and we tend to conflate them when talking about this question in general terms. Second, the recent surge of activity around programming education for all has fixated on roughly a single answer to this question (lessons in scripting via something like Python), and that answer isn't suitable to address the challenges raised in the NYTimes piece.
The current incarnation of online coding classes can serve a particular interpretation of "everyone" and "code" well: if your job will require you to script some data or an application, knowing basic scripting is a good idea (I'm avoiding the question of instructional design/choices in many of the current tools, many of which give me great cause for concern -- that's another post for another day). Learning basic scripting will not, however, particularly help those looking to attain jobs that blend working with automated tools and applying human skills. Yes, those jobs require someone to be able to think about aspects of computation (what is possible, where potential consequences lie, etc), but writing scripts isn't necessarily the right way to get there. Or, to be more precise, writing scripts for the kinds of general, application-centric activities that seem to be common in online programming classes isn't necessarily the approach to get there.
There's a huge challenge and opportunity here for computing educators to articulate contextualized learning goals for "everyone" who will need to blend their work with computational processes, and then to design instruction that meets those goals. In being too narrow about how we think of "everybody", "learn", and "code", we're just setting ourselves up to argue without making meaningful headway towards solving a significant problem.
Sunday, August 25, 2013
When does sabbatical start (and what is it, really)?
I'm bemused by how my thinking about sabbatical has evolved over the last several months. (I am on sabbatical for the entire upcoming academic year):
- Having front-loaded my entire teaching load to the fall semester last academic year, I gave my last pre-sabbatical lecture in December. I was conscious of it being my last lecture for over 18 months. Under the "sabbatical-as-concentrated-research-time" view, part me believe sabbatical started as soon as my final grades went in. That wasn't a great interpretation, as my spring semester was so deadline-driven (proposals, papers, committee/university service, etc) that I got little new research done compared to my teaching-intensive semesters, and I was frustrated.
- The day of my last committee meeting on campus, I went home giddy, convinced that sabbatical had finally started. This took the "sabbatical as freedom from meetings" view, as well as the "sabbatical-as-freedom-from-commuting-to-campus" view (I live an hour's drive from campus). These views discounted the wave of "I'm really tired and need some rest before I can usefully think again" that characterized the start of summer.
- July 1st was my official first day of sabbatical, after which I could reasonably tell anyone who asked me to do any university work that I was unavailable (not that anyone did, but I still sensed power in the date). This was the "sabbatical-as-owning-my-own-time" view.
- Solid progress on new research projects in new areas in the second half of this summer have been personally rewarding. This is the "sabbatical-as-time-to-do-new-big-stuff" view.
- This weekend, I am conscious that the incoming freshmen are moving in today, classes start on Thursday, and I'm not responsible for a dang thing. I'm gloating internally at all the university and department emails that I'm not bothering to open. I've reconnected with the idea that August can actually be a relaxing and enjoyable part of summer; I could get used to that. This is a "sabbatical-as-freedom-from-death-by-a-thousand-time-cuts" view.
Reflecting from my calm August deck chair, I see how much of my early view of sabbatical has been framed around the idea of "freedom". That's somewhat sad, as it encouraged me to focus on the aggrevating parts of faculty life (which I do actually enjoy on the whole). It also had me thinking about the end of sabbatical--the time when I would lose that freedom--from the time it started. Focusing on the looming end of freedom made me a outright basket case for the first several weeks, when I felt a responsibility to make the most of every single minute of freedom I had, like grabbing a precious breadth when coming above water.
Those early weeks of sabbatical actually weren't much fun emotionally.
With a summer of rest and reading behind me, my perspective is healthier. Sabbatical now feels like the "responsibility-to-push-myself-in-new-intellectual-directions". Yeah, that's what the official memos on how to apply for sabbatical said, but I didn't feel it in my bones before now. Finally, in the week when classes are about to resume, I finally feel sufficiently rested and initially rejuvenated mentally to start the real work of sabbatical. I'm just thankful to have gotten to this point with a full year still to go, and with enough productive work done over the summer that I know what to do in the times that I won't be lecturing, orienting, or otherwise sitting in meetings this week. There's no more academic baggage to lose, so the true freedom of sabbatical can begin (apologies to Janis Joplin).
Thursday, June 13, 2013
Good enough for MOOCs?
As a computer science professor who is interested in learning, it's natural that I've been thinking about and following the whole MOOC-mania. I'm partly interested from a research/tech perspective (just how much is online learning capable of, given time to develop?), and partly from a job-survival one (what's the likelihood of my line of work disappearing before I retire?).
A friend's blog post lead me to a Forbes article from this week on why online ed is a bubble: summary---students go to college as much as for social/experience reasons as for education, and online can't reproduce the networking and fun aspects of college. Certainly not a new argument. Certainly an argument that makes sense for some segments of colleges and of the population. But I'm not ready to dismiss the MOOC paranoia just yet.
MOOCS aren't going to wipe many colleges off the map in 5 years like an asteroid (though I'm thinking this would make a good 48-hour film project premise). The college experience is going to wither by a thousand cuts (budget cuts, staffing, time, etc). A pending case of academic frog-boiling. College doesn't have to be great--it has to be good enough. On the educational side, MOOCs just might prove good enough for many students. What's good enough on the social side? Could an innovative entity create a social space with a bit of exclusivity (part of the college draw), clout, alumni, and many other things that together create a "good enough" social experience? Membership driven, like a fitness club?
To claim that colleges will retain their stranglehold on the young-adult social experience bets against innovation. There will always be a place for elite schools, but they don't serve the majority of students. It'll take time, but there's a lot colleges could lose and still be "good enough". Colleges need to innovate to figure out how to lower costs and still be "good enough". Other organizations will figure out how to grow to be "good enough" socially, by which time online education will have matured a lot.
Should we bet against good enough?
A friend's blog post lead me to a Forbes article from this week on why online ed is a bubble: summary---students go to college as much as for social/experience reasons as for education, and online can't reproduce the networking and fun aspects of college. Certainly not a new argument. Certainly an argument that makes sense for some segments of colleges and of the population. But I'm not ready to dismiss the MOOC paranoia just yet.
MOOCS aren't going to wipe many colleges off the map in 5 years like an asteroid (though I'm thinking this would make a good 48-hour film project premise). The college experience is going to wither by a thousand cuts (budget cuts, staffing, time, etc). A pending case of academic frog-boiling. College doesn't have to be great--it has to be good enough. On the educational side, MOOCs just might prove good enough for many students. What's good enough on the social side? Could an innovative entity create a social space with a bit of exclusivity (part of the college draw), clout, alumni, and many other things that together create a "good enough" social experience? Membership driven, like a fitness club?
To claim that colleges will retain their stranglehold on the young-adult social experience bets against innovation. There will always be a place for elite schools, but they don't serve the majority of students. It'll take time, but there's a lot colleges could lose and still be "good enough". Colleges need to innovate to figure out how to lower costs and still be "good enough". Other organizations will figure out how to grow to be "good enough" socially, by which time online education will have matured a lot.
Should we bet against good enough?
Sunday, May 5, 2013
Resurrecting a blog, a professor, and a course
I'm now on sabbatical, next due in the office in August 2014. Been thinking this would be a good excuse/motivation to resurrect the blog (quiet for the last 5 years), but hadn't yet found something I felt like writing about.
So much for leaving the classroom: my inaugural sabbatical post is about class sizes.
The NYTimes has an op-ed on whether class size counts. The article is about pre-college, not college classes. Roughly, the piece discusses a proposal under which high performing teachers get additional pay in exchange for taking on larger classes. The piece doesn't discuss how much larger, but does cite a national survey that asked teachers about taking on 3 additional students in exchange for $10K. Interestingly, only 42% of teachers wanted this deal, while 47% would turn down the raise in exchange for 3 fewer students. The piece also discusses the lack of actual research on the effect of class sizes, noting that the effect may be quite different for different kinds of students.
This struck me in part because my department is potentially facing an increase in students interested in taking Computer Science next year. For the past three years, I've taught the second course in our CS sequence (OO program design and data structures), last year topping out at 235 students across two lecture sessions and 9 lab sections. And honestly, trying to find ways to give the support associated with smaller classes to that many students burned me out enough that I'm only now really figuring out what I want to do with my sabbatical.
But it has left me thinking a lot about what we associate with "small classes", what parts of that actually matter, and how we can provide it as class sizes outstrip resources.
Access to help is perhaps the biggest issue. In practice, though, help needs are not directly proportional to class sizes. How many times have I taught smaller (40ish) person classes and not had a single student come to office hours or request appointments? Out of my 235 students, I'd estimate that there were roughly 25-30 students who actually came to my office or asked for help with any regularity (I know many more used the teaching assistants). The point is that a class of 200+ students who don't need help is a very different beast than a class of students who do need help. The op-ed raises this distinction, but at the college level, we seem to make our allocations more on simple student/staff ratios.
Quality feedback on student work follows close behind. Good teaching involves showing students who don't think they need help that they still have things to learn. Unless someone is actually reading student work, we miss those opportunities for deep education. I still insist on having my staff actually read all the code that gets submitted (rather than just auto-grading against test suites), but we're losing the scale battle there.
Avoiding anonymity is another issue I worry about: even if a student never expects to seek help, large classes feel impersonal. At a time when students are trying to work out who they are and what they care about, this is problematic. Not problematic enough, however, to justify additional resources.
I will be spending at least part of my sabbatical better understanding how cognitive tutors and other computer-based learning aids could help with these problems. I don't want to fully automate my class. Being honest with myself, I'm looking for ways to mitigate the guilt of not being able to support each and every student in line with my values as a teacher. Having that support come entirely from a human teacher isn't feasible, nor do I suspect necessary, or even optimal. There are interesting blended human-computer instructional systems waiting to be built for teaching in large classes (lots of progress exists on the systems side, but the teacher/tool interaction seems less developed). If I can come off sabbatical more comfortable with the level of support I can provide, and rested enough to hold up my end of the bargain, I'd call it a success.
So much for leaving the classroom: my inaugural sabbatical post is about class sizes.
The NYTimes has an op-ed on whether class size counts. The article is about pre-college, not college classes. Roughly, the piece discusses a proposal under which high performing teachers get additional pay in exchange for taking on larger classes. The piece doesn't discuss how much larger, but does cite a national survey that asked teachers about taking on 3 additional students in exchange for $10K. Interestingly, only 42% of teachers wanted this deal, while 47% would turn down the raise in exchange for 3 fewer students. The piece also discusses the lack of actual research on the effect of class sizes, noting that the effect may be quite different for different kinds of students.
This struck me in part because my department is potentially facing an increase in students interested in taking Computer Science next year. For the past three years, I've taught the second course in our CS sequence (OO program design and data structures), last year topping out at 235 students across two lecture sessions and 9 lab sections. And honestly, trying to find ways to give the support associated with smaller classes to that many students burned me out enough that I'm only now really figuring out what I want to do with my sabbatical.
But it has left me thinking a lot about what we associate with "small classes", what parts of that actually matter, and how we can provide it as class sizes outstrip resources.
Access to help is perhaps the biggest issue. In practice, though, help needs are not directly proportional to class sizes. How many times have I taught smaller (40ish) person classes and not had a single student come to office hours or request appointments? Out of my 235 students, I'd estimate that there were roughly 25-30 students who actually came to my office or asked for help with any regularity (I know many more used the teaching assistants). The point is that a class of 200+ students who don't need help is a very different beast than a class of students who do need help. The op-ed raises this distinction, but at the college level, we seem to make our allocations more on simple student/staff ratios.
Quality feedback on student work follows close behind. Good teaching involves showing students who don't think they need help that they still have things to learn. Unless someone is actually reading student work, we miss those opportunities for deep education. I still insist on having my staff actually read all the code that gets submitted (rather than just auto-grading against test suites), but we're losing the scale battle there.
Avoiding anonymity is another issue I worry about: even if a student never expects to seek help, large classes feel impersonal. At a time when students are trying to work out who they are and what they care about, this is problematic. Not problematic enough, however, to justify additional resources.
I will be spending at least part of my sabbatical better understanding how cognitive tutors and other computer-based learning aids could help with these problems. I don't want to fully automate my class. Being honest with myself, I'm looking for ways to mitigate the guilt of not being able to support each and every student in line with my values as a teacher. Having that support come entirely from a human teacher isn't feasible, nor do I suspect necessary, or even optimal. There are interesting blended human-computer instructional systems waiting to be built for teaching in large classes (lots of progress exists on the systems side, but the teacher/tool interaction seems less developed). If I can come off sabbatical more comfortable with the level of support I can provide, and rested enough to hold up my end of the bargain, I'd call it a success.
Subscribe to:
Posts (Atom)