In this post, I describe a ubiquitous style of programming that, to my knowledge, has never been formally taught in the classroom.
In most programming classes, students write programs in a single language (e.g., Java, Python) and its standard library; they might use a well-documented third-party library for, say, graphics. Students fill in skeleton code templates provided by instructors or, at most, write a few chunks of code "from scratch." Specifications and interfaces are clearly defined, and assignments are graded using automated test suites to verify conformance to specs.
What I just described is necessary for introducing beginners to basic programming and software engineering concepts. But it bears little resemblance to the sorts of programming that these students must later do in the real world.
Over my past decade of programming, I’ve built research prototypes, extended open-source software projects, shipped products at startups, and engaged in formal software engineering practices at large companies. Regardless of setting, here are the typical steps that my colleagues and I take when starting a new project:
1. Forage: Find existing snippets of code to build my project upon. This might include code that I wrote in the past or that colleagues sent to me in various stages of bit-rot. If I’m lucky, then I can find a software library that does some of what I want; if I’m really lucky, then it will come with helpful documentation. Almost nobody starts coding a real-world project "from scratch" anymore; modern programmers usually scavenge parts from existing projects.
2. Tinker: Play with these pieces of existing code to assess their capabilities and limitations. This process involves compiling and running the code on various inputs, inserting "print" statements to get a feel for when certain lines execute and with what values, and then tweaking the code to see how its behavior changes and when it breaks.
(Now loop between steps 1 and 2 until I'm satisfied with my choice of building blocks for my project. Then move on to step 3.)
3. Weld: Try to attach ("weld") pieces of existing code to one another. I might spend a lot of time getting the pieces compiled and linked together due to missing or conflicting dependencies. Impedance mismatches are inevitable: Chances are, the code I have just welded together were never designed to "play nicely" with one another or to suit the particular needs of my project.
4. Grow: Hack up some hard-coded examples of my new code interfacing with existing "welded" code. At this point, my newborn code is sloppy and not at all abstracted, but that’s okay -- I just want to get things working as quickly as possible. In the process, I debug lots of idiosyncratic interactions at the seams between my code and external code. Wrestling with corner cases becomes part of my daily routine.
5. Doubt: When implementing a new feature, I often ask myself, "Do I need to code this part up all by myself, or is there some idiomatic way to accomplish my goal using the existing code base or libraries?" I don’t want to reinvent the wheel, but it can be hard to figure out whether existing code can be molded to do what I want. If I’m lucky, then I can ask the external code's authors for help; but I try not to get my hopes up because they probably didn’t design their code with my specific use case in mind. The gulf of execution is often vast: Conceptually simple features take longer than expected to implement.
6. Refactor: Notice patterns and redundancies in my code and then create abstractions to generalize, clean up, and modularize it. As I gradually refactor, the interfaces between my code and external code start to feel cleaner, and I also develop better intuitions for where to next abstract. Eventually I end up "sanding down" most of the rough edges between the code snippets that I started with in step 4.
(Now repeat steps 4 through 6 until my project is completed.)
I don’t have a good name for this style of programming, so I’d appreciate any suggestions. The closest is Opportunistic Programming, a term that my colleagues and I used in our CHI 2009 paper where we studied the information foraging habits of web programmers. Also, I coined the term Research Programming in my Ph.D. dissertation, but the aforementioned six-step process is widespread outside of research labs as well. (A reader suggested the term bricolage.)
Students currently pick up these hands-on programming skills not in formal CS courses, but rather through research projects, summer internships, and hobby hacking.
One argument is that the status quo is adequate: CS curricula should focus on teaching theory, algorithm design, problem decomposition, and engineering methodologies. After all, "CS != Programming," right?
But a counterargument is that instructors should directly address how real-world programming — the most direct applications of CS — is often a messy and ad-hoc endeavor; modern-day programming is more of a craft and empirical science rather than a collection of mathematically-beautiful formalisms.
How might instructors accomplish this goal? Perhaps via project-based curricula, peer tutoring, pair programming, one-on-one mentorship, or pedagogical code reviews. A starting point is to think about how to teach more general intellectual concepts in situ as students encounter specific portions of the six-step process described in this post. For example, what can "code welding" teach students about API design? What can refactoring teach students about modularity and testing? What can debugging teach students about the scientific method?
My previous CACM post, "Teaching Programming To A Highly Motivated Beginner," describes one attempt at this style of hands-on instruction. However, it's still unclear how to scale up this one-off experience to a classroom (or department) full of students. The main challenge is striking a delicate balance between exposing students to the nitty-gritty of real-world programming while also teaching them powerful and generalizable CS principles along the way.
Taking a course in programming is like expecting to learn a new language by being handed a dictionary. I despair at the horrible state that, so called, computer language courses are in. There is no teaching involved, and in many cases where I have requested assistance in my home work it became apparent that the homework was never reviewed.
I don't find it surprising that the computer science field has such a difficult time attracting students. They are to spend hundreds of dollars on books and then basically told to read them and teach themselves programming.
TDH
Well I for one wont be taking one of your classes. Cobbling together all sorts of stuff and hoping you can create abstractions later, modularise and refactor is NOT the way to write serious software. If I presented this way of programming to the medical regulatory authorities, I would be laughed at. At best this method could be used for script kiddies apps on iPhones. Its all anyone seems to talk about. The REAL programming world is one of proper specification, design and test.
Doug
I can't help think that there may be a place for a software craftsmanship (unfortunately, its not an "engineering" discipline yet for the most part) qualification, but I hope CS maintains its focus on CS independently of that.
Well, I am an amateur programmer, but when I studied programming I had to write whole applications, sometimes from scratch, though sometimes from templates, which were sometimes quite complete, sometimes skeletal and which didn't always help very much. (Sometimes it's easier to work from scratch, but the templates forced us to use a certain design strategy). In my written exam I had to write a number of classes from scratch, with nothing but paper and pen. Thus, I am not sure I relate entirely to the spoon-fed approach outlined in the article. Clearly approaches to IT education differ somewhat. Courses did teach a single language, which is why I took several courses, covering C++, C# and Java. I pretty well follow the same steps that you do, but then I was writing my own applications from day one, long before commencing formal study and so I had already discovered these things. What I wasn't taught was the use of some of the tools that are in fad in industry at the moment, but then I'm an amateur so I guess that doesn't matter to me so much.
Would call this Cowboy Coding, wouldnt ya?
I teach the software "Practicum" at Franklin University, which puts sophmores and juniors on a development team with a senior as the manager/project-manager/technical lead. Each team develops an application they can complete within the semester, and demos it at the end.
The class focuses on process: the team needs to write full requirements, a design document, and a user manual; the manager needs to create a schedule for the project with milestones, assign tickets each week for the work, and keep the software in a repository accessible through SVN or GIT.
The process creates more overhead that you might do in the real world for a small project, but gives a view of how a software team really works. Initiative, out-of-the-box thinking, communications take place in practice not just theory.
One of the things I stress in the real world is to "write it down" before you build it. Even for small projects I build myself, I find I save time and headaches by writing down: what I'm trying to accomplish, how I'm going to do, what are the tools and environment. Building on Phillips's comments, I would write down "where" to look for code snippets, "how" they would go together, and a clear statement of "what" I'm trying to accomplish.
For myself, I enjoy writing code, and I find when I don't write things down at the beginning, I tend to wander.
Displaying comments 11 - 16 of 16 in total