My Basic Technology Stack for Teaching Web Programming
November 2014 (perspective of an assistant professor)
I teach a course whose objective is to provide students with a strong foundation in the principles of Web programming. When designing this course, I chose a very basic technology stack:
So given the state-of-the-art in industry, why did I choose such a basic technology stack for teaching? Because I want to teach generalizable, fundamental concepts of Web programming, not tool-specific details.
I had three main requirements for a technology stack:
Thus, my stack consists of tools at the lowest practical levels of abstraction (of course, there will always be abstraction), are easy to install and deploy, and have been in widespread usage for many years.
HTML and CSS
The semester begins with a crash course on writing basic HTML and CSS in a text editor. No templating languages, no styling languages that compile to CSS, no GUI-based interface builders. This setup is trivial to test locally on any computer with a Web browser. There is nothing else to install except for a free text editor. HTML and CSS aren't going anywhere soon, so this technology ages well.
The only extra setup step is linking to a Web-hosted version of jQuery in the HTML file, which doesn't require any extra software installation. “Hello World” still fits on a single page. Again, this setup can be tested locally on an ordinary Web browser; no need for extra compilation steps or running a Web server on localhost.
One additional benefit of using jQuery is its declarative selector language, which further reinforces DOM concepts and somewhat segues into SQL.
Later in the course, I also use jQuery for making Ajax calls without eye-gouging, cross-browser compatibility pains.
For server-side programming, I chose a plain CGI interface, which is almost as old as the Web itself. The basic idea is simple to teach: Write a program in any language that prints the full HTTP response to the terminal. The full HTTP response, warts and all, header and payload.
The bare-bones nature of printing an HTTP response is great for illustrating HTTP concepts. It's also easy to debug since students can simply run their programs on the command line to see what it prints. Nothing is hidden behind layers of abstraction like in many Web application frameworks. What the student's program prints is exactly what gets sent over the Internet to the user's Web browser.
Since Python has convenient built-in libraries for CGI, cookies, JSON encoding (for Ajax), and SQLite (see next section), I chose it as the backend language. But I emphasize that any language works here.
Python and CGI have been around forever (in Web years), and are easy to deploy on any Apache server such as those provided by many universities and low-cost Web hosting services. Students can install MAMP to test locally on their own computers.
Why not PHP? PHP integrates better with HTML and provides a
quasi-template backend language. For hacking up a simple Web app with
little overhead, I'd recommend PHP. But for teaching, I don't like how
PHP hides the concept of HTTP responses. There's still some
behind-the-scenes magic that turns a PHP script into an HTML webpage.
With ordinary CGI, there's no magic: an HTML webpage gets created as a
string and explicitly printed using
For persistent data storage, I start by teaching students to read and write ordinary text files in their Python CGI programs. That's the simplest possible approach. Then we discuss the limitations of using the filesystem as a “database,” and why ACID is good.
We segue into SQLite, which is (gasp!) a single file just like those they've been reading and writing all along in Python. Since Python comes with built-in SQLite support, there is no extra installation or setup. A database is a single file; no magic.
Any other database requires an enormous amount of command-line
bullshittery to set up and administer.
(Want to back up an SQLite database? Use
What about NoSQL databases? Even though they're getting lots of traction in industry, I still want to introduce the idea of SQL in my course, so those won't work by definition.
Here's a 12-minute summary of this stack, delivered live during my Web programming class lecture on 2014-11-25:
What about Web application frameworks?
A reasonable criticism of this basic technology stack is one of authenticity. I openly admit that this stack isn't being used by top companies at the moment. So am I harming my students by explicitly not teaching them the latest and greatest Web application frameworks that they will use on the job?
Before I answer that question, I'll pose one of my own. Even if I were to teach the latest and greatest, which framework should I teach? A cursory comparison page on Wikipedia lists almost a hundred frameworks in over a dozen programming languages. I can't possibly pick one that will be a good fit for every potential job that my students might get. And Web technologies change so fast that my students will likely need to learn new ones every few years, either at their current job, or when they hop to a new job.
I'm old-school (ha!) and strongly believe that the role of formal schooling is to teach foundational concepts that are as timeless as possible. For Web programming, this means teaching the basics of the DOM, HTTP, frontend and backend programming, and persistent data storage. It doesn't mean teaching whatever buzzwords happen to be hot on the job market at the moment.
Web technologies change at a blistering pace. Read this Hacker News thread where a professional Web programmer laments the lack of job opportunities in Ruby on Rails:
Rails programmer here -- I've been working with Rails since 2008 and I've been doing remote jobs for a living since then.
Lately I can't find any work at all with Rails, what's going on? What happened? Where did all the Rails jobs went?
I remember during 2008 doing multiples jobs at the same time, I used to get lots of offers, lots of interviews and gigs, etc, things have been really nice. Nowdays I barely get any interviews, and if I'm lucky to talk to someone, they ask for Node.js/Ember experience as well, etc.
Did all the jobs went to Node.js and other languages? Or I'm just doing something wrong?
It's really depressing accumulating all the knowledge and skills I got since then and not being able to use it or find more work. Even more depressing that I can't find work and my income depends on this.
Any suggestions or recommendations welcome. I'm only looking for remote work.
Sigh, please help. :-(
Rails was the new hotness back in 2008 when 1337 haX0rz were dissing PHP, but now it's being supplanted by newer technologies such as Node.js, which itself will be supplanted by something newer in the coming years. However, foundational Web technologies such as the DOM and HTTP change much slower, on the order of a decade or more.
If my students develop a solid understanding of fundamentals, it will be far easier for them to learn whatever new frameworks are required in their future jobs, since the basic underlying concepts don't change much from year to year.
I'll end by reiterating my three main requirements for a teaching stack, and arguing why frameworks don't fit those requirements:
To be clear, I think frameworks are crucial for building production-grade Web applications, and they are enormously powerful tools for professionals. However, I don't think they're worth teaching in a general Web programming course. (That said, for their semester-long group project, I let students use any technology that they like, so some opt to learn frameworks on their own, while others use the basic stack that I teach.)
In sum, taking my single course is no substitute for continual, on-the-job learning to stay current with the latest market trends. My main goal is to lay a solid foundation so that students can learn effectively by themselves throughout their careers.