Philip Guo (Phil Guo, Philip J. Guo, Philip Jia Guo, pgbovine)

Porta: Profiling Software Tutorials Using Operating-System-Wide Activity Tracing

research paper summary
Porta: Profiling Software Tutorials Using Operating-System-Wide Activity Tracing. Alok Mysore and Philip J. Guo. ACM Symposium on User Interface Software and Technology (UIST), 2018.
(Best Paper Award)
It can be hard for tutorial creators to get fine-grained feedback about how learners are actually stepping through their tutorials and which parts lead to the most struggle. To provide such feedback for technical software tutorials, we introduce the idea of tutorial profiling, which is inspired by software code profiling. We prototyped this idea in a system called Porta that automatically tracks how users navigate through a tutorial webpage and what actions they take on their computer such as running shell commands, invoking compilers, and logging into remote servers. Porta surfaces this trace data in the form of profiling visualizations that augment the tutorial with heatmaps of activity hotspots and markers that expand to show event details, error messages, and embedded screencast videos of user actions. We found through a user study of 3 tutorial creators and 12 students who followed their tutorials that Porta enabled both the tutorial creators and the students to provide more specific, targeted, and actionable feedback about how to improve these tutorials. Porta opens up possibilities for performing user testing of technical documentation in a more systematic and scalable way.
@inproceedings{MysoreUIST2018,
 author = {Mysore, Alok and Guo, Philip J.},
 title = {Porta: Profiling Software Tutorials Using Operating-System-Wide Activity Tracing},
 booktitle = {Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology},
 series = {UIST '18},
 year = {2018},
}

Scientists and engineers often need to use multiple pieces of complex GUI and command-line software together in tight coordination. For instance, if you want to start building a full-stack web application in 2018, you may need to first install Node.js and the npm package manager, run a slew of npm commands to configure a custom toolchain with a CSS preprocessor and a JavaScript code bundler, adjust OS environment variables to detect all required library dependencies and execution paths, customize your IDE to hook up to that toolchain, install and configure web browser extensions for debugging, and set up a pipeline to deploy code to production servers. All these hoops to jump through just to get started!

People learn to perform these intricate software actions by reading step-by-step tutorials written by experts. However, it's notoriously hard for experts to create high-quality software tutorials that work well for a wide range of users. Why?

  • First and foremost, many suffer from expert blind spots. Since they are so intimately familiar with the subject matter they're writing about, they can't anticipate what learners don't yet know. As a result, they often skip critical details or steps in their write-ups, which leads to learner confusion.
  • Even if they create the world's highest-quality tutorial that works perfectly on their own computer, chances are that it will not work properly on some other users' computers since those users may have subtly differing versions of system libraries or OS configuration settings that are really hard to test for.
  • And even if they make a perfectly robust tutorial that works across everyone's computers today, some parts will inevitably break over time as users' installed versions of libraries, frameworks, and operating systems get upgraded in the near future. This is an especially salient problem in fast-moving domains such as web development or data science where new versions of tools come out every few months.

The most frustrating thing for tutorial creators is that there's no easy way for them to see how learners are actually using their tutorials or which parts are the most confusing. They simply throw their tutorials over the wall and hope for the best.

What if we could allow tutorial creators to get a detailed view into how learners are stepping through their tutorials? To do so, we created a macOS app called Porta (Profiling Operating-system Recordings for Tutorial Assessment) which is a spiritual successor to our Torta project from last year. Here's how Porta works (click image to enlarge):

  1. Start with any existing tutorial webpage, such as this web development tutorial shown in the image above.

  2. Recruit learners to test out this tutorial by installing the Porta app and Chrome browser extension. They can either come to a lab to do this user testing, or do it on their own computer.

  3. As each learner tries to follow this tutorial webpage, Porta automatically tracks their actions both within the web browser and across various GUI and command-line applications on their computer. Specifically, it records:

    • a full screencast video of the test session
    • where the user is focusing on the tutorial webpage
    • when they play/pause embedded videos (tutorials sometimes embed videos to demonstrate certain actions)
    • what other webpages they open
    • what text is copied from tutorial then pasted into other apps
    • what command-line apps they are invoking in the terminal
    • what compiler (e.g., gcc, javac) and other toolchain commands (e.g., make) they are running from the terminal or IDE, along with what files they are compiling
    • commands run on remote servers via ssh (useful for tutorials involving cloud-based systems)
    • whether they are opening the browser developer tools or getting JavaScript errors (for web development tutorials)
  4. After the test session ends, Porta aggregates all of the recorded data and displays it as an overlay on top of the original tutorial webpage. This forms a tutorial profile visualization with the following user interface (click image to enlarge):

This interface lets the tutorial creator see exactly what each learner (or a group of learners in aggregate) did as they were following the given tutorial. For instance, the positional heatmap shows how long the learner looked at each part of the tutorial, and event markers show instances of recorded events (e.g., command-line app invocations) when the learner was focused on that part. Clicking a marker will pop up details about that event, including a screencast video recording and a log of any errors that arose.

Using this interface, tutorial creators can gain deep insights into which parts were confusing for learners so that they can update those parts to clarify accordingly. When we tested Porta on creators of three widely-used software tutorials, our participants were pleasantly surprised to find unexpected problems in their own tutorials that they had never thought about before.

In sum, Porta opens up possibilities for performing user testing of technical documentation in a more systematic and scalable way, which can in turn help improve tutorials and make complex software easier to use.


Read the full paper for details:

Porta: Profiling Software Tutorials Using Operating-System-Wide Activity Tracing. Alok Mysore and Philip J. Guo. ACM Symposium on User Interface Software and Technology (UIST), 2018.
(Best Paper Award)
It can be hard for tutorial creators to get fine-grained feedback about how learners are actually stepping through their tutorials and which parts lead to the most struggle. To provide such feedback for technical software tutorials, we introduce the idea of tutorial profiling, which is inspired by software code profiling. We prototyped this idea in a system called Porta that automatically tracks how users navigate through a tutorial webpage and what actions they take on their computer such as running shell commands, invoking compilers, and logging into remote servers. Porta surfaces this trace data in the form of profiling visualizations that augment the tutorial with heatmaps of activity hotspots and markers that expand to show event details, error messages, and embedded screencast videos of user actions. We found through a user study of 3 tutorial creators and 12 students who followed their tutorials that Porta enabled both the tutorial creators and the students to provide more specific, targeted, and actionable feedback about how to improve these tutorials. Porta opens up possibilities for performing user testing of technical documentation in a more systematic and scalable way.
@inproceedings{MysoreUIST2018,
 author = {Mysore, Alok and Guo, Philip J.},
 title = {Porta: Profiling Software Tutorials Using Operating-System-Wide Activity Tracing},
 booktitle = {Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology},
 series = {UIST '18},
 year = {2018},
}
Related pages tagged as human-computer interaction:
Related pages tagged as software: