Lessons Learned in the Space Sector: An Interactive Tool to Disseminate Lessons Learned to Systems Engineers
Organizations, like individuals, are expected to learn from their mistakes. Companies that successfully rely on past knowledge to inform programmatic decisions use knowledge management tools to capture and disseminate this information, often in the form of lessons learned databases. However, past mistakes continue to happen in the aerospace industry, including NASA. Although NASA has taken measures to stress the importance of lessons learned in organizational culture, relatively little work has been done to develop the user interface of their lessons learned database. Encouraging engineers to review lessons only goes so far when the interface itself is outdated and difficult to use. We propose that an interactive network tool is an effective way to disseminate lessons learned to novice systems engineers.
In this thesis, I begin by developing a model to represent spacecraft anomaly narratives and applying this model to the Jet Propulsion Laboratory’s publicly available lessons learned database. I then create an interactive network tool and populate it with the set of modeled lessons. Then, I design an experiment to determine how novice engineers use two different knowledge management tools—the interactive network and the NASA database. I use transcripts of users’ thought processes, verbalized to me during the experiment, to create a mental model of how users with access to knowledge management tools respond to engineering scenarios. From the mental model, I identify the functional strengths and weakness of both the interactive network and the NASA database. Finally, I discuss the results of the experiment and recommend future improvements to the interactive network tool.
We found that the interactive network was a better resource for users to make connections between topics, and that the NASA database was a better resource for users to search for specific information. Using the interactive network over the NASA database correlated with an increase in performance for the majority of the experiment, but data we collected do not provide enough evidence for us to conclude that the interactive network is a better dissemination tool than the NASA database in all scenarios. We found that receiving lessons learned from either of the tools takes time because each tool’s functionality elicits new tasks from the user. Finally, we found that the top performers in the experiment used each of the tool’s strongest features.