When reading a longer document, such as a novel, your interpretation of the text you are currently reading is informed by what’s come before. Your attention to particular details, wording, or ideas; whether you perceive an action in a positive or negative light; what you think about a particular character. Currently, transformer-based Large Language Models consider only the current context (limited to 512 to 1024 tokens or subwords), and can not interpret longer texts as we do. We explore the possibility of modeling the narrative arc, influencing a transformer’s context with the flow of the story to the current point. We approach this problem from literary, mathematical, and computational viewpoints. Please join us on October 21 from 1 to 2pm as we host Prof. Corey Brady (Peabody, Mathematics Education, Department of Teaching and Learning) on a discussion of Narrative Arc! As always, we’ll have the popcorn bar ready for snacks!