Character relationships are central to narrative understanding in both literature and screenwriting. However, differences in storytelling between novels and television dramas pose unique challenges for algorithmic modeling. This paper proposes RKGCCBA (Role Knowledge Graph-assisted Correction and Context-Block Attention), a novel model for automating character relationship modeling across narrative texts. RKGCCBA integrates a role knowledge graph to incorporate inter-character relationship knowledge and a context-block attention mechanism that dynamically focuses on relevant dialogue context to improve speaker attribution accuracy. We evaluate RKGCCBA on corpora from both media (novels and TV drama scripts), conducting a cross-media comparative analysis of character relationship extraction. Experimental results demonstrate that RKGCCBA outperforms baseline methods in dialogue speaker identification tasks on both media. Moreover, the comparative evaluation highlights key narrative differences between prose novels and scripted dramas, underscoring the importance of tailored context modeling and confirming the approach’s broad applicability to diverse storytelling formats.