28th International Conference on Text Speech and Dialogue-TSD-Annual, Erlangen, Almanya, 25 - 28 Ağustos 2025, cilt.16030, ss.203-215, (Tam Metin Bildiri)
Large language models (LLMs) have demonstrated remarkable performance across a wide range of NLP tasks. However, their effectiveness in discourse parsing remains underexplored where the existing LLM-based attempts fall significantly short of the performance achieved by the encoder-based models. In this study, we propose a Chain-of-Thought (CoT) prompting approach for the task of implicit discourse relation recognition (IDRR), leveraging the concept of abstract objects. We show that guiding the model to identify abstract objects within the arguments of the discourse relation systematically enhances the classification performance across both Level-1 and Level-2 senses, in both monolingual and multilingual settings. Through experiments on three monolingual and one multilingual corpora, covering seven languages and annotated according to PDTB 3.0, we demonstrate that our CoT-style prompting approach achieves significant improvements over previous LLM-based methods.