Abstract Summary
Recent advances in large language models (LLMs) have opened new opportunities for personalized news recommendation. However, existing LLM-based approaches mainly focus on semantic enrichment while overlooking structural signals such as entity-level relations that are crucial for modeling user preferences. Meanwhile, graph-based methods capture structural information but often rely on sparse click-based or incomplete news¨Centity interactions, limiting their ability to model rich relational structures. To address these limitations, we propose PromptHG, a unified framework that integrates LLM prompting with heterogeneous graph learning for news recommendation. First, PromptHG leverages LLMs to directly synthesize entities from news titles and link articles through shared entity nodes, uncovering relationships beyond conventional interaction signals and alleviating sparsity in user behavior data. Second, we construct a heterogeneous news¨Centity graph that integrates user click sequences with entity-based connections, and employ a lightweight graph encoder to learn robust news and user representations, enabling the model to capture complex structural dependencies for improved preference modeling. Extensive experiments on two benchmark datasets demonstrate that PromptHG consistently outperforms strong baselines across multiple evaluation metrics, highlighting the effectiveness of LLM-guided entity synthesis and heterogeneous graph modeling for personalized news recommendation.