Course recommendation systems (CRS) are essential tools that help university students navigate their academic journey and select appropriate courses. Most of the existing models function as black boxes, offering little transparency in their recommendations. This lack of explainability reduces students' trust and willingness to adopt these systems, as they often want to understand the reasoning behind course suggestions. To address this issue, we introduce a knowledge graph-enhanced explainable course recommendation framework to provide personalized course recommendations to students in the upcoming semester. Our model, GGCR, integrates multiple techniques, including feature (keywords) extraction from course descriptions, graph attention networks (GAT) to build high-quality embeddings, and gated recurrent units (GRU) to capture session-based and sequential patterns over multiple semesters. We also explore the explainability capabilities of the best-performing models. We utilize course embeddings to offer course-similarity-based explanations. In addition, using the attention weights of the GAT module, our approach provides more detailed path reasoning-based explanations for each recommendation. In our experiments, conducted on real-world data, the proposed GGCR model not only exceeds existing state-of-The-Art models in performance but also enhances explainability. This highlights that explainability does not have to come at the cost of accuracy. Furthermore, we assess the quality of different types of explanations using the Fidelity measure, large language models (LLMs), and human experts to ensure that our framework provides meaningful and interpretable recommendations.