The ability to assure reliability of adaptation is important in safety-critical applications. Traditional software V&V techniques cannot account for the time-evolving nature of a generic dynamic system, making them inapplicable for computing systems assurance. In this paper, we propose considering stability of adaptation as a heuristic measure of reliability, and present a stability monitoring system for detecting divergent learning behavior during online operation of adaptive systems. The monitoring system comprises of several Lyapunov-like functions that detect distinct states in learning that bifurcate away from stable behavior. Murphy's rule based on Dempster-Shafer theory is applied for combining stability information provided by individual monitors into an easily interpretable belief representation. The proposed analysis technicpie is evaluated using online learning experiments based on data generated by an actual adaptive flight control system. Results indicate that the stability monitoring system detects divercency conditions, and provides an insight into understanding whether the on-line learning converges to a stable state.