A context-aware prefetching strategy for mobile computing environments Conference

Drakatos, S, Pissinou, N, Makki, K et al. (2006). A context-aware prefetching strategy for mobile computing environments . 2006 1109-1116. 10.1145/1143549.1143771

cited authors

  • Drakatos, S; Pissinou, N; Makki, K; Douligeris, C

authors

abstract

  • In a mobile wireless environment, the latency (time-delay) observed by a user before s/he receives up-to-date information may be high because of the limited available bandwidth. An efficient prefetching strategy must be tailored to the competing goals of keeping latency low (which requires more prefetching) and reducing resource waste in a mobile environment, which is characterized by scarce bandwidth and resource-poor user devices. Current research is based on the tangent velocity approach, which is effective only within a short time interval and has a high cost of continuous geometric estimations. This paper proposes a cache management method that maintains the mobile terminal's cache content by prefetching data items with maximum benefit and evicting cache data entries with minimum benefit. The data item benefit is evaluated based on the user's query context defined as a set of constraints (predicates), which define both the movement pattern and information context requested by the mobile user. A context-aware cache is formed and maintained using a set of neighboring locations (we call the prime list) restricted by the validity of the data fetched from the server. Simulation results show that the proposed strategy using different levels of granularity can greatly improve the system performance in terms of cache hit ratio. Copyright 2006 ACM.

publication date

  • December 1, 2006

Digital Object Identifier (DOI)

International Standard Book Number (ISBN) 10

International Standard Book Number (ISBN) 13

start page

  • 1109

end page

  • 1116

volume

  • 2006