140 likes | 282 Vues
This paper explores the advantages of utilizing semantic caching to manage location-dependent data (LDD) in mobile computing. It discusses how semantic caches can reduce wireless network traffic and improve system performance through efficient query processing and cache management. By modeling LDD queries and implementing a semantic cache index, we can enhance data retrieval based on user location. The study evaluates various caching strategies and replacement policies, revealing that semantic caching outperforms traditional page caching in mobile environments.
E N D
Using Semantic Caching to Manage Location Dependent Data in Mobile Computing 2003.3.18 CS 744 Database Lab. Se-Kyoung Huh
Contents • Background • Semantic Cache • Modeling LDD Query • LDD Semantic Cache Index • LDD Query Processing • LDD Cache Management • Experiment • Conclusion
Background • Characteristic of mobile computing • Large overlapped results for continuous queries • Disconnected situation • Advantage of caching data for mobile computing • Wireless network traffic cost down • System performance up
Semantic Cache • Semantic Cache vs. Page Cache • Advantage of semantic cache for LDD (Location Dependent Data) • Strong semantic locality than spatial locality for semantic LDD application • Possibility of flexible cache management • Use of semantic information in disconnection situations
Modeling LDD Query • Q = “Give me the names of the hotels within 20 miles whose prices are below $100” • Qp = (price < 100) ∩ (Lx-20 < xposition <= Lx+20) ∩ (Ly-20 <= yposition < Ly+20) • (Lx,Ly) : current user position • Assumption : reference point is given • Dependent on the current user position
LDD Semantic Cache Index Semantic information Index for cache result Table Attribute Predicate Bound position Time Stamp
LDD Query Processing • Relationship between query and cache • If query is contained by cache • Use cache for query processing • If query is partly contained by cache • Split the query into • The query satisfied by cache • by checking through all segment in the cache • The query not satisfied by cache • Send only the query not satisfied by cache to server • Coalesce every partial query result • Add new query result into cache • Need for decomposition of segments to prevent duplicated cache segment
LDD Cache Management • Replacement principle • Incorporation of the status of the mobile user • The moving direction • The distance from cache segment
LDD Cache Management (cont’d) • FAR algorithm • Divide cache segment • In Direction set • Segment in the user’s moving direction • Out Direction set • Segment not in the user’s moving direction • Choose the victim among the Out Direction set • If Out Direction set is empty • Choose the victim the furthest segment in In Direction set
Experiment • Page Caching vs. Semantic Caching Database is neither indexed nor clustered • Semantic caching is better • Due to the highly reduced wireless network traffic • Only the required data is transferred
Experiment (cont’d) • Page Caching vs. Semantic Caching (cont’d) index on x, column-wise scan clustering • Page caching becomes better than one in no index database • Due to not necessity of scanning database for finding page
Experiment (cont’d) • Page Caching vs. Semantic Caching (cont’d) index on x, column-wise scan clustering • Page Caching is sensitive for the organization of database
Experiment (cont’d) • Comparison of several replacement policy • FAR is better than LRU or MRU
Conclusion • Contribution • Propose semantic cache concept for mobile computing • Weakness • Cache replacement policy • Always possible for predicting user’s movement direction?