美文网首页
好词好句

好词好句

作者: 生病喝药水 | 来源:发表于2018-03-22 10:04 被阅读5次

    1、Dynamic Mobile Edge Caching with Location Differentiation

    • The soaring mobile traffic has put high pressure on the paradigm of Cloud-based service provisioning, because moving a large volume of data into and out of the Cloud wirelessly consumes substantial spectrum resources, and meanwhile may incur large latency. Mobile Edge Computing (MEC) emerges as a new paradigm to alleviate the capacity concern of mobile access networks
    • In practice, however, the popularity profile of content is not only unknown, but also varying since user’s interests are constantly changing [13], andmeanwhile new contents are being created.
    • it asymptotically approaches the optimal strategy in the long term. Extensive simulations based on real world traces show that the proposed caching algorithm achieves better accuracy on hit rate prediction, and meanwhile adapts steadily to the popularity dynamics
    • a set of ENs N = {1, 2, . . . , N} is deployed with separated backhaul links connecting to the mobile core network. Each EN n is associated with a distinct location and has different characteristics in terms of content requests compared with others. Online contents are dynamically pushed to ENs so that user’s content requests can be processed with reduced latency. EachEN serves a disjoint set of mobile users.
    • we intend to perform dynamic content caching that constantly updates the files on ENs to achieve higher long-term hit rate. To this end, contents with higher popularity at different locations should be proactively identified and cached respectively, and meanwhile the less popular ones should be evicted
    • For convenience, denote the optimal caching strategy Fn,t ∗ for EN n at time t.
    • we resort to the linear model
    • Hence, content refreshing can be performed during the off-peak period with minimized impact on the normal network activity

    2、Proactive Retention Aware Caching

    • Caching/replicating a popular file in different data centers minimizes the download cost
    • caching a file at multiple data centers incurs a storage cost at each of these data centers. This cost, which typically depends on the duration for which a content is stored, has not been considered by previous works
    • Right before retention expiration, the content is rewritten by the virtue of a scrubbing algorithm at the expense of a P/E cycle at each write, thus causing substantial flash damage with each write/re-write
    • Prior work [11] minimized the device damage for an isolated content-centric cache where the authors found that the optimal retention times are proportional to content popularity
    • Typically, the servers in such a cache network relay data to the leaf node either by employing a unicast transmission, such as in a wired network (e.g. data centers [1], [2]) or by multicasting data to all the leaf nodes at once.
    • higher request probability implies higher retention time
    • Owing to the temporal periodicity in user demand fluctuations
    • We assume that file request probabilities are known/can be predicted ahead of a time frame [4], [8], [9]. We assume that file m is requested at node n in each time slot independently with probability pmn.
    • owing to periodicity in demands
    • If a file requested at slot t 2 [T ] is present in the cache then it is served instantaneously with no additional cost; otherwise, the node forwards the request to the server. A server multicasts all the files that are requested in a slot as the traffic is delayintolerant. A multicast transmission of a file is received by all caches including the ones that have not requested the file (see Figure 1).
    • Let ymn 2 [T ] denote the retention duration defined as the number of slots for which file m is stored at cache n starting from t = 0. Storing a file for duration y in cache incurs a storage cost g(y) 2 R+;

    相关文章

      网友评论

          本文标题:好词好句

          本文链接:https://www.haomeiwen.com/subject/hormxftx.html