Optimization Of Offloading Strategies In Mobile Cloud Computing

0
Mobile Cloud Processing

Mobile Cloud ProcessingCould computing security has gathered a lot of attention in recent times, but some key areas of progress are still being neglected. One such area is the aspect of dynamic offloading in mobile cloud computing (MCC).

Using different solutions, the energy consumption and communications costs of cloud computing can be optimized. Furthermore, the decision process such as the Markov process can help achieve optimal offloading policy.

Myths around cloud computing services

Several myths continue to surround the domain of cloud computing. The recent Snowden case has also stirred the hornet’s nest, with users doubting whether their data is even safe in the cloud.

The answer comes from security analysts and investors, who are the major stakeholders in the cloud computing industry.

While technologists may be able to offer some explanation about security, Artis Capital shows how venture capitalist firms can actually provide a judicious analysis. The firm strenuously believes that not all cloud computing services are the same. It is up to the users to select the ones that are succeeding based on better technology and more privacy based customization.

Focus on technological improvement

Cloud computing services and security can be improved if more attention is paid on solving processing related tasks. For instance, mobile cloud computing is one such area. MCC demands that computationally demanding tasks should be offloaded to the cloud, processed and transmitted back to a mobile device.

In order to lower the energy and cost of the process, three tasks can be identified:

1.  Tasks which can be processed locally on a mobile device

2.  Tasks which are processed in the cloud

3.  Tasks which can be processed either in the cloud or on the device

Using Markov Decision Processes (MDP)

MDP can be effectively employed to improve the energy consumption as well as the efficiency of the entire task. It allows us to have different options modeled as single server queues. These options can be local processing, offloading to a cloud, use of a hotspot Wi-Fi or a cellular network.

Now a policy would be designed to minimize a given cost function such as the mean response time, current state of the system and anticipated future tasks. The drawback in the current model is the mobility of a user hindering the use of WLAN hotspots and also the cost of cellular connection.

To address the complication, a stochastic model can be derived for dynamic offloading. The idea is to capture various performance metrics and intermittently use WLAN hotspots as available access links. After the tasks are queued, the analysis is done in the MDP framework. The mathematical calculations of the MDP are those which are used routinely.

The Optimization

The optimization is done by deriving a new queuing theoretic result, which would allow formation of near optimal offloading policies. These policies would take into account metrics such as delay, energy and offloading costs.

When numerical experiments are performed on such policies, they outperform the existing one by quite a margin. If more cost structures are added, the results would still be the same.

Further research work can target developing schemes using jockeying, thereby allowing the possibility of re-assigning tasks later on.

By focusing more on the ‘solution’ aspect of cloud computing, it would lead to improvement in the efficiency of the system. Through reduction in cost from this side, more intensive security measures can be added without increasing the cost of cloud computing solutions.