LOAD BALANCING OPTIMIZATION FOR RPL BASED EMERGENCY RESPONSE USING Q-LEARNING
Journal: MATTER: International Journal of Science and Technology (Vol.4, No. 2)Publication Date: 2018-07-15
Authors : A. Sebastian; S. Sivagurunathan;
Page : 74-92
Keywords : Internet of Things; RPL; Load Balancing Optimization; Disaster Response; Multi Agent Q-Learning;
Abstract
Internet of Things technology has given rise to Smart Cities, Smart Health, Smart Transport Logistics, Smart Production and Supply chain management, Smart Home and many more. For IoT deployments, ROLL-WG has standardized Routing Protocol for Low Power and Lossy Networks (RPL) for urban environment (RFC 5548). RPL is designed to address the needs of constrained IoT environment. RPL uses Objective Functions (ETX & Hop Count) to optimize route selection. Many new Objective Functions for IoT applications are suggested by researchers to optimize path selection. Load Balancing Optimization for emergency response is least explored. In this article, we propose load balancing optimization for RPL based emergency response using Q-learning (LBO-QL). We have tested the proposed model in Contiki OS and Cooja simulator. Proposed model provides improved efficiency in Packet Delivery Ratio, Traffic Control Overhead and Power consumption. Hence, DODAG optimization using Q-Learning for disaster response is effective in optimized usage of constrained resources for disaster response operations with improved efficiency and reliability.
Other Latest Articles
- THE FRACTURE BEHAVIOR OF RANDOM FIBER-REINFORCED COMPOSITE SPECIMENS
- HENRY D. THOREAU’S HUT AND ANCIENT JAPANESE SUSTAINABLE WISDOM
- NATURE OF CODE-SWITCHING AMONG THE BANGLADESHI TERTIARY LEVEL FACEBOOK USERS
- MODEL OF MAIZE COOPERATIVE INSTITUTIONAL DEVELOPMENT IN MUNA REGENCY, INDONESIAN
- MINI ''SINANTOLOGIA 2''
Last modified: 2018-08-24 14:32:57