Vijayasekaran, G.Duraipandian, M.2023-05-012023-05-012022Vijayasekaran, G. Resource scheduling in edge computing IoT networks using hybrid deep learning algorithm / G. Vijayasekaran, M. Duraipandian // Системні дослідження та інформаційні технології : міжнародний науково-технічний журнал. – 2022. – № 3. – С. 86-101. – Бібліогр.: 25 назв.https://ela.kpi.ua/handle/123456789/55153The proliferation of the Internet of Things (IoT) and wireless sensor networks enhances data communication. The demand for data communication rapidly increases, which calls the emerging edge computing paradigm. Edge computing plays a major role in IoT networks and provides computing resources close to the users. Moving the services from the cloud to users increases the communication, storage, and network features of the users. However, massive IoT networks require a large spectrum of resources for their computations. In order to attain this, resource scheduling algorithms are employed in edge computing. Statistical and machine learning-based resource scheduling algorithms have evolved in the past decade, but the performance can be improved if resource requirements are analyzed further. A deep learning-based resource scheduling in edge computing IoT networks is presented in this research work using deep bidirectional recurrent neural network (BRNN) and convolutional neural network algorithms. Before scheduling, the IoT users are categorized into clusters using a spectral clustering algorithm. The proposed model simulation analysis verifies the performance in terms of delay, response time, execution time, and resource utilization. Existing resource scheduling algorithms like a genetic algorithm (GA), Improved Particle Swarm Optimization (IPSO), and LSTM-based models are compared with the proposed model to validate the superior performances.enedge computingcloud computingInternet of Things (IoT)resource schedulingdeep learningпериферійні обчисленняхмарні обчисленняінтернет речей (IoT)планування ресурсівглибоке навчанняResource scheduling in edge computing IoT networks using hybrid deep learning algorithmArticleС. 86-101https://doi.org/10.20535/SRIT.2308-8893.2022.3.06519-62