Deep Learning-based Concurrent Resource Allocation for Enhancing Service Response in Secure 6G Network-in-Box Users using IIoT
Network-in-box (NIB) architectures are designed to improve communication information sharing in an ad-hoc manner with limited infrastructure support. These architectures are interoperable, and hence it is capable of providing services based on sixth-generation (6G) communication technologies. Resource allocation for massive machine-type communications in the 6G platform aided for NIB architectures is challenging due to the terahertz and high throughput features.Then it comes to resource allocation, and the most difficult part is figuring out what capacity is to check the availabilityof work on a project is difficult to determine.In this manuscript, the attuned slicing-dependent concurrent resource allocation (AS-CRA) method is formulated for improving the service reliability of the 6G users in NIB architecture. Learning assisted slicing and concurrent resource allocation process is jointly exploited in this proposed method to improve the users’ service reliability. The output of the learning process is useful in classifying resource allocation and user mapping irrespective of the limited NIB infrastructure support. Virtualization and concurrency in resource allocation are balanced based on the user capacity and network blocking rate to achieve optimality in service responses. The performance of the proposed resource allocation method is verified using simulations, and the performance is verified using the metrics capacity 89.726%, latency 81.32%, resource utilization rate 0.963%, response ratio 92.309, and blocking rate 0.047%.
IEEE Internet of Things Journal
Digital Object Identifier (DOI)