Abdinasir HirsiLukman Audah0000-0001-8880-6153Adeb SalhMohammed AlhartomiZhili SunAhmed HammoodiSalman Ahmed2025-09-122025-09-122025-1010.1016/j.phycom.2025.102712https://dspace-cris.utar.edu.my/handle/123456789/11336The integration of fifth-generation/sixth-generation (5G/6G) ultra-reliable low-latency communication (URLLC) with industrial Internet of Things (IIoT) applications is revolutionizing Industry 4.0, and enhancing IIoT performance through artificial intelligence (AI) simulations. The critical need for low latency and high reliability in IIoT devices can be effectively addressed by leveraging AI techniques, optimizing data processing, and making decisions in real time. Traditional methods achieve some level of efficiency and safety, but AI offers significant improvements in decision-making, safety, quality prediction, and employee adoption. Integrating AI into IIoT applications enhances industrial workflows, while presenting opportunities and challenges. Machine learning (ML) and deep learning (DL) algorithms enable industrial applications to operate efficiently and intelligently. This paper outlines the requirements for reliable and low-latency communication links between IIoT devices and primary research areas where AI algorithms can be employed, such as fault diagnosis, intelligent anomaly detection, edge computing, network performance, and intrusion detection systems in IIoT applications. Special attention is paid to the role of AI techniques in enhancing IIoT system performance and efficiency, highlighting its advantages, applications, and challenges. The current state-of-the-art challenges and future directions of AI in IIoTs are discussed, providing insights for further research. Potential areas for further research include developing new techniques, integrating 5G/6G technologies, autonomous decision-making, self-optimization, addressing mission-critical applications, and shifting AI processing to the edge. This comprehensive review will benefit academics, researchers, professionals in AI and IIoT, and industries seeking to leverage AI technologies to enhance IIoT performance and efficiency. © 2025 Elsevier B.V.en-USArtificial intelligenceDeep learningIIoT applicationsIndustrial IoTMachine learningURLLCAnomaly detectionDecision makingEdge computingEfficiencyFault detectionIndustrial internet of things (IIoT)Industry 4.0Intrusion detectionLearning algorithmsLearning systemsOptimizationArtificial intelligence techniquesDecisions makingsIndustrial internet of thing applicationLow-latency communicationMachine-learningPerformancePerformances evaluationUltra-reliable low-latency communicationArtificial intelligence performance evaluation for URLLC of industrial IoT applications: A review, open challenges and future directionstext::review