The Industrial Internet of Things (IoT) typically consists of several thousands of heterogeneous devices, such as sensors, actuators, access points, machinery, end-users' handheld equipment, and supply chain. In such an industrial environment, a multitude of data is generated from massive IoT devices, e.g., sensors for monitoring the environment, reading temperature, and gauging pressure. Most of the data are from delay-sensitive and computation-intensive applications, such as real-time manufacturing and automated diagnostics, which require big data analytics with low latency. Machine learning (ML) has been witnessed as an efficient solution for big data analytics. The majority of such ML algorithms are centralized methods, meaning that they first gather data from different users for use as a training dataset, which is placed on the ML server, and then build a model to classify the new data samples by applying the ML algorithms to this training dataset. However, the access to these datasets in the centralized ML methods raises concerns about data privacy for users. Federated learning (FL) was designed to protect data privacy to address a part of these issues. In FL, each participant uses a global training model without uploading their private data to a third-party server. Compared with the conventional ML, FL can preserve data security, especially in terms of participant data during the learning process. In particular, FL can also help in updating server-side data for the global model, and the participant is not required to provide their data. However, in FL, individual computing units may show abnormal actions, such as faulty software, hardware invasions, unreliable communication channels, and malicious samples deliberately crafting the model. To mitigate these challenges, we require robust policies to control the learning phases in FL. Motivated by the abovementioned issues, this special section solicits original research and practical contributions that advance the security and privacy of the FL solutions for industrial IoT applications as follows.

Guest Editorial Security and Privacy of Federated Learning Solutions for Industrial {IoT} Applications / M. Shojafar, M. Mukherjee, V. Piuri, J. Emal Abawajy. - In: IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS. - ISSN 1551-3203. - 18:5(2022 May), pp. 3519-3521. [10.1109/tii.2021.3128972]

Guest Editorial Security and Privacy of Federated Learning Solutions for Industrial {IoT} Applications

V. Piuri
Penultimo
;
2022

Abstract

The Industrial Internet of Things (IoT) typically consists of several thousands of heterogeneous devices, such as sensors, actuators, access points, machinery, end-users' handheld equipment, and supply chain. In such an industrial environment, a multitude of data is generated from massive IoT devices, e.g., sensors for monitoring the environment, reading temperature, and gauging pressure. Most of the data are from delay-sensitive and computation-intensive applications, such as real-time manufacturing and automated diagnostics, which require big data analytics with low latency. Machine learning (ML) has been witnessed as an efficient solution for big data analytics. The majority of such ML algorithms are centralized methods, meaning that they first gather data from different users for use as a training dataset, which is placed on the ML server, and then build a model to classify the new data samples by applying the ML algorithms to this training dataset. However, the access to these datasets in the centralized ML methods raises concerns about data privacy for users. Federated learning (FL) was designed to protect data privacy to address a part of these issues. In FL, each participant uses a global training model without uploading their private data to a third-party server. Compared with the conventional ML, FL can preserve data security, especially in terms of participant data during the learning process. In particular, FL can also help in updating server-side data for the global model, and the participant is not required to provide their data. However, in FL, individual computing units may show abnormal actions, such as faulty software, hardware invasions, unreliable communication channels, and malicious samples deliberately crafting the model. To mitigate these challenges, we require robust policies to control the learning phases in FL. Motivated by the abovementioned issues, this special section solicits original research and practical contributions that advance the security and privacy of the FL solutions for industrial IoT applications as follows.
Settore ING-INF/05 - Sistemi di Elaborazione delle Informazioni
Settore IINF-05/A - Sistemi di elaborazione delle informazioni
mag-2022
https://hdl.handle.net/2434/1211955
Article (author)
File in questo prodotto:
File Dimensione Formato  
Guest_Editorial_Security_and_Privacy_of_Federated_Learning_Solutions_for_Industrial_IoT_Applications.pdf

accesso riservato

Tipologia: Publisher's version/PDF
Licenza: Nessuna licenza
Dimensione 414.38 kB
Formato Adobe PDF
414.38 kB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2434/1034408
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 7
  • ???jsp.display-item.citation.isi??? 5
  • OpenAlex ND
social impact