Which of the following options correctly identifies a privacy-preserving ML technique that operates on decentralized devices with local data and shares only updates?

Study for the CompTIA SecAI+ (CY0-001) Exam. Review flashcards and multiple choice questions, each with detailed explanations. Ace your certification!

Multiple Choice

Which of the following options correctly identifies a privacy-preserving ML technique that operates on decentralized devices with local data and shares only updates?

Explanation:
Federated learning is the approach described. It trains the model directly on each device using its own local data, and then only shares the resulting updates (like gradients or weight changes) with a central aggregator. The aggregator combines these updates to improve the global model, which is then sent back to devices for another round. This keeps raw data on-device, reducing exposure and bandwidth needs, while still enabling collaborative model improvement. Privacy can be further enhanced with secure aggregation or differential privacy, but the defining idea is decentralized training with updates exchanged rather than data. Differential privacy is a privacy guarantee applied to data or outputs, not the fundamental workflow of distributed model training. Secure enclaves provide isolated hardware for computation but don’t inherently describe decentralized devices sharing only updates. Homomorphic encryption allows computation on encrypted data but is a cryptographic technique rather than the standard federated training setup.

Federated learning is the approach described. It trains the model directly on each device using its own local data, and then only shares the resulting updates (like gradients or weight changes) with a central aggregator. The aggregator combines these updates to improve the global model, which is then sent back to devices for another round. This keeps raw data on-device, reducing exposure and bandwidth needs, while still enabling collaborative model improvement. Privacy can be further enhanced with secure aggregation or differential privacy, but the defining idea is decentralized training with updates exchanged rather than data.

Differential privacy is a privacy guarantee applied to data or outputs, not the fundamental workflow of distributed model training. Secure enclaves provide isolated hardware for computation but don’t inherently describe decentralized devices sharing only updates. Homomorphic encryption allows computation on encrypted data but is a cryptographic technique rather than the standard federated training setup.

Subscribe

Get the latest from Examzify

You can unsubscribe at any time. Read our privacy policy