Firms that provide personalised services to users based on how other users behave puts users' personal information at risk, even if hackers don't directly gain access to the database. Hackers might get access to an algorithms' output for real users, then reverse-engineer the information to gain insight into a person's characteristics.
One strategy to combat this is to add some noise to an algorithm's output.
10
46 reads
The idea is part of this collection:
Learn more about cybersecurity with this collection
Understanding machine learning models
Improving data analysis and decision-making
How Google uses logic in machine learning
Related collections
Read & Learn
20x Faster
without
deepstash
with
deepstash
with
deepstash
Personalized microlearning
—
100+ Learning Journeys
—
Access to 200,000+ ideas
—
Access to the mobile app
—
Unlimited idea saving
—
—
Unlimited history
—
—
Unlimited listening to ideas
—
—
Downloading & offline access
—
—
Supercharge your mind with one idea per day
Enter your email and spend 1 minute every day to learn something new.
I agree to receive email updates