Want a Free Amazon Halo Wearable? Just Hand Over Your Data to This Major Insurance Company

Insurance has always been one of the most obvious candidates for usage of tracking data. John Hancock, a life insurer, is partnering with Amazon to give Halo health and fitness tracker wristbands in exchange for data, which will be used to adjust insurance premiums (through discounts and rewards).

https://onezero.medium.com/want-a-free-amazon-halo-wearable-just-hand-over-your-data-to-this-major-insurance-company-56b6430b0749

The article also mentions the potential for creating economic inequities. Privacy becomes more expensive.

2 Likes

Exactly. I find it generally hard to convince people to care about their privacy because they feel mass surveillance doesnā€™t affect them (they arenā€™t important enough).

But when I bring up the topic of insurance generally they start getting the point of how this may impact them. So Iā€™m not surprised that this is happening but in the USA they donā€™t have significant privacy laws.

Do you know if this is also happenning in the EU?

I donā€™t know about Europe, Iā€™ve just read this article. I suppose it could happen based on consent, because this is all explicit. I donā€™t know if other laws, not related to privacy, could also play a role.

Iā€™ve seen that argument about not being important enough to be affected by mass surveillance, but yes, this kind of automated assessment, AI algorithms, etc, show that you donā€™t have to be special to be targeted. Itā€™s not like thereā€™s a committee meeting in a dark room, analyzing your file and deciding to go after you, which seems to be whatā€™s on peopleā€™s minds when using that reasoning.

Anyway, I know itā€™s kind of pointless to say this in this forum, where people do worry about privacy. I also find it hard to convince others that itā€™s an important issue.

1 Like

I think there are transparency laws that require explainability of results. Something that black box algorithms canā€™t do (like the most common artificial neural nets).

Yup. Out of sight, out of mind. Thereā€™s a nice paper somehow related to this about steps one has to take to start caring about their privacy called ā€œWhy doesnā€™t Jane Protect her Privacyā€

e2e

1 Like

Thanks for the paper, thatā€™s great.

About explainability, I think in this case the rules can be very simple: hit some exercise goals and get a discount. I was wondering about other things like insurance regulations, but I have no idea.

1 Like

https://epic.org/alert/epic_alert_27.13.html#4._

" Amazon Claims ā€˜Haloā€™ Device Will Monitor Userā€™s Voice for ā€˜Emotional Well-Beingā€™

Despite the exceptional privacy risks of biometric data collection and opaque, unproven algorithms, Amazon last week unveiled Halo, a wearable device that purports to measure ā€œtoneā€ and ā€œemotional well-beingā€ based on a userā€™s voice. According to Amazon, the device ā€œuses machine learning to analyze energy and positivity in a customerā€™s voice so they can better understand how they may sound to others[.]ā€ The device also monitors physical activity, assigns a sleep score, and can scan a userā€™s body to estimate body fat percentage and weight. In recent years, Amazon has come under fire for its development of biased and inaccurate facial surveillance tools, its marketing of home surveillance camera Ring, and its controversial partnerships with law enforcement agencies. Last year, EPIC filed a Federal Trade Commission complaint against Hirevue, an AI hiring tool that claims to evaluate ā€œcognitive ability,ā€ ā€œpsychological traits,ā€ and ā€œemotional intelligenceā€ based on videos of job candidates. EPIC has long advocated for algorithmic transparency and the adoption of the Universal Guidelines for AI."

2 Likes