Discovery Group has always been a company obsessed with measurement. Steps, heart rate, blood pressure, driving style – every data point a proxy for how “good” we’re being. Now it wants to add another metric to that ledger: how we sleep.
Its new initiative, announced at a media event at its palatial head office in Sandton on Tuesday, will link wearables to its Vitality rewards ecosystem, allowing members to earn points or benefits for a healthy night’s rest. On the surface, it’s clever use of technology. Sleep is the new frontier in wellness – studies link a lack of it to everything from obesity to depression – and the idea of incentivising rest fits neatly into Discovery’s behavioural-economics playbook that has been key to much of its success.
The company’s logic is elegant: if you can get people to value sleep as much as they do steps or kale smoothies, you might bend long-term health outcomes in a cheaper, preventive direction.
But it also brings us to a deeper question about what kind of society we’re building when one of the most intimate corners of our lives – our bedrooms – become data points.
Discovery’s model has always depended on surveillance dressed as self-improvement. The app rewards the jogger, discounts the broccoli eater and penalises the heavy-footed driver. You volunteer to be watched because the deal sounds good.
The trouble is that constant monitoring doesn’t stay in its lane. Once a company can track your sleep, why stop there? Why not measure stress from your smartwatch, detect mood swings from voice patterns, infer relationships from location data?
Equities analyst Irnest Kaplan made a good point to me on X on Wednesday: while noting that Discovery isn’t tracking its customers for anything “sinister” (I agree fully), sleep may also be a deeply contextual and unfair metric to measure.
Asymmetry
A parent with a newborn or a resident in a crime-plagued suburb may sleep poorly through no fault of their own. Yet under a rewards-based system, they’re penalised for circumstances beyond their control. Taken to its conclusion (mine, not Kaplan’s), those most likely to lose sleep over safety or poverty are least able to earn the perks meant to make them “healthier”.
Discovery will say participation is voluntary. But how voluntary is it when your premiums and perks hinge on compliance? Behavioural nudges can morph into quiet coercion.
Read: Discovery’s next big idea: turning sleep into rewards
By the way, we’ve already normalised this stuff: we trade privacy for convenience every day – sharing our steps with Apple, our locations with Google, and our moods with Meta and X. Surveillance capitalism has taught us to see tracking as “care”. But data collection, no matter how benevolent the intent, creates asymmetry: the company learns everything about us; we learn almost nothing about how that information is used, who accesses it or what happens if it’s breached in a cyberattack.
We’ve already lived this once. Social media promised connection; it delivered addiction and polarisation. In exchange for convenience, we handed Silicon Valley the blueprint of our personal lives – our interests, our friends and our politics. The result? An attention economy where outrage has become the fastest route to profit.
Health surveillance risks replaying this same pattern. But instead of selling outrage, it sells virtue. The mechanism is the same: track, quantify and monetise human behaviour. The danger lies in mistaking that for “care”. It’s nothing of the sort; it’s a business model designed to maximise shareholder returns.
It’s not hard to imagine where this road could lead, especially if governments begin to implement this sort of tracking technology. Contracting with a private company voluntarily is one thing; having a government do it is quite another. And it’s already happening.
China’s social-credit experiments offer a hint: the country uses vast data – from financial records to online behaviour – to rate citizens and businesses on “trustworthiness”. High scores can mean perks like easier loans; low scores can restrict travel or employment.
Critics say it enforces conformity and state control, turning surveillance into a mechanism of social discipline.
South Africa is a democracy; China isn’t. But even in democracies, the appetite for personal data is growing – think pandemic contact-tracing apps or smart-city sensors or citywide, AI-powered CCTV networks. The line between protection and intrusion blurs fast.
We should not pretend South Africa is immune. High crime rates create fertile ground for “safety tech” that quietly erodes privacy in the name of security. Insurance incentives could easily spill into public policy: safer drivers get lower premiums today; tomorrow, maybe better licence renewals or tax rebates?
Beyond privacy, there’s a moral discomfort in delegating judgment to data.
Who is really in control?
Rewarding sleep as a metric of virtue risks penalising the vulnerable – the naturally anxious person, the double-shift worker, the parent treating a sick baby. Algorithms don’t see that context. They only see compliance.
None of this is to say Discovery is acting in bad faith. Its approach has improved millions of lives and arguably shifted the health insurance model in a positive direction. But scale matters. When a private company’s influence over daily behaviour becomes so deep that it shapes when we sleep, eat and move, we should pause to ask who is really in control.
Discovery’s idea may yet get people to sleep better. But it is also a good idea to be wide awake to the risks of too much surveillance in our lives. – © 2025 NewsCentral Media
Get breaking news from TechCentral on WhatsApp. Sign up here.