ayodee
ArticlesFor you
privacydatasubscriptionsmental health appsanonymitytech ethics

We are not data-whores

6 October 2026·9 min read

When you open a mental health or wellness app and begin entering data about your anxiety, your drinking, your mood, your sleep, your medication — you are creating something extremely valuable. Not to you. To the company that built the app.

You are creating a detailed, longitudinal, deeply personal psychological profile, attached to your name and email address, stored on servers you have no visibility into, governed by a privacy policy written by lawyers whose job is to maximise the company's latitude while minimising your legal recourse.

What happens to that profile depends on the company's business model. The most honest version is that it gets used to target you with advertising. A less honest version is that it gets sold to data brokers who aggregate it with your other digital footprints. The most concerning version is that it gets subpoenaed by a legal authority, requested by an insurer, or accessed in a breach — and because it has your name on it, there is no ambiguity about whose crisis it describes.

This is not a hypothetical. It is the current operating model of the wellness app industry.

How the model works

The wellness app economy runs on a specific and largely invisible transaction. The user believes they're paying for a service — a meditation programme, a mood tracker, a drinking diary. What they're actually providing is much more valuable than money: intimate, sensitive, longitudinal data about their psychological state.

This data is worth money in several ways.

Advertising targeting. A user who has disclosed anxiety, disrupted sleep, and elevated alcohol consumption is a highly targetable consumer for pharmaceutical advertising, supplement brands, insurance products, and alcohol brands that have licensed specific segments from the data platform. The targeting can be inferred rather than explicit — "people in this mood cluster respond to this ad category" — which keeps the operation at arm's length from the data it's exploiting.

Data brokerage. Health data, including mental health and substance use data, is bought and sold in markets that have minimal regulatory oversight in most jurisdictions. The privacy policy that says "we may share data with third-party partners" is the disclosure. The third-party partners are data brokers. The data ends up aggregated with your location data, your purchase history, your browsing behaviour, and sold onward.

Research and insurance. Several large wellness and health app companies have partnerships with insurers and pharmaceutical companies under research agreements. The data flowing through these agreements is disclosed in the privacy policy under language like "for research and analytics purposes." The research purposes can include actuarial risk modelling — assessing whether your disclosed mental health data correlates with insurance risk categories.

Legal exposure. When law enforcement or legal proceedings require data, companies that hold identified records are compelled to produce them. A wellness app that holds your name, email, and two years of mood and substance use data will hand that over when legally required to. The privacy policy will note this, in the same tone and font as everything else.

The subscription model compounds this. Monthly or annual billing requires a payment relationship — name, billing address, credit card details — attached to the account. The payment data may be processed by a third party, but the account relationship is held by the company. The identified user is now a subscriber whose mental health data is attached to a billing profile that includes verified personal information.

The specific vulnerability of substance use data

Mental health data is sensitive. Substance use data is in a category of its own.

Disclosed substance use can affect insurance premiums and coverage decisions. It can affect employment — directly through drug testing policies and indirectly through the data trails that inform background check providers. It can affect child custody proceedings. It can affect visa and immigration applications. It can affect professional licensing in regulated industries. In some jurisdictions, it can affect criminal proceedings.

A person who uses a substance use tracking app that holds their name, email, and a detailed record of their alcohol and drug use has created a document that could be adversarial to their interests in any of these contexts. They did so in good faith, seeking to understand themselves better. The potential for harm from that document exists regardless of the company's intentions, because the document is identifiable and it exists.

The standard privacy policy response to this is reassurance: "we take your privacy seriously," "we use industry-standard security," "we will notify you in the event of a breach." These assurances are not nothing. They are also not a structural guarantee. They are policy-level commitments that exist in tension with commercial incentives and that have, repeatedly, proven insufficient when those incentives are significant or when security is breached.

What "architecturally impossible" means and why it's different

ayodee was built around a different premise: that the safest data is data that doesn't exist.

Registration requires only a passphrase — four words, chosen by you, hashed before storage using a one-way cryptographic function. A hashed passphrase cannot be reversed to recover the original. It cannot be used to identify you. It is not a credential that reveals anything about who you are.

No name is collected. No email address is required — it's optional, used only if you choose to receive a magic link for login or diary reminders, and is not linked to any identifying profile. No date of birth. No address. No device identifier attached to a named account. No payment relationship linked to the app account — payment is processed separately by Stripe, and the code that extends your access is emailed to the address you provided for payment. That email address is never associated with your app data.

The consequence of this architecture is specific: there is no record that could be produced in response to a legal order that would identify you as the person who created it. Not because we would refuse to comply with a court order — we would comply — but because compliance would produce a passphrase hash, a set of diary entries, and a set of mood scores, with no name or identifier attached to any of them. The data exists. The person it belongs to is unknown to us.

This is not a privacy policy. It is a structural fact about what the system was built to hold.

The subscription model and what it requires

Most wellness apps are subscription businesses. Monthly or annual recurring revenue is the standard model — and it requires a billing relationship with the user that is, by definition, identified. You cannot charge a subscription to an anonymous user. The subscription model and genuine anonymity are structurally incompatible.

ayodee uses a pay-as-you-go model precisely because of this. Your first 14 diary entries are free. Access beyond that is extended by purchasing a code — 1 month, 3 months, or 12 months — through a separate Stripe payment flow. The code is emailed to you and entered in the app. The email address used for the payment is never associated with your diary data. The Stripe transaction is a one-off purchase, not a recurring billing relationship.

This model costs something in revenue predictability — subscriptions produce smoother cash flow than one-off purchases. That's a deliberate choice, made because the alternative requires knowing who you are, and we specifically don't want to know who you are.

The economics also work differently for the user. There is no subscription to forget to cancel. There is no annual charge that appears on your statement in a category you'd prefer wasn't there. There is no renewal that continues after you've stopped using the app. You pay for the access period you want, when you want it, and nothing else happens.

Why US server location doesn't undermine this

The servers that store ayodee data are US-based — Vercel for the application layer, Railway for the database. US jurisdiction applies.

US law includes legal mechanisms — court orders, law enforcement requests, national security letters — that can compel production of user data. This is real and worth understanding clearly.

What it doesn't change is the architectural point: what can be produced is diary entries and mood scores attached to a passphrase hash, with no identifying information linking that data to a named person. A court order compelling ayodee to produce "all data related to [named individual]" would produce nothing, because the system has no way to retrieve data by named individual. A court order compelling production of all diary entries would produce a database of diary entries with no names attached.

This is different from the position of an app that holds identified records and is legally compelled to produce them. The compellable data is not the same. The risk to the user is not the same.

What we're actually selling

The wellness app industry has discovered that people will pay — in money and in data — for the feeling of being helped with their most vulnerable concerns. The product that's being sold is the feeling. The data that's being collected is the real transaction.

ayodee sells access to a self-monitoring tool built on evidence-based principles. The data you generate belongs to you — not in the aspirational sense that wellness companies use that phrase, but in the structural sense that we can't access it in a form that would be useful to anyone who wanted to use it against you.

The price of access is money. A modest, one-off amount of it. The data is not part of the transaction.

This is not the dominant model. It is deliberately not the dominant model. The dominant model works very well for the companies running it and very poorly for the people using it — particularly when those people are entering the kind of sensitive, personal, potentially adversarial information that a substance use diary contains.

You should know what you're entering, and where, and what happens to it. The answer here is: a diary entry, on a server in the US, attached to nothing that identifies you, recoverable by no one who doesn't have your passphrase, owned by you in the only sense of ownership that matters.


ayodee is a 90-second daily diary for substance use, mood, and sleep. Passphrase only. No name, no email required, no subscription. Your data exists. It just can't be linked to you.

Want to see your own patterns?

ayodee is a 90-second daily diary for your substance use, mood, and sleep. Anonymous, no email required. Free to start.

Try ayodee free