TL;DR

A developer of the AI-powered PoopCheck app is selling a database of over 150,000 stool images collected from users. The sale highlights significant privacy and data security concerns, as user data is being used beyond stated policies. The development underscores risks related to sensitive health data in AI training.

A developer of the AI stool analysis app PoopCheck is offering a database of over 150,000 user-uploaded images for sale, raising urgent questions about user privacy and data security. The sale was revealed through a Reddit post by a user claiming to possess the database, and confirmed by the app’s developer, highlighting potential misuse of sensitive health data.

The database, reportedly containing images from approximately 25,000 users, was advertised on Reddit by a user named Ill_Car_7351, who claimed to have collected and annotated the images from the PoopCheck app, developed by Soft All Things. The app, which analyzes stool images to provide gut health insights, features a community and leaderboard but also states in its terms that user data can be used for AI training and commercial purposes.

The Reddit poster offered the database for sale, describing it as containing highly rare, labeled images that could be valuable for machine learning, medical research, or cancer studies. When contacted, the developer Marco from Soft All Things confirmed that the images were collected via the app, which had over 150,000 shared stool images at the time of inquiry. He indicated willingness to share a sample of the data with interested parties.

Official privacy policies on the app’s store page claim no data collection, but the terms of service explicitly grant the company rights to use, reproduce, and sell user data, including images, for AI development and other commercial uses. This discrepancy between stated privacy policies and actual data rights is a key concern.

Why It Matters

This incident underscores the risks of health-related apps collecting sensitive data under the guise of privacy, only to potentially exploit it commercially. It highlights a broader issue of inadequate regulation and transparency in health tech, especially concerning AI training datasets. For users, it raises questions about how their personal health data is protected and whether consent is meaningful. For developers and regulators, it signals a need for stricter oversight and clearer privacy standards in health and AI applications.

Amazon

at-home stool analysis kit

As an affiliate, we earn on qualifying purchases.

As an affiliate, we earn on qualifying purchases.

Background

PoopCheck was launched several years ago as a gut health app using AI to analyze stool images, with claims of privacy and security. Its privacy policy states data is encrypted and can be deleted, but the terms of service explicitly allow the company to use and sell user data for AI and research purposes. The incident follows a broader pattern of health apps sharing or selling user data, often without clear user awareness or consent. The Reddit post revealing the sale of the database has attracted attention for exposing how easily sensitive data can be commodified in underground markets.

“I’ve got 150k+ labeled and classified images of 💩 from roughly 25K different people. I know there’s a lot of value in it, but not sure how to move forward.”

— Ill_Car_7351

“The images were gathered from real users over the last couple of years, and we’re open to sharing a sample of the data with interested parties.”

— Marco, Soft All Things

Amazon

gut health testing kit

As an affiliate, we earn on qualifying purchases.

As an affiliate, we earn on qualifying purchases.

What Remains Unclear

It remains unclear how many users were explicitly aware that their images could be used or sold for commercial purposes, and whether the app’s privacy policies adequately inform users about such data use. The legal implications of selling health data collected under the guise of privacy policies are also still uncertain, as are the potential regulatory responses.

Amazon

fecal microbiome test

As an affiliate, we earn on qualifying purchases.

As an affiliate, we earn on qualifying purchases.

What’s Next

Authorities and privacy advocates may investigate the sale and app’s data practices. Further disclosures from Soft All Things about their data policies and user consent are expected. The incident could prompt stricter regulations on health app data handling and AI training datasets, and users may become more cautious about sharing sensitive health information online.

Amazon

AI stool analysis app

As an affiliate, we earn on qualifying purchases.

As an affiliate, we earn on qualifying purchases.

Key Questions

Is my stool image data safe when I use PoopCheck?

While the app claims to encrypt data and prioritize privacy, the terms of service indicate that user images can be used and sold for AI training and commercial purposes, which may pose privacy risks.

Could my personal health data be sold without my knowledge?

Yes, according to the app’s terms, user data, including images, can be used for research and commercial purposes, and users may not be fully aware of this when they upload images.

Legal protections vary by jurisdiction, but generally, health data is protected under privacy laws. However, app terms often grant broad rights to companies, complicating enforcement.

Will this incident lead to stricter regulations?

It is possible that regulators will scrutinize health app data practices more closely, potentially leading to new rules on data collection, transparency, and user consent.

You May Also Like

Musk’s Colossus 1 AI supercomputer’s inefficient mixed-architecture design couldn’t be used to train Grok, so Anthropic’s using it for inference instead — Musk readies unified Blackwell-only Colossus 2 for frontier training and potential IPO

SpaceX’s Colossus 1 supercomputer, with mixed GPU architecture, is being leased to Anthropic to address its compute bottlenecks, raising questions about efficiency.

What happens when AI starts building itself?

San Francisco startup Recursive Superintelligence aims to develop self-improving AI models capable of autonomous research and self-repair, raising significant questions about AI’s future.

US reportedly allows 10 Chinese companies to buy NVIDIA’s coveted H200 AI chips

The US reportedly permits 10 Chinese companies to buy NVIDIA’s H200 AI processors, marking a potential shift in export controls amid ongoing tensions.

AI data centers trigger massive ‘irreversible’ 76% electricity price spike in largest US region — federal watchdog demands tech giants pay for their own power infrastructure

A federal watchdog reports a 76% spike in electricity prices in LA, driven by AI data center demand, raising concerns over market impact and future costs.