← Back to Home

Perspective

From signals to sense: what a diagnostic AI taught me about health products

Nolina Mishra

Case study

Adenomyosis diagnostic model — Google Vertex AI

Underdiagnosed reproductive condition. Frequently missed on standard imaging. Women sent home with incomplete care plans. Trained a deep learning model on ultrasound images that flagged slides incorrectly labeled by human experts.

The results were striking. But as the model improved, its limits became equally clear. Identifying the pattern is the beginning, not the endpoint. A diagnosis without support for the next steps still leaves patients navigating uncertainty alone.

Given the varying skill levels of clinicians, holistic inputs from the model could reduce the scope of error — but only if the product is designed to support decision-making, not just surface a classification.


What this taught me about AI health products

I'd been deep in the wearable health space at the same time — continuous glucose monitors, sleep trackers, recovery dashboards — managing my own metabolic health. Every device generated impressive data. But the experience was often more anxiety-producing than clarifying.

What devices give you

Glucose spike after a meal. Low HRV after poor sleep. Steps, strain, heart rate. Pattern recognition. Correlations.

What patients need

Does this spike matter for my condition? My goals? My medication? Interpretation. Context. A path to a decision.

The hardest product problem in AI health isn't accuracy. It's the layer between a signal and a decision.

This is where the opportunity is sharpest in women's health. Care protocols trail behind lived experience. Many conditions still lack holistic, evidence-based treatment plans. A model that surfaces patterns a time-pressed clinician might miss and gives patients clearer ways to participate in their own care isn't a futuristic promise — it's a product gap that exists right now.


The PMM question

The teams that build these products will face a familiar tension: the pressure to ship fast and promise big versus the discipline to scope honestly. My experience building emerging tech products in India — where teams routinely overpromise in the rush to scale — showed me how quickly credibility erodes when the product can't deliver on the narrative. In healthcare, that erosion carries real consequences.

The most useful thing a PMM can do in this space is hold the line between what the technology can do and what the user needs it to mean. That's not a safety constraint. It's the positioning work.