Google Removes Feature That Used AI to Serve Up Stranger-Sourced Health Advice

by admin477351

Google has shut down a search tool that presented users with health suggestions collected from anonymous strangers online. Known as “What People Suggest,” the feature relied on AI to curate and organize crowd-sourced perspectives from internet forums and discussions. Three insiders confirmed the tool is no longer live, and Google has since acknowledged its removal.

The feature made its debut at Google’s annual “The Check Up” event in March of last year, framed as a tool that would enrich health searches with authentic human experiences. Karen DeSalvo, who served as Google’s chief health officer at the time, wrote that users often value peer insights alongside expert medical guidance. The feature was initially rolled out to mobile users in the United States.

When pressed for details, Google attributed the shutdown to a broader effort to streamline its search interface. The spokesperson declined to link the removal to safety considerations, insisting the feature’s quality was never the issue. Despite this, the company struggled to point to any transparent public communication about the decision, pointing instead to a blog post that never even mentioned the feature by name.

The timing of the removal places it squarely within a broader controversy about Google’s use of AI in health search. A major investigation earlier this year uncovered significant inaccuracies in Google’s AI Overview health summaries, which are served to roughly two billion users every month. In response to that investigation, Google pulled AI Overviews for certain medical topics, though it stopped well short of a comprehensive overhaul.

The company is pressing forward with its health AI agenda, with its upcoming “The Check Up” event expected to showcase innovations and partnerships in the medical technology space. But critics argue that scrapping a potentially dangerous tool without public acknowledgment is not a substitute for genuine transparency. Greater regulatory and public pressure may be necessary to force more meaningful accountability from the tech giant.

You may also like