The video, first uploaded in 2016, remains on YouTube, but there's an "important update" written by Liz, the niece, a year later.
Mari's cancer had returned, the note said, and she had died.
I found Mari's videos without looking for them when a search for a smoothie recipe opened up an algorithmic tunnel to videos that claimed to know the secret to curing cancer.
These tunnels, forged by Google searches and Facebook recommendations, connect relatively staid health and nutrition advice to fringe theories, false claims and miracle juices.
But the web of false, misleading and potentially dangerous cancer "cures" and conspiracy theories more often ensnares people reeling from bad news, groping for answers.
"People with a new cancer diagnosis are often feeling vulnerable and scared," said Renee DiResta, a researcher who studies disinformation.
The treatments for cancer, especially chemotherapy — which targets cancerous cells but can also kill or damage healthy ones — can come with significant, unpleasant side effects. Facing the horrors of such a diagnosis and treatment, some people search for information and community online.
What they find can be quite disturbing to medical professionals: home remedies that purport to cure diseases with baking soda, frankincense, silver particles.
Google and Facebook have promised to crack down on health misinformation in recent months, as links between anti-vaccine conspiracy theories and measles outbreaks become major news.
Under present legislation, tech companies respond to reports of harmful content on their own terms, at their own pace. The result, in the case of health misinformation? A long period during which seekers found themselves immersed in wells of dubious advice and conspiracy thinking.
They carried it into their own networks by the bucketful. In this way, the proliferation of bogus medical science in the internet age resembles a public-health crisis: The harm can be hard to calculate, and remedies cannot undo damage already done.
As recently as late April, searching "cure for cancer" in YouTube surfaced several troubling results: The sixth video, with more than 1.4 million views, said baking soda could cure cancer.
The eighth was an interview with self-described cancer expert Leonard Coldwell, in which Coldwell explains that every cancer can be cured in weeks with a special diet that "alkalises" the body, a claim debunked by scientists. The video has more than 7 million views.
YouTube is trying to plug the holes that lead to videos like the Coldwell interview. When I ran the "cure for cancer" search again, in May, the baking soda and Coldwell videos were still online, but most of the top results came from major cancer research centres.
YouTube says it has started to treat search results for different types of topics differently: When its algorithms decide a search query is related to news or information-gathering on a topic like cancer, they will attempt to populate results with more authoritative sources.
It's tempting to think of medical misinformation as a technological problem in need of a technological solution, but that's only part of it. Humans are part of the infrastructure of the internet.
For those facing a battle with a terrifying illness, hopeful anecdotes can be powerful.
Anecdotes can turn seekers into believers, who can turn other seekers into believers. And on Facebook, those anecdotes continue to attract large audiences.
Even as Facebook works to limit the reach of anti-vaccine chatter, other medical misinformation is thriving — including bogus cancer cures.
Facebook is in the process of experimenting with how to address health misinformation beyond vaccines. One possibility might be alerting users who are invited to join a group that it has circulated debunked hoaxes.
To this point, it's been up to users to steer their peers toward or away from bad health advice.