Over the past few days, I've asked Facebook and Twitter and the Google-owned YouTube if they're reassessing their policies around video uploads.
The response has been pitiful.
A Google spokesman told me, "Nothing to announce at this stage."
Facebook sent details of their 24-hour operation to take down and block material related to the mosque massacres, and I can appreciate efforts have been comprehensive on that front. And a Twitter spokesman offered that the footage posted on Friday was in violation of its community guidelines.
But in terms of changes to livestreaming policy - or even if they're considering changes - Facebook and Twitter just won't respond.
It seems they shut the stable door after the horse had bolted. And I fear that, as with other incidents, they'll simply open it again once the media fuss dies down. Even now, it's barely shut, if at all.
Livestreamed video is a relatively new addition to Facebook and Twitter; both survived fine without it. I'm struggling to see why they can't temporarily suspend it when they search for better ways to filter or moderate it.
At the moment, it feels like my kids are at risk of seeing live snuff films on Facebook, just so Mark Zuckerberg can get fractionally richer.
I don't want to see Facebook knocked down. I'm not even one of those who wants to leave Facebook. My neighbour used the social network to arrange a vigil on Sunday, and I'm sure many others did the same. In my suburb, and others, it helps to build a sense of real-life community.
But I would like to see the major social networks acknowledge that they are publishers as much as platforms, and to take on the responsibility that comes with that - both legal and moral.
Prime Minister Jacinda Ardern has heard from Facebook chief operating officer Sheryl Sandberg and has indicated she wants to talk more with the company.
But Facebook is all too well experienced with deflecting politicians.
I hope Ardern is also looking to Germany, where Facebook's attention has been focused by a new law imposing fines of up to 50 million euros ($82 million) for any site that fails to delete posts featuring hate speech or fake news (our Harmful Digital Communications Act tops out at $200,000).
And I hope our lawmakers revisit their recent decision to hose down Privacy Commissioner John Edwards' request for his office - which has been grappling with Facebook - to finally get some real teeth.
On the private sector side, Spark, Vodafone, Vocus and other internet service providers have worked together to block hate sites that host the shooter's clip, or edits.
That's good, but the threat posed by 4Chan and the dark web pales beside the mainstream social media sites' inability to police their content.
So it was good to see Spark - before the Christchurch massacres - decide to pull advertising from YouTube in protest at inappropriate content targeted at children.
As another YouTube skeptic, TVNZ boss Kevin Kenrick put it, "brands are judged by the company they keep". That's something for Google to mull as its YouTube platform continues to host an allegedly anti-Semitic channel promoted by Brenton Tarrant in his livestream.
Post-shooting, Spark boss Simon Moutter has turned up the heat again, tweeting: "Helen Clark is 100% correct in asserting that if the global social media platform companies put as much effort into algorithms for preventing of the spread of hate material as they put into targeted advertising, they could easily solve the problem."
And as I type, it seems other large companies are poised to join Spark's YouTube boycott.
Hopefully, our PM will take an equally front-foot approach if she's granted an audience with Facebook's founder.
Though I'm also wary that Ardern is a high-rotate Facebook Live user herself. Like so many politicians, she finds it a great way to reach a mass audience with no mediation. Live video is a hard habit to break, even in these times.
The difference in emphasis has been interesting. Facebook said it blocked 1.5m videos in the 24 hours after the attack - 1.2m of them at the point of upload.
Our PM highlighted the number blocked at upload. In the US, Techcrunch headlined "Facebook failed to block 20% of uploaded New Zealand shooter videos."
And the blocking is still selective. A spokeswoman for YouTube owner Google says, "If a news organisation chooses to show non-graphic portions of the footage in their video, that would not violate our policy and they could appeal our rejection."
That's not good. Even edited versions risk glamourising the killer, and play into his desire for the masscre to go viral.