Rosen doubles down on blaming the media, saying "distribution was further propelled by broad reporting of the existence of a video, which may have prompted people to seek it out".
Facebook users themselves don't get off that easy. You see, they didn't help Facebook do its job properly. As Rosen explains: "During the entire live broadcast, we did not get a single user report. This matters because reports we get while a video is broadcasting live are prioritised for accelerated review."
MORE:
• Front up, Zuck
• Up to 14 years' jail for video sharers as Commissioner asks Facebook to give police names
• Kiwis lose jobs for sharing massacre video at work
• Philip Neville Arps in custody for allegedly sharing footage of shooting
• Masterton woman arrested for Facebook post after mosque shooting
• Ray White real estate agents removed over racially charged Facebook posts
Rosen twice repeats that the video itself had only been viewed 200 times when it was live, a refrain Facebook has been pushing hard across its PR efforts in the aftermath of the massacre. The argument goes that it was from this base of 200 views that reworkings of the video were allowed to proliferate through the internet.
To be clear, the expectation here is that someone who stumbles upon a piece of shocking content is immediately required to engage their rational senses to report the matter to very platform that is allowing this content to be distributed. I urge anyone who saw the video to think back to their first response at seeing the footage and to consider whether the thought of reporting the footage to Facebook even crossed their mind. Chances are, you were more likely trying your best to process what the hell was going on in your country.
Imagine the national outcry, if a television or radio broadcaster only made an effort to remove objectionable content once it received a complaint from the public. This simply isn't good enough for a mainstream media channel and it shouldn't be good enough for Facebook.
The problem with Facebook's defence is that it's little more than a rehashed version of the old line it has always used to skirt around obligations as responsible publisher: namely, that it isn't a media company.
Rosen is careful in his choice of language. He variously describes Facebook as an "online platform" and "social network", outright avoiding the more common descriptor of a "social media company", which, of course, features the uncomfortable 'M' word right there in the middle.
This is an old line coated in a fresh veneer of nonsense.
You'd almost be forgiven for believing that Facebook had taken a few tips in partisan obfuscation from its users who have become so good at making their hate speech look reasonable.
In much the same way that many of the most popular members of the Alt Right cover their underlying thoughts in eloquence and pseudoscience, Facebook is just reiterating the position it's always held in a new way.
The thing, however, is that we aren't falling for it anymore.
"We cannot simply sit back and accept that these platforms just exist and that what is said on them is not the responsibility of the place where they are published," said Prime Minister Jacinda Ardern this week.
"They are the publisher. Not just the postman. There cannot be a case of all profit no responsibility."
Which is to say we can no longer accept that Facebook is anything but a media company.