Clegg was representing Facebook in Paris at the Christchurch Call summit, where 17 countries, the European Commission and several tech companies signed the Call to Action to work together to eliminate terrorist and violent extremist content online.
In what is believed to be a world-first, major tech companies Microsoft, Twitter, Facebook, Google and Amazon released a joint statement saying they would set out concrete steps to address the abuse of technology to spread terrorist content.
Facebook boss Mark Zuckerberg, who was not in Paris for the summit, has said that putting a delay on livestreaming as an added safeguard would fundamentally break the service.
Clegg echoed this sentiment, saying it was "something that is used by millions of people for good, decent, happy, innocent reasons - it's part of the suite of things we provide so people can express themselves as fully and freely as they like".
Yesterday, Facebook also said it will change its policy and restrict more users who have broken certain rules from livestreaming.
Clegg, the UK's former deputy prime minister, said if the changes had been in place on March 15, the gunman would not have been able to livestream his video.
He said the risk of another March 15 happening again could never be fully eliminated in a "free country", so the question should be how to minimise it.
"We're trying to narrow the funnel so if someone has contravened our rules, that can lead to them being disabled from using live altogether.
In the Call to Action, tech companies agreed to share the outcomes of their algorithms and counter the drivers of terrorism, including the potential of algorithms to lead users down an online rabbit hole to radicalisation.
But Clegg said that did not happen at Facebook.
"We do not design an algorithm to send people down a rabbit hole of ever-more extreme material."
He couldn't speak for other online platforms, but he said Facebook's algorithms filtered down content to what users want to see backed on online behaviour such as groups that they follow and the posts they engage with.
"Would you produce a newspaper that actively has material that no one wants to read? Why would we produce a product no one wants to use?
"We have already taken the most advanced step in lifting the veil on all of this by introducing a feature which is 'Why am I seeing this?' You can go to what you're looking at and it will explain to you why you're seeing this."
He said Facebook also used algorithms to diminish or delete violent hateful content, and yesterday it announced US$7.5 million to develop that technology.
Clegg praised Prime Minister Jacinda Ardern for her leadership in putting the summit together.
"I really want to pay very fulsome tribute to the Prime Minister. It really wouldn't have happened without her.
"She has firmly, if always in a very civil way, pushed the tech companies to be more ambitious about coordinating with each other."
Like Ardern, Clegg was also exposed to the gunman's March 15 video.
"No one can look at something like that and not be appalled at the manner in which the terrorist clearly had planned something in order to broadcast his atrocity as widely as possible.
"I don't think any human being on the planet would want to do anything other than try and minimise that happening again in the future."
Facebook co-founder Chris Hughes has said that the company, which also owns Instagram and WhatsApp, was too powerful and should be broken up
"Because Facebook so dominates social networking, it faces no market-based accountability," Hughes said.
"This means that every time Facebook messes up, we repeat an exhausting pattern: first outrage, then disappointment and, finally, resignation."
Clegg disagreed and said breaking up Facebook would hinder its ability to develop AI to block objectionable content.
He said Facebook was not a traditional publisher like a newspaper, which has complete control over what it publishes.
"We should be held accountable and should be responsible for policing the boundaries within which people are entitled to express themselves.
"Within those boundaries, to suggest that we can somehow be held responsible for content which is spontaneous is clearly not realistic."