The court filing said Frazier was forced to watch depraved acts as part of her job.
"For example, Plaintiff witnessed videos of: a smashed open skull with people eating from it; a woman who was kidnapped and beheaded by a cartel; a person's head being run over by a tank; a man eating the head off a rat; a fox being skinned alive; a man falling to his death off a roof that included audio of the impact of his body hitting the ground; school shootings included dead bodies of children; a politician shooting himself; backyard abortions; child abuse; and child sexual assault," it said.
Frazier is also suing TikTok's parent company ByteDance.
The documents filed in the United States District court also revealed the huge earning of the parent company.
"In fiscal year 2020, ByteDance made approximately (US) $34.3 billion (NZD$50.3b)
in advertising revenue. In 2019, that number was $17b, and in 2018 that number was $7.4b. ByteDance accomplished this in part due to the popularity of its Tik Tok App," it said.
The documents also said due to the strict monitoring of the moderators' activity there is increased pressure on the staff to watch as many videos as possible.
TikTok implements software that tracks moderators online and via camera. It also tells moderators that they should only review 25 seconds of each video.
The intense pressure moderators are placed under, as well as the content they are required to watch makes them more likely of suffering post-traumatic stress disorder, Frazier alleges.
Because of her work, Frazier said she has developed panic attacks and depression as well as symptoms associated with anxiety and post-traumatic stress disorder.
She said she has trouble sleeping and suffers from nightmares when she does manage to sleep over the content she's watched.
Frazier wants to have TikTok pay her and others for the psychological injuries they have suffered and wants the court to force the company to set up a medical fund for content moderators.
She has demanded the matter be heard before a jury.
A TikTok spokesman told news.com.au in a statement they are continuing to expand the care of its moderators.
"While we do not comment on ongoing litigation, we strive to promote a caring working environment for our employees and contractors. Our Safety team partners with third-party firms on the critical work of helping to protect the TikTok platform and community, and we continue to expand on a range of wellness services so that moderators feel supported mentally and emotionally," he said.