They reveal 11.6 million pieces of content related to child nudity and child sexual exploitation were taken down between July and September 2019.
For the first time, Facebook is also releasing figures for Instagram and including numbers for posts related to suicide and self-harm.
This follows a public outcry over the death of 14-year-old Molly Russell.
The teenager killed herself in 2017 and her father then found large amounts of graphic material about self-harm and suicide on her Instagram account.
In a blog, Facebook vice-president Guy Rosen said: "We remove content that depicts or encourages suicide or self-injury, including certain graphic imagery and real-time depictions that experts tell us might lead others to engage in similar behaviour.
"We place a sensitivity screen over content that doesn't violate our policies but that may be upsetting to some, including things like healed cuts or other non-graphic self-injury imagery in a context of recovery."