A researcher posing as a 13-year-old girl witnessed grooming, sexual material, racist insults and a rape threat in the virtual-reality world.
The children's charity said it was "shocked and angry" at the findings.
Head of online child safety policy Andy Burrows added the investigation had found "a toxic combination of risks".
The BBC News researcher - using an app with a minimum age rating of 13 - visited virtual-reality rooms where avatars were simulating sex. She was shown sex toys and condoms, and approached by numerous adult men.
The metaverse is the name given to games and experiences accessed by people wearing virtual reality headsets. The technology, previously confined to gaming, could be adapted for use in many other areas - from work to play, concerts to cinema trips.
Mark Zuckerberg thinks it could be the future of the internet - so much so, he recently rebranded Facebook as Meta, with the company investing billions developing its Oculus Quest headset.
That headset - now rebranded the Meta Quest - is thought to have as much as 75% of the market share. It was one of these headsets which the BBC News researcher used to explore an app, and part of the metaverse. The app, called VRChat, is an online virtual platform which users can explore with 3D avatars.
While it is not made by Facebook, it can be downloaded from an app store on Facebook's Meta Quest headset, with no age verification checks - the only requirement being a Facebook account.
The BBC News researcher created a fake profile to set up her account - and her real identity was not checked.
Inside VRChat, there are rooms where users can meet: some are innocent and everyday - such as a McDonald's restaurant, for example - but there are also pole-dancing and strip clubs.
Children mix freely with adults.
One man told our researcher that avatars can "get naked and do unspeakable things". Others talked about "erotic role-play".
Following the BBC News investigation, the NSPCC (National Society for the Prevention of Cruelty to Children) said improvements in online safety are a matter of urgency.
Mr Burrows, from the NSPCC, told us what we had found was "extraordinary".
"It's children being exposed to entirely inappropriate, really incredibly harmful experiences," he said.
He believes technology companies have learned little from mistakes made with the first generation of social media.
"This is a product that is dangerous by design, because of oversight and neglect. We are seeing products rolled out without any suggestion that safety has been considered," he said.
Meta says it does have tools that allow players to block other users, and is looking to make safety improvements "as it learns how people interact in these spaces".