Meta (formerly Facebook) is aware that virtual reality can be a "toxic environment" especially for women and minorities, and Metaverse would be an "existential threat" to Facebook if it turned off "mainstream customers from the medium entirely", the media reported.
According to a report in Financial Times citing an internal memo by Meta CTO Andrew Bosworth, Facebook aims its virtual worlds to have "almost Disney levels of safety".
However, Bosworth acknowledged that moderating how users speak and behave "at any meaningful scale is practically impossible", the FT report said on Friday.
Bosworth later posted a blog post, saying that technology that opens up new possibilities can also be used to cause harm, and "we must be mindful of that as we design, iterate, and bring products to market".
"Harassment in digital spaces is nothing new, and it's something we and others in the industry have been working to address for years. That work is ongoing and will likely never be finished. It's continually evolving, though its importance remains constant. It's an incredibly daunting task," he noted.
Meta has pledged $50 million for research into practical and ethical issues around its metaverse.
More From This Section
The social network now plans to spend at least $10 billion on metaverse-related projects this year and is changing its financial reporting to separate revenue between Facebook Reality Labs and its family of apps.
The metaverse will be a social, 3D virtual space where you can share immersive experiences with other people, even when you can't be together in person - and do things together you couldn't do in the physical world.
"Of course, there are limitations to what we can do. For example, we can't record everything that happens in VR indefinitely it would be a violation of people's privacy, and at some point, the headset would run out of memory and power," Bosworth said.
--IANS
na/ksk/