Former Facebook staffer testifies in Meta trial

Date:



A high-profile trial against social media company Meta continued into its second day in Santa Fe, with prosecutors accusing the parent company of Facebook, Instagram and WhatsApp of intentionally targeting teens and preteens to maximize advertising revenue while exposing young users to sexual exploitation and other online dangers.On Tuesday, jurors heard testimony from Arturo Béjar, a former senior Facebook leader who oversaw engineering and product efforts related to site integrity, security, safety and customer support.Béjar testified that Facebook maintained a proactive internal standard for addressing user harm when he left the company in 2015. He said he returned in 2019 after his daughter received sexually explicit photos online, hoping to help drive change, but found the company had shifted to a less responsive environment.Béjar said research and recommendations aimed at reducing harm were often ignored. “So many examples of people with good ideas for good things that would reduce harm within, as it got reviewed and went through the pipeline, would get pushed down,” he said.During his testimony, Béjar used a car analogy to describe platform responsibility. He said people expect a car to operate safely regardless of who is driving and argued the same standard should apply to social media products. He also said parents and children share responsibility for online harm.Béjar conducted an internal survey called the Bad Experiences and Encounters Framework, or BEEF, in 2021, which included nearly 238,000 Instagram users between the ages of 13 and 15. The survey asked whether they had experienced multiple types of online harm.According to the survey, about one in three users reported witnessing online bullying, while about one in 10 said they had personally experienced it. One in five reported seeing sexual images.Béjar became emotional while discussing the findings, highlighting the scale of potential harm among teenage users.”270 million teenagers on Instagram today. But it’s 1 in 10 out of 270 million kids, right? That’s half the population of the United States. When you see this number to act on it, because you’re going to, you have such a responsibility to the safety of every single one of those kids,” Béjar said.Béjar testified that he presented the results in an email to Meta CEO Mark Zuckerberg and other top executives. He said leadership was aware of the reported harm but did not implement changes and alleged the company prioritized growth and competition with other social media platforms, including TikTok and Snapchat.Béjar testified that while Instagram’s policy in 2021 during the time of the study states harmful behavior is not allowed, internal data shows such behavior continues and policies do not adequately warn parents about potential risks.Béjar said Meta focused on building new features and directing resources toward growth rather than addressing safety concerns. He also criticized the company leadership’s response to safety issues.”I think they (executives) really care about making people think that they care. But I think in practice they don’t care,” Béjar said. “Caring is the moment you become aware of something, you engage with it, you understand it, you work on it, you do things that make it better.”The defense has not yet cross-examined Béjar.

A high-profile trial against social media company Meta continued into its second day in Santa Fe, with prosecutors accusing the parent company of Facebook, Instagram and WhatsApp of intentionally targeting teens and preteens to maximize advertising revenue while exposing young users to sexual exploitation and other online dangers.

On Tuesday, jurors heard testimony from Arturo Béjar, a former senior Facebook leader who oversaw engineering and product efforts related to site integrity, security, safety and customer support.

Béjar testified that Facebook maintained a proactive internal standard for addressing user harm when he left the company in 2015. He said he returned in 2019 after his daughter received sexually explicit photos online, hoping to help drive change, but found the company had shifted to a less responsive environment.

Béjar said research and recommendations aimed at reducing harm were often ignored. “So many examples of people with good ideas for good things that would reduce harm within, as it got reviewed and went through the pipeline, would get pushed down,” he said.

During his testimony, Béjar used a car analogy to describe platform responsibility. He said people expect a car to operate safely regardless of who is driving and argued the same standard should apply to social media products. He also said parents and children share responsibility for online harm.

Béjar conducted an internal survey called the Bad Experiences and Encounters Framework, or BEEF, in 2021, which included nearly 238,000 Instagram users between the ages of 13 and 15. The survey asked whether they had experienced multiple types of online harm.

According to the survey, about one in three users reported witnessing online bullying, while about one in 10 said they had personally experienced it. One in five reported seeing sexual images.

Béjar became emotional while discussing the findings, highlighting the scale of potential harm among teenage users.

“270 million teenagers on Instagram today. But it’s 1 in 10 out of 270 million kids, right? That’s half the population of the United States. When you see this number to act on it, because you’re going to, you have such a responsibility to the safety of every single one of those kids,” Béjar said.

Béjar testified that he presented the results in an email to Meta CEO Mark Zuckerberg and other top executives. He said leadership was aware of the reported harm but did not implement changes and alleged the company prioritized growth and competition with other social media platforms, including TikTok and Snapchat.

Béjar testified that while Instagram’s policy in 2021 during the time of the study states harmful behavior is not allowed, internal data shows such behavior continues and policies do not adequately warn parents about potential risks.

Béjar said Meta focused on building new features and directing resources toward growth rather than addressing safety concerns. He also criticized the company leadership’s response to safety issues.

“I think they (executives) really care about making people think that they care. But I think in practice they don’t care,” Béjar said. “Caring is the moment you become aware of something, you engage with it, you understand it, you work on it, you do things that make it better.”

The defense has not yet cross-examined Béjar.



Source link

Share post:

Subscribe

spot_imgspot_img

Popular

More like this
Related

Nicole Curtis Responds After HGTV Pulls ‘Rehab Addict’

HGTV immediately pulled episodes of Rehab Addict after a...

ANZ Bank Shares Enjoy Best Day Since 2020 as Cost Cuts Show

SYDNEY—ANZ Group shares are on course for their best...

Parker, EDD, Griffin among finalists for ’26 Basketball Hall of Fame class

Ramona ShelburneFeb 11, 2026, 06:53 PM ETCloseSenior writer for...