His original work in 2016 prompted the police to investigate, leading to a man being jailed over the issue. Months later, when Mr. Crawford found that other inappropriate images of children were still being posted on Facebook, he raised the issue with the company directly.
But to his surprise, the Silicon Valley giant reported him and the BBC to the police.
Facebook argued that it was just following established industry-standard procedures — and British law — by alerting the authorities to images of child exploitation, something it said it does with all such pictures.
But the move by Facebook follows criticism over how and when the social network moderates content on the social network. The company has variously blocked (and then unblocked) photos deemed controversial; grappled with hate speech laws in countries like Germany; and faced intensifying criticism over the spread of fake news, particularly during the American presidential election and other votes around the world.
Most recently, a judge in Germany ruled on Tuesday that Facebook was not responsible for the distribution of an image showing a Syrian refugee taking a selfie with Chancellor Angela Merkel of Germany. Edited versions of the image that were shared on the social network falsely linked the refugee with terrorist attacks.
Google and Twitter, notably, are grappling with similar issues. Experts say these tech giants are blurring the lines between mere neutral online platforms for digital content and active media companies that must monitor what is posted online.
“There is an expectation, particularly following the fake news controversy, that platforms have a responsibility as media companies rather than neutral tech companies to ensure that what goes on their platform is appropriate, tasteful, just like a broadcaster or a newspaper,” said Adam Rendle, a London lawyer specializing in copyright and media at the law firm Taylor Wessing.
Mr. Crawford, the BBC reporter, began investigating the presence of obscene images of children on Facebook last year. He had sought to highlight the difficulty of monitoring content on the social network, and his original investigation found that pedophiles were using secret pages to share images of children. A subsequent police investigation led to one man being imprisoned for downloading indecent images.
This year, he followed up and found that there were still images on the website that appeared to break Facebook guidelines, which state that the social media company will remove any content that promotes sexual violence or exploitation. Mr. Crawford reported the images using Facebook’s internal system, but the company took down only 18 of the 100 that he flagged.
He then contacted the social network directly to query the discrepancy, and was asked to provide examples of images that he had reported. But when he provided examples last week, the company reported Mr. Crawford and the BBC to the police.
In a video broadcast using the social network’s Facebook Live function, Mr. Crawford said on Tuesday that the images he had provided to the company were uploaded on pages with obscene comments on them or explicitly geared toward men with a sexual interest in children.
Facebook, however, said it had followed established procedures, and insisted it had removed all content that was illegal or breached its standards.
“When the BBC sent us such images we followed our industry’s standard practice and reported them,” Simon Milner, Facebook’s policy director in Britain, said in a statement.
“It is against the law for anyone to distribute images of child exploitation,” Mr. Milner said. “We also reported the child exploitation images that had been shared on our own platform. This matter is now in the hands of the authorities.”
The company filed its report with the Child Exploitation and Online Protection Center. The police unit declined to comment, neither confirming nor denying whether an investigation was underway.
Facebook has said it is improving its system for reporting offensive content, but the incident has raised questions about exactly how it polices its site.
Mr. Crawford, for example, noted an apparent contradiction between the view of Facebook’s moderation system, which determined that the photos were not in breach of the social network’s guidelines, and the company’s decision to report the BBC to the police.
“Is it that it’s illegal and their moderation isn’t working?” he asked, “Or is it that actually it’s perfectly legal content?”