MEDIA MATTERS
FORMER BROADCAST JOURNALIST GWEN IFILL HONORED WITH BLACK HERITAGE FOREVER STAMP
<B> Former Broadcast Journalist Gwen Ifill Honored With Black Heritage Forever Stamp</b>
THE LATE BROADCAST JOURNALIST GWEN IFILL HONORED WITH BLACK HERITAGE FOREVER STAMP.
By: Richard Prince—"Journal-isms" N2Entertainment.net
N2Entertainment.net

The dedication ceremony for the Gwen Ifill Black Heritage Forever Stamp, honoring one of the first African-Americans to hold prominent positions in both broadcast and print journalism, takes place Jan. 30 at 11 a.m. EST, the U.S. Postal Service recently announced. Attendees are encouraged to RSVP at usps.com/gwenifillblackheritage. Metropolitan AME Church, 1518 M Street NW, Washington, DC 20005, where Ifill worshipped, is hosting the ceremony.

OTHER “JOURNAL-ISMS” NEWS:

WHY HIGH ERROR RATES SHOULD CONCERN NEWSROOMS

“Facial-recognition systems misidentified people of color more often than white people, a landmark federal study released Thursday shows, casting new doubts on a rapidly expanding investigative technique widely used by law enforcement across the United States,” Drew Harwell reported for the Washington Post. Use of the technique is also taking place in newsrooms.

“Asian and African American people were up to 100 times as likely to be misidentified than white men, depending on the particular algorithm and type of search,” the story continued. “Native Americans had the highest false-positive rate of all ethnicities, according to the study, which found that systems varied widely in their accuracy.

“The faces of African American women were falsely identified more often in the kinds of searches used by police investigators, where an image is compared to thousands or millions of others in hopes of identifying a suspect. . . .”

Harwell wrote, “The study could fundamentally shake one of American law enforcement’s fastest-growing tools for identifying criminal suspects and witnesses, which privacy advocates have argued is ushering in a dangerous new wave of government surveillance tools.”

But the study also has implications for the news media, according to veteran digital news executive Raju Narisetti professor of professional practice at Columbia Journalism School and director of its Knight-Bagehot Fellowship in Economics and Business Journalism.

“In a way, this broad finding should be intimately familiar to people of color in American newsrooms who know that seeming layer of invisibility and non-recognition when it comes to leadership roles and opportunities over several decades,” Narisetti wrote by email. “Nor is it dissimilar to newsrooms [whose] staffing doesn’t reflect the diverse audiences in their own current and potential audiences.

“Machines learn from humans so the finding that so-called smart systems are prone to the same fallacies, assumptions and errors of our societies when it comes to race and gender should be unsurprising. This finding is also yet another wake-up call, especially for journalists typically prone to automatically give the benefit of doubt to technologies, algorithms and tech-laden institutions, to be far more skeptical and questioning of the outcomes of machine learning, beyond just focusing on the more recently popular privacy concerns.”

Charlton McIlwain, professor in New York University’s Department of Media, Culture, and Communication in its Center for Critical Race and Digital Studies, noted that some newsrooms are already using facial technology, at times to the detriment of people of color.

“I do believe we need to be paying more attention to how controversial and socially destructive technologies like facial recognition are being used in journalism,” McIlwain, author of the recent “Black Software:The Internet and Racial Justice: From the AfroNet to Black Lives Matter,” said by email.

“For one reason, news media — print, television, digital — traffic in, and produce troves of images that can be used to feed and train facial recognition systems, which raises questions about the complicity of news outlets in helping to develop these technologies that disproportionately and negatively affect people of color, and African Americans in particular.

“News media — often motivated by the need to ‘understand’ new technologies — also use facial technology in ways that can perpetuate those same biases and compromise personal privacy, as this recent story discussed regarding the NY Times’s use of facial recognition technology to identify event goers. Again, one of the primary concerns has to do with the ways that such practices mirror what news media and journalists have done for years — to use facial imagery of black and brown people to aid and abet criminal justice enterprises that frequently and disproportionately arrest black and brown ‘suspects’ in error. . . .”

EDITOR'S NOTE: Washington Post journalist Richard Prince occasionally submits his column "Journal-isms" to "Media Matters." Prince's "Journal-isms" originates from Washington, D.C. To check out Prince's complete "Journal-ism's" columns log on to: http://mije.org.