Advertisement
Guest User

Untitled

a guest
Apr 18th, 2024
302
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 5.94 KB | None | 0 0
  1. On Tuesday, Jennifer Petrucelli, Stephanie Jenkins, and Rachel Antell presented documentary filmmakers at the International Documentary Association’s “Getting Real” conference with a draft of guidelines for how they could thoughtfully and ethically use generative AI in their work.
  2.  
  3. A primary concern for Petrucelli, Jenkins, and Antell, longtime documentary filmmakers and co-founders of the Archival Producers Alliance (APA), is to avoid a situation in which AI-generated images make their way into documentaries without proper disclosure, creating a false historical record.
  4.  
  5. Around the same time they made these guidelines public, a story in Futurism revealed that Netflix had already done exactly what they feared. In the recently released true crime documentary, What Jennifer Did, the movie uses a couple of clearly AI-generated images to help establish accused murderer Jennifer Pan as a normal, fun loving teenaged girl.
  6.  
  7. “Jennifer was bubbly, happy, confident,” a high school friend of Pan’s says during the sequence. As he’s saying this, a series of three photos of Pan in a red dress flash on screen. These images offer a stark contrast to how the audience sees Pan for most of the film: quiet, shaken, and under the harsh lighting of an interrogation room.
  8.  
  9. The first of the photographs in this sequence appears to be real, and shows her in a red dress, smiling at the camera and throwing up the peace sign. The next two, however, seem clearly fake. One image also shows her smiling and throwing up the peace sign, but her hands are mangled in that now signature tell of older AI image generators.
  10.  
  11. Another image, which also appears to be AI-generated, also appears in some of the promotional material for the documentary. The promotional split image shows a mugshot type image of Pan looking coldly at the camera during her investigation on one side, and an image of her smiling ear to ear on the other. Again, the image conveys a cliché of the true crime genre: how could this seemingly normal and happy person commit such a horrible act?
  12.  
  13. As far as I could tell, the fact that these images were AI generated is not disclosed in the movie, and the end credits do not mention that any AI tools were used during its making. Netflix did not respond to my request for comment.
  14.  
  15. “One of the things we've realized is once a piece of media exists, even if it is disclosed [that it’s AI generated], it can then be lifted out of any documentary, make its way onto the internet and into other films, and then it's forever part of the historic record,” Antell told me on a call. “If it's being represented as this is a picture of this person, then that's what's going into the historic record. And it's very hard to pull that back. I think there is a danger in that starting to happen in terms of how we understand the world and how we understand media and our relationship to it, and what we can trust.”
  16.  
  17. The APA’s guidelines are mostly common sense and are designed with four overarching principles: value of primary sources, transparency, legal considerations, and consent.
  18.  
  19. The guidelines were written and presented before the APA knew about What Jennifer Did’s use of AI, but specifically encourage filmmakers to use “more explicit forms of transparency” when generative AI is used to:
  20.  
  21. Make a real person say or do something they did not say or do (i.e., create deepfakes)
  22. Alter footage, photos, or audio of a real event or place
  23. Generate a realistic-seeming historical scene that did not actually occur.
  24. “As you probably can tell from the guidelines, we're not telling people don't use [generative AI tools],” Petrucelli told me. “We’re just really strongly encouraging people to be transparent about the use, and in certain cases, where appropriate, get consent to recreate things that didn't necessarily happen.”
  25.  
  26. As Petrucelli and Antell told me, using non-archival, non-documentary footage in a documentary film is nothing new. The most common example is recreations of real events, which are used sparingly in What Jennifer Did as well. The difference is that, by now, viewers understand the visual language of documentary filmmaking well enough, and specifically the device of recreations, to know they are dramatized, staged versions of what happened. Sometimes this is explained explicitly with text on screen, but even stylistically, or by virtue of the footage providing a point of view that wouldn’t be available to a documentarian, the audience understands it’s watching a recreation.
  27.  
  28. Theoretically, and as the guidelines make clear, documentaries could use generative AI to create similar visuals, and maybe in a way that would even allow them to tell stories in ways they couldn’t previously.
  29.  
  30. The problem, according to Petrucelli and Antell, is that whereas with traditional recreations audiences got more savvy and over time learned how to recognize them, generative AI images are only going to get harder to detect over time unless they’re disclosed as such.
  31.  
  32. As we’ve reported time and again over the last six months, the scale of generative AI is only going to make this problem worse. As an example, Antell told me how she was recently working on a film and came across a photograph the filmmakers thought was of women during the Civil War.
  33.  
  34. “It looked like it, it was a very good recreation of that,” she said “But with a little bit of research, I could figure out that it wasn't real. It was a contemporary photograph that was made to look like it. Occasionally something can slip through, but there are so few of those, and someone goes out and they take the time to research, what was the clothing? What was the look, what were the hairstyles?”
  35.  
  36. In the future, she said, archivists will not be able to keep up with the deluge of AI-generated images.
  37.  
  38. “Archival moves at a human pace and GenAI does not move at a human pace, and so for humans to keep up with it, that's a very unlikely thing to be able to happen,” Antell said.
  39.  
  40.  
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement