The Blurry Boundaries of Child Porn
Not every illicit image is equally offensive.
There are several reasons why child pornography isn’t governed by the laxer rules that regulate adult porn, but one rationale lies at the core of the law: Child porn is a record of a crime. It is illegal for a grown man to have sex with, say, a 4-year-old boy, and he doesn’t get to claim the protection of the First Amendment just because someone was photographing him while he committed the act. When jihadists behead their captives on video, the tape doesn’t change the fact that they’re murderers. It demonstrates and advertises it. And the photographer is an accessory to the crime.
Unfortunately, people have been arraigned for images that are a far cry from a man raping a toddler. Besides the “sexting” incidents, in which teenagers are charged with producing porn after flirtatiously emailing photos of themselves to each other, there are cases like the one involving 28-year-old Todd Senters, who videotaped himself having sex with his 17-year-old girlfriend. This happened in Nebraska, where the age of consent is 16, so the intercourse itself was legal. But under the federal Child Protection Act of 1984, a person in a pornographic performance is considered a child if he or she is under 18. So the tape was illicit even though the act was not—not a record of a crime, but still a crime itself. The Nebraska Supreme Court upheld Senters’ conviction in 2005.
If the definition of child porn is broader than it needs to be, the great bulk of enforcement, at the federal level at least, is aimed at material most liberal-minded Americans would consider criminal. That represents a profound legal and social shift. It’s hard to believe, but the law didn’t distinguish child pornography from other sorts of porn until the 1970s, when greater tolerance for sexually explicit material made the seediest stuff more visible, prompting a backlash. Even then, the sole federal law on the subject—the Protection of Children Against Sexual Exploitation Act, passed in 1978—merely stiffened the penalties for material that would already be illegal under the obscenity statutes. And no item was obscene if it had “serious literary, artistic, political, or scientific value.” It wasn’t until New York v. Ferber in 1982 that the U.S. Supreme Court allowed legislators to outlaw images of children that didn’t fit the strict legal definition of obscenity. Child abuse is child abuse, the unanimous court declared, even if the pictures of that abuse have artistic merit.
Since then, police have mostly aimed enforcement not at the producers of the porn but at the distributors and, more controversially, the consumers. There are three central reasons why the law pursues people who possess child porn as well as those who make and sell it:
1. To eliminate the market. In Ferber, the Court noted that while “the production of pornographic materials is a low-profile, clandestine industry,” distribution networks must be “visible” to be effective; therefore “the most expeditious if not the only practical method of law enforcement may be to dry up the market for this material.” If the market is disrupted, that reduces the incentives to create the images in the first place.
This is the most understandable rationale for the restriction. If you pay for a kiddie porn tape, you’re not just looking at images. You’re creating an incentive to make more of them and therefore to abuse more children.
In the Web era, you might be doing more than that. In 2006 The New York Times described PlayToy, a website that offers “scores of original photographs of scantily clad under-age children...often posed in ways requested by subscribers.” That isn’t the only online community that solicits such input from its users, and it surely isn’t the worst of them; at least the PlayToy models weren’t being molested onscreen. Media scholars have described an “active audience” that reframes, reinterprets, and even rewrites its favorite texts. This is the active audience on steroids, the ugly underside of the user-driven Web. In the worst-case scenario, such consumers are co-conspirators, morally if not legally, in the rape and abuse of children.
2. To police people’s thoughts. The Child Pornography Protection Act of 1996 banned even computer-generated “virtual” porn that was produced without any actual kids. Such material, the law declared, “encourages a societal perception of children as sexual objects.” In other words, the measure targeted not the acts performed in front of a camera but the acts performed within people’s minds.
The Supreme Court struck down that law as overbroad in 2002, but the rationale behind it has been a constant companion to the crackdown on child porn. An argument that most researchers roundly reject when the topic is adult pornography—that viewing it incites people to commit sexual violence—is frequently cited uncritically where porn featuring children is concerned. The Supreme Court may be wary of laws that invoke a “paternalistic interest in regulating [the] mind,” as Justice Byron White once put it, but it’s clear that many activists and legislators do not share those qualms.
If that were all there was to the issue, the civil libertarian position would be fairly clear. Restrictions on purchasing child porn might be justifiable, but restrictions on merely possessing it—acquiring it for free via Kazaa, say—would not. (If the idea is to cut into child pornographers’ profits, peer-to-peer sharing might be more ally than enemy.) But there are other arguments, including one in particular that’s worth some thought:
3. To protect the privacy of the victims. “Because the child’s actions are reduced to a recording,” the attorney David P. Shouvlin wrote in the Wake Forest Law Review in 1981, “the pornography may haunt him in future years, long after the original misdeed took place. A child who has posed for a camera must go through life knowing that the recording is circulating within the mass distribution system for child pornography.” Those words evidently impressed the Supreme Court, which quoted them a year later in Ferber. The viewing, in this analysis, is itself a perpetuation of the abuse.
Such arguments undergird Masha’s Law, named for Masha Allen, a Russian orphan who was held prisoner, raped repeatedly on camera, and advertised in the kiddie porn world as “Disney World Girl.” The measure, which became law in 2006, allows adults who were victimized by pornographers as minors to sue people who download the resulting images.
Emotionally, it’s a compelling concept. And where invasion of privacy is the concern, civil remedies certainly make more sense than criminal prosecutions. But the idea opens a can of worms. If the issue is privacy, shame, and being haunted by ineradicable images, wouldn’t the same argument apply to the abused prisoners photographed at Abu Ghraib? To hostages filmed by their captors and aired on the news? To anyone humiliated in front of a camera? Should an inadvertent Internet celebrity, deeply embarrassed that people are chuckling at a clip of his light-saber dance, have standing to sue the viewers?
That last example might seem absurd, but it actually veers close to the pornography debate. Because the child porn laws set the age of maturity so high, they cover not just the victims of coercion but exhibitionists who voluntarily put photographs of themselves online. There also are people who post pictures that are salacious but don’t include the “lascivious exhibition of the genitals or pubic area” invoked by the law. They do not necessarily intend for anyone but their friends to see the photos. But the Internet doesn’t always work that way.
Consider Amanda Wenk, a teenager who became an online celebrity in 2005 after she posted pictures of herself in bikinis, tight T-shirts, and low-cut dresses on her Webshots site. She took down the photos after they attracted outside attention, but by that time they had escaped to Fark and other forums for people who like to swap online ephemera.
She wasn’t really a child at the time, but the law says she was; the images aren’t much more pornographic than a high school yearbook, but some people clearly use them as though they were Playboy centerfolds. She is presumably embarrassed by the attention, given that she tried to remove the pictures from the Web. She may well be haunted by it. Is it the role of the government to preserve her peace of mind?
The difference between what happened to Amanda Wenk and what happened to Masha Allen should be obvious. But both must, to borrow the phrase the Supreme Court quoted in Ferber, “go through life knowing that the recording is circulating within the mass distribution system for child pornography.” I’m not convinced that’s reason enough to punish the people who merely see those recordings, as opposed to the people who actively participate in the abuse of prisoners like Allen—or the inmates at Abu Ghraib.
Jesse Walker (firstname.lastname@example.org) is managing editor of reason.