10.1 Digital media law

Digital media have grown in significance and complexity since the 1990s, and problems of law have grown alongside them.  Issues such plagiarism, copyright and cyber bullying were relatively smaller original issues, but we are now also  concerned with hate speech, disinformation campaigns, information wars, and  well as the domination of the digital economy by what were once a handful of faltering startups.

Digital media,  as it turns out, tend to be quite different from the other two major types of  traditional media (print and broadcast). As a result, laws and regulations that once apply to traditional media types do not work as well in the digital media sphere. And laws that created a free regulatory environment at the birth of the digital media industry are now being questioned in light of the corporate tech giants’ monopolistic control of global  communication and commerce.

Differences in media 

Traditional printed publications and broadcast programs are “top down” systems run by experts. They are relatively scarce,* they allow only limited public access, and they are difficult to copy. Broadcasters and printing companies are entirely responsible for all content they produce. *(Scarcity, for example, is the underlying rationale for FCC regulation of broadcasting.)    

Digital media, on the other hand, are crowd sourced systems with cheap,  abundant content. They are  easily accessible,  freely copied, permanently recorded, and often published without filters, editing, ethics or social responsibility.   

The US Supreme Court said in the foundational case,  Reno v ACLU (1997): “The special factors … justifying regulation of the broadcast media  … are not present in cyberspace.”*    Therefore, the court said,  the internet does not need to be regulated like broadcasting.

Section 230: Following Reno, Congress passed Section 230 of the Telecommunications Act, which says basically that Internet Service Providers (eg, social media platforms and related companies) are not be subject to government regulation of their content.  While individuals who publish on the internet were still held accountable through the same laws and regulations (libel, privacy, incitement to violence, etc), the platforms themselves were indemnified.

DMCA:  One partial exception to Section 230’s blanket indemnification was in the area of copyright. A set of regulations called the DMCA  allows  enforcement of  infringement claims directly through the digital media companies as well with the individual content creators.  This book’s  section on Copyright and digital media is found here).

Digital Media: New challenges to the marketplace of ideas 

Cheap speech  is one description of the characteristics that distinguish  digital media. When traditional remedies like lawsuits for libel and invasion of privacy fail to deter people who have easy access to the media and are already bankrupt, plaintiffs may use court orders to criminalize thart speech, according to UCLA law professor Eugene Volokh:

While the old expensive-speech system was rightly criticized as undemocratic, the flip side was that the owners of the press had assets that were vulnerable to civil lawsuits, and those owners were  thus disciplined by the risk of liability, as well as by market forces. They also had professional and business reputations that they wanted to  preserve: if reporters spread something that proved to be a hoax, it could mean loss of a job… Say what you will about the old mainstream media, but it didn’t offer much of a voice to people obsessed with private grievances, or to outright kooks, or to the overly credulous spreaders of conspiracy theories…  [They can act]  without being much deterred by the risk of liability for libel or disclosure of private facts;  because the speakers have very little money, they have little to lose from a lawsuit, and potential plaintiffs (and contingency fee lawyers) have little to gain… Damages lawsuits against those without assets are largely quixotic.  [Partly as a result] The legal system’s remedy … has been increased criminalization … through anti-libel injunctions and criminal libel charges…”

Is the marketplace of ideas broken?  To the extent that it is, a more European-style approach to laws about freedom of speech may be worth considering says Harrison Rosenthal:

“An ideological chasm is emerging between new First Amendment theorists and their scholarly forbearers on the philosophical justifications for hate speech protection. The new guard … is balancing the equities of First Amendment  libertarianism against Fourteenth Amendment equal protection—or what the international community calls human dignity. For socio-historical reasons rooted in armed conflict, Americans tend to embrace individualism, while Europeans tend to embrace collectivism.

Similarly, Richard Holden argues in a 2020 article,  that cooperation is just as important as individual rights in the marketplace.

Just because the marketplace for ideas doesn’t always work doesn’t mean it never works. There are deeply important non-economic reasons to value freedom of speech that extend beyond how much information gets revealed.  But modern economics teaches us that competition in the marketplace for ideas is different than competition in the market for ordinary goods and services… [And  he cautions: ] We should not always conclude that odious [bad]  ideas will be consigned to the dustbin of history simply through open discussion.

Digital content issues  in this section of RevComm/law include:

  • CDA Section 230 limited liability is a pivotal controversy in digital media law. Should Facebook or Twitter be responsible for what a user posts? Why or why not? Two 2023 Supreme Court cases, Gonzales v Google and Twitter v Taamnah opened this question in the 2022-23 term. In both cases, the court sided with Facebook and Twitter rather than critics.
  • Issues concerning access and de-platforming of individuals who spread  false or misleading content.  For instance, in the wake of the assault on the US Capitol, Jan. 6, 2021, Twitter, Facebook and Instagram  permanently suspended  (deplatformed) Donald Trump.  He sued the tech giants in July, 2021, but the suit was dismissed May 6, 2022. 
    Meanwhile, laws passed in Florida and Texas (H.B. 20)  that would make it illegal to de-platform someone for their political views, are being challenged in the case  NetChoice, LLC v. Paxton. 
  • Meanwhile, Elon Musk has taken the helm at Twitter (now “X”) and has re-platformed Trump.
  • Social media bans have included false, misleading and hateful statements  about the imaginary dangers of coronavirus vaccines or  “theft” of the 2020 election results. Are such statements protected by the First Amendment? To the extent that they may include false statements about people or companies, the protection may be subject to Sullivan standard review.
  • Access to official Twitter accounts as a form of petitioning the government, as guaranteed under the First Amendment.  (Cases: Knight Institute v. Trump, 2017 and 2021).
  • Privacy: Do people have the right to privacy and to control   information about them that is gathered by social media? New laws in Virginia and California have carved out more of this right.
  • Proposed US bans of Tik-Toc, the Chinese social media software,  are part of this privacy concern, but experts say that this sort of ban doesn’t address deeper issues.
  • Also,  should people have the right to “de-link” search engine paths to outdated personal information? (This is also sometimes called the “Right to Be Forgotten.”) For example, should people victimized by “revenge porn,” or depictions of actual fatalities, or malicious fabrications, be able to control their personal information? (Cases: Zeran v AOL, 1997;  Google v Gonzalez, Spain / EU, 2014).
  • Defamation and digital communications:  Even traditional media libel cases, such as Dominion voting systems v Fox, often have digital elements. When Dominion set out to prove (under the Sullivan standard) that Fox knowingly broadcast false information, it was able to subpoena texts and emails from Fox producers and personalities like Tucker Carlson.  That level of access into the thinking of broadcasters and publishers was unprecedented.  Carlson and others clearly broadcast information about Dominion that they knew, or suspected, was false.
  •  Defamation damages:  Another question involves consequences when the loser in a libel case seeks standard bankruptcy protection, as in the case involving Alex Jones of Infowars and the parents of the massacred Sandy Hook children The court found that the parents were defamed by Jones, and have awarded millions of dollars in damages, but Jones refuses to stop his broadcasts. In similar libel and privacy cases (Gawker v Bollea, 2015, or James C. Green v Alton Telegraph 1981), similar small publications went out of business.
  • Sedition: New kinds of seditious libel charges about issues of fact concerning government operations, for example, the Michigan Attorney General’s order to take down “Detroit Leaks” website, which alleges, without evidence, fraud in the 2020 election.   This follows older time-place-manner laws that have prohibited false information about location and time and requirements for voting. As of November, 2022, the video was still online but carried a warning label.  
  • Digital Copyright, especially DMCA laws that allow regulatory take-downs and stiff fines for possible infringements need to be considered; and 
  • Advertising and “trade disparagement” via negative remarks on competitors online web sites or evaluations are a new dimension 

 Structural issues in this course include:

  • Anti-trust (anti-monopoly) laws
  • Net neutrality and information transmission
  • Domain name services

MORE 


* Broadcast regulation factors included (according to the Reno decision):  the history of extensive Government regulation of broadcasting, see, e. g., Red Lion Broadcasting Co. v. FCC395 U. S. 367, 399-400; the scarcity of available frequencies at its inception, see, e. g., Turner Broadcasting System, Inc. v. FCC, 512 U. S. 622, 637-638; and its “invasive” nature, see Sable Communications of Cal., Inc. v. FCC, 492 U. S. 115, 128 — Thus, these cases provide no basis for qualifying the level of First Amendment scrutiny that should be applied to the Internet.