The Bowling Green Civic Assembly Report

Bowling Green.png

Over 2,000 people participated in our 'Pol.is' online civic conversation in Bowling Green, Kentucky.  Together they submitted over 600 statements and voted nearly a quarter-million times.  These statements dealt with divisive issues such as immigration and LGBTQ rights, but overwhelmingly, the things people wanted to talk about were things about which people agreed.  People agreed about traffic and local development, about the need for improved community and commercial services, about government accountability, job training, and education.  This report explores the Bowling Green results and explains the methodology.

Read the Report

Middle Neighborhoods: Action Agenda for a National Movement

2018. Paul C. Brophy (ed.), Pamela Puchalski (ed.), Stephanie Sung (ed.)
Project Director: Paul C. Brophy; The American Assembly, in partnership with the Federal Reserve Bank of Richmond; and with support from Lincoln Institute of Land Policy and Healthy Neighborhoods, Inc.

"Middle Neighborhoods: Action Agenda for a National Movement" is a report that summarizes discussions from a meeting, held November 15-16, 2017 in Baltimore and co-sponsored by The American Assembly and the Federal Reserve Bank, with support from the Lincoln Institute of Land Policy and Healthy Neighborhoods Inc., a leading community development organization in Baltimore. Experts from varying disciplines and backgrounds, each familiar with the context of middle neighborhoods in cities across the United States, divided themselves into three working groups to advance issues in policy, practice, and research. Among the participants were the Mayor of Baltimore, Catherine Pugh; Philadelphia City Councilwoman Cherelle Parker; and two Congressional members, Dan Kildee (D-MI), and Dwight Evans (D-PA).

For a summary of conclusions, including highlights from each working group, and next steps, 

View the report here.

Download: 

"Middle Neighborhoods: Action Agenda for a National Movement" (Download PDF)

Notice and Takedown in Everyday Practice

Notice and Takedown.png

2016. Jennifer Urban, Joe Karaganis, Brianna Schofield

American Assembly and Berkeley Law

"Notice and Takedown in Everyday Practice" is a set of empirical studies into the DMCA’s notice and takedown process. Despite its importance to copyright holders, online service providers, and Internet speakers, very little empirical research has been done on how effective notice and takedown is in addressing copyright infringement, spurring online service provider development, or due process for notice targets.

Our report is the most in-depth research we know of to date into this system. It includes three studies that draw back the curtain on notice and takedown:  using detailed surveys and interviews with more than three dozen respondents, the first study gathers information on how online service providers and rightsholders experience and practice notice and takedown on  a day-to-day basis; the second study examines a random sample from over 100 million notices generated during a six-month period to see who is sending notices, why, and whether they are valid takedown requests; and the third study looks specifically at a subset of those notices that were sent to Google Image Search.

The findings suggest that whether notice and takedown “works” is highly dependent on who is using it and how it is practiced, though all respondents agreed that the Section 512 safe harbors remain fundamental to the online ecosystem.  Perhaps surprisingly in light of large-scale online infringement, a large portion of OSPs still receive relatively few notices and process them by hand. For some major players, however, the scale of online infringement has led to automated, “bot”-based systems that leave little room for human review or discretion, and in a few cases notice and takedown has been abandoned in favor of techniques such as content filtering.

The second and third studies revealed surprisingly high percentages of notices of questionable validity—mistakes are made by both by "bots" and by humans. In one study, we reviewed automated notices—created, sent, and processed largely by computers. These notices overwhelmingly targeted well-known infringing sites and requested removal of major copyright holders’ assets, which may lessen concerns that mistakes would have negative effects on expression. Unfortunately, however, they also exhibited a number of flaws. One in twenty-five of the takedown requests (4.2 percent) targeted material that clearly did not match the copyrighted work, and nearly a third (28.4 percent) raised at least one question about their validity—ranging from failure to identify the materials in dispute to targeting potentially legal uses. These percentages translate to many millions of notices in the entire set—for example, the 4.2 percent translates to about 4.5 million notices.

The other statistical study raised further concerns. These requests tended to be sent by smaller senders—individuals and small businesses—apparently by humans rather than computers.  But they exhibited even more flaws. A full seven out of ten (72 percent) presented questions about their validity. More than half (all problematic) were from one individual sender. Even without her notices, however, 36.8 percent were questionable. These notices often targeted social media, blogs, and personal websites, raising even greater questions about their effect on expression.

The United States Copyright Office recently issued a “Notice of Inquiry” (a formal study) of the DMCA’s notice and takedown process, with comments due on April 1, making this research very timely. The findings strongly suggest that the notice and takedown system is important, under strain, and that there is no “one size fits all” approach to improving it. Based on the findings, we suggest a variety of reforms to law and practice

Available at: 

Social Science Research Network

Action Agenda for Historic Preservation in Legacy Cities

Action Agenda - Historic Preservation.png

2015. Cara Bertron (ed.)

Published by: Preservation Rightsizing Network

"Action Agenda for Historic Preservation in Legacy Cities" is a report containing a nine-point strategy to shape new approaches to preservation, to adapt existing tools and policies used by preservationists, and to promote place-based collaboration, especially in legacy cities like Newark, Detroit, and Cleveland. By offering new strategies for protecting local cultural heritage, "Action Agenda" serves as a guide for preserving the stories of Rust Belt cities and communities and make them more equitable, prosperous, and sustainable in the face of economic shifts. Using examples from Cincinnati, Buffalo, Detroit, and more, the report offers suggested next steps, potential partners from preservation and allied fields, and financing and coalition-building toolkits for urban development and preservation advocates.

Learn more about the release event here and The Assembly's work in Historic Preservation in Legacy Cities here.

Available at: 

Amazon

Downloads: 

"Action Agenda for Historic Preservation in Legacy Cities" (Download PDF)

The Rise of Robo Notice

Rise of Robo Notice.jpg

2015. Joe Karaganis, Jennifer Urban

Communications of the ACM, September 2015

"Rise of the Robo Notice" is a preview of our longer publication, Notice and Takedown in Everyday Practice (2016).

Here's an excerpt of the book's introduction:

Most Internet professionals have some familiarity with the “notice and takedown” process created by the 1998 U.S. Digital Millennium Copyright Act (the DMCA). Notice and takedown was conceived to serve three purposes: it created a cheap and relatively fast process for resolving copyright claims against the users of online services (short of ling a lawsuit); it established steps online services could take to avoid liability as intermediaries in those disputes—the well-known DMCA “safe harbor”; and it provided some protection for free speech and fair use by users in the form of “counter notice” procedures.

The great virtue of the notice and takedown process for online services is its proceduralism. To take the most common example, if a service reliant on user-generated content follows the statutory procedures, acts on notices, and otherwise lacks specific knowledge of user infringement on its site (the complicated “red flag” knowledge standard), it can claim safe harbor protection in the event of a lawsuit. Services can make decisions about taking down material based on substantive review and their tolerance for risk. They may also adopt technologies or practices to supplement notice and takedown, though the law makes no such demands beyond a requirement for repeat infringer policies. The resulting balance has enabled a relatively broad scope for innovation in search and user-generated content services. As one entrepreneur put it in our recent study of these issues, notice and takedown was “written into the DNA” of the Internet sector.

This basic model held for about a decade. In the last five or six years, however, the practice of notice and takedown has changed dramatically, driven by the adoption of automated notice-sending systems by rights holder groups responding to sophisticated infringing sites. As automated systems became common, the number of takedown requests increased exponentially.

For some online services, the numbers of complaints went from dozens or hundreds per year to hundreds of thousands or millions. In 2009, Google’s search service received less than 100 takedown requests. In 2014, it received 345 million requests. Although Google is the extreme outlier, other services— especially those in the copyright ‘hot zones’ around search, storage, and social media—saw order-of-magnitude increases. Many others—through luck, obscurity, or low exposure to copyright conflicts—remained within the “DMCA Classic” world of low-volume notice and takedown.

This split in the application of the law undermined the rough industry consensus about what services needed to do to keep their safe harbor protection. As automated notices overwhelmed small legal teams, targeted services lost the ability to fully vet the complaints they received. Because companies exposed themselves to high statutory penalties if they ignored valid complaints, the safest path afforded by the DMCA was to remove all targeted material. Some companies did so. Some responded by developing automated triage procedures that prioritized high-risk notices for human review (most commonly, those sent by individuals).

Others began to move beyond the statutory requirements in an effort to reach agreement with rights holder groups and, in some cases, to reassert some control over the copyright disputes on their services.

Available at: 

"Rise of the Robo Notice" at Communications of the ACM