Trusting Everybody to Work Together

In an increasingly factious world, Forsyth argues Wikipedia's approach to collaboration and trust-building point to a brighter future.
Trusting Everybody to Work Together
·
Contributors (1)
Created
Jun 08, 2019

On September 11, 2001, I awoke up to the news of horrific, coordinated attacks on targets in New York City and Washington, D.C. I was working for a startup newspaper, but I was still at home when I heard the news. I eventually managed to drag myself, drained of any hope or positive outlook on the future, away from the nonstop TV coverage. Three airplanes had crashed into strategically-chosen buildings full of thousands of people; the hijackers of a fourth were thwarted by heroic passengers over rural Pennsylvania. When I arrived at work, the journalists I so admired and respected seemed listless and defeated. They didn't soothe my feeling of hopelessness, but their words did help define it. They spoke of the ill-advised military actions and regressive surveillance state policies that would surely result from such an attack.

As the decade unfolded, I watched many of those worries become a reality. The responses to the attacks, as much as the attacks themselves, seemed to point to a very bleak future. Soon enough, the U.S. was involved in two wars. We would learn that one of them had been waged only after a misinformation campaign on the world stage, orchestrated by our democratically elected leaders. Meanwhile, our representatives in Congress passed laws impacting civil liberties. When a revision to the USA PATRIOT Act was signed into law in 2006, it brought a strong expansion of the powers of our executive branch of government; but for reasons I found incomprehensible, it remained unknown for a period of time what Senator had introduced the change (to say nothing of why he thought it was a defensible idea). Hadn't these lawmakers heard of wiki software? Were they simply unaware of the ways that software could help make sense of complex processes, or did avoid such tools on purpose? So much for a government that valued transparent or accountable delibarative processes.

I found myself worrying about access to accurate information and trustworthy narratives. It seemed urgent that we find a better way, on a national and global scale, of alining our basic values and beliefs, and of learning to tolerate differences when that proves impossible. We seemed to have these problems at local, national, and international levels.

Could I trust fellow Americans with different political beliefs to observe a baseline of honest discourse? Could I trust foreigners with complaints about U.S. policy, whether legitimate or not, to express those complaints peacefully? If I wanted that to change, what actions were within my power? I wasn't so sure…but I was pretty sure these were important things to figure out.

If we couldn't do those things, could the social advances of several millennia be sustained? Again, I wasn't so sure.

Although I worked for a newspaper, I wasn't a journalist. I was responsible for the computer systems. I took pride in supporting journalists and entrepreneurs, by building and refining systems that helped them do their best work. I was reading the tech news site Slashdot every day, and I volunteered for a newly-launched computer recycling non-profit called Free Geek. I learned about free and open source software (FOSS), a movement founded in the 1970s not on any technical principle, but on basic beliefs about empowering software developers and users. Later, I learned to use the wiki software that enabled Free Geek volunteers and staff to rapidly document everything from technical processes to meeting minutes to jokes and cartoons.

I found few opportunities to bring FOSS or wikis into my day job, but these technology initiatives rooted in social principles impacted me strongly nonetheless. I developed a conviction that in the long run, such initiatives might offer us all the ability to contribute to a better society. Even as I developed doubts about whether democratic values could survive in our social and government institutions, I was heartened to see similar values taking root in the tech world.

But I wasn't really at home in the tech world. My interests and skills were better suited to a career rooted in human interaction and written communication. Merely supporting others doing that work wasn’t going to do it for me. In the mid-2000s, I began volunteering for a variety of organizations, and following emerging information and discussion resources on the Internet, in the hopes of finding a more fulfilling career. It was in that context that I found Wikipedia. For several years, I had assumed that Wikipedia was pretty much complete; I didn't pay much attention to it, because I figured there was little I could contribute. But as I started learning about local happenings, including government, politics, economy, and culture, I gradually realized that Wikipedia had gaps in those areas, and I started adding pages about them.

Wikipedia spoke directly to the concerns and hopes dominating my thoughts. It drew heavily on values I knew and respected, from familiar traditions like journalism, academia, and various theories of self-governance. But Wikipedia lacked the rigidity of more established institutions. Wikipedia had a culture of "do-ocracy," in which getting stuff done was valued over formal titles, hierarchy, or perfect adherence to rules. Rules existed, but all users were encouraged to exercise their judgment in how strictly they were to be followed. As I came to identify as a Wikipedian, I joined in the effort to create the rules even as we built the encyclopedia. We mostly understood the limited value of specific rules that hadn’t yet been put to the test. In Wikipedia’s early days, I embraced the challenge of building an encyclopedia alongside unknown colleagues from far away, intent not only on building something, but also on learning something in the process. In 2008, I attended the RecentChangesCamp conference in Palo Alto. Ehud Lamm, an academic from Tel Aviv, convened a discussion about whether wiki and its principles could help resolve conflict between Israelis and Palestinians. And so, there it was: confirmation that I wasn’t alone in finding parallels and connections between the world of wiki, and the most pressing problems in the rest of the world.

Within a few years, I decided Wikipedia should have a central role in my career, and that I should find ways to share that learning experience with other people and organizations. I launched a business, Wiki Strategies, with John Wallin, a friend I had met through shared interests on Wikipedia; over the years, it has been a vehicle to work with a variety of inspiring people and organizations. I enjoy the freedom to continue self-directed volunteer work, as well. As Wikipedia approaches its 20th anniversary, my sense of wonder at the accomplishments it has enabled, and my enthusiasm for its potential, remain high.

What, then, have we learned from two decades of Wikipedia? What can the processes that have produced Wikipedia teach us about collaboration and knowledge dissemination? As traditional journalism, and other institutions, struggle to keep up with shifts in culture and technology, is Wikipedia, or something like it, equipped to fill important gaps? How can Wikipedia’s lessons be applied outside the pages of a wiki? Can they address the hopes and concerns I brought to begin with?

I think the answer to that last question is yes. But it's a hazy answer, an answer whose nuances have evolved over the years.

In its first decade, Wikipedia's user community grew by leaps and bounds. The software at the site's core, MediaWiki, did two important things: first, it gave users the tools to see and understand what one another were doing; and second, it gave them the ability to take action, or comment, on the activities of others in specific, relevant, effective ways. This mix, reminiscent of the concept of "social translucence" introduced in an academic paper just prior to Wikipedia's launch, emerged organically. Some of the relevant software features existed in early wiki software; others were added when a need specific to encyclopedia composition was identified.

But even as Wikipedia grew its base of dedicated users, it had difficulty breaking through to a skeptical public. University departments were in the news for "banning" Wikipedia from term papers, declaring that it must not be used as an authoritative reference...as though it anybody had ever said it was a good idea to use Wikipedia (or any encyclopedia) for such a purpose. Wikipedia became fodder for ridicule on late-night television, notably when comedian Stephen Colbert coined the term "wikiality"—reality defined by consensus, rather than objective facts.

In 2007, a local historical society launched the Oregon Encyclopedia. The new site's marketers characterized its rigor by drawing a contrast to Wikipedia. The Oregon Encyclopedia would be serious, unlike Wikipedia, in which, they assured us, you might read about somebody's (presumably insignificant) "Aunt Betty." That new project galvanized Oregon Wikipedia editors to speak up in defense of our own efforts. We hastened to point out that we, unlike some encyclopedias, weren't holding our hands out for a couple million dollars in local funding, and that we routinely deleted "Aunt Betty" biographies by the dozens. I was invited onto a local public affairs radio program alongside the new project's editor in chief.

The next year, the Wikimedia Foundation hired me to head up a new program, in which we would seek, among other things, to redefine the relationship between Wikipedia and academia. We persuaded university professors to assign their students to write Wikipedia articles, instead of wringing their hands about whether to allow them to cite its contents. Academics began, both independently and in connection with our efforts, to as ever-more interesting questions about Wikipedia. Soon, a spinoff organization, the Wiki Education Foundation, was born.

As Wikipedia entered its second decade, the enthusiasm driving its user base had plateued, but it charged into the world's top five web properties as measured by readership. Established scholars and institutions were increasingly taking the site seriously, and asking sophisticated questions about it. Did it have value as a teaching tool? Were certain aspects of its contents worthy of the public's trust, in spite of a seemingly chaotic production model? Wikipedia's creators may never have expected this kind of impact, but here it came.

In Wikipedia's second decade, questions about the site have often grown out of questions about media and information sources writ large. Widespread concerns about "fake news" have altered a landscape in which Wikipedia was previously measured against information giants that enjoyed the public's clear and consistent trust.

If we want to explore the nature of trust between Wikipedia and its readership, we would do well to look closely at the nature of trust among Wikipedia's editors. That's where the focus was in Wikipedia's early days, when nobody really thought a Wikipedia article would ever be cited in a term paper. Software mechanisms establish a framework in which one Wikipedia editor can come to trust another. Those same mechanisms also support the site's ability to earn a reader's trust.

How to earn trust: The paths taken by Wikipedia and journalism

(Source for much of this section: James Melvin Lee, History of American Journalism, 1917, ch. 19; also Sunset)

The field of journalism in the early 20th century United States looked very different from Wikipedia a century later. Newspapers had become increasingly important as the growing country processed ever more information. But the public's trust in newspapers, and their numerous independent owners and publishers, had bottomed out. Forces other than the public interest—and often contrary to it—drove editorial decisions. Advertisments featured strange products boasting dubious medical benefits; personal attacks between rival editors raged, and in Oregon, one even escalated to a fatal duel.

Veteran newspaper editor Henry Watterson described journalism in 1900 as "without any code of ethics or system of self-restraint and self-respect. It has no sure standards of either work or duty. Its intellectual landscapes are anonymous, its moral destination confused." Watterston decried such factors as anonymity and lack of a coherent moral vision.

To any student of Wikipedia's reputation, such charges have the ring of familiarity. Wikipedia, despite its grassroots flavor, had a genesis that was in many respects more centralized than that of the U.S. news media. Many of Wikipedia's core values and policies were codified early on, on an email list and on wiki pages. Journalism, by contrast, had found its values in need of articulation and codification to respond to a growing crisis of public trust.

Market forces, along with an appetite for regulation among lawmakers and the public, quickly came to bear on newspaper publishers. In 1912, the U.S. Congress required that paid editorial content be clearly labeled as advertising; the law drew dramatic criticism from the New York Evening Post. Concerns about libel and national security drove the Theodore Roosevelt presidential administration to sue several news publications. Though the suits failed, the prospect of such high profile lawsuits surely contributed to the growing sense that some kinds of standards were needed. Barratt O’Hara, Lieutenant Governor of Illinois, proposed (also without success) that his state should require newspaper owners to be licensed. Civic societies took an interest in the news. The Citizens’ Protective League of Denver, Colorado, for instance, pursued goals including that newspapers eschew "fake stories" and de-emphasize salacious stories of divorce, murder, suicide, and the like.

In that context, publishers joined forces to host regional conferences to address common problems, and journals were established to cover the industry. Publishers needed to earn the trust of the reading public; more mundane concerns, such as addressing the rising cost of newsprint, also motivated them.

Through this period of churn, newspaper publishers collectively wrote and committed to codes of ethics. Although they weren’t codified by any central authority, these codes came to be well understood by the educated public, and by journalism and its adjacent industries. Events in recent years have undermined public confidence in the reliability of "the news;" but it is no small accomplishment that, for about a century, we did have a general shared concept of what we could and should expect from news outlets. That shared concept resulted from a fair amount of philosophical and political deliberation, which had to be retrofitted onto an existing practice of journalism. Despite some erosion, the widely shared understanding of journalism's ethical ground rules remains a valuable asset in navigating the present storm around trust in media.

Wikipedia’s evolution differed from that of journalism. Its creators paid careful attention to principles from the earliest days, as they sought to create something that would take hold and survive in the aftermath of the first "Dot Com bubble." A perusal of Wikipedia’s early email list entries reveals numerous philosophical discussions. Founders Jimmy Wales and Larry Sanger, along with a number of early contributors and enthusiasts, anticipated many issues Wikipedia would indeed come to face. The early articulation of, and debate around, core principles and values established a firm foundation for the project. Journalism had arrived at formal codes of ethics; Wikipedia editors may not have used such lofty language, but its deliberations resulted in policy documents and essays on best practices.

In an early email exchange, Sanger observed a potential distraction from Wikipedia’s core purpose of becoming an encyclopedia: he advised against taking steps that only made sense if the goal was to build "yet another web directory." Under the subject line "Controversial thoughts," Wales suggested that the primary goal of Wikipedia might be "fun for the contributors. If something cool emerges out of our playing with knowledge, all the better. But if it isn’t fun for the contributors, it will die." He went on to suggest that some workable middle ground between anarchy and total control by a core group might be the best way to evolve as vandalism inevitably became a concern. When Slashdot featured a story about Wikipedia in August 2001, the linked article by Sanger extolled the virtues of Wikipedia’s open content model. As with journalists’ concerns about the cost of newsprint, many more ephemeral, technical points were raised on the list, alongside these more philosophical issues.

These earliest philosophical debates were of a different sort than those around journalism in the early 20th century, but they were similarly important: they established a basis for trust. It wasn’t long before more familiar discussions emerged. Shortly after the Slashdot article, Krzysztof P. Jasiutowicz proposed in another email discussion, under the subject line "The future of Wikipedia," that a "Wikipedia community" had taken shape. He urged discussion of the site’s reliability and editorial model, among other topics.

These early discussions show Wikipedia’s early architects grappling with issues not entirely unlike those that newspaper publishers considered a century before. But Wikipedia's model for establishing trust has differs in fundamental ways from that of journalism. A journalism publication works to establish a reputation where its readers will believe something simply because its writers and editors say so. Wikipedia, though, advises its readers not to take its contents at face value, but to consult its source material before forming strong opinions. Wikipedia aims for, and achieves, new kinds of success, largely due to its abandonment of more traditional standards for what must be verified prior to publication.

The site’s early contributors’ perception of its needs was imperfect. That's hardly surprising, given Wikipedia's novel approach. Certain major tensions continue to impact Wikipedia’s trustworthiness. For instance, the prevalence of anonymity among its editors does not fit naturally with concerns around editor conflict of interest, and the unplanned nature of the site’s contributor base has yielded disturbing trends in its demographics. Both of these issues have their analogues in the world of more traditional publications. But careful deliberation, beginning in the earliest email discussions and continued over the years in increasingly varied venues, formats, and levels of formality, bodes well for the project, and creates space for creative solutions to such issues to emerge.

Wikipedia’s lack of consistent rules is often discussed, but in comparison to the field of journalism, after only 20 years, Wikipedia began, in some respects, with a head-start of several centuries. Journalism is but one of the giants on whose shoulders Wikipedia began. The codification of ethics and best practices in earlier fields formed a ready foundation for Wikipedia. Many of the lessons those giants had learned, often through lengthy and painful processes, were well documented, and informed the efforts of early Wikipedians to plot the course of the online encyclopedia.

Conditions that support collaboration

It is possible for a diverse collection of humans to self-organize and collaborate to produce something of value, and on a massive scale, in just a few years. Early authors writing about Wikipedia, including Benkler, Ayers, Lih, and Broughton, all emphasized the importance of tools supporting community and communication. The size, quality, and impact of the encyclopedia also merit consideration, but the most unique and important aspects of Wikipedia lie in its production process.

If Wikipedia has been so effective at fostering generative collaboration, does it offer lessons about what conditions can support effective collaboration? Might it be possible to discern design principles that could help other projects replicate aspects of Wikipedia’s success? Could small adjustments to Wikipedia’s processes help it improve its support of collaborative efforts? Many have asked these questions, but a compelling answer has remained elusive.

The first wiki-based website, Ward Cunningham’s WikiWikiWeb, was created with the purpose of supporting collaboration, by helping programmers exchange ideas more freely. But with its focus on building a specific kind of product—an encyclopedia—Wikipedia had the potential to stray from that ethos. Wales’ early emphasis on the importance of fun may have helped keep some focus on the best conditions for collaboration.

With the benefit of hindsight, though, several points stand out to me. MediaWiki is the software built early on to support Wikipedia’s collaborative practices, as well as presenting its contents to the public. I observe five significant kinds of information that the basic MediaWiki software enables its users to access. A sophisticated Wikipedia reader can readily perceive:

  • that some content has changed

  • exactly what was changed (including access to the original version)

  • who made the change

  • when they made it

  • why they made it

When a reader has a question about the text of an article, the software permits them to confirm whether it was added all at once or piece-by-piece, by using the "view history" screen. "Diffs" can convey exactly what the change was between one revision and another, and identify what editor made the change. If that editor chose to enter an edit summary, any reader can learn something about the editor's thought process.

Armed with this information, a Wikipedia user has several ready avenues for action. A sophisticated reader, familiar with the software features outlined above, has but a short step to take to become an editor. MediaWiki software makes it easy to:

  • address the person who made a change

  • undo the changes of others

  • create changes of one’s own

A serious Wikipedia user might find these software features so familiar as to pay them little mind. But it’s no mere happy accident that they all exist in the software. When I first broached this topic with Cunningham, he agreed that Wikipedia’s early decision to create talk pages was a momentous one. He had previously felt it was appropriate to mix content and discussion together freely; and pages on the WikiWikiWeb, and other early wikis, reveal the benefits of that approach. But with Wikipedia’s added focus on final product, a separate venue for discussion was quickly identified as an important software refinement. Just a couple months after Wikipedia launched, Sanger asserted: “I think the ‘talk’ page for every page should be automatic, one of the default links in the header/footer.”

Cunningham also didn’t bother to keep old versions of pages in the original wiki software, preferring to trust in the good intentions of its user community, and that their contributions would tend to be more generative than destructive. Most of the capabilities listed above were not present in the original wiki software. But as Cunningham is quick to acknowledge, the ambition to build an encyclopedia introduced new needs, and some innovations introduced in its software were important. His perspective as a software designer is no different from that of a seasoned wiki editor: when others graft changes onto the core he built, he is quick to observe advantages. In a recent conversation with me, he noted that the ability to browse a user’s edit history has become an important way of getting to know who that user is, and that the ability to observe patterns in a user’s work makes it possible to get to know them.

In 2016, I explained to Cunningham how I believed Wikipedia’s software supported collaboration. He drew a parallel between my idea and a decades-old design concept, the "observe-orient-decide-act" (OODA) loop, which originated in military strategic theory.

In the Internet’s first few decades, many projects aiming to generate self-sustaining collaborative activity have come and gone. In some, the absence of one or more of the conditions above seems to have played a role. For instance, for a few years I dabbled with the Open Directory Project (ODP), a predecessor of Wikipedia that built a web directory through broad-based peer production. Accustomed to wikis, I was frequently frustrated by the challenges of determining exactly what had changed and when, who had done it and why, and how I could effectively engage them in discussion if I disagreed. In my view, the challenges inherent in contributing to the ODP did not bode well for its long-term survival, especially once sites like Wikipedia made it easier to build things online.

The ability of Wikipedia users to meet such basic needs for themselves supports a largely coherent system of collaborative activity, as described by the OODA loop concept. By default, users have a great deal of autonomy; in most situations, they don’t need to rely on others to help them make observations, orient themselves, make decisions, or take action.

The central importance of a coherent production process is that users have fewer obstacles in creating high quality content. The quality of content, of course, is a core element of trust; all the higher-level considerations about trust, whether in the world of journalism, Wikipedia, or elsewhere, have to do with ensuring the quality, in various respects, of the content. Even as we take an interest in the influence of conflicts of interest, editorial autonomy, contributor access to the production process, advertising, and the like, we should remember that the basic quality of the content in question has primary importance.

Trust in the context of broad collaboration

In recent centuries and in recent years, the topic of trust in traditional media has received much attention. The kind of trust under discussion, however, is narrow: it’s the trust between publisher and audience. In order to collaborate and produce high quality content, every publication involving more than one person requires another kind of trust, too: trust among content producers. Working relationships among reporters, editors, sources quoted in stories, proofreaders, etc. are all essential to producing quality content, and they all require some measure of trust.

One of Wikipedia’s core traits is that it blurs the traditional lines between producer and consumer. So with Wikipedia, the kind of trust needed within the community of producers inevitably overlaps with the audience’s trust in Wikipedia. Fortunately, the kind of trust needed to build a working relationship is one of the things supported by Wikipedia’s software, and its attention to the OODA loop.

But those same software features also support audience trust. In order to fully evaluate a Wikipedia article, a discerning reader must review article history and edit summaries, and skim through discussions, drawing conclusions about the intentions and biases of the authors.

In one sense, Wikipedia makes things more complicated and messier by blending production and publication. But in so doing, it forces us to address the issues of trust inherent in both, simultaneously, and using the same set of tools. In that sense, Wikipedia might just point the way toward a more coherent way to address issues of trust. In the articles and talk pages of Wikipedia, I have seen editors firmly committed to opposing views resolve seemingly intractable disputes; and the resulting articles are the better for it, serving readers well by helping them understand competing views. These dynamics bring my thoughts back to that 2008 discussion with Ehud Lamm, and the kind of trust that will be needed if we are to overcome violence on a global scale.

Furthermore, as is often said, trust is a two-way street. Treating someone with respect, empowering them, and showing trust in them, can often engender reciprocal trust. When Wikipedia takes steps that help its readers and contributors trust its content, it also expresses trust in them.

Throughout society, we are currently grappling with basic epistemic questions. How can we differentiate between "real" and "fake" news? What's the proper role of scientific studies in shaping policy decisions, or our day-to-day decisions? Individual judgment is a key asset in charting a path forward. A reader literate in the scientific method is better equipped to evaluate a scientific study than one who has to rely on external authorities; a television viewer well-versed in the techniques of video manipulation or rhetorical trickery will be less susceptible to deception.

Wikipedia’s structure invites individuals and institutions to build literacy skills and develop trust. To the degree that we can put Wikipedia’s tools to appropriate use, we may just have the ability to build trust throughout society, and generally make the world work better. Wikipedia doesn’t promise any of this to us; but if it gives us a nudge in the right direction, that makes it a valuable resource, and one worth protecting and exploring.

As some of the world’s largest technology platforms field tough questions about what value they ultimately provide, Wikipedia stands apart. Its idealistic roots in traditions like wiki and free and open source software, and its ability to build on the lessons of longer-standing social institutions, have served it well. Wikipedia empowers its editors and its readers, and its software encourages everyone involved to find ways to trust one another.

Those building new software intended to support and nurture collaboration would do well to study the interplay of the software features described here. This applies as much within the Wikipedia world as outside it; a mobile interface that obscures the "view history" screen, for instance, deprives the reader of a key element required for critical reading, and thereby presents only a partial view of the richness of Wikipedia's content.

As we reflect on 20 years of Wikipedia and consider what we might accomplish with it in the next 20 years, we should take the time to understand the benefits of its earliest software features. They may point the way toward a more robust and collaborative future.

Discussions

Labels
Joseph Reagle: good connection to earlier
Joseph Reagle: how might this apply outside WP? WP is novel in terms of its purpose, and its evolution it seems to me.
Joseph Reagle: is this part of thesis? if so, can you mention at start?
Joseph Reagle: I don’t get this or its importance; explain
Joseph Reagle: this seems like a key point; do you have evidence, can you substantiate, do you do so in the next section?
Joseph Reagle: restructure; what is “its”?
Joseph Reagle: if you are quoting from Lee, see https://www.lib.sfu.ca/help/cite-write/citation-style-guides/chicago/secondary-sources
Joseph Reagle: Use Chicago full notes.
Joseph Reagle: delete comma
Joseph Reagle: “plateaued even as it charged”
Joseph Reagle: “to ask”
Joseph Reagle: I think this is your thesis here. Can you be more specific and direct? “It’s not an easy answer, but I believe X, which has changed over the years, from A to B.” Also, this should end the intro section; start a new section below.
Joseph Reagle: new para?
Joseph Reagle: hard to follow; perhaps give an example?
Joseph Reagle: delete ,
Joseph Reagle: massage prose
Joseph Reagle: typo
Joseph Reagle: ?
Joseph Reagle: a very captivating start, but can you give the reader your thesis early on?
Thomas Townsend: The essay is entirely about the merits or otherwise of the Wikipedia community as a social process. Indeed, in another essay here Dariusz Jemielniak rather lets the cat out of the bag by admitting that Wikipedia is an online role-playing game in which players play at building an encyclopaedia. If the function of Wikipedia were nothing but to make its volunteers feel good about themselves and about the world around them, then it might be a success. Actually, of course, it isn’t even good at that as other essays here and Foundation research on harassment have revealed. But the claimed mission of Wikipedia is to build an encyclopaedia — a reliable source of knowledge. Has it done that? I claim that it has not, that it is not doing so, that it can never do so, and indeed was not designed to do so. Surely this point whould be explicitly addressed, even if only by the author saying in so many that they believe, what I do not, that Wikipedia is a success at it’s self-proclaimed mission of collecting and disseminating knowledge.
Joseph Reagle: This seems to be a major part of your essay, I think it needs to be explained a bit more and forerounded.
Joseph Reagle: Substantiate or exemplify
Joseph Reagle: too abstract; can you be specific?
Joseph Reagle: Cite. If you read this in Lee, this would be a secondary reference. https://libguides.murdoch.edu.au/Chicago/secondary
Joseph Reagle: tighten: Americans’
Joseph Reagle: Word choice
Joseph Reagle: This is odd; use prose and notes. For example, “As James Lee discusses in… [1].”
Joseph Reagle: I'm not sure how this section fits.The historical side About 20th Century journalism doesn't yet cohere with your concerns about trust.
Joseph Reagle: tighten: “Wikipedia spoke to my concerns and hopes in the early 2000s.”