Big Tech funds the very people who are supposed to hold it accountable



Major technology companies such as Google and Meta, the parent company of Facebook, have significantly increased their charitable contributions to universities in recent years. This has granted them influence over academic research on crucial topics like artificial intelligence, social media, and disinformation. According to a study conducted by the Tech Transparency Project, Meta CEO Mark Zuckerberg alone has donated money to over 100 university campuses through either Meta or his personal philanthropic organization. Other tech firms are also funding academic centers, providing grants to professors, and serving on advisory boards reserved for donors. The impact of Silicon Valley's influence is most noticeable among computer science professors at prestigious institutions like the University of California at Berkeley, the University of Toronto, Stanford, and MIT. A 2021 paper by researchers from the University of Toronto and Harvard revealed that a majority of tenure-track computer science professors at these schools, whose funding sources were identifiable, had received funding from the tech industry. This included nearly 60% of AI scholars. In certain controversial fields, this proportion was even higher. For instance, out of the 33 professors who wrote on AI ethics for top journals Nature and Science and whose funding could be traced, all but one had received grant money from tech giants or had worked as their employees or contractors. Academics claim that they are increasingly reliant on tech companies to access large amounts of data necessary for studying social behavior, including the dissemination of disinformation and hate speech. However, both Meta and X (formerly Twitter) have limited the flow of data to researchers, requiring them to negotiate special agreements or pay exorbitant fees to gain access. This power shift became more evident when renowned disinformation researcher Joan Donovan filed complaints against Harvard University, alleging that her removal from the Harvard Kennedy School was influenced by personal connections of Meta executives and a 500 million grant for AI research. Harvard has denied any improper influence. Lawrence Lessig, a professor at Harvard Law School, commented on the success of Big Tech in influencing academia over the past decade. He noted that an extraordinary number of academics have been financially supported by Facebook alone. While most tech-focused academics claim that their work remains unaffected by these companies, interviews with two dozen professors revealed that tech companies exert "soft power" through funding control and data access, which can slow down research and create tension between academics and their institutions. McGill University professor Taylor Owen described this influence as subtle but pervasive. Owen personally experienced corporate power when Meta's Canadian public policy head joined the advisory board of McGill's public policy school. The Meta executive complained about the school publicizing Owen's critical research on the company and suggested that Meta could fund specific classes for educating journalists. After Owen objected, the school declined the offer. While Meta did not dispute the Tech Transparency Project's assessment of its grants, spokesperson David Arnold stated that the company's donations to academic institutions aim to enhance understanding of their platforms' impact. He also pointed out that the organizations supporting the Transparency Project, which criticizes tech companies, also fund academic research. Arnold emphasized Meta's commitment to supporting rigorous and unbiased external academic research.


Stringent requirements imposed by tech companies have raised concerns among researchers about the potential drying up of funds. Google, for example, invested millions of dollars in a coalition called First Draft, which focused on disinformation. However, after contributing 4.5 million in one year, Google drastically reduced its funding the following year, leading to the closure of the group in 2022. While the coalition members were not explicitly told what they could or could not publish, there was a concern that publishing unfavorable findings could affect future funding. Scholars often find it difficult to turn down funding from tech companies due to limited research opportunities and funding sources. The underfunding of social science research by the federal government and the historical reluctance of foundations to fund basic research have made tech companies the primary source of research funding. However, there is a recognition that tech companies can influence what research is promoted and emphasized. The American Association of University Professors has acknowledged previous scandals involving professors who downplayed risks in their research due to financial ties. While outside funding is welcomed, the association urges faculty bodies to establish detailed rules and enforce them. In addition to funding limitations, tech companies are also restricting access to internal data that researchers rely on for their work. Elon Musk, for instance, has started charging researchers for access to previously free data on X. Meta (formerly Facebook) disabled accounts associated with NYU's Ad Observatory project, citing privacy concerns. Meta has also acquired and reduced support for CrowdTangle, a social media tracking tool used by academics. Despite efforts to collaborate with tech companies for data access, challenges arise. For example, Social Science One, a partnership between Meta and researchers, faced delays and data release issues due to privacy concerns. In another instance, Meta collaborated with researchers but clashed over the interpretation of results regarding the impact of small experimental interventions on political polarization.

The research indicated that small experimental interventions, such as changing the Facebook news feed to a chronological order, did not have an impact on political polarization. Nick Clegg, Meta's President of Global Affairs, highlighted these findings as part of a growing body of research that suggests Meta's platforms alone do not cause harmful polarization or have significant effects on these outcomes. However, researchers acknowledged that these results do not imply that Meta does not contribute to divisions. Samuel Woolley, an expert on misinformation from the University of Texas, observed a consistent pattern in such conflicts. He noted that there is a trend towards promising systematic efforts to study these issues, but progress often stalls. Woolley chose not to obtain company data for his book on computational propaganda in 2018, as the process of gathering data from various sources was painstaking but necessary. He described quantitative research in this field as a challenging and disheartening endeavor. Lawrence Lessig from Harvard, who previously led a center focused on ethics in society, is currently working on a system to ensure academic research is truly independent. He plans to present this initiative, called the Academic Integrity Project, to the American Academy of Arts and Sciences. However, he is still seeking funding for this project.


Comments



Font Size
+
16
-
lines height
+
2
-