Following up on multiple hearings on Capitol Hill over Facebook’s treatment of users’ private data, Sen. Mark Warner (D-VA) released a 23-page white paper this week detailing about 20 policy proposals for regulating social media companies and big tech companies so that incidents like the Facebook-Cambridge Analytica scandal are not repeated.

Warner lauds Silicon Valley for their groundbreaking technological innovations, but says the current business model “involves offering nominally free services, but which results in consumers providing ever-more data in exchange for continued usage,” which he believes means “these tech giants” — including Apple, Amazon, Facebook, Google, and Twitter —  now deserve “increased scrutiny.”

The paper outlines three ways Warner believes tech companies should be regulated: 1) to stop the spread of “disinformation” and “fake news,” 2) to better protect and facilitate consumer privacy, and 3) to encourage competition and prevent a handful of platforms from dominating the market.

The size and reach of these platforms demand that we ensure proper oversight, transparency and effective management of technologies that in large measure undergird our social lives, our economy, and our politics,” Warner states in the paper. “Numerous opportunities exist to work with these companies, other stakeholders, and policymakers to make sure that we are adopting appropriate safeguards to ensure that this ecosystem no longer exists as the ‘Wild West’ — unmanaged and not accountable to users or broader society — and instead operates to the broader advantage of society, competition, and broad-based innovation.”

The whole paper is call for tech companies to claim responsibility for the data-sharing practices and information services they use — something which Facebook has repeatedly evaded in congressional hearings and interviews.

For example, Warner believes tech companies have a “duty to clearly and conspicuously label bots, …determine origin of posts and/or accounts,” and “identify inauthentic accounts” to stop the spread of misinformation on social media.

Regarding the latter, Warner also wrote, “Failure to appropriately address inauthentic account activity — or misrepresentation of the extent of the problem — could be considered a violation of both SEC disclosure rules and/or Section 5 of the FTC Act.”

Warner says Congress should consider making “platforms liable for state-law torts (defamation, false light, public disclosure of private facts) for failure to take down deep fake or other manipulated audio/video content,” and floated the idea of a Public Interest Data Access Bill “to allow researchers to measure and audit social trends on platforms. This would ensure that problems on, and misuse of, the platforms were being evaluated by researchers and academics, helping generate data and analysis that could help inform actions by regulators or Congress.”

He suggests requiring tech companies to obtain first party consent for data collection, giving the Federal Trade Commission (FTC) more regulatory power to “police data protection and unfair competition in digital markets,” and a “comprehensive (GDPR-like) data protection legislation.”

While the policy proposals Warner discusses in the paper are still just proposals, they do show lawmakers increasingly seeking out regulatory solutions to recent big data scandals like Facebook-Cambridge Analytica.

Technology think tanks almost predicted some kind of regulation for tech companies that collect private user data, and they are already diving deep into the issues.

Research on the subject has been conducted as early as 2013, but not all experts think the focus on regulating companies’ treatment of user privacy is the key to preventing an incident like Facebook-Cambridge Analytica from happening again.

The nonpartisan Technology Policy Institute’s President and Senior Research Fellow Scott Wallsen wrote in an article for RealClearPolitics that “while the users whose data was taken were not directly harmed, anyone whose voting behavior changed because of misinformation targeted with the help of that data was harmed. This private harm aggregates to larger social harms if it affected the outcomes of any elections.”

He continues to say that “Regardless of whether one believes European-style privacy rules would be a net benefit, they are not a response to the problem at hand. After all, strict privacy rules did not prevent similar election interference in Europe,” and that Facebook’s real responsibility is to stop the spread of misinformation campaigns — which Warner’s paper addresses.

The nonpartisan Center for Democracy and Technology (CDT) noted in a 2017 paper that to efficiently address user privacy and give more control back to users, third parties may need to “perform state-mandated impact assessments of data-management practices, advocating for users’ interests and creating greater transparency,” especially if companies fail to improve transparency and “self-regulate.”

The paper also advocates for a “multipronged approach,” arguing regulation alone will not fix the issue but that private sector transparency and media literacy among individuals are also key.

Furthermore, the CDT wrote in a June paper that not all companies collecting personal information — especially nonprofit organizations — may have the resources to handle that data responsibly.

“Many are not equipped…to implement sophisticated legal, technical, and ethical compliance regimes or to understand how certain data collection, use, and sharing activities could put the communities they serve at risk,” the paper reads.

Big tech companies like Facebook and the others mentioned by Warner may have the resources to comply with future regulations regarding data collection and user privacy, but smaller companies and small businesses may not, which could put them at a competitive disadvantage.

While regulations like the European Union’s GDPR are touted as the solution to improving individual privacy rights, concerns over whether an American version would stifle tech innovation may prevent such a comprehensive regulation from taking off in the U.S.

The CDT was unable to grant InsideSources’ request for comment.