Home / Blogs

Regulation of Algorithmic Regulation Begins

A Chinese law that went into effect six months ago required online service providers to file details of the algorithms they use with China’s centralized regulator, the Cyberspace Administration of China (CAC). In mid-August, CAC released a list of 30 algorithms used by companies such as Alibaba, TenCent and Douyin, the Chinese version of Tiktok, with a brief description of their purpose. The move reflects similar trends in the US and Europe, where the idea of “regulating algorithms” is contained in proposed legislation in the US and in the EU.

The information publicly released by CAC is not much different from what Facebook already makes available about the algorithm governing users’ Feed and how YouTube describes its algo. But the CAC holds more detailed data and the ability to intervene in a top-down manner with a great deal of discretion.

The issue now is what to do with this information?

Automated filtering and prioritization of content is an unavoidable aspect of large-scale digital content-sharing platforms. The massive amount of content available, most of which is of little interest to any given user, would quickly overwhelm users. Algorithms thus provide a hyper-scale, automated version of the editorial function that traditional media used to do with human gatekeepers. The algorithms also serve the business purposes of the platforms (just as traditional editorial functions do), matching ads to targeted user-types, promoting engagement, and maintaining an environment that doesn’t offend or scare away users.

China’s experience may provide a useful test of the simplistic idea that nice, impartial government regulators can be better editors than profit-motivated businesses. While the CAC says that it will supervise algorithms to protect workers’ and consumers’ rights, or to prevent manipulating search results, manipulation of search results is one of the key features of China’s public opinion management regime, and the CAC has openly identified “safeguarding ideological security” or the promotion of “positive energy” as goals of its intervention. While not all the governments seeking to regulate algorithms are authoritarian, all of them are political in nature, and the politicization of what algorithms do cannot possibly do anything but make the world a better place, we are sure.

By Milton Mueller, Professor, Georgia Institute of Technology School of Public Policy

Filed Under

Comments

Translated to Chinese Charles Mok  –  Oct 16, 2022 5:11 PM

Comment Title:

  Notify me of follow-up comments

We encourage you to post comments and engage in discussions that advance this post through relevant opinion, anecdotes, links and data. If you see a comment that you believe is irrelevant or inappropriate, you can report it using the link at the end of each comment. Views expressed in the comments do not represent those of CircleID. For more information on our comment policy, see Codes of Conduct.

CircleID Newsletter The Weekly Wrap

More and more professionals are choosing to publish critical posts on CircleID from all corners of the Internet industry. If you find it hard to keep up daily, consider subscribing to our weekly digest. We will provide you a convenient summary report once a week sent directly to your inbox. It's a quick and easy read.

I make a point of reading CircleID. There is no getting around the utility of knowing what thoughtful people are thinking and saying about our industry.

VINTON CERF
Co-designer of the TCP/IP Protocols & the Architecture of the Internet

Related

Topics

Domain Names

Sponsored byVerisign

DNS

Sponsored byDNIB.com

IPv4 Markets

Sponsored byIPv4.Global

Cybersecurity

Sponsored byVerisign

Brand Protection

Sponsored byCSC

New TLDs

Sponsored byRadix

Threat Intelligence

Sponsored byWhoisXML API