Language

The WeRead Case: Discussion on Reasonable Digital Privacy Expectation

Authored by Yingying Zhu

 

March 2021

Each of us leaves a lasting digital footprint on the internet and would expect businesses that we are dealing with could treat our digital privacy with reasonable care and consideration. Can users have a reasonable privacy expectation in the friends made and the books read online? The Beijing Internet Court in its recently released WeRead judgment holds that, friends list and reading data are not eligible for privacy protection in the case under dispute but nevertheless entitled to protection as personal information.

Background

The judgment is in relation to a dispute between an individual, Huang, a user of a book reading app named WeRead, and the digital giant, Tencent, the operator of the most successful social media in China, WeChat, and its sister app WeRead. The WeRead app wishes to set up an app-based reading community, where people who enjoy reading can read & connect. The plaintiff Huang was complaining that WeRead sneaked away her friends list from WeChat and then automatically turned those who are also subscribers of WeRead as her connections. Huang was also complaining that the information regarding the books she read and how she felt about the reading was widely open to all her connections without her permission while she intended to keep such information private. In its defense, the defendant Tencent alleged that users’ friends list and reading data were obtained with a preapproval from users therefore it should not be held liable for the utilization of the data.

Decision of Beijing Internet Court[1]

The Beijing Internet Court (hereinafter the “BIC”), the Court of First Instance, decides that Huang’s friends list and reading data shall not be categorized as private information, hence not eligible for privacy protection.

To define what constitutes private information, the BIC’s reasoning is based on the classification of the following three layers of personal information:

1.     personal information reasonably recognized by the society as private information, such as one’s sextual orientation, sex life, history of disease and unreleased criminal records, etc.

2.     personal information on which one may hold a defensive expectation or a utilization expectation; and

3.     general information that has no traits of privacy at all.

 

The BIC holds, because one’s friends list and reading data do not constitute private information as listed in layer 1 in the above classification, Tencent is not liable for invasion of the plaintiff’s privacy.

 

The BIC goes on to reason that one’s friends list and reading data shall be classified under layer 2 in the above classification, where the information is considered personal but not private and therefore the emphasis of protection is to give the data subject a right to decide whether to hide or to use such information.

 

The BIC further holds that in this case the plaintiff did not get the chance to decide how to deal with her personal information, because Tencent failed to give proper and transparent notices to the plaintiff and failed to obtain her affirmative consent before utilizing the information under dispute. The BIC then decides that Tencent should be held liable for violation of the plaintiff’s legitimate interests in her personal information. The BIC’s decision is majorly based on Article 43 of the Cybersecurity Law of China. [2]

Discussion

1.    What is Privacy?

According to Eric Hughes, an American mathematician, computer programmer, and cypherpunk, “Privacy is the power to selectively reveal oneself to the world.” [3] Broadly speaking, privacy is the right to be let alone, or freedom from interference or intrusion. Information privacy is the right to have some control over how your personal information is collected and used.[4]

 

The Civil Code of China (2021) defines privacy as peace in a person’s private life and the private space, private activities and private information that a person does not intend for others to know.[5]

 

As a governing law, the Civil Code’s definition of privacy is vague. As we know, privacy varies greatly from person to person: while one person may be comfortable with showing his or her diet recipe online, another person may be embarrassed to let others know how little (or how much) he or she eats over a meal. Similarly, while one person may be at ease with disclosing many details of his or her personal life to online social connections, another person may feel ashamed of posting anything personal on the internet. So exactly what kind of privacy does the Civil Code protect? Some guidance from a concurring opinion in a US Supreme Court decision might shed some light on this.

 

2.    Reasonable Expectation of Privacy

To define the right to privacy under the Fourth Amendment, [6]  the US Supreme Court Justice John Marshall Harlan, in his concurring opinion in Katz, [7]  formulated a “reasonable expectation of privacy” test. The test has two prongs:

1)     the person must exhibit an “actual (subjective) expectation of privacy”; and

2)     society recognizes the expectation as “reasonable.”

The Katz “reasonable expectation of privacy” test, while particularly useful in terms of defining privacy, also provokes further questions: what is reasonable? where to draw the line between “reasonable” expectation and expectation that is “unreasonable”? These questions matter hugely in today’s digital world, because every time a user creates a new account at an online platform, the user provides information with personal details, including name, birthdate, geographic location, and personal interests, etc. Users are entitled to know if they can have a “reasonable expectation of privacy” in such information and if such expectation could be respected by the platform.

 

3.    Exceptions to the Reasonable Expectation of Privacy

 

There are several recognized exceptions to the reasonable expectation of privacy, such as the Third-Party Doctrine, which means once an individual invests a third party with information, and voluntarily agrees to share information with a recipient, the individual loses any reasonable expectation of privacy in that information, [8] and the Voluntary consent Doctrine, which means individuals lose a reasonable expectation of privacy when they consent to a search of private information.[9]Other exceptions include the following: unlawful information is not protectable by the law and therefore there should be no reasonable expectation of privacy,[10] and public disclosure of private information will cause forfeiture of any reasonable expectation of privacy.[11]

 

4.    Where did the Court draw the Line?

 

The BIC obviously referenced the Katz test by reasoning that “the privateness in the information that one does not intend to disclose depends on a subjective intent, however, such subjective intent shall be reasonably recognized by the society.”

 

Then the BIC made the point that the information about one’s social relationship could only invoke reasonable expectation of privacy under the following circumstances: the relationship between the data subject and certain connections would be too intimate to let others know, or the disclosure of some social relationship would negatively affect the data subject’s social image.

 

With respect to the book reading data, the BIC made another similar point that one could only have reasonable expectation of privacy in one’s reading data if certain reading contents fall into some private and secret information region or the reading data, when generated at certain amounts, would reflect negatively on the data subject.

 

Then the BIC commented that the plaintiff’s online social relationship, i.e., the listed friends, is being identified by open-ID, profile and nickname, which should not show the real social relationship or the degree of intimacy between the plaintiff and her social connections. The BIC also went through the contents of the plaintiff’s reading data and found that neither of the two books displayed to her connections would cause any damage to the plaintiff’s social image. The plaintiff’s reading data therefore should not be categorized as private information, hence no reasonable privacy expectation in the data.

 

In a nutshell, the BIC was defining “reasonable expectation of privacy” in the digital world based on the content of certain information. If a piece of information contains nothing intimate or cannot reflect negatively on the data subject, then the data subject should not have a “reasonable expectation of privacy” in the information. The content-based approach is how the BIC drew the line between privacy and non-privacy related information.

 

5.    Content-based Approach is not Fair

 

The BIC’s views on this issue are deeply disturbing. Back to the definition of privacy, broadly speaking, privacy is the right to be “let alone”. It means when a person walks into an isolated space, the person could expect to be in a state in which one is not observed or disturbed by other people,[12] as long as nothing illegal is ongoing under the roof. By applying the Katz test, this person has a reasonable expectation of privacy because the person demonstrates a subjective expectation of privacy by “walking into the isolated space”, which is well recognized by the society as reasonable.  Furthermore, the person’s act does not fall into any of the aforesaid exceptions.

 

 In solitude, a decent citizen could expect the same degree of privacy as much as anyone would. The right to privacy does not depend on whether something shameful is being conducted inside that isolated space. The right to privacy does not depend on the activity happened inside. Instead, it depends on whether one’s demonstration of intent to be let alone could be accepted as reasonable by the society. However, under the content-based approach, a decent citizen would have less expectation of privacy than someone who conducts shameful behaviour in solitude, and this approach apparently leads to unfair results.

 

Here comes the digital world version of the above scenario. When an individual, like the plaintiff Huang, subscribes to open an account at an online platform, like WeRead, and secures it with a password, this would create an isolated space where this person could expect digital privacy. By applying the Katz test, this individual has a reasonable expectation of privacy as he or she demonstrates a subjective expectation of privacy by “creating a password-secured account”, which is well recognized by the society as reasonable.  Likewise, the person’s act does not fall into any of the aforesaid exceptions.

 

This person is fully entitled to assert a digital privacy right to be “let alone”. One can choose not to have any improper friends, and not to read any obscene books, but can still enjoy full privacy rights over one’s personal information. It literally means that being a decent netizen should not compromise one’s digital privacy rights. The content of the information stored in a password-secured account, if it is nothing unlawful, should not dictate if and how the person would enjoy the right to privacy.

 

The above scenario shows that the content-based approach taken by the BIC is not fair because it makes users’ digital privacy rights conditional on the content of personal information, i.e., if the information includes any embarrassing content or not. This approach leads to the unfair conclusion that being a decent netizen, one has nothing shameful to hide and therefore would not have reasonable expectation of digital privacy.

 

Conclusion

 

With the storage and processing of exabytes of data, social media users’ concerns about their privacy have been on the rise in recent years. Incidents of illegal use of data and data breaches have alerted many users and caused them to reconsider their interaction with social media and the security of their personal data.

The disputes caused by unauthorized use of personal information over the internet have spiked in the privacy law landscape. The Beijing Internet Court’s present decision, which echoes with the same court’s decision on the “Dou Yin (Tik Tok Chinese version) collection of personal information” case, [13] is among the first few decisions made by Chinese courts on this controversial issue. Significantly, the decision might impact ongoing litigation stemming from similar disputes. Other courts around the country might follow suit. Therefore, it is imperative to have a more clear and fair approach towards defining reasonable digital privacy expectation.

In the era of big data, defining privacy is under pressure in the digital world. As Bill Gates put it: “whether it’s digital cameras or satellites or just what you click on, we need to have more explicit rules — not just for governments but for private companies.” [14]

 

 




[1] Beijing Internet Court, (2019) Jing 0491Min Chu Zi No. 16142.

[2]  China Cybersecurity Law, Article 43, provides, “Where an individual finds that any network operator collects or uses his or her personal information in violation of the provisions of any law, administrative regulation or the agreement of both parties, the individual shall be entitled to request the network operator to delete his or her personal information. If the individual finds that his or her personal information collected or stored by the network operator has any error, he or she shall be entitled to request the network operator to make corrections. The network operator shall take measures to delete the information or correct the error.”

[3] Eric Hughes, The Cypherpunk Manifesto (1993), see https://www.activism.net/cypherpunk/manifesto.html.

[4] See https://iapp.org/about/what-is-privacy/.

[5] Article 1032, China Civil Code (2021).

[6] The Fourth Amendment of the US Constitution, ratified on December 15, 1791, protects the right of people “to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures.”

[7]See Katz v. United States, 389 U.S. 347 (1967). Concurring opinion written by Justice Harlan.

[8] See Smith v. Maryland, 442 U.S. 735, 743-44 (1979).

[9] See Katz v. United States, 389 U.S. 347 (1967).

[10] See https://civillaw.com.cn/bo/t/?id=37410.

[11] Ibid.

[12] See https://www.igi-global.com/dictionary/privacy-data-protection-towards-elderly/23405.

[13]See Beijing Internet Court, (2019) Jing 0491Min Chu Zi No. 6694.

[14] See https://www.oipc.bc.ca/news/quote-of-the-day-bill-gates/.


  • 相关资讯 More
  • 点击次数: 1000006
    2024 - 04 - 14
    作者:张琳 在企业用工过程中,职工可能因工作遭受事故伤害或者患职业病。为保障职工获得医疗救治和经济补偿,促进工伤预防和职业康复,分散用人单位的工伤风险,我国制定了《工伤保险条例》,强制要求用人单位为职工缴纳工伤保险,在职工出现工伤时,由用人单位和工伤保险基金分担职工的工伤保险待遇相关费用。职工因工作遭受事故伤害的原因有多种情况,可能是由于职工自身原因、用人单位原因、用人单位其他职工的工作原因或非工作原因、与用人单位有合同关系(如买卖、运输、承包、服务关系等)的单位或其雇用人员与履行合同相关或无关的原因、与用人单位有合同关系(如劳务、分包、挂靠、服务、运输关系等)的个人与履行合同相关或无关的原因、前述单位、个人之外的第三人原因或意外事件等。当工伤事故是由于用人单位其他职工的职务行为时,用人单位既是承担工伤保险待遇的主体,同时又是承担民事侵权责任的主体,这时就发生了用人单位的工伤保险待遇责任和民事侵权责任的竞合。在此情况下,职工是只能选择某一种维权方式、可以在两种维权方式中自主决定选择其中一种、还是两种维权方式可以同时主张,对于这种情况的不同处理结果将极大影响职工和用人单位的相关权益。根据相关司法解释,如职工发生工伤事故,不能向用人单位主张民事侵权责任,只能按工伤保险相关程序要求享受工伤保险待遇;如果是用人单位以外的第三人侵权,可以向第三人主张民事侵权责任。该司法解释虽然是为了解决用人单位工伤保险待遇责任和民事侵权责任竞合问题,但本身具有比较强的原则性,在司法实践中经常产生不同的理解和适用,进而导致不同的裁判结果。笔者拟通过二个案例对此问题进行分析和梳理,以期让读者对此问题有一个更加清晰的认识和理解,并对统一和完善相关问题的解决提出自己的意见和建议。 一、案例简介  案例一:周某与黄某、北京某加工厂、王某提供劳务者致害责任纠纷(参见北京市...
  • 点击次数: 1000006
    2024 - 04 - 07
    作者:金涟伊什么是AI?根据百度百科的介绍,AI即人工智能(Artificial Intelligence),是一个以计算机科学(Computer Science)为基础,由计算机、心理学、哲学等多学科交叉融合的交叉学科、新兴学科,研究、开发用于模拟、延伸和扩展人的智能的理论、方法、技术及应用系统的一门新的技术科学,企图了解智能的实质,并生产出一种新的能以人类智能相似的方式做出反应的智能机器,该领域的研究包括机器人、语言识别、图像识别、自然语言处理和专家系统等。目前大家接触了解较多的人工智能包括百度的文心一言、openai的chatgpt等等。 “文心一言”“chatgpt” 目前网上存在大量关于如何利用人工智能提高效率的信息内容,例如利用AI进行内容整理,文稿撰写,数据分析,可高效助力新媒体创作、图片绘制、视频创作。曾经需要一个经验丰富的数码画师花费数个小时创作完成的插画,如今只需要输入一组关键词,几分钟之内就能输出一张成品图。但在享受人工智能便捷快速的“创作”成果时,我们仍要思考一个问题:利用AI创作的作品是否受著作权法保护? 对于人工智能创作作品是否受中国著作权法保护的问题,北京互联网法院通过一则判例给出了一种答案。2023年11月27日,北京互联网法院作出AI著作权首案宣判,判决认定原告享有其通过AI生成作品的著作权,并判定被告侵权。主要案情如下: 2023年2月24日,该案原告使用开源软件Stable Diffusion通过输入提示词的方式生成了图片,后将该图片以“春风送来了温柔”为名发布在小红书平台。 后原告发现,有百家号账号发布文章时配图使用了涉案图片,没有获得其许可,且截去了其在小红书平台的署名水印,为此,原告将被告告上了法庭。 原告认为,被告严重侵犯了其享有的署名权和信息网络传播权,要求其赔偿经济...
  • 点击次数: 1000004
    2024 - 03 - 29
    作者:张嘉畅《中华人民共和国广告法》对“绝对化用语”的使用有明确规定,旨在规范广告行业,保护消费者权益。商家在广告中应谨慎使用绝对化用语,避免误导消费者。然而,随着电子商务、直播平台的迅猛发展,许多商家和广告人,还是没有办法明确“绝对化用语”的标准是什么?是否任何含有“最”、“顶级”的词语都不能在营销中使用?什么样的绝对化用语有可能被使用在宣传当中,又不会被处罚?关于以上问题,希望本文能够为广告从业者带来一些解答。 “绝对化用语”是指在广告中使用的具有绝对意义或排他性的表述,如“最佳”、“最高级”、“国家级”等。这些用语往往给消费者一种产品具有绝对优势或绝对效果的印象,但实际上往往存在夸大或误导的成分,容易误导消费者,进而损害公平竞争。 《中华人民共和国广告法》第九条第三款规定,广告不得使用“国家级”、“最高级”、“最佳”等绝对化用语。这一规定旨在规范广告用语,防止商家通过夸大其词的宣传误导消费者。 例如,某公司在京东网上经营某品牌户外速干裤,在该商品的宣传页面使用 “大胆融入当下流行的撞色服饰设计元素,告别以往欧洲户外服饰设计呆板,单一,色彩单一,引领户外时尚第一品牌”宣传语。某市市监局据此对该公司作出行政处罚决定。法院认为,“第一品牌”属于与“国家级”、“最高级”、“最佳”含义类似的绝对化用语,在该公司无法提供相关证明文件的情况下,市监局的处罚决定合理合法。 由此可见,该条款极大程度上规范了广告用语。然而,因为条文表述较为模糊,在监管和执法实践中依然存在一些“一刀切”“简单化”等问题。例如,某贸易有限公司在天猫商城经营网店,其宣传女靴商品的页面上写有“特别设计的鞋跟是体现你性感的最佳利器”的广告语。某地工商行政管理局据此对该公司作出行政处罚决定。法院观点认为,上述广告词汇所针对的鞋子在大众中具有较高的认知度,因此其对市场...
  • 点击次数: 1000006
    2024 - 03 - 15
    作者:常春引言:外观设计侵权判定中的判断主体“一般消费者”的界定在理论和实践中一直有争论。例如,一些观点认为一般消费者应当局限于普通消费者群体,不应该具备专业知识。另一些观点认为一般消费者的界定应当根据产品的销售对象确定,也可能具有专业知识。近日,最高人民法院的判例在这个问题上给出了相关指引,明确了“如果产品的功能和用途决定了其只能被作为组装产品的部件使用,该组装产品的最终用户在正常使用组装产品的过程中无法观察到部件的外观设计,则一般消费者主要包括该部件的直接购买者、安装者。” 正文:外观设计,是指对产品的整体或者局部的形状、图案或者其结合以及色彩与形状、图案的结合所作出的富有美感并适于工业应用的新设计。根据司法解释的规定,判断某产品是否侵犯一项外观设计专利权时,需要引入一般消费者这样的一个判断主体,根据一般消费者的知识水平和认知能力对是否构成侵权进行判断。当然,其他司法解释还规定了在考虑一般消费者的知识水平和认知能力时,还应当考虑设计空间,但本文暂不涉及设计空间对一般消费者的知识水平和认知能力的影响。不同类别的产品的一般消费者的界定一般是不同的。相同的类别的产品的一般消费者的界定是否应该相同则存在着多种细分的情形。一般而言,只要体现外观设计的是终产品,或者虽然是终产品的部件,但在终产品中完全可见,或者虽然是终产品的部件,但该部件可以单独使用,在这些情形下,专利权人和被诉侵权人在一般消费者的界定往往不会出现较大分歧,因此一般消费者的知识水平和认知能力也可基本是可以确定的。而在此外的其他的情形中,例如外观设计专利的客体是某一终产品的一个部件,且该部件在终产品上仅部分可见,或者完全不可见时,则专利权人和被诉侵权人在一般消费者的定义上则可能存在较大分歧。原因在于,如果一般消费者针对该产品类别具有较高的知识水平和认知能力,则其可能会注意到一些细微的设计差别,而当其知识水平...
× 扫一扫,关注微信公众号
北京市铭盾律师事务所 www.mdlaw.cn
Copyright© 2008 - 2020北京市铭盾律师事务所京ICP备09063742号-1犀牛云提供企业云服务
X
1

QQ设置

3

SKYPE 设置

4

阿里旺旺设置

5

电话号码管理

6

二维码管理

展开