Language

The WeRead Case: Discussion on Reasonable Digital Privacy Expectation

Authored by Yingying Zhu

 

March 2021

Each of us leaves a lasting digital footprint on the internet and would expect businesses that we are dealing with could treat our digital privacy with reasonable care and consideration. Can users have a reasonable privacy expectation in the friends made and the books read online? The Beijing Internet Court in its recently released WeRead judgment holds that, friends list and reading data are not eligible for privacy protection in the case under dispute but nevertheless entitled to protection as personal information.

Background

The judgment is in relation to a dispute between an individual, Huang, a user of a book reading app named WeRead, and the digital giant, Tencent, the operator of the most successful social media in China, WeChat, and its sister app WeRead. The WeRead app wishes to set up an app-based reading community, where people who enjoy reading can read & connect. The plaintiff Huang was complaining that WeRead sneaked away her friends list from WeChat and then automatically turned those who are also subscribers of WeRead as her connections. Huang was also complaining that the information regarding the books she read and how she felt about the reading was widely open to all her connections without her permission while she intended to keep such information private. In its defense, the defendant Tencent alleged that users’ friends list and reading data were obtained with a preapproval from users therefore it should not be held liable for the utilization of the data.

Decision of Beijing Internet Court[1]

The Beijing Internet Court (hereinafter the “BIC”), the Court of First Instance, decides that Huang’s friends list and reading data shall not be categorized as private information, hence not eligible for privacy protection.

To define what constitutes private information, the BIC’s reasoning is based on the classification of the following three layers of personal information:

1.     personal information reasonably recognized by the society as private information, such as one’s sextual orientation, sex life, history of disease and unreleased criminal records, etc.

2.     personal information on which one may hold a defensive expectation or a utilization expectation; and

3.     general information that has no traits of privacy at all.

 

The BIC holds, because one’s friends list and reading data do not constitute private information as listed in layer 1 in the above classification, Tencent is not liable for invasion of the plaintiff’s privacy.

 

The BIC goes on to reason that one’s friends list and reading data shall be classified under layer 2 in the above classification, where the information is considered personal but not private and therefore the emphasis of protection is to give the data subject a right to decide whether to hide or to use such information.

 

The BIC further holds that in this case the plaintiff did not get the chance to decide how to deal with her personal information, because Tencent failed to give proper and transparent notices to the plaintiff and failed to obtain her affirmative consent before utilizing the information under dispute. The BIC then decides that Tencent should be held liable for violation of the plaintiff’s legitimate interests in her personal information. The BIC’s decision is majorly based on Article 43 of the Cybersecurity Law of China. [2]

Discussion

1.    What is Privacy?

According to Eric Hughes, an American mathematician, computer programmer, and cypherpunk, “Privacy is the power to selectively reveal oneself to the world.” [3] Broadly speaking, privacy is the right to be let alone, or freedom from interference or intrusion. Information privacy is the right to have some control over how your personal information is collected and used.[4]

 

The Civil Code of China (2021) defines privacy as peace in a person’s private life and the private space, private activities and private information that a person does not intend for others to know.[5]

 

As a governing law, the Civil Code’s definition of privacy is vague. As we know, privacy varies greatly from person to person: while one person may be comfortable with showing his or her diet recipe online, another person may be embarrassed to let others know how little (or how much) he or she eats over a meal. Similarly, while one person may be at ease with disclosing many details of his or her personal life to online social connections, another person may feel ashamed of posting anything personal on the internet. So exactly what kind of privacy does the Civil Code protect? Some guidance from a concurring opinion in a US Supreme Court decision might shed some light on this.

 

2.    Reasonable Expectation of Privacy

To define the right to privacy under the Fourth Amendment, [6]  the US Supreme Court Justice John Marshall Harlan, in his concurring opinion in Katz, [7]  formulated a “reasonable expectation of privacy” test. The test has two prongs:

1)     the person must exhibit an “actual (subjective) expectation of privacy”; and

2)     society recognizes the expectation as “reasonable.”

The Katz “reasonable expectation of privacy” test, while particularly useful in terms of defining privacy, also provokes further questions: what is reasonable? where to draw the line between “reasonable” expectation and expectation that is “unreasonable”? These questions matter hugely in today’s digital world, because every time a user creates a new account at an online platform, the user provides information with personal details, including name, birthdate, geographic location, and personal interests, etc. Users are entitled to know if they can have a “reasonable expectation of privacy” in such information and if such expectation could be respected by the platform.

 

3.    Exceptions to the Reasonable Expectation of Privacy

 

There are several recognized exceptions to the reasonable expectation of privacy, such as the Third-Party Doctrine, which means once an individual invests a third party with information, and voluntarily agrees to share information with a recipient, the individual loses any reasonable expectation of privacy in that information, [8] and the Voluntary consent Doctrine, which means individuals lose a reasonable expectation of privacy when they consent to a search of private information.[9]Other exceptions include the following: unlawful information is not protectable by the law and therefore there should be no reasonable expectation of privacy,[10] and public disclosure of private information will cause forfeiture of any reasonable expectation of privacy.[11]

 

4.    Where did the Court draw the Line?

 

The BIC obviously referenced the Katz test by reasoning that “the privateness in the information that one does not intend to disclose depends on a subjective intent, however, such subjective intent shall be reasonably recognized by the society.”

 

Then the BIC made the point that the information about one’s social relationship could only invoke reasonable expectation of privacy under the following circumstances: the relationship between the data subject and certain connections would be too intimate to let others know, or the disclosure of some social relationship would negatively affect the data subject’s social image.

 

With respect to the book reading data, the BIC made another similar point that one could only have reasonable expectation of privacy in one’s reading data if certain reading contents fall into some private and secret information region or the reading data, when generated at certain amounts, would reflect negatively on the data subject.

 

Then the BIC commented that the plaintiff’s online social relationship, i.e., the listed friends, is being identified by open-ID, profile and nickname, which should not show the real social relationship or the degree of intimacy between the plaintiff and her social connections. The BIC also went through the contents of the plaintiff’s reading data and found that neither of the two books displayed to her connections would cause any damage to the plaintiff’s social image. The plaintiff’s reading data therefore should not be categorized as private information, hence no reasonable privacy expectation in the data.

 

In a nutshell, the BIC was defining “reasonable expectation of privacy” in the digital world based on the content of certain information. If a piece of information contains nothing intimate or cannot reflect negatively on the data subject, then the data subject should not have a “reasonable expectation of privacy” in the information. The content-based approach is how the BIC drew the line between privacy and non-privacy related information.

 

5.    Content-based Approach is not Fair

 

The BIC’s views on this issue are deeply disturbing. Back to the definition of privacy, broadly speaking, privacy is the right to be “let alone”. It means when a person walks into an isolated space, the person could expect to be in a state in which one is not observed or disturbed by other people,[12] as long as nothing illegal is ongoing under the roof. By applying the Katz test, this person has a reasonable expectation of privacy because the person demonstrates a subjective expectation of privacy by “walking into the isolated space”, which is well recognized by the society as reasonable.  Furthermore, the person’s act does not fall into any of the aforesaid exceptions.

 

 In solitude, a decent citizen could expect the same degree of privacy as much as anyone would. The right to privacy does not depend on whether something shameful is being conducted inside that isolated space. The right to privacy does not depend on the activity happened inside. Instead, it depends on whether one’s demonstration of intent to be let alone could be accepted as reasonable by the society. However, under the content-based approach, a decent citizen would have less expectation of privacy than someone who conducts shameful behaviour in solitude, and this approach apparently leads to unfair results.

 

Here comes the digital world version of the above scenario. When an individual, like the plaintiff Huang, subscribes to open an account at an online platform, like WeRead, and secures it with a password, this would create an isolated space where this person could expect digital privacy. By applying the Katz test, this individual has a reasonable expectation of privacy as he or she demonstrates a subjective expectation of privacy by “creating a password-secured account”, which is well recognized by the society as reasonable.  Likewise, the person’s act does not fall into any of the aforesaid exceptions.

 

This person is fully entitled to assert a digital privacy right to be “let alone”. One can choose not to have any improper friends, and not to read any obscene books, but can still enjoy full privacy rights over one’s personal information. It literally means that being a decent netizen should not compromise one’s digital privacy rights. The content of the information stored in a password-secured account, if it is nothing unlawful, should not dictate if and how the person would enjoy the right to privacy.

 

The above scenario shows that the content-based approach taken by the BIC is not fair because it makes users’ digital privacy rights conditional on the content of personal information, i.e., if the information includes any embarrassing content or not. This approach leads to the unfair conclusion that being a decent netizen, one has nothing shameful to hide and therefore would not have reasonable expectation of digital privacy.

 

Conclusion

 

With the storage and processing of exabytes of data, social media users’ concerns about their privacy have been on the rise in recent years. Incidents of illegal use of data and data breaches have alerted many users and caused them to reconsider their interaction with social media and the security of their personal data.

The disputes caused by unauthorized use of personal information over the internet have spiked in the privacy law landscape. The Beijing Internet Court’s present decision, which echoes with the same court’s decision on the “Dou Yin (Tik Tok Chinese version) collection of personal information” case, [13] is among the first few decisions made by Chinese courts on this controversial issue. Significantly, the decision might impact ongoing litigation stemming from similar disputes. Other courts around the country might follow suit. Therefore, it is imperative to have a more clear and fair approach towards defining reasonable digital privacy expectation.

In the era of big data, defining privacy is under pressure in the digital world. As Bill Gates put it: “whether it’s digital cameras or satellites or just what you click on, we need to have more explicit rules — not just for governments but for private companies.” [14]

 

 




[1] Beijing Internet Court, (2019) Jing 0491Min Chu Zi No. 16142.

[2]  China Cybersecurity Law, Article 43, provides, “Where an individual finds that any network operator collects or uses his or her personal information in violation of the provisions of any law, administrative regulation or the agreement of both parties, the individual shall be entitled to request the network operator to delete his or her personal information. If the individual finds that his or her personal information collected or stored by the network operator has any error, he or she shall be entitled to request the network operator to make corrections. The network operator shall take measures to delete the information or correct the error.”

[3] Eric Hughes, The Cypherpunk Manifesto (1993), see https://www.activism.net/cypherpunk/manifesto.html.

[4] See https://iapp.org/about/what-is-privacy/.

[5] Article 1032, China Civil Code (2021).

[6] The Fourth Amendment of the US Constitution, ratified on December 15, 1791, protects the right of people “to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures.”

[7]See Katz v. United States, 389 U.S. 347 (1967). Concurring opinion written by Justice Harlan.

[8] See Smith v. Maryland, 442 U.S. 735, 743-44 (1979).

[9] See Katz v. United States, 389 U.S. 347 (1967).

[10] See https://civillaw.com.cn/bo/t/?id=37410.

[11] Ibid.

[12] See https://www.igi-global.com/dictionary/privacy-data-protection-towards-elderly/23405.

[13]See Beijing Internet Court, (2019) Jing 0491Min Chu Zi No. 6694.

[14] See https://www.oipc.bc.ca/news/quote-of-the-day-bill-gates/.


  • 相关资讯 More
  • 点击次数: 1000003
    2025 - 09 - 26
    作者:王辉 在员工严重失职给公司造成重大损失时,公司能否依据《劳动合同法》第三十九条解除劳动合同?公司解雇行为属于合法维权还是违法侵权?司法实践中,公司胜诉与败诉的案例皆不鲜见。下文就结合司法案例,从公司合法解除与违法解除两个视角剖析其中关键。一、实务案例◆案例1  合法解除 (参见(2022)京0105民初16489号判决书)原告张某与某顾问公司分别于2007年12月24日、2010年1月1日、2013年1月1日签订劳动合同,2015年3月1日张某与北京某人力资源有限公司签订劳动合同,张某与上述案外公司签订劳动合同后均派遣至被告某公司工作。2021年1月1日原告与被告某公司签订无固定期限劳动合同,从事销售岗位。2021年3月26日,被告某公司以原告张某从事货品职务,因工作失误造成北京某零售部订货损失870件,价值375354元为由解除与张某签署的劳动合同。后张某向北京市朝阳区劳动人事争议仲裁委员会提出仲裁申请,朝阳仲裁委作出京朝劳人仲字[2021]第18715号裁决书,驳回张某的全部仲裁请求。张某不服,诉至法院。被告某公司为证明其解雇行为合法提交了《员工违纪过失单》、邮件截屏、微信聊天记录截屏、损失明细表、《零售员工手册》、征求意见函、通知工会函。《员工违纪过失单》载明:“违纪人姓名:张某;违纪时间:2021年3月25日;违纪经过:工作失误导致某零售公司订货损失870件金额375354元。违反的规定条款:条款原文:丙类(严重)过失行为:由于管理不当、工作失误或玩忽职守或其他个人原因,造成人身伤害或公司财产损失人民币500元以上。”员工签字处显示张某姓名签字,落款日期为2021年3月26日。微信聊天记录截屏显示时间为“星期四12:40”的信息内容:“某某今天有补货,邮件转给你了,销售好款保证店铺两周周转,从开始到导完单告诉我用了多长时间。”张某回复:“好...
  • 点击次数: 1000003
    2025 - 08 - 22
    作者:刘艳玲随着科技的发展,越来越多的发明不再局限于单一技术领域,而是跨越多个技术领域形成创新,这种跨领域的技术创新会产生全新的商业价值和应用场景。先来假设一个场景,假如你或你的团队深耕大健康产业,你们注意到中医理疗市场2019年规模达2920亿元,到了2023年市场规模已经初步统计超过7000亿元,未来增长空间巨大,因此希望在中医理疗市场拓展业务。相比于传统的针灸、推拿、艾灸、拔罐和刮痧等保健方法,你们想结合现代技术提供有市场竞争力的产品和服务。人工智能技术和机器人技术是未来的发展方向,因此看好与电或磁相关的中医理疗产品和服务。上面描述的这类技术创新就涉及多个技术领域,需要了解甚至掌握中医、信息通信技术(ICT)和机械设计等相关知识才能实现创新,很明显这需要团队合作,因为一个人甚至一个团队不可能具备这么多技术领域的知识储备。而且,可能还需要能提供相应技术和/或产品部件的外部供应商支持。通常来说,技术专家大多熟悉的是自己从事的技术领域的最新发展,较少了解其他领域的技术及其发展,希望横跨多个技术领域进行研发创新并商业化落地,那么熟悉专利检索和分析是非常必要的。下面以这个场景为例来介绍专利检索和分析。 第一步,学习和了解业务方向的技术和市场发展情况,确定专利检索主题。随着医学的发展,现代科学已发现生物电和人体细胞、血液、经络和神经都有关系。中医讲究气血循环、经络畅通,气血之“气”为人体之“电气”,即人体生物电。经络是导电的,也即“电气”会循着人体经络流动。因此,将专利检索主题初步确定为利用电技术作用于人体经络实现理疗的发明创新。第二步,进行初步专利检索尝试。这里我们选择国家知识产权局提供的公共专利检索数据库https://pss-system.cponline.cnipa.gov.cn/conventionalSearch为例进行说明,当然你也可以选择其他免费或收费的商业专利数...
  • 点击次数: 1000003
    2025 - 08 - 15
    作者:张琳自我国上世纪80年代开始推行社会保险制度、到90年代各地陆续实施了社会保险制度以来,存在大量用人单位未为劳动者缴纳社会保险的情况。很多劳动者当时并未意识到社会保险的意义和价值,同时每月还可以多到手一些工资,因此并未对此提出质疑。随着人们法律意识的增强,许多劳动者开始认识到了社会保险在养老、看病等方面的价值,开始运用法律武器维护自身的权益。特别是将于2025年9月1日生效的《最高人民法院关于审理劳动争议案件适用法律问题的解释(二)》再一次将社保问题推到了风口浪尖。劳动者社保维权的其中一种方式是向社保部门投诉要求用人单位补缴在职期间的社会保险。但是,如果劳动者无法提供与用人单位的劳动合同,社保部门就无法认定双方之间存在劳动关系,进而无法启动社会保险稽核程序。在这种情况下,劳动者就需要先通过劳动仲裁/诉讼程序确认其与用人单位之间存在劳动关系,之后再带着确认双方劳动关系的裁决书/判决书向社保部门投诉。但是,由于有些劳动者已离职多年,时过境迁,有些用人单位已经注销了,这种情况下劳动者还能否通过劳动仲裁/诉讼主张确认劳动关系?把谁作为被申请人/被告?确认与谁存在劳动关系?这种确认劳动关系之诉是否受仲裁时效或诉讼时效的限制?确认劳动关系后还能否向社保部门投诉要求补缴社保?鉴于我国各地经常就劳动争议和社保等问题出台地方性法规、政府规章、司法文件、规范性文件等,各地劳动仲裁机构和人民法院基于对现有劳动相关法律的理解不一致和地方规定的不一致在同类劳动争议案件中往往作出不一致的裁判结果,本文引用北京的两个案例对上述问题进行分析和讨论,仅供大家参考。 一、案例简介案例一:邢某与某红公司劳动争议案件(参见北京市朝阳区人民法院(2022)京0105民初75494号民事判决书、北京市第三中级人民法院(2024)京03民终9047号民事判决书)邢某于1983年8月1日至1984年3月3...
  • 点击次数: 1000004
    2025 - 08 - 08
    作者:金涟伊《中华人民共和国商标法》(以下简称“商标法”)第三十条规定:“申请注册的商标,凡不符合本法有关规定或者同他人在同一种商品或者类似商品上已经注册的或者初步审定的商标相同或者近似的,由商标局驳回申请,不予公告。” 该法条是商标审查实践中判断商标是否应予核准注册的重要法律依据。 尽管该条款本身并未出现“混淆”二字,但《最高人民法院关于审理商标民事纠纷案件适用法律若干问题的解释》及《北京市高级人民法院商标授权确权行政案件审理指南》等配套规范,已将“容易导致混淆”确立为独立的评判要件。司法实践中,法院援引本条时,除审查标识是否“相同或近似”、商品是否“同一种或类似”外,还需进一步评估是否存在混淆可能。本文拟以某公司诉国家知识产权局商标驳回复审行政纠纷一案为切入点,探析《商标法》第三十条中“混淆可能性”的认定尺度与适用逻辑。 一、《商标法》第30条规定与混淆 现行《商标法》明文提及“混淆”的法条只有3条,即第13条对驰名商标的保护条款、第42条关于转让的条款,以及第57条关于侵犯注册商标专用权的条款。但在商标相关司法解释、部门规章等法规中,“混淆”是商标法第30条认定商标近似的重要判断依据。 2010年《最高人民法院关于审理商标授权确权行政案件若干问题的规定》第16条规定,人民法院认定商标是否近似,既要考虑商标标志构成要素及其整体的近似程度,也要考虑相关商标的显著性和知名度、所使用商品的关联程度等因素,以是否容易导致混淆作为判断标准。 而2019年北京市高级人民法院发布的《商标授权确权行政案件审理指南》第15条进一步明确了,“适用商标法第三十条、第三十一条时,可以综合考虑商标标志的近似程度、商品的类似程度、引证商标的显著性和知名度、相关公众的注意程度以及诉争商标申请人的主观意图等因素,以及前述因素之间的相互影响,以是否容易造...
× 扫一扫,关注微信公众号
铭盾MiNGDUN www.mdlaw.cn
Copyright© 2008 - 2025 铭盾京ICP备09063742号-1犀牛云提供企业云服务
X
1

QQ设置

3

SKYPE 设置

4

阿里旺旺设置

5

电话号码管理

6

二维码管理

展开