Language

The WeRead Case: Discussion on Reasonable Digital Privacy Expectation

Authored by Yingying Zhu

 

March 2021

Each of us leaves a lasting digital footprint on the internet and would expect businesses that we are dealing with could treat our digital privacy with reasonable care and consideration. Can users have a reasonable privacy expectation in the friends made and the books read online? The Beijing Internet Court in its recently released WeRead judgment holds that, friends list and reading data are not eligible for privacy protection in the case under dispute but nevertheless entitled to protection as personal information.

Background

The judgment is in relation to a dispute between an individual, Huang, a user of a book reading app named WeRead, and the digital giant, Tencent, the operator of the most successful social media in China, WeChat, and its sister app WeRead. The WeRead app wishes to set up an app-based reading community, where people who enjoy reading can read & connect. The plaintiff Huang was complaining that WeRead sneaked away her friends list from WeChat and then automatically turned those who are also subscribers of WeRead as her connections. Huang was also complaining that the information regarding the books she read and how she felt about the reading was widely open to all her connections without her permission while she intended to keep such information private. In its defense, the defendant Tencent alleged that users’ friends list and reading data were obtained with a preapproval from users therefore it should not be held liable for the utilization of the data.

Decision of Beijing Internet Court[1]

The Beijing Internet Court (hereinafter the “BIC”), the Court of First Instance, decides that Huang’s friends list and reading data shall not be categorized as private information, hence not eligible for privacy protection.

To define what constitutes private information, the BIC’s reasoning is based on the classification of the following three layers of personal information:

1.     personal information reasonably recognized by the society as private information, such as one’s sextual orientation, sex life, history of disease and unreleased criminal records, etc.

2.     personal information on which one may hold a defensive expectation or a utilization expectation; and

3.     general information that has no traits of privacy at all.

 

The BIC holds, because one’s friends list and reading data do not constitute private information as listed in layer 1 in the above classification, Tencent is not liable for invasion of the plaintiff’s privacy.

 

The BIC goes on to reason that one’s friends list and reading data shall be classified under layer 2 in the above classification, where the information is considered personal but not private and therefore the emphasis of protection is to give the data subject a right to decide whether to hide or to use such information.

 

The BIC further holds that in this case the plaintiff did not get the chance to decide how to deal with her personal information, because Tencent failed to give proper and transparent notices to the plaintiff and failed to obtain her affirmative consent before utilizing the information under dispute. The BIC then decides that Tencent should be held liable for violation of the plaintiff’s legitimate interests in her personal information. The BIC’s decision is majorly based on Article 43 of the Cybersecurity Law of China. [2]

Discussion

1.    What is Privacy?

According to Eric Hughes, an American mathematician, computer programmer, and cypherpunk, “Privacy is the power to selectively reveal oneself to the world.” [3] Broadly speaking, privacy is the right to be let alone, or freedom from interference or intrusion. Information privacy is the right to have some control over how your personal information is collected and used.[4]

 

The Civil Code of China (2021) defines privacy as peace in a person’s private life and the private space, private activities and private information that a person does not intend for others to know.[5]

 

As a governing law, the Civil Code’s definition of privacy is vague. As we know, privacy varies greatly from person to person: while one person may be comfortable with showing his or her diet recipe online, another person may be embarrassed to let others know how little (or how much) he or she eats over a meal. Similarly, while one person may be at ease with disclosing many details of his or her personal life to online social connections, another person may feel ashamed of posting anything personal on the internet. So exactly what kind of privacy does the Civil Code protect? Some guidance from a concurring opinion in a US Supreme Court decision might shed some light on this.

 

2.    Reasonable Expectation of Privacy

To define the right to privacy under the Fourth Amendment, [6]  the US Supreme Court Justice John Marshall Harlan, in his concurring opinion in Katz, [7]  formulated a “reasonable expectation of privacy” test. The test has two prongs:

1)     the person must exhibit an “actual (subjective) expectation of privacy”; and

2)     society recognizes the expectation as “reasonable.”

The Katz “reasonable expectation of privacy” test, while particularly useful in terms of defining privacy, also provokes further questions: what is reasonable? where to draw the line between “reasonable” expectation and expectation that is “unreasonable”? These questions matter hugely in today’s digital world, because every time a user creates a new account at an online platform, the user provides information with personal details, including name, birthdate, geographic location, and personal interests, etc. Users are entitled to know if they can have a “reasonable expectation of privacy” in such information and if such expectation could be respected by the platform.

 

3.    Exceptions to the Reasonable Expectation of Privacy

 

There are several recognized exceptions to the reasonable expectation of privacy, such as the Third-Party Doctrine, which means once an individual invests a third party with information, and voluntarily agrees to share information with a recipient, the individual loses any reasonable expectation of privacy in that information, [8] and the Voluntary consent Doctrine, which means individuals lose a reasonable expectation of privacy when they consent to a search of private information.[9]Other exceptions include the following: unlawful information is not protectable by the law and therefore there should be no reasonable expectation of privacy,[10] and public disclosure of private information will cause forfeiture of any reasonable expectation of privacy.[11]

 

4.    Where did the Court draw the Line?

 

The BIC obviously referenced the Katz test by reasoning that “the privateness in the information that one does not intend to disclose depends on a subjective intent, however, such subjective intent shall be reasonably recognized by the society.”

 

Then the BIC made the point that the information about one’s social relationship could only invoke reasonable expectation of privacy under the following circumstances: the relationship between the data subject and certain connections would be too intimate to let others know, or the disclosure of some social relationship would negatively affect the data subject’s social image.

 

With respect to the book reading data, the BIC made another similar point that one could only have reasonable expectation of privacy in one’s reading data if certain reading contents fall into some private and secret information region or the reading data, when generated at certain amounts, would reflect negatively on the data subject.

 

Then the BIC commented that the plaintiff’s online social relationship, i.e., the listed friends, is being identified by open-ID, profile and nickname, which should not show the real social relationship or the degree of intimacy between the plaintiff and her social connections. The BIC also went through the contents of the plaintiff’s reading data and found that neither of the two books displayed to her connections would cause any damage to the plaintiff’s social image. The plaintiff’s reading data therefore should not be categorized as private information, hence no reasonable privacy expectation in the data.

 

In a nutshell, the BIC was defining “reasonable expectation of privacy” in the digital world based on the content of certain information. If a piece of information contains nothing intimate or cannot reflect negatively on the data subject, then the data subject should not have a “reasonable expectation of privacy” in the information. The content-based approach is how the BIC drew the line between privacy and non-privacy related information.

 

5.    Content-based Approach is not Fair

 

The BIC’s views on this issue are deeply disturbing. Back to the definition of privacy, broadly speaking, privacy is the right to be “let alone”. It means when a person walks into an isolated space, the person could expect to be in a state in which one is not observed or disturbed by other people,[12] as long as nothing illegal is ongoing under the roof. By applying the Katz test, this person has a reasonable expectation of privacy because the person demonstrates a subjective expectation of privacy by “walking into the isolated space”, which is well recognized by the society as reasonable.  Furthermore, the person’s act does not fall into any of the aforesaid exceptions.

 

 In solitude, a decent citizen could expect the same degree of privacy as much as anyone would. The right to privacy does not depend on whether something shameful is being conducted inside that isolated space. The right to privacy does not depend on the activity happened inside. Instead, it depends on whether one’s demonstration of intent to be let alone could be accepted as reasonable by the society. However, under the content-based approach, a decent citizen would have less expectation of privacy than someone who conducts shameful behaviour in solitude, and this approach apparently leads to unfair results.

 

Here comes the digital world version of the above scenario. When an individual, like the plaintiff Huang, subscribes to open an account at an online platform, like WeRead, and secures it with a password, this would create an isolated space where this person could expect digital privacy. By applying the Katz test, this individual has a reasonable expectation of privacy as he or she demonstrates a subjective expectation of privacy by “creating a password-secured account”, which is well recognized by the society as reasonable.  Likewise, the person’s act does not fall into any of the aforesaid exceptions.

 

This person is fully entitled to assert a digital privacy right to be “let alone”. One can choose not to have any improper friends, and not to read any obscene books, but can still enjoy full privacy rights over one’s personal information. It literally means that being a decent netizen should not compromise one’s digital privacy rights. The content of the information stored in a password-secured account, if it is nothing unlawful, should not dictate if and how the person would enjoy the right to privacy.

 

The above scenario shows that the content-based approach taken by the BIC is not fair because it makes users’ digital privacy rights conditional on the content of personal information, i.e., if the information includes any embarrassing content or not. This approach leads to the unfair conclusion that being a decent netizen, one has nothing shameful to hide and therefore would not have reasonable expectation of digital privacy.

 

Conclusion

 

With the storage and processing of exabytes of data, social media users’ concerns about their privacy have been on the rise in recent years. Incidents of illegal use of data and data breaches have alerted many users and caused them to reconsider their interaction with social media and the security of their personal data.

The disputes caused by unauthorized use of personal information over the internet have spiked in the privacy law landscape. The Beijing Internet Court’s present decision, which echoes with the same court’s decision on the “Dou Yin (Tik Tok Chinese version) collection of personal information” case, [13] is among the first few decisions made by Chinese courts on this controversial issue. Significantly, the decision might impact ongoing litigation stemming from similar disputes. Other courts around the country might follow suit. Therefore, it is imperative to have a more clear and fair approach towards defining reasonable digital privacy expectation.

In the era of big data, defining privacy is under pressure in the digital world. As Bill Gates put it: “whether it’s digital cameras or satellites or just what you click on, we need to have more explicit rules — not just for governments but for private companies.” [14]

 

 




[1] Beijing Internet Court, (2019) Jing 0491Min Chu Zi No. 16142.

[2]  China Cybersecurity Law, Article 43, provides, “Where an individual finds that any network operator collects or uses his or her personal information in violation of the provisions of any law, administrative regulation or the agreement of both parties, the individual shall be entitled to request the network operator to delete his or her personal information. If the individual finds that his or her personal information collected or stored by the network operator has any error, he or she shall be entitled to request the network operator to make corrections. The network operator shall take measures to delete the information or correct the error.”

[3] Eric Hughes, The Cypherpunk Manifesto (1993), see https://www.activism.net/cypherpunk/manifesto.html.

[4] See https://iapp.org/about/what-is-privacy/.

[5] Article 1032, China Civil Code (2021).

[6] The Fourth Amendment of the US Constitution, ratified on December 15, 1791, protects the right of people “to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures.”

[7]See Katz v. United States, 389 U.S. 347 (1967). Concurring opinion written by Justice Harlan.

[8] See Smith v. Maryland, 442 U.S. 735, 743-44 (1979).

[9] See Katz v. United States, 389 U.S. 347 (1967).

[10] See https://civillaw.com.cn/bo/t/?id=37410.

[11] Ibid.

[12] See https://www.igi-global.com/dictionary/privacy-data-protection-towards-elderly/23405.

[13]See Beijing Internet Court, (2019) Jing 0491Min Chu Zi No. 6694.

[14] See https://www.oipc.bc.ca/news/quote-of-the-day-bill-gates/.


  • 相关资讯 More
  • 点击次数: 1000001
    2025 - 08 - 22
    作者:刘艳玲随着科技的发展,越来越多的发明不再局限于单一技术领域,而是跨越多个技术领域形成创新,这种跨领域的技术创新会产生全新的商业价值和应用场景。先来假设一个场景,假如你或你的团队深耕大健康产业,你们注意到中医理疗市场2019年规模达2920亿元,到了2023年市场规模已经初步统计超过7000亿元,未来增长空间巨大,因此希望在中医理疗市场拓展业务。相比于传统的针灸、推拿、艾灸、拔罐和刮痧等保健方法,你们想结合现代技术提供有市场竞争力的产品和服务。人工智能技术和机器人技术是未来的发展方向,因此看好与电或磁相关的中医理疗产品和服务。上面描述的这类技术创新就涉及多个技术领域,需要了解甚至掌握中医、信息通信技术(ICT)和机械设计等相关知识才能实现创新,很明显这需要团队合作,因为一个人甚至一个团队不可能具备这么多技术领域的知识储备。而且,可能还需要能提供相应技术和/或产品部件的外部供应商支持。通常来说,技术专家大多熟悉的是自己从事的技术领域的最新发展,较少了解其他领域的技术及其发展,希望横跨多个技术领域进行研发创新并商业化落地,那么熟悉专利检索和分析是非常必要的。下面以这个场景为例来介绍专利检索和分析。 第一步,学习和了解业务方向的技术和市场发展情况,确定专利检索主题。随着医学的发展,现代科学已发现生物电和人体细胞、血液、经络和神经都有关系。中医讲究气血循环、经络畅通,气血之“气”为人体之“电气”,即人体生物电。经络是导电的,也即“电气”会循着人体经络流动。因此,将专利检索主题初步确定为利用电技术作用于人体经络实现理疗的发明创新。第二步,进行初步专利检索尝试。这里我们选择国家知识产权局提供的公共专利检索数据库https://pss-system.cponline.cnipa.gov.cn/conventionalSearch为例进行说明,当然你也可以选择其他免费或收费的商业专利数...
  • 点击次数: 1000000
    2025 - 08 - 15
    作者:张琳自我国上世纪80年代开始推行社会保险制度、到90年代各地陆续实施了社会保险制度以来,存在大量用人单位未为劳动者缴纳社会保险的情况。很多劳动者当时并未意识到社会保险的意义和价值,同时每月还可以多到手一些工资,因此并未对此提出质疑。随着人们法律意识的增强,许多劳动者开始认识到了社会保险在养老、看病等方面的价值,开始运用法律武器维护自身的权益。特别是将于2025年9月1日生效的《最高人民法院关于审理劳动争议案件适用法律问题的解释(二)》再一次将社保问题推到了风口浪尖。劳动者社保维权的其中一种方式是向社保部门投诉要求用人单位补缴在职期间的社会保险。但是,如果劳动者无法提供与用人单位的劳动合同,社保部门就无法认定双方之间存在劳动关系,进而无法启动社会保险稽核程序。在这种情况下,劳动者就需要先通过劳动仲裁/诉讼程序确认其与用人单位之间存在劳动关系,之后再带着确认双方劳动关系的裁决书/判决书向社保部门投诉。但是,由于有些劳动者已离职多年,时过境迁,有些用人单位已经注销了,这种情况下劳动者还能否通过劳动仲裁/诉讼主张确认劳动关系?把谁作为被申请人/被告?确认与谁存在劳动关系?这种确认劳动关系之诉是否受仲裁时效或诉讼时效的限制?确认劳动关系后还能否向社保部门投诉要求补缴社保?鉴于我国各地经常就劳动争议和社保等问题出台地方性法规、政府规章、司法文件、规范性文件等,各地劳动仲裁机构和人民法院基于对现有劳动相关法律的理解不一致和地方规定的不一致在同类劳动争议案件中往往作出不一致的裁判结果,本文引用北京的两个案例对上述问题进行分析和讨论,仅供大家参考。 一、案例简介案例一:邢某与某红公司劳动争议案件(参见北京市朝阳区人民法院(2022)京0105民初75494号民事判决书、北京市第三中级人民法院(2024)京03民终9047号民事判决书)邢某于1983年8月1日至1984年3月3...
  • 点击次数: 1000003
    2025 - 08 - 08
    作者:金涟伊《中华人民共和国商标法》(以下简称“商标法”)第三十条规定:“申请注册的商标,凡不符合本法有关规定或者同他人在同一种商品或者类似商品上已经注册的或者初步审定的商标相同或者近似的,由商标局驳回申请,不予公告。” 该法条是商标审查实践中判断商标是否应予核准注册的重要法律依据。 尽管该条款本身并未出现“混淆”二字,但《最高人民法院关于审理商标民事纠纷案件适用法律若干问题的解释》及《北京市高级人民法院商标授权确权行政案件审理指南》等配套规范,已将“容易导致混淆”确立为独立的评判要件。司法实践中,法院援引本条时,除审查标识是否“相同或近似”、商品是否“同一种或类似”外,还需进一步评估是否存在混淆可能。本文拟以某公司诉国家知识产权局商标驳回复审行政纠纷一案为切入点,探析《商标法》第三十条中“混淆可能性”的认定尺度与适用逻辑。 一、《商标法》第30条规定与混淆 现行《商标法》明文提及“混淆”的法条只有3条,即第13条对驰名商标的保护条款、第42条关于转让的条款,以及第57条关于侵犯注册商标专用权的条款。但在商标相关司法解释、部门规章等法规中,“混淆”是商标法第30条认定商标近似的重要判断依据。 2010年《最高人民法院关于审理商标授权确权行政案件若干问题的规定》第16条规定,人民法院认定商标是否近似,既要考虑商标标志构成要素及其整体的近似程度,也要考虑相关商标的显著性和知名度、所使用商品的关联程度等因素,以是否容易导致混淆作为判断标准。 而2019年北京市高级人民法院发布的《商标授权确权行政案件审理指南》第15条进一步明确了,“适用商标法第三十条、第三十一条时,可以综合考虑商标标志的近似程度、商品的类似程度、引证商标的显著性和知名度、相关公众的注意程度以及诉争商标申请人的主观意图等因素,以及前述因素之间的相互影响,以是否容易造...
  • 点击次数: 1000009
    2025 - 07 - 25
    作者:陈巴特运输毒品罪指的是在中国境内,通过携带、邮寄、利用他人或交通工具等方式,将毒品从一地转移到另一地。该罪行具体表现为改变毒品的所在地。作为毒品犯罪链条中的重要环节,运输毒品的行为为毒品的流通提供了条件,加剧了毒品的泛滥,不仅严重危害公民的身心健康,还可能导致社会治安问题频发,甚至关系民族兴衰、国家安危。从社会危害性来看,运输毒品罪无疑属于性质恶劣的犯罪类型。因此,厉行禁毒、依法严厉打击包括运输毒品犯罪在内的毒品犯罪,是党和政府的一贯立场和主张。【基本案情】王某和妻子均是执业药师,且一同就职于中部某市中心医院药房。与药品药材打交道,成为夫妻二人日常工作。幸福的家庭,稳定的工作,较高的收入,在这个三线城市,二人简直是大多数人“羡慕嫉妒恨”的对象。然而,天有不测风云,正是这份职业以及优越的生活,加之王某为人厚道、乐于助人的性格,给王某带来牢狱之灾,给家人生活长期蒙上巨大阴影。2021年9月某天,王某的一个普通朋友范某来电话,称因治病需要,其从西南某市购进一箱中药,想让王某率先看一看药材真假好坏,让王某提供医院的地址,用于接收从西南某市邮寄过来的中药。王某未加思索便同意并提供了地址。几天后,范某再次致电王某,称中药包裹已到医院收发室,收件人为“贾某”,收件电话尾号为“XXXX”,让王某帮忙取一下。王某仍然没有过多考虑,大摇大摆地去医院收发室取包裹。在医院收发室,一个并非收发室工作人员的陌生男子简单询问后,将一个纸箱包裹交给王某。王某抱着包裹就往外走,没走几米,感觉很不对劲儿:收发室的人他都认识啊,今天怎么是一个说着普通话的陌生人将包裹交给他?又想到范某吸毒,曾经引诱过自己吸毒,难道包裹里……简直不敢往下想!但王某也不能确定包裹里到底是什么,于是将包裹放在一旁,抽上烟,静观其变。很快,几名陌生人向王某围过来,简单询问后,便亮出“真家伙”将王某铐住,将其带至当地公安机关讯问。在...
× 扫一扫,关注微信公众号
铭盾MiNGDUN www.mdlaw.cn
Copyright© 2008 - 2025 铭盾京ICP备09063742号-1犀牛云提供企业云服务
X
1

QQ设置

3

SKYPE 设置

4

阿里旺旺设置

5

电话号码管理

6

二维码管理

展开