Language

The WeRead Case: Discussion on Reasonable Digital Privacy Expectation

Authored by Yingying Zhu

 

March 2021

Each of us leaves a lasting digital footprint on the internet and would expect businesses that we are dealing with could treat our digital privacy with reasonable care and consideration. Can users have a reasonable privacy expectation in the friends made and the books read online? The Beijing Internet Court in its recently released WeRead judgment holds that, friends list and reading data are not eligible for privacy protection in the case under dispute but nevertheless entitled to protection as personal information.

Background

The judgment is in relation to a dispute between an individual, Huang, a user of a book reading app named WeRead, and the digital giant, Tencent, the operator of the most successful social media in China, WeChat, and its sister app WeRead. The WeRead app wishes to set up an app-based reading community, where people who enjoy reading can read & connect. The plaintiff Huang was complaining that WeRead sneaked away her friends list from WeChat and then automatically turned those who are also subscribers of WeRead as her connections. Huang was also complaining that the information regarding the books she read and how she felt about the reading was widely open to all her connections without her permission while she intended to keep such information private. In its defense, the defendant Tencent alleged that users’ friends list and reading data were obtained with a preapproval from users therefore it should not be held liable for the utilization of the data.

Decision of Beijing Internet Court[1]

The Beijing Internet Court (hereinafter the “BIC”), the Court of First Instance, decides that Huang’s friends list and reading data shall not be categorized as private information, hence not eligible for privacy protection.

To define what constitutes private information, the BIC’s reasoning is based on the classification of the following three layers of personal information:

1.     personal information reasonably recognized by the society as private information, such as one’s sextual orientation, sex life, history of disease and unreleased criminal records, etc.

2.     personal information on which one may hold a defensive expectation or a utilization expectation; and

3.     general information that has no traits of privacy at all.

 

The BIC holds, because one’s friends list and reading data do not constitute private information as listed in layer 1 in the above classification, Tencent is not liable for invasion of the plaintiff’s privacy.

 

The BIC goes on to reason that one’s friends list and reading data shall be classified under layer 2 in the above classification, where the information is considered personal but not private and therefore the emphasis of protection is to give the data subject a right to decide whether to hide or to use such information.

 

The BIC further holds that in this case the plaintiff did not get the chance to decide how to deal with her personal information, because Tencent failed to give proper and transparent notices to the plaintiff and failed to obtain her affirmative consent before utilizing the information under dispute. The BIC then decides that Tencent should be held liable for violation of the plaintiff’s legitimate interests in her personal information. The BIC’s decision is majorly based on Article 43 of the Cybersecurity Law of China. [2]

Discussion

1.    What is Privacy?

According to Eric Hughes, an American mathematician, computer programmer, and cypherpunk, “Privacy is the power to selectively reveal oneself to the world.” [3] Broadly speaking, privacy is the right to be let alone, or freedom from interference or intrusion. Information privacy is the right to have some control over how your personal information is collected and used.[4]

 

The Civil Code of China (2021) defines privacy as peace in a person’s private life and the private space, private activities and private information that a person does not intend for others to know.[5]

 

As a governing law, the Civil Code’s definition of privacy is vague. As we know, privacy varies greatly from person to person: while one person may be comfortable with showing his or her diet recipe online, another person may be embarrassed to let others know how little (or how much) he or she eats over a meal. Similarly, while one person may be at ease with disclosing many details of his or her personal life to online social connections, another person may feel ashamed of posting anything personal on the internet. So exactly what kind of privacy does the Civil Code protect? Some guidance from a concurring opinion in a US Supreme Court decision might shed some light on this.

 

2.    Reasonable Expectation of Privacy

To define the right to privacy under the Fourth Amendment, [6]  the US Supreme Court Justice John Marshall Harlan, in his concurring opinion in Katz, [7]  formulated a “reasonable expectation of privacy” test. The test has two prongs:

1)     the person must exhibit an “actual (subjective) expectation of privacy”; and

2)     society recognizes the expectation as “reasonable.”

The Katz “reasonable expectation of privacy” test, while particularly useful in terms of defining privacy, also provokes further questions: what is reasonable? where to draw the line between “reasonable” expectation and expectation that is “unreasonable”? These questions matter hugely in today’s digital world, because every time a user creates a new account at an online platform, the user provides information with personal details, including name, birthdate, geographic location, and personal interests, etc. Users are entitled to know if they can have a “reasonable expectation of privacy” in such information and if such expectation could be respected by the platform.

 

3.    Exceptions to the Reasonable Expectation of Privacy

 

There are several recognized exceptions to the reasonable expectation of privacy, such as the Third-Party Doctrine, which means once an individual invests a third party with information, and voluntarily agrees to share information with a recipient, the individual loses any reasonable expectation of privacy in that information, [8] and the Voluntary consent Doctrine, which means individuals lose a reasonable expectation of privacy when they consent to a search of private information.[9]Other exceptions include the following: unlawful information is not protectable by the law and therefore there should be no reasonable expectation of privacy,[10] and public disclosure of private information will cause forfeiture of any reasonable expectation of privacy.[11]

 

4.    Where did the Court draw the Line?

 

The BIC obviously referenced the Katz test by reasoning that “the privateness in the information that one does not intend to disclose depends on a subjective intent, however, such subjective intent shall be reasonably recognized by the society.”

 

Then the BIC made the point that the information about one’s social relationship could only invoke reasonable expectation of privacy under the following circumstances: the relationship between the data subject and certain connections would be too intimate to let others know, or the disclosure of some social relationship would negatively affect the data subject’s social image.

 

With respect to the book reading data, the BIC made another similar point that one could only have reasonable expectation of privacy in one’s reading data if certain reading contents fall into some private and secret information region or the reading data, when generated at certain amounts, would reflect negatively on the data subject.

 

Then the BIC commented that the plaintiff’s online social relationship, i.e., the listed friends, is being identified by open-ID, profile and nickname, which should not show the real social relationship or the degree of intimacy between the plaintiff and her social connections. The BIC also went through the contents of the plaintiff’s reading data and found that neither of the two books displayed to her connections would cause any damage to the plaintiff’s social image. The plaintiff’s reading data therefore should not be categorized as private information, hence no reasonable privacy expectation in the data.

 

In a nutshell, the BIC was defining “reasonable expectation of privacy” in the digital world based on the content of certain information. If a piece of information contains nothing intimate or cannot reflect negatively on the data subject, then the data subject should not have a “reasonable expectation of privacy” in the information. The content-based approach is how the BIC drew the line between privacy and non-privacy related information.

 

5.    Content-based Approach is not Fair

 

The BIC’s views on this issue are deeply disturbing. Back to the definition of privacy, broadly speaking, privacy is the right to be “let alone”. It means when a person walks into an isolated space, the person could expect to be in a state in which one is not observed or disturbed by other people,[12] as long as nothing illegal is ongoing under the roof. By applying the Katz test, this person has a reasonable expectation of privacy because the person demonstrates a subjective expectation of privacy by “walking into the isolated space”, which is well recognized by the society as reasonable.  Furthermore, the person’s act does not fall into any of the aforesaid exceptions.

 

 In solitude, a decent citizen could expect the same degree of privacy as much as anyone would. The right to privacy does not depend on whether something shameful is being conducted inside that isolated space. The right to privacy does not depend on the activity happened inside. Instead, it depends on whether one’s demonstration of intent to be let alone could be accepted as reasonable by the society. However, under the content-based approach, a decent citizen would have less expectation of privacy than someone who conducts shameful behaviour in solitude, and this approach apparently leads to unfair results.

 

Here comes the digital world version of the above scenario. When an individual, like the plaintiff Huang, subscribes to open an account at an online platform, like WeRead, and secures it with a password, this would create an isolated space where this person could expect digital privacy. By applying the Katz test, this individual has a reasonable expectation of privacy as he or she demonstrates a subjective expectation of privacy by “creating a password-secured account”, which is well recognized by the society as reasonable.  Likewise, the person’s act does not fall into any of the aforesaid exceptions.

 

This person is fully entitled to assert a digital privacy right to be “let alone”. One can choose not to have any improper friends, and not to read any obscene books, but can still enjoy full privacy rights over one’s personal information. It literally means that being a decent netizen should not compromise one’s digital privacy rights. The content of the information stored in a password-secured account, if it is nothing unlawful, should not dictate if and how the person would enjoy the right to privacy.

 

The above scenario shows that the content-based approach taken by the BIC is not fair because it makes users’ digital privacy rights conditional on the content of personal information, i.e., if the information includes any embarrassing content or not. This approach leads to the unfair conclusion that being a decent netizen, one has nothing shameful to hide and therefore would not have reasonable expectation of digital privacy.

 

Conclusion

 

With the storage and processing of exabytes of data, social media users’ concerns about their privacy have been on the rise in recent years. Incidents of illegal use of data and data breaches have alerted many users and caused them to reconsider their interaction with social media and the security of their personal data.

The disputes caused by unauthorized use of personal information over the internet have spiked in the privacy law landscape. The Beijing Internet Court’s present decision, which echoes with the same court’s decision on the “Dou Yin (Tik Tok Chinese version) collection of personal information” case, [13] is among the first few decisions made by Chinese courts on this controversial issue. Significantly, the decision might impact ongoing litigation stemming from similar disputes. Other courts around the country might follow suit. Therefore, it is imperative to have a more clear and fair approach towards defining reasonable digital privacy expectation.

In the era of big data, defining privacy is under pressure in the digital world. As Bill Gates put it: “whether it’s digital cameras or satellites or just what you click on, we need to have more explicit rules — not just for governments but for private companies.” [14]

 

 




[1] Beijing Internet Court, (2019) Jing 0491Min Chu Zi No. 16142.

[2]  China Cybersecurity Law, Article 43, provides, “Where an individual finds that any network operator collects or uses his or her personal information in violation of the provisions of any law, administrative regulation or the agreement of both parties, the individual shall be entitled to request the network operator to delete his or her personal information. If the individual finds that his or her personal information collected or stored by the network operator has any error, he or she shall be entitled to request the network operator to make corrections. The network operator shall take measures to delete the information or correct the error.”

[3] Eric Hughes, The Cypherpunk Manifesto (1993), see https://www.activism.net/cypherpunk/manifesto.html.

[4] See https://iapp.org/about/what-is-privacy/.

[5] Article 1032, China Civil Code (2021).

[6] The Fourth Amendment of the US Constitution, ratified on December 15, 1791, protects the right of people “to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures.”

[7]See Katz v. United States, 389 U.S. 347 (1967). Concurring opinion written by Justice Harlan.

[8] See Smith v. Maryland, 442 U.S. 735, 743-44 (1979).

[9] See Katz v. United States, 389 U.S. 347 (1967).

[10] See https://civillaw.com.cn/bo/t/?id=37410.

[11] Ibid.

[12] See https://www.igi-global.com/dictionary/privacy-data-protection-towards-elderly/23405.

[13]See Beijing Internet Court, (2019) Jing 0491Min Chu Zi No. 6694.

[14] See https://www.oipc.bc.ca/news/quote-of-the-day-bill-gates/.


  • 相关资讯 More
  • 点击次数: 1000003
    2026 - 01 - 16
    作者:金涟伊美丽熊有限公司(BeautyBears GmbH)是一家源自德国的保健品及功能性食品品牌企业,以“小熊软糖”剂型风靡全球市场。公司专注将维生素、矿物质、胶原蛋白等营养成分融入美味软糖,让健康补充告别传统药片与胶囊。为进入中国市场做准备,美丽熊有限公司(下称“美丽熊公司”)将自己的多件商标向国家知识产权局递交了商标注册申请,然而申请并不顺利,全部商标都遭遇了驳回的决定。 在错综复杂的驳回乱局中,铭盾所悉心梳理了所有商标驳回决定并进行全面分析,确定获得第72235131号“”商标的注册是解决所有商标问题的关键。理由有三:从市场角度来说,它已经被实际印刷在美丽熊的畅销维生素软糖外包装上,消费者在线下货架和跨境电商品详情页第一眼就能看到;从品牌识别角度来说,商标中的熊图形是公司全球统一使用的吉祥物,文字部分同时承载了商号中的“Bears”和“benefits”这一美好寓意,是美丽熊公司重要的品牌口号;从商品角度来看,该商标指定的维生素补剂类型商品是公司拳头品类,一旦失守将直接动摇公司几乎全部商业布局。 为了攻克这个难题,铭盾同美丽熊奋战两年,几乎穷尽了一切救济,走遍了非诉程序和行政诉讼程序,最终成功获得了该商标的注册。下文将复盘这场第72235131号“”商标的全流程“攻坚战”。 一、商标申请流程 (一)商标申请阶段第72235131号“”商标是美丽熊公司申请注册在第5类的核心商标。该商标于2023年6月14日申请,于2023年6月30日受理。 国家知识产权局审查了该申请后,于2023年10月19日下达了商标驳回通知书。在通知书中,该局驳回了该申请,理由是与在先商标相同或近似。所罗列的引证商标多达6件,分别是:1、第49622403号“”商标,指定使用商品“维生素制剂;护肤药剂;医用糖果;医用营养食物;空气净化制剂;含药...
  • 点击次数: 1000002
    2025 - 12 - 26
    作者:王辉在纷繁复杂的市场经济活动中,关联公司间的人员调配、业务交叉协作比较常见,这也导致“混同用工”现象成为劳动争议的高发领域。而当劳动者与用人单位发生纠纷时,能否主张关联公司承担连带责任,关键在于是否构成法律意义上的混同用工。本文就结合司法实践中的典型案例,探讨混同用工及关联公司连带责任认定及法律应对。一、 混同用工的界定什么是混同用工?在我国劳动法、劳动合同法等基本法律中并无明确的立法定义,在司法实践中指两个或两个以上存在关联关系的用人单位,交替或者同时对同一劳动者进行用工管理的用工现象。混同用工的认定以“实际用工管理”为核心,而非仅凭单一要素判断。若仅存在工资代发或社保代缴等单一交叉行为,而无实际工作指派、考勤管理等支配性关联,一般不构成混同用工。二、典型案例  参见(2024)京02民终14800号民事判决书(一)基本案情霍某于2017年4月19日入职某有限公司,岗位为运营经理。双方两次签订劳动合同,期限分别为2017年4月19日至2020年4月18日,2020年4月19日至2023年4月18日。劳动合同约定霍某工资为年薪制,以银行转账形式支付。2017年5月至2022年11月,某有限公司为霍某缴纳了社会保险,2022年11月18日双方协商一致解除劳动关系。霍某主张工资为年薪210000元,某有限公司认可霍某工资为年薪制,具体数额记不住了。某馆及某公司均表示不清楚霍某的工作情况。霍某以某公司、某有限公司、某馆为被申请人向北京市大兴区劳动人事争议仲裁委员会(以下简称大兴仲裁委)申请劳动仲裁,请求事项:1.某有限公司支付2020年1月1日至2022年11月18日期间工资91466.09元;2.某馆及某公司承担连带赔偿责任。2024年1月24日,大兴仲裁委作出京兴劳人仲字[2024]第0656号裁决书,裁决:一、某有限公司向霍某支付2020...
  • 点击次数: 1000001
    2025 - 12 - 12
    作者:杨秀芸近日,一个名为“人民咖啡馆”的品牌引发市场关注,其在营造亲切感的同时,也埋下了重大的法律隐患。该案例恰好成为我们剖析《商标法》第十条的典型样本。实践中,许多企业像“人民咖啡馆”一样,对商标注册的合法性认识不足,尤其在商标本身可能触犯禁用条款时仍坚持使用,这种行为隐藏着巨大风险。本文将以“人民咖啡馆”为切入点,分析违反《商标法》第十条的法律风险,为企业品牌建设提供合规参考。 一、商标法第十条的法律性质:不可逾越的“禁用”红线 在深入分析案例前,必须首先理解《商标法》第十条的根本性质。该条款明确列举了不得作为商标使用的标志,通常被称为商标的“禁用条款”。这与仅禁止注册但允许在先使用的第十一条“禁注条款”有着本质区别。  “禁用条款”的立法目的,在于禁止使用可能损害国家尊严、社会公共利益、社会公共秩序、民族团结、宗教信仰等的标志或者违反社会善良风俗、具有其他不良影响的标志。关键在于,它所禁止的标志不仅无法获得注册,其商业使用行为本身就是违法的。这意味着,无论是否提交注册申请,只要一个标志落入第十条的规制范围,其在市场上的任何商业使用均处于违法状态,随时面临监管部门的查处。 二、“人民咖啡馆”触及的具体风险条款 “人民咖啡馆”是一家名为要潮(上海)文化传播有限公司的市场化企业在运营。其名称看似巧妙,实则精准地触及了《商标法》第十条的红线。以下将逐一剖析其与具体款项的关联风险。 (一)主要风险:易被认定为具有“其他不良影响”(第十条第一款第(八)项)《商标法》第十条第一款第八项是一个兜底条款,即“有害于社会主义道德风尚或者有其他不良影响的”标志,不得作为商标使用。“人民”一词在我国语境中具有崇高的政治与公共属性,将其用于咖啡馆等纯商业服务,极易被认定为构成“其他不良影响”,具体表现为: 1、贬损与商业化...
  • 点击次数: 1000014
    2025 - 11 - 07
    作者:张琳一、引言合同是当事人之间的法律,与企业经营成败息息相关,因此加强合同管理对于企业来说至关重要。2015年12月4日最高人民法院发布了19起合同纠纷典型案例,其中有一个案例虽已时隔数年,但仍具有重要的现实指导意义。本文拟结合该案例探讨当今企业如何加强合同管理。 二、案情简介案号:临沂市兰山区人民法院(2013)临兰商初字第3091号民事判决书、山东省临沂市中级人民法院(2014)临商终字第99号 王XX从事贩卖板皮业务,孙YY为个体工商户AA板材厂的经营者,孙ZZ为孙YY之兄。王XX多次与AA板材厂发生买卖夹心皮的业务关系。2012年4月1日,王XX给AA板材厂送夹心皮,孙ZZ当时给王XX出具了出货单,载明:夹心皮,货款236000元。孙YY分别于2012年4月14日和10月17日向王XX名下银行账户存款54000元和10000元。2023年9月17日,王XX给孙YY打电话催要226000元货款,孙YY表示十月一过后安排点。 后王XX以买卖合同纠纷为由将孙YY、孙ZZ诉至一审法院,请求判令孙YY、孙ZZ支付所欠货款226000元及利息。一审法院经审理认为:1、孙ZZ收到王XX价值236000元夹心皮的事实有当事人陈述及王XX提交的出库单一份等证据予以证实。王XX向孙ZZ索要货款,孙ZZ理应支付。但是,王XX提交的证据不足以证明孙ZZ、孙YY系合伙经营或共同经营,故王XX要求孙YY共同偿付欠款的诉讼请求,证据不足,不予支持。2、王XX对孙YY于2012年10月17日金额为10000元的转账凭证无异议。孙ZZ主张已偿付欠款64000元,有其提交的合计金额为64000元的银行个人业务凭证予以证实,王XX虽主张2012年4月14日金额为54000元的转款并非偿付该案所诉欠款,但未提交相关证据予以证实。因此,合计金额为64000元的转款应当在王X...
× 扫一扫,关注微信公众号
铭盾MiNGDUN   www.mdlaw.cn                                               犀牛云提供企业云服务 
Copyright© 2008 - 2025 铭盾京ICP备14029762号-1                                                                                                                                隐私政策   免责声明       
X
1

QQ设置

3

SKYPE 设置

4

阿里旺旺设置

5

电话号码管理

6

二维码管理

展开