The WeRead Case: Discussion on Reasonable Digital Privacy Expectation

Authored by Yingying Zhu


March 2021

Each of us leaves a lasting digital footprint on the internet and would expect businesses that we are dealing with could treat our digital privacy with reasonable care and consideration. Can users have a reasonable privacy expectation in the friends made and the books read online? The Beijing Internet Court in its recently released WeRead judgment holds that, friends list and reading data are not eligible for privacy protection in the case under dispute but nevertheless entitled to protection as personal information.


The judgment is in relation to a dispute between an individual, Huang, a user of a book reading app named WeRead, and the digital giant, Tencent, the operator of the most successful social media in China, WeChat, and its sister app WeRead. The WeRead app wishes to set up an app-based reading community, where people who enjoy reading can read & connect. The plaintiff Huang was complaining that WeRead sneaked away her friends list from WeChat and then automatically turned those who are also subscribers of WeRead as her connections. Huang was also complaining that the information regarding the books she read and how she felt about the reading was widely open to all her connections without her permission while she intended to keep such information private. In its defense, the defendant Tencent alleged that users’ friends list and reading data were obtained with a preapproval from users therefore it should not be held liable for the utilization of the data.

Decision of Beijing Internet Court[1]

The Beijing Internet Court (hereinafter the “BIC”), the Court of First Instance, decides that Huang’s friends list and reading data shall not be categorized as private information, hence not eligible for privacy protection.

To define what constitutes private information, the BIC’s reasoning is based on the classification of the following three layers of personal information:

1.     personal information reasonably recognized by the society as private information, such as one’s sextual orientation, sex life, history of disease and unreleased criminal records, etc.

2.     personal information on which one may hold a defensive expectation or a utilization expectation; and

3.     general information that has no traits of privacy at all.


The BIC holds, because one’s friends list and reading data do not constitute private information as listed in layer 1 in the above classification, Tencent is not liable for invasion of the plaintiff’s privacy.


The BIC goes on to reason that one’s friends list and reading data shall be classified under layer 2 in the above classification, where the information is considered personal but not private and therefore the emphasis of protection is to give the data subject a right to decide whether to hide or to use such information.


The BIC further holds that in this case the plaintiff did not get the chance to decide how to deal with her personal information, because Tencent failed to give proper and transparent notices to the plaintiff and failed to obtain her affirmative consent before utilizing the information under dispute. The BIC then decides that Tencent should be held liable for violation of the plaintiff’s legitimate interests in her personal information. The BIC’s decision is majorly based on Article 43 of the Cybersecurity Law of China. [2]


1.    What is Privacy?

According to Eric Hughes, an American mathematician, computer programmer, and cypherpunk, “Privacy is the power to selectively reveal oneself to the world.” [3] Broadly speaking, privacy is the right to be let alone, or freedom from interference or intrusion. Information privacy is the right to have some control over how your personal information is collected and used.[4]


The Civil Code of China (2021) defines privacy as peace in a person’s private life and the private space, private activities and private information that a person does not intend for others to know.[5]


As a governing law, the Civil Code’s definition of privacy is vague. As we know, privacy varies greatly from person to person: while one person may be comfortable with showing his or her diet recipe online, another person may be embarrassed to let others know how little (or how much) he or she eats over a meal. Similarly, while one person may be at ease with disclosing many details of his or her personal life to online social connections, another person may feel ashamed of posting anything personal on the internet. So exactly what kind of privacy does the Civil Code protect? Some guidance from a concurring opinion in a US Supreme Court decision might shed some light on this.


2.    Reasonable Expectation of Privacy

To define the right to privacy under the Fourth Amendment, [6]  the US Supreme Court Justice John Marshall Harlan, in his concurring opinion in Katz, [7]  formulated a “reasonable expectation of privacy” test. The test has two prongs:

1)     the person must exhibit an “actual (subjective) expectation of privacy”; and

2)     society recognizes the expectation as “reasonable.”

The Katz “reasonable expectation of privacy” test, while particularly useful in terms of defining privacy, also provokes further questions: what is reasonable? where to draw the line between “reasonable” expectation and expectation that is “unreasonable”? These questions matter hugely in today’s digital world, because every time a user creates a new account at an online platform, the user provides information with personal details, including name, birthdate, geographic location, and personal interests, etc. Users are entitled to know if they can have a “reasonable expectation of privacy” in such information and if such expectation could be respected by the platform.


3.    Exceptions to the Reasonable Expectation of Privacy


There are several recognized exceptions to the reasonable expectation of privacy, such as the Third-Party Doctrine, which means once an individual invests a third party with information, and voluntarily agrees to share information with a recipient, the individual loses any reasonable expectation of privacy in that information, [8] and the Voluntary consent Doctrine, which means individuals lose a reasonable expectation of privacy when they consent to a search of private information.[9]Other exceptions include the following: unlawful information is not protectable by the law and therefore there should be no reasonable expectation of privacy,[10] and public disclosure of private information will cause forfeiture of any reasonable expectation of privacy.[11]


4.    Where did the Court draw the Line?


The BIC obviously referenced the Katz test by reasoning that “the privateness in the information that one does not intend to disclose depends on a subjective intent, however, such subjective intent shall be reasonably recognized by the society.”


Then the BIC made the point that the information about one’s social relationship could only invoke reasonable expectation of privacy under the following circumstances: the relationship between the data subject and certain connections would be too intimate to let others know, or the disclosure of some social relationship would negatively affect the data subject’s social image.


With respect to the book reading data, the BIC made another similar point that one could only have reasonable expectation of privacy in one’s reading data if certain reading contents fall into some private and secret information region or the reading data, when generated at certain amounts, would reflect negatively on the data subject.


Then the BIC commented that the plaintiff’s online social relationship, i.e., the listed friends, is being identified by open-ID, profile and nickname, which should not show the real social relationship or the degree of intimacy between the plaintiff and her social connections. The BIC also went through the contents of the plaintiff’s reading data and found that neither of the two books displayed to her connections would cause any damage to the plaintiff’s social image. The plaintiff’s reading data therefore should not be categorized as private information, hence no reasonable privacy expectation in the data.


In a nutshell, the BIC was defining “reasonable expectation of privacy” in the digital world based on the content of certain information. If a piece of information contains nothing intimate or cannot reflect negatively on the data subject, then the data subject should not have a “reasonable expectation of privacy” in the information. The content-based approach is how the BIC drew the line between privacy and non-privacy related information.


5.    Content-based Approach is not Fair


The BIC’s views on this issue are deeply disturbing. Back to the definition of privacy, broadly speaking, privacy is the right to be “let alone”. It means when a person walks into an isolated space, the person could expect to be in a state in which one is not observed or disturbed by other people,[12] as long as nothing illegal is ongoing under the roof. By applying the Katz test, this person has a reasonable expectation of privacy because the person demonstrates a subjective expectation of privacy by “walking into the isolated space”, which is well recognized by the society as reasonable.  Furthermore, the person’s act does not fall into any of the aforesaid exceptions.


 In solitude, a decent citizen could expect the same degree of privacy as much as anyone would. The right to privacy does not depend on whether something shameful is being conducted inside that isolated space. The right to privacy does not depend on the activity happened inside. Instead, it depends on whether one’s demonstration of intent to be let alone could be accepted as reasonable by the society. However, under the content-based approach, a decent citizen would have less expectation of privacy than someone who conducts shameful behaviour in solitude, and this approach apparently leads to unfair results.


Here comes the digital world version of the above scenario. When an individual, like the plaintiff Huang, subscribes to open an account at an online platform, like WeRead, and secures it with a password, this would create an isolated space where this person could expect digital privacy. By applying the Katz test, this individual has a reasonable expectation of privacy as he or she demonstrates a subjective expectation of privacy by “creating a password-secured account”, which is well recognized by the society as reasonable.  Likewise, the person’s act does not fall into any of the aforesaid exceptions.


This person is fully entitled to assert a digital privacy right to be “let alone”. One can choose not to have any improper friends, and not to read any obscene books, but can still enjoy full privacy rights over one’s personal information. It literally means that being a decent netizen should not compromise one’s digital privacy rights. The content of the information stored in a password-secured account, if it is nothing unlawful, should not dictate if and how the person would enjoy the right to privacy.


The above scenario shows that the content-based approach taken by the BIC is not fair because it makes users’ digital privacy rights conditional on the content of personal information, i.e., if the information includes any embarrassing content or not. This approach leads to the unfair conclusion that being a decent netizen, one has nothing shameful to hide and therefore would not have reasonable expectation of digital privacy.




With the storage and processing of exabytes of data, social media users’ concerns about their privacy have been on the rise in recent years. Incidents of illegal use of data and data breaches have alerted many users and caused them to reconsider their interaction with social media and the security of their personal data.

The disputes caused by unauthorized use of personal information over the internet have spiked in the privacy law landscape. The Beijing Internet Court’s present decision, which echoes with the same court’s decision on the “Dou Yin (Tik Tok Chinese version) collection of personal information” case, [13] is among the first few decisions made by Chinese courts on this controversial issue. Significantly, the decision might impact ongoing litigation stemming from similar disputes. Other courts around the country might follow suit. Therefore, it is imperative to have a more clear and fair approach towards defining reasonable digital privacy expectation.

In the era of big data, defining privacy is under pressure in the digital world. As Bill Gates put it: “whether it’s digital cameras or satellites or just what you click on, we need to have more explicit rules — not just for governments but for private companies.” [14]



[1] Beijing Internet Court, (2019) Jing 0491Min Chu Zi No. 16142.

[2]  China Cybersecurity Law, Article 43, provides, “Where an individual finds that any network operator collects or uses his or her personal information in violation of the provisions of any law, administrative regulation or the agreement of both parties, the individual shall be entitled to request the network operator to delete his or her personal information. If the individual finds that his or her personal information collected or stored by the network operator has any error, he or she shall be entitled to request the network operator to make corrections. The network operator shall take measures to delete the information or correct the error.”

[3] Eric Hughes, The Cypherpunk Manifesto (1993), see

[4] See

[5] Article 1032, China Civil Code (2021).

[6] The Fourth Amendment of the US Constitution, ratified on December 15, 1791, protects the right of people “to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures.”

[7]See Katz v. United States, 389 U.S. 347 (1967). Concurring opinion written by Justice Harlan.

[8] See Smith v. Maryland, 442 U.S. 735, 743-44 (1979).

[9] See Katz v. United States, 389 U.S. 347 (1967).

[10] See

[11] Ibid.

[12] See

[13]See Beijing Internet Court, (2019) Jing 0491Min Chu Zi No. 6694.

[14] See

  • 相关资讯 More
  • 点击次数: 13
    2023 - 03 - 24
    作者:金涟伊信息时代的来临带来了更多机会与市场,其中意见领袖、平台主播等自媒体是这一浪潮中最突出的弄潮儿。但不论是在什么领域,对其品牌的培养都是自媒体运营的重点。运营自媒体账户培育品牌有以下注意事项。 一、 品牌名称选取 对于自媒体相关主体,不论在哪个平台建立账号,一个好的昵称是成功的一半。该昵称也会在未来成为意见领袖、up主或主播的重要品牌,成为吸引用户的最突出的标志之一。因此对昵称的选择是非常重要的。昵称的风格可以千变万化,可以简约,可以标识重点,可以抽象或单纯富有趣味,但不论是何风格都需遵守当地法律法规以及平台规定。 以某平台为例,在平台用户服务协议明确约定,用户所设置的账号不得违反国家法律法规及平台的相关规则,用户账号名称、头像和简介等注册信息及其他个人信息中不得出现违法和不良信息,未经他人许可不得用他人名义(包括但不限于冒用他人姓名、名称、字号、头像等或采取其他足以让人引起混淆的方式)开设账号,不得恶意注册平台账号(包括但不限于频繁注册、批量注册账号等行为)。同时,用户在账号注册及使用过程中需遵守相关法律法规,不得实施任何侵害国家利益、损害其他公民合法权益,有害社会道德风尚的行为。平台有权对用户提交的注册信息进行审核,这也是平台的义务。 概括而言,注册账户名称应关注: 1、 符合法律法规及平台的规定以及公序良俗2、具有可识别性——昵称及特色3、不侵犯他人在先权利 二、 重视品牌维护 自媒体运营的领域除了其频道主要内容涉及的方向外,也应当注意广告、娱乐教育服务方面的品牌维护。自媒体账户通常盈利方式包括:1、平台分成或签约;2、广告;3、衍生产品。对以上不同盈利方式应当各有注意要点。 对于通过平台分成或签约形式盈利的自媒体,应当注意签约合同中对知识产权的约定,...
  • 点击次数: 11
    2023 - 03 - 10
    作者:刘艳玲当专利申请人向多个国家/地区提交专利申请时,如果希望专利申请加快审查进程,我们知道专利审查高速路(PPH)是一个可以利用的方式。PPH是专利审查机构直接开展的审查结果共享的业务合作,旨在帮助申请人的他国同族专利申请早日获得授权。当申请人在一国审查局提交的专利申请中有一项或多项权利要求被确定为可授权时,可以以此为基础向他国审查局就同族专利申请提出加快审查请求。除了可以加快审查以外,答复审查意见通知书的次数也可能会减少,并且申请被授予专利权的可能性也能增加。同族专利申请的审查结果除了上述应用以外,还有其他的利用方式。在此根据实践经验进行相应介绍。 美国根据美国专利相关法规,专利申请的申请人及密切相关人员在该美国专利申请的过程中有义务将对该申请的专利性重要的现有技术文件(包括专利文献和非专利文献)提交给美国专利商标局以供审查员在审查时考虑。这个程序也称IDS(Information Disclosure Statement,信息公开声明提交)。申请人如果没履行IDS提交义务会导致授权专利无法执行(unenforceable)。美国专利实施细则37CFR1.97-1.98以及专利审查指南MPEP609中给出了IDS文件的具体内容提交要求和时限要求,读者可进一步检索查看。这其中包括申请人及相关人员需要向美国专利商标局提交外国同族专利申请的审查意见/审查结果中引用的对比文件,而且需要在收到审查意见/审查结果后3个月内提交且该期限不可延长。对于以PCT方式进美国的国家申请,审查员审查时会考虑美国专利商标局IFW系统中的所有美国专利文献;如果美国专利商标局下发的PCT/DO/EO/903表中指出了国际检索报告和相关文件的副本已经在国家阶段文件包中,审查员审查时会考虑这些对比文件。由于存在法律适用的不同情形,处理申请时请就提交细节向代理专利申请的合作专利律师/代理师咨询。印度 根...
  • 点击次数: 9
    2023 - 02 - 24
    作者:常春引言:  最高人民法院近日公开的(2021)最高法知民终1363号案件的判决书给出了关于侵犯技术秘密的侵权获利计算的新方式,即可以将侵权人在特定项目上的全部获利作为侵权获利只要侵权人有明显过错且该侵权行为直接决定商业机会的得失。这一计算方式是对技术秘密侵权案件中侵权获利计算方法的一种细化,也为其他知识产权侵权的计算方法提供了参照和启示。 案情概述:  A公司与Y公司同时参加某项目招投标,Y公司以相对较低价格中标。A公司发现中标的Y公司实际为其前核心员工组建且均与A公司签署有保密协议,保密协议约定对他们知悉的A公司技术秘密保密。A公司起诉Y公司商业秘密侵权。法院在审理认为Y公司核心员工李某的电脑中保存的该项目的标书、中期报告等文件中包含A公司的技术秘密,而且因为Y公司使该等技术秘密的行为使得其以低价中标,进而使得A公司错失了在该项目中的交易机会。因此,法院基于Y公司在该项目中的营业利润判定给与A公司赔偿。 铭盾分析:反不正当竞争法规定了侵犯技术秘密的赔偿述额需要按实际损失、侵权获利、法定赔偿的顺序确定。其中,侵权获利的计算方法可以参照确定侵犯专利权的损害赔偿额的方法进行。而专利侵权的侵权获利的计算方法则包括侵权人因侵权所获得的利益可以根据该侵权产品(服务)在市场上销售的总数乘以每件侵权产品(服务)的合理利润所得之积计算。侵权人因侵权所获得的利益一般按照侵权人的营业利润计算,对于完全以侵权为业的侵权人,可以按照销售利润计算,但其中应当合理扣除因其他权利所产生的利益,即应当考虑专利在利润中的贡献率。按照上述的计算方法,对于并非以侵权为业的侵权人技术秘密侵权行为的获利可以按以下方式计算:侵权获利=侵权产品(服务)量X侵权产品(服务)营业利润X技术秘密对利润的贡献率;其中,营业利润=销售利润-管理费用-财务费用。但在本案中,法院认为招投标项目有其特殊性,...
  • 点击次数: 12
    2023 - 02 - 17
    作者:金涟伊现如今,品牌对于企业发展的重要性已经无可非议,大型企业甚至成立专门的知识产权公司以统一管理、运营、保护其知识产权。而对于中小企业,品牌保护对自身发展有着更重要的意义。能否另辟新径,避开企业规模的劣势,令其品牌直面消费者,使自身获得相应市场地位,成为中小企业树立优质品牌的工作重点。然而,中小企业品牌在面对猖獗的恶意抢注行为时显得更为脆弱,由于自身规模及可调用资源的限制,通常难以与怀有恶意的商标抢注人,甚至同行业竞争者相抗争。本文将简要介绍目前常见的打击恶意商标申请的办法,为中小企业打击恶意商标申请提供思路参考。 一、 何为恶意商标注册申请及法律相关规定 实践中常见的恶意商标注册申请主要可分为两类:以囤积倒卖商标为目的的恶意商标注册申请;侵犯他人在先权利的恶意商标注册申请。 (一)以囤积倒卖商标为目的的恶意商标注册申请 以囤积倒卖商标为目的的恶意商标注册申请,是指申请人在多个类别大量申请商标,明显超出实际生产经营活动所需。商标法第四条规定,“自然人、法人或者其他组织在生产经营活动中,对其商品或者服务需要取得商标专用权的,应当向商标局申请商标注册。不以使用为目的的恶意商标注册申请,应当予以驳回。”该条规定了向国家知识产权局商标局申请注册的商标应当是生产经营活动所需,不以使用为目的的商标注册申请是恶意商标注册申请,国家知识产权局将予以驳回。 国家知识产权局对不以使用为目的、囤积商标的恶意注册申请的打击力度较重,一旦发现此种申请,将对该申请人所申请的全部商标均予以驳回。此种驳回目前公示在国家知识产权局商标局官网的商标注册审查决定文书栏目中。 尽管国家知识产权局会依职权主动对此种恶意注册商标行为采取行动,但在审查中仍可能存在漏网之鱼。由于此种恶意注册申请会侵占大量商标资源,可能导致企业在申请自创商标时遭遇...
× 扫一扫,关注微信公众号
Copyright© 2008 - 2020北京市铭盾律师事务所京ICP备09063742号-1犀牛云提供企业云服务