Language

The WeRead Case: Discussion on Reasonable Digital Privacy Expectation

Authored by Yingying Zhu

 

March 2021

Each of us leaves a lasting digital footprint on the internet and would expect businesses that we are dealing with could treat our digital privacy with reasonable care and consideration. Can users have a reasonable privacy expectation in the friends made and the books read online? The Beijing Internet Court in its recently released WeRead judgment holds that, friends list and reading data are not eligible for privacy protection in the case under dispute but nevertheless entitled to protection as personal information.

Background

The judgment is in relation to a dispute between an individual, Huang, a user of a book reading app named WeRead, and the digital giant, Tencent, the operator of the most successful social media in China, WeChat, and its sister app WeRead. The WeRead app wishes to set up an app-based reading community, where people who enjoy reading can read & connect. The plaintiff Huang was complaining that WeRead sneaked away her friends list from WeChat and then automatically turned those who are also subscribers of WeRead as her connections. Huang was also complaining that the information regarding the books she read and how she felt about the reading was widely open to all her connections without her permission while she intended to keep such information private. In its defense, the defendant Tencent alleged that users’ friends list and reading data were obtained with a preapproval from users therefore it should not be held liable for the utilization of the data.

Decision of Beijing Internet Court[1]

The Beijing Internet Court (hereinafter the “BIC”), the Court of First Instance, decides that Huang’s friends list and reading data shall not be categorized as private information, hence not eligible for privacy protection.

To define what constitutes private information, the BIC’s reasoning is based on the classification of the following three layers of personal information:

1.     personal information reasonably recognized by the society as private information, such as one’s sextual orientation, sex life, history of disease and unreleased criminal records, etc.

2.     personal information on which one may hold a defensive expectation or a utilization expectation; and

3.     general information that has no traits of privacy at all.

 

The BIC holds, because one’s friends list and reading data do not constitute private information as listed in layer 1 in the above classification, Tencent is not liable for invasion of the plaintiff’s privacy.

 

The BIC goes on to reason that one’s friends list and reading data shall be classified under layer 2 in the above classification, where the information is considered personal but not private and therefore the emphasis of protection is to give the data subject a right to decide whether to hide or to use such information.

 

The BIC further holds that in this case the plaintiff did not get the chance to decide how to deal with her personal information, because Tencent failed to give proper and transparent notices to the plaintiff and failed to obtain her affirmative consent before utilizing the information under dispute. The BIC then decides that Tencent should be held liable for violation of the plaintiff’s legitimate interests in her personal information. The BIC’s decision is majorly based on Article 43 of the Cybersecurity Law of China. [2]

Discussion

1.    What is Privacy?

According to Eric Hughes, an American mathematician, computer programmer, and cypherpunk, “Privacy is the power to selectively reveal oneself to the world.” [3] Broadly speaking, privacy is the right to be let alone, or freedom from interference or intrusion. Information privacy is the right to have some control over how your personal information is collected and used.[4]

 

The Civil Code of China (2021) defines privacy as peace in a person’s private life and the private space, private activities and private information that a person does not intend for others to know.[5]

 

As a governing law, the Civil Code’s definition of privacy is vague. As we know, privacy varies greatly from person to person: while one person may be comfortable with showing his or her diet recipe online, another person may be embarrassed to let others know how little (or how much) he or she eats over a meal. Similarly, while one person may be at ease with disclosing many details of his or her personal life to online social connections, another person may feel ashamed of posting anything personal on the internet. So exactly what kind of privacy does the Civil Code protect? Some guidance from a concurring opinion in a US Supreme Court decision might shed some light on this.

 

2.    Reasonable Expectation of Privacy

To define the right to privacy under the Fourth Amendment, [6]  the US Supreme Court Justice John Marshall Harlan, in his concurring opinion in Katz, [7]  formulated a “reasonable expectation of privacy” test. The test has two prongs:

1)     the person must exhibit an “actual (subjective) expectation of privacy”; and

2)     society recognizes the expectation as “reasonable.”

The Katz “reasonable expectation of privacy” test, while particularly useful in terms of defining privacy, also provokes further questions: what is reasonable? where to draw the line between “reasonable” expectation and expectation that is “unreasonable”? These questions matter hugely in today’s digital world, because every time a user creates a new account at an online platform, the user provides information with personal details, including name, birthdate, geographic location, and personal interests, etc. Users are entitled to know if they can have a “reasonable expectation of privacy” in such information and if such expectation could be respected by the platform.

 

3.    Exceptions to the Reasonable Expectation of Privacy

 

There are several recognized exceptions to the reasonable expectation of privacy, such as the Third-Party Doctrine, which means once an individual invests a third party with information, and voluntarily agrees to share information with a recipient, the individual loses any reasonable expectation of privacy in that information, [8] and the Voluntary consent Doctrine, which means individuals lose a reasonable expectation of privacy when they consent to a search of private information.[9]Other exceptions include the following: unlawful information is not protectable by the law and therefore there should be no reasonable expectation of privacy,[10] and public disclosure of private information will cause forfeiture of any reasonable expectation of privacy.[11]

 

4.    Where did the Court draw the Line?

 

The BIC obviously referenced the Katz test by reasoning that “the privateness in the information that one does not intend to disclose depends on a subjective intent, however, such subjective intent shall be reasonably recognized by the society.”

 

Then the BIC made the point that the information about one’s social relationship could only invoke reasonable expectation of privacy under the following circumstances: the relationship between the data subject and certain connections would be too intimate to let others know, or the disclosure of some social relationship would negatively affect the data subject’s social image.

 

With respect to the book reading data, the BIC made another similar point that one could only have reasonable expectation of privacy in one’s reading data if certain reading contents fall into some private and secret information region or the reading data, when generated at certain amounts, would reflect negatively on the data subject.

 

Then the BIC commented that the plaintiff’s online social relationship, i.e., the listed friends, is being identified by open-ID, profile and nickname, which should not show the real social relationship or the degree of intimacy between the plaintiff and her social connections. The BIC also went through the contents of the plaintiff’s reading data and found that neither of the two books displayed to her connections would cause any damage to the plaintiff’s social image. The plaintiff’s reading data therefore should not be categorized as private information, hence no reasonable privacy expectation in the data.

 

In a nutshell, the BIC was defining “reasonable expectation of privacy” in the digital world based on the content of certain information. If a piece of information contains nothing intimate or cannot reflect negatively on the data subject, then the data subject should not have a “reasonable expectation of privacy” in the information. The content-based approach is how the BIC drew the line between privacy and non-privacy related information.

 

5.    Content-based Approach is not Fair

 

The BIC’s views on this issue are deeply disturbing. Back to the definition of privacy, broadly speaking, privacy is the right to be “let alone”. It means when a person walks into an isolated space, the person could expect to be in a state in which one is not observed or disturbed by other people,[12] as long as nothing illegal is ongoing under the roof. By applying the Katz test, this person has a reasonable expectation of privacy because the person demonstrates a subjective expectation of privacy by “walking into the isolated space”, which is well recognized by the society as reasonable.  Furthermore, the person’s act does not fall into any of the aforesaid exceptions.

 

 In solitude, a decent citizen could expect the same degree of privacy as much as anyone would. The right to privacy does not depend on whether something shameful is being conducted inside that isolated space. The right to privacy does not depend on the activity happened inside. Instead, it depends on whether one’s demonstration of intent to be let alone could be accepted as reasonable by the society. However, under the content-based approach, a decent citizen would have less expectation of privacy than someone who conducts shameful behaviour in solitude, and this approach apparently leads to unfair results.

 

Here comes the digital world version of the above scenario. When an individual, like the plaintiff Huang, subscribes to open an account at an online platform, like WeRead, and secures it with a password, this would create an isolated space where this person could expect digital privacy. By applying the Katz test, this individual has a reasonable expectation of privacy as he or she demonstrates a subjective expectation of privacy by “creating a password-secured account”, which is well recognized by the society as reasonable.  Likewise, the person’s act does not fall into any of the aforesaid exceptions.

 

This person is fully entitled to assert a digital privacy right to be “let alone”. One can choose not to have any improper friends, and not to read any obscene books, but can still enjoy full privacy rights over one’s personal information. It literally means that being a decent netizen should not compromise one’s digital privacy rights. The content of the information stored in a password-secured account, if it is nothing unlawful, should not dictate if and how the person would enjoy the right to privacy.

 

The above scenario shows that the content-based approach taken by the BIC is not fair because it makes users’ digital privacy rights conditional on the content of personal information, i.e., if the information includes any embarrassing content or not. This approach leads to the unfair conclusion that being a decent netizen, one has nothing shameful to hide and therefore would not have reasonable expectation of digital privacy.

 

Conclusion

 

With the storage and processing of exabytes of data, social media users’ concerns about their privacy have been on the rise in recent years. Incidents of illegal use of data and data breaches have alerted many users and caused them to reconsider their interaction with social media and the security of their personal data.

The disputes caused by unauthorized use of personal information over the internet have spiked in the privacy law landscape. The Beijing Internet Court’s present decision, which echoes with the same court’s decision on the “Dou Yin (Tik Tok Chinese version) collection of personal information” case, [13] is among the first few decisions made by Chinese courts on this controversial issue. Significantly, the decision might impact ongoing litigation stemming from similar disputes. Other courts around the country might follow suit. Therefore, it is imperative to have a more clear and fair approach towards defining reasonable digital privacy expectation.

In the era of big data, defining privacy is under pressure in the digital world. As Bill Gates put it: “whether it’s digital cameras or satellites or just what you click on, we need to have more explicit rules — not just for governments but for private companies.” [14]

 

 




[1] Beijing Internet Court, (2019) Jing 0491Min Chu Zi No. 16142.

[2]  China Cybersecurity Law, Article 43, provides, “Where an individual finds that any network operator collects or uses his or her personal information in violation of the provisions of any law, administrative regulation or the agreement of both parties, the individual shall be entitled to request the network operator to delete his or her personal information. If the individual finds that his or her personal information collected or stored by the network operator has any error, he or she shall be entitled to request the network operator to make corrections. The network operator shall take measures to delete the information or correct the error.”

[3] Eric Hughes, The Cypherpunk Manifesto (1993), see https://www.activism.net/cypherpunk/manifesto.html.

[4] See https://iapp.org/about/what-is-privacy/.

[5] Article 1032, China Civil Code (2021).

[6] The Fourth Amendment of the US Constitution, ratified on December 15, 1791, protects the right of people “to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures.”

[7]See Katz v. United States, 389 U.S. 347 (1967). Concurring opinion written by Justice Harlan.

[8] See Smith v. Maryland, 442 U.S. 735, 743-44 (1979).

[9] See Katz v. United States, 389 U.S. 347 (1967).

[10] See https://civillaw.com.cn/bo/t/?id=37410.

[11] Ibid.

[12] See https://www.igi-global.com/dictionary/privacy-data-protection-towards-elderly/23405.

[13]See Beijing Internet Court, (2019) Jing 0491Min Chu Zi No. 6694.

[14] See https://www.oipc.bc.ca/news/quote-of-the-day-bill-gates/.


  • 相关资讯 More
  • 点击次数: 1000002
    2025 - 06 - 06
    作者:刘艳玲创新技术的商业化过程不仅涉及技术的应用,也涉及对消费者行为、市场趋势和商业模式的理解和洞察。如果创新技术影响巨大,甚至可能重塑商业运作模式[1]。例如,人工智能技术的应用正在逐渐改变人们的生活和工作方式。 一种创新技术产品从无到有,再到成功上市大概要经历以下阶段:技术研发、产品设计、生产和销售,其中包括技术产品化和产品商业化。这个过程中产生的成本包括研发成本、生产制造成本以及营销成本。整个技术商业化过程中离不开知识产权保护和法律服务,这期间形成的知识产权可以说是技术产品商业化成功的有力助推手。本文是笔者提供法律和成果转化服务过程中形成的经验分享,以下以产品创新的推进进展为时间线进行讨论。 在战略阶段,通过对技术和专利信息检索和分析可以获得技术情报。技术情报能够揭露技术趋势、技术成熟度、技术边界和技术应用生命周期等信息,这些信息可以为应不应该投入某个技术领域的研发,以及如果投入应该走哪条技术路线指明方向。例如,技术应用生命周期包括萌芽期、过热期、低估期、复苏期和成熟期,技术处于生命周期的不同阶段所采取的创新策略和商业策略不同。技术情报还可以为专利布局和技术成果成功转化提供建议和解决方案。 商业人士应该知道知识产权保护对创新技术的商业化成功来说非常重要。这种保护需要在研发阶段,就有意识、有策略地对所研发的技术采取知识产权保护措施。研发阶段的知识产权策略包括筹划哪些技术适合采用技术秘密来保护、哪些技术适合并且能够通过申请专利来保护,以及计算机软件代码考虑采用著作权登记来得到保护等。   随着技术研发的推进,当所研发的创新技术其技术成熟度达到7-9级时[2],可以着手进行技术商业化。技术成熟度到达7级意味着技术已通过模拟使用环境下的系统样机验证。通常来说,刚开始可能只是一个人或一家单位发起一项新技术的研发,随着新技术研发...
  • 点击次数: 1000000
    2025 - 05 - 30
    作者:张琳本律师于近日办结了同一个自然人就一起交通事故向同一个法人单位分别提起机动车交通事故责任纠纷和劳动争议纠纷,最终两个关联案件均调解结案,为代理的法人单位成功避免了双重赔偿。 一、基本案情自然人XXX乘坐自然人AAA驾驶的车辆在上班路上与自然人BBB驾驶的车辆发生交通事故,XXX受伤致残。YYY公司的项目经理在交警调查交通事故时称XXX和AAA是YYY公司的临时工,未与YYY公司签合同,AAA驾驶的车辆为YYY公司配发,事发时是从YYY公司的工人宿舍出发去某小区执行绿化任务。交警最终认定本次交通事故中AAA和BBB为同等责任,XXX无责。另外,XXX与YYY公司未签订劳动合同,YYY公司也未为XXX缴纳社保。后XXX以机动车交通事故责任纠纷为由起诉至法院,要求AAA、BBB、BBB的用人单位、BBB所驾车辆投保交强险和商业责任险的保险公司承担赔偿责任。在案件审理过程中,AAA以其为YYY公司所雇员工且事发时为职务行为、相关赔偿责任应由YYY公司承担为由,申请追加YYY公司为本案被告并被法院批准。同时,XXX又以劳动争议纠纷为由申请劳动仲裁,要求确认与YYY公司存在劳动关系,并要求YYY公司支付未签订劳动合同双倍工资差额。 二、裁判结果1、在交通事故案一审程序中,YYY公司主张如XXX认为其与YYY公司存在劳动关系且事发时在上班途中构成工伤,XXX就YYY公司对其应承担的法律责任应按《工伤保险条例》的规定处理,本案应由XXX撤回对YYY公司的起诉,或本案中止审理等待XXX与YYY公司劳动争议案件的最终裁判结果。但法院并未采纳YYY公司的意见,认为AAA系YYY公司的工作人员,发生事故时亦在履行职务过程中,故AAA的责任由YYY公司承担,遂判决YYY公司在AAA承担责任的范围内向XXX承担赔偿责任。YYY公司不服,提起上诉。2、在劳动争议案仲裁程序中,...
  • 点击次数: 1000000
    2025 - 05 - 16
    作者:张嘉畅2025年4月21日,在世界知识产权日来临之际,最高人民法院举行了知识产权宣传周新闻发布会,并在会上发布2024年人民法院知识产权典型案例。其中第八案,浙江省东阳市人民法院(2024)浙0783刑初585号案为著作权侵权案件。侵权人最终被认定触犯侵犯著作权罪,刑期最高长达4年,最低有期徒刑10个月(缓刑1年零4个月)。此外,3名侵权人还被处以最高150万元的民事罚金。在本案当中,被告陆某某自2020年起,开设了多个违规盗版视频网站,未经权利人授权许可,非法向公众提供各类影视作品。另外两被告季某某、方某明在明知陆某某开设的网站为违规网站的情况下,依然向其出售影视网站模板,并持续为其提供技术服务,共计收取6990余元。在此期间,陆某某与非法广告商合作,在其开设的盗版网站上投放涉黄、涉赌广告,广告费收入超过148万元人民币。2024年初,3名被告人被公安机关抓获归案,公诉机关指控三被告人触犯《中华人民共和国刑法》第二百一十七条侵犯著作权罪。又因上述盗版网站大量传播当时影院热映的《飞驰人生2》、《第二十条》、《热辣滚烫》等贺岁档电影,各电影出品方提起了附带民事诉讼,要求被告人赔偿经济损失。浙江省东阳市人民法院一审认定,被告人陆某某以盈利为目的,未经著作权人许可,通过信息网络向公众传播他人视听作品,违法所得数额巨大;被告人方某、季某某明知他人侵犯著作权仍提供帮助,以上被告人均构成侵犯著作权罪。综合在案事实,法院最终判处被告人陆某某有期徒刑四年,并处罚金150万元;被告人方某有期徒刑一年,缓刑一年六个月,并处罚金1.6万元;被告人季某某有期徒刑十个月,缓刑一年四个月,并处罚金1万元;被告人陆某某赔偿附带民事诉讼各原告人经济损失共计88万元。本案判决充分彰显了知识产权民事、刑事、行政“三合一”审判模式的效能。它不仅妥善解决了各被告人的定罪及量刑问题,还有效处理了被害人的民事赔...
  • 点击次数: 1000006
    2025 - 05 - 09
    作者:陈巴特将银行账户借给父亲临时周转,儿子凭什么要承担还款责任?这或许是很多人的第一反应。正是因为持有这种想法的人很多,现实生活中,亲友、同事甚至企业和员工之间,借用银行账户的情形大量存在。殊不知,出借银行账户,出借人存在很大法律风险,很可能和借款人或债务人承担连带责任或补充责任。一定条件下,出借人甚至可能构成犯罪。一、基本案情陈某与张某系多年好友关系。2021年初,陈某因资金周转需要,向张某提出借款30万元,月利率为1%,按月还息,先息后本,两年还清。张某考虑双方好友关系以及有利可图,便同意借款。因张某在农业银行账户有足够的活期存款可使用,遂要求陈某使用农业银行账户接收借款。又因陈某此前未开设农业银行账户,故在未向儿子陈小某告知用途的情况下,借用儿子的农业银行账户,并指示张某将借款转入该账户。于是,张某将30万元借款转入陈小某的农业银行账户。陈小某对父亲陈某使用其银行账户借款并不知情,亦未实际使用该借款。 借款期限届满后,陈某只偿还了一年的利息。张某多次催讨,陈某虽向张某承诺一定会偿还剩余借款本息,但其迟迟未予偿还。张某忍无可忍,将陈某和陈小某一同诉至人民法院,要求陈某偿还本息,陈小某承担连带清偿责任。二、争议焦点庭审中,原告张某提交的证据《借条》和《银行交易明细清单》,能充分证明陈某向其借款及偿还了一年利息的事实,被告陈某亦完全认可尚未偿还的借款本息金额且愿意偿还。但是,双方在陈小某是否应当承担连带还款责任的问题上,产生重大分歧。法庭围绕该争议焦点展开辩论。原告张某主张:首先,原告虽要求陈某提供农业银行账户接收借款,但陈某完全可以亲自到农业银行新开设自己的农业银行账户,不必借用其儿子陈小某的农业银行账户接收借款。 其次,被告陈某和陈小某系父子关系,原告完全有理由相信陈某借用陈小某的农业银行账户时向陈小某告知了用途,陈小某对自己的农业银行账户接收张某...
× 扫一扫,关注微信公众号
铭盾MiNGDUN www.mdlaw.cn
Copyright© 2008 - 2025 铭盾京ICP备09063742号-1犀牛云提供企业云服务
X
1

QQ设置

3

SKYPE 设置

4

阿里旺旺设置

5

电话号码管理

6

二维码管理

展开