PART 3Providers of regulated user-to-user services and regulated search services: duties of care

CHAPTER 2Providers of user-to-user services: duties of care

Category 1 services

I2I714Assessment duties: user empowerment

1

This section sets out the duties about assessments related to adult user empowerment which apply in relation to Category 1 services (in addition to the duties about risk assessments set out in section 9 and, in the case of Category 1 services likely to be accessed by children, section 11).

2

A duty to carry out a suitable and sufficient assessment for the purposes of section 15(2) at a time set out in, or as provided by, Schedule 3.

3

A duty to take appropriate steps to keep such an assessment up to date.

4

Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient assessment for the purposes of section 15(2) relating to the impacts of that proposed change.

5

An assessment of a service “for the purposes of section 15(2)” means an assessment of the following matters—

a

the user base;

b

the incidence of relevant content on the service;

c

the likelihood of adult users of the service encountering, by means of the service, each kind of relevant content (with each kind separately assessed), taking into account (in particular) algorithms used by the service, and how easily, quickly and widely content may be disseminated by means of the service;

d

the likelihood of adult users with a certain characteristic or who are members of a certain group encountering relevant content which particularly affects them;

e

the likelihood of functionalities of the service facilitating the presence or dissemination of relevant content, identifying and assessing those functionalities more likely to do so;

f

the different ways in which the service is used, and the impact of such use on the likelihood of adult users encountering relevant content;

g

how the design and operation of the service (including the business model, governance, use of proactive technology, measures to strengthen adult users’ control over their interaction with user-generated content, and other systems and processes) may reduce or increase the likelihood of adult users encountering relevant content.

6

In this section “relevant content” means content to which section 15(2) applies (content to which user empowerment duties set out in that provision apply).

7

See also—

a

section 23(9) and (10) (records of assessments), and

b

Schedule 3 (timing of providers’ assessments).

I5I815User empowerment duties

1

This section sets out the duties to empower adult users which apply in relation to Category 1 services.

2

A duty to include in a service, to the extent that it is proportionate to do so, features which adult users may use or apply if they wish to increase their control over content to which this subsection applies.

3

The features referred to in subsection (2) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to effectively—

a

reduce the likelihood of the user encountering content to which subsection (2) applies present on the service, or

b

alert the user to content present on the service that is a particular kind of content to which subsection (2) applies.

4

A duty to ensure that all features included in a service in compliance with the duty set out in subsection (2) (“control features”) are made available to all adult users and are easy to access.

5

A duty to operate a service using a system or process which seeks to ensure that all registered adult users are offered the earliest possible opportunity, in relation to each control feature included in the service, to take a step indicating to the provider that—

a

the user wishes to retain the default setting for the feature (whether that is that the feature is in use or applied, or is not in use or applied), or

b

the user wishes to change the default setting for the feature.

6

The duty set out in subsection (5)

a

continues to apply in relation to a user and a control feature for so long as the user has not yet taken a step mentioned in that subsection in relation to the feature;

b

no longer applies in relation to a user once the user has taken such a step in relation to every control feature included in the service.

7

A duty to include clear and accessible provisions in the terms of service specifying which control features are offered and how users may take advantage of them.

8

A duty to summarise in the terms of service the findings of the most recent assessment of a service under section 14 (assessments related to the duty set out in subsection (2)).

9

A duty to include in a service features which adult users may use or apply if they wish to filter out non-verified users.

10

The features referred to in subsection (9) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to effectively—

a

prevent non-verified users from interacting with content which that user generates, uploads or shares on the service, and

b

reduce the likelihood of that user encountering content which non-verified users generate, upload or share on the service.

I3I916User empowerment duties: interpretation

1

In determining what is proportionate for the purposes of section 15(2), the following factors, in particular, are relevant—

a

all the findings of the most recent assessment under section 14, and

b

the size and capacity of the provider of the service.

2

Section 15(2) applies to content that—

a

is regulated user-generated content in relation to the service in question, and

b

is within subsection (3), (4) or (5).

3

Content is within this subsection if it encourages, promotes or provides instructions for—

a

suicide or an act of deliberate self-injury, or

b

an eating disorder or behaviours associated with an eating disorder.

4

Content is within this subsection if it is abusive and the abuse targets any of the following characteristics—

a

race,

b

religion,

c

sex,

d

sexual orientation,

e

disability, or

f

gender reassignment.

5

Content is within this subsection if it incites hatred against people—

a

of a particular race, religion, sex or sexual orientation,

b

who have a disability, or

c

who have the characteristic of gender reassignment.

6

The duty set out in section 15(5) applies in relation to all registered adult users, not just those who begin to use a service after that duty begins to apply.

7

In section 15 and this section—

  • disability” means any physical or mental impairment;

  • injury” includes poisoning;

  • non-verified user” means a user who—

    1. a

      is an individual, whether in the United Kingdom or outside it, and

    2. b

      has not verified their identity to the provider of a service;

  • race” includes colour, nationality, and ethnic or national origins.

8

In section 15 and this section—

a

references to features include references to functionalities and settings, and

b

references to religion include references to a lack of religion.

9

For the purposes of section 15 and this section, a person has the characteristic of gender reassignment if the person is proposing to undergo, is undergoing or has undergone a process (or part of a process) for the purpose of reassigning the person’s sex by changing physiological or other attributes of sex, and the reference to gender reassignment in subsection (4) is to be construed accordingly.

10

See also, in relation to duties set out in section 15, section 22 (duties about freedom of expression and privacy).

I1I1017Duties to protect content of democratic importance

1

This section sets out the duties to protect content of democratic importance which apply in relation to Category 1 services.

2

A duty to operate a service using proportionate systems and processes designed to ensure that the importance of the free expression of content of democratic importance is taken into account when making decisions about—

a

how to treat such content (especially decisions about whether to take it down or restrict users’ access to it), and

b

whether to take action against a user generating, uploading or sharing such content.

3

A duty to ensure that the systems and processes mentioned in subsection (2) apply in the same way to a wide diversity of political opinion.

4

A duty to include provisions in the terms of service specifying the policies and processes that are designed to take account of the principle mentioned in subsection (2), including, in particular, how that principle is applied to decisions mentioned in that subsection.

5

A duty to ensure that—

a

the provisions of the terms of service referred to in subsection (4) are clear and accessible, and

b

those provisions are applied consistently.

6

In determining what is proportionate for the purposes of subsection (2), the size and capacity of the provider of a service, in particular, is relevant.

7

For the purposes of this section content is “content of democratic importance”, in relation to a user-to-user service, if—

a

the content is—

i

news publisher content in relation to that service, or

ii

regulated user-generated content in relation to that service; and

b

the content is or appears to be specifically intended to contribute to democratic political debate in the United Kingdom or a part or area of the United Kingdom.

8

In this section, the reference to “taking action” against a user is to giving a warning to a user, or suspending or banning a user from using a service, or in any way restricting a user’s ability to use a service.

9

For the meaning of “news publisher content” and “regulated user-generated content”, see section 55.

I618Duties to protect news publisher content

1

This section sets out the duties to protect news publisher content which apply in relation to Category 1 services.

2

Subject to subsections (4), (5) and (8), a duty, in relation to a service, to take the steps set out in subsection (3) before—

a

taking action in relation to content present on the service that is news publisher content, or

b

taking action against a user who is a recognised news publisher.

3

The steps referred to in subsection (2) are—

a

to give the recognised news publisher in question a notification which—

i

specifies the action that the provider is considering taking,

ii

gives reasons for that proposed action by reference to each relevant provision of the terms of service,

iii

where the proposed action relates to news publisher content that is also journalistic content, explains how the provider took the importance of the free expression of journalistic content into account when deciding on the proposed action, and

iv

specifies a reasonable period within which the recognised news publisher may make representations,

b

to consider any representations that are made, and

c

to notify the recognised news publisher of the decision and the reasons for it (addressing any representations made).

4

If a provider of a service reasonably considers that the provider would incur criminal or civil liability in relation to news publisher content present on the service if it were not taken down swiftly, the provider may take down that content without having taken the steps set out in subsection (3).

5

A provider of a service may also take down news publisher content present on the service without having taken the steps set out in subsection (3) if that content amounts to a relevant offence (see section 59 and also subsection (10) of this section).

6

Subject to subsection (8), if a provider takes action in relation to news publisher content or against a recognised news publisher without having taken the steps set out in subsection (3), a duty to take the steps set out in subsection (7).

7

The steps referred to in subsection (6) are—

a

to swiftly notify the recognised news publisher in question of the action taken, giving the provider’s justification for not having first taken the steps set out in subsection (3),

b

to specify a reasonable period within which the recognised news publisher may request that the action is reversed, and

c

if a request is made as mentioned in paragraph (b)—

i

to consider the request and whether the steps set out in subsection (3) should have been taken prior to the action being taken,

ii

if the provider concludes that those steps should have been taken, to swiftly reverse the action, and

iii

to notify the recognised news publisher of the decision and the reasons for it (addressing any reasons accompanying the request for reversal of the action).

8

If a recognised news publisher has been banned from using a service (and the ban is still in force), the provider of the service may take action in relation to news publisher content present on the service which was generated or originally published or broadcast by the recognised news publisher without complying with the duties set out in this section.

9

For the purposes of this section, a provider is not to be regarded as taking action in relation to news publisher content in the following circumstances—

a

a provider takes action in relation to content which is not news publisher content, that action affects related news publisher content, the grounds for the action only relate to the content which is not news publisher content, and it is not technically feasible for the action only to relate to the content which is not news publisher content;

b

a provider takes action against a user, and that action affects news publisher content that has been uploaded to or shared on the service by the user.

10

Section 192 (providers’ judgements about the status of content) applies in relation to judgements by providers about whether news publisher content amounts to a relevant offence as it applies in relation to judgements about whether content is illegal content.

11

Any provision of the terms of service has effect subject to this section.

12

In this section—

a

references to “news publisher content” are to content that is news publisher content in relation to the service in question;

b

references to “taking action” against a person are to giving a warning to a person, or suspending or banning a person from using a service, or in any way restricting a person’s ability to use a service.

13

In this section references to “taking action” in relation to content are to—

a

taking down content,

b

restricting users’ access to content, or

c

adding warning labels to content, except warning labels normally encountered only by child users,

and also include references to taking any other action in relation to content on the grounds that it is content of a kind which is the subject of a relevant term of service (but not otherwise).

14

A “relevant term of service” means a term of service which indicates to users (in whatever words) that the presence of a particular kind of content, from the time it is generated, uploaded or shared on the service, is not tolerated on the service or is tolerated but liable to result in the provider treating it in a way that makes it less likely that other users will encounter it.

15

Taking any step set out in subsection (3) or (7) does not count as “taking action” for the purposes of this section.

16

See—

  • section 19 for the meaning of “journalistic content”;

  • section 55 for the meaning of “news publisher content”;

  • section 56 for the meaning of “recognised news publisher”.

Annotations:
Commencement Information
I6

S. 18 not in force at Royal Assent, see s. 240(1)

I4I1119Duties to protect journalistic content

1

This section sets out the duties to protect journalistic content which apply in relation to Category 1 services.

The duties

2

A duty to operate a service using proportionate systems and processes designed to ensure that the importance of the free expression of journalistic content is taken into account when making decisions about—

a

how to treat such content (especially decisions about whether to take it down or restrict users’ access to it), and

b

whether to take action against a user generating, uploading or sharing such content.

3

A duty, in relation to a decision by a provider to take down content or to restrict access to it, to make a dedicated and expedited complaints procedure available to a person who considers the content to be journalistic content and who is—

a

the user who generated, uploaded or shared the content on the service, or

b

the creator of the content (see subsections (14) and (15)).

4

A duty to make a dedicated and expedited complaints procedure available to users of a service in relation to a decision by the provider of the service to take action against a user because of content generated, uploaded or shared by the user which the user considers to be journalistic content.

5

A duty to ensure that—

a

if a complaint about a decision mentioned in subsection (3) is upheld, the content is swiftly reinstated on the service;

b

if a complaint about a decision mentioned in subsection (4) is upheld, the action against the user is swiftly reversed.

6

Subsections (3) and (4) do not require a provider to make a dedicated and expedited complaints procedure available to a recognised news publisher in relation to a decision if the provider has taken the steps set out in section 18(3) in relation to that decision.

7

A duty to include provisions in the terms of service specifying—

a

by what methods content present on the service is to be identified as journalistic content;

b

how the importance of the free expression of journalistic content is to be taken into account when making decisions mentioned in subsection (2);

c

the policies and processes for handling complaints in relation to content which is, or is considered to be, journalistic content.

8

A duty to ensure that—

a

the provisions of the terms of service referred to in subsection (7) are clear and accessible, and

b

those provisions are applied consistently.

Interpretation

9

In determining what is proportionate for the purposes of subsection (2), the size and capacity of the provider of a service, in particular, is relevant.

10

For the purposes of this Part content is “journalistic content”, in relation to a user-to-user service, if—

a

the content is—

i

news publisher content in relation to that service, or

ii

regulated user-generated content in relation to that service;

b

the content is generated for the purposes of journalism; and

c

the content is UK-linked.

11

For the purposes of this section content is “UK-linked” if—

a

United Kingdom users of the service form one of the target markets for the content (or the only target market), or

b

the content is or is likely to be of interest to a significant number of United Kingdom users.

12

In this section references to “taking action” against a user are to giving a warning to a user, or suspending or banning a user from using a service, or in any way restricting a user’s ability to use a service.

13

In this section the reference to the “creator” of content is to be read in accordance with subsections (14) and (15).

14

The creator of news publisher content is the recognised news publisher in question.

15

The creator of content other than news publisher content is—

a

an individual who—

i

created the content, and

ii

is in the United Kingdom; or

b

an entity which—

i

created the content, and

ii

is incorporated or formed under the law of any part of the United Kingdom.

16

For the meaning of “news publisher content”, “regulated user-generated content” and “recognised news publisher”, see sections 55 and 56.