DOCUMENTS

The problems with the FPB's Draft Online Regulation Policy - HSF

Foundation says document inter alia fails to guarantee the right to publication of political views

Submission in response to the Film Publication Board’s Draft Online Regulation Policy

(Notice 182 of 2015, Gazette No. 38531)

Introduction

The Helen Suzman Foundation’s (“HSF”) mandate is to promote and defend South Africa’s constitutional democracy. The HSF’s interest in the Draft Online Regulation Policy (“Policy”) centres on ensuring that those who hold power are always accountable; that liberty is protected; that freedom of expression is not unconstitutionally curtailed; and that intentional and unintentional consequences of policies are considered. Central to our work is the defence of the Rule of Law.

The HSF welcomes the opportunity to make a submission on the Film Publication Board’s Draft Online Regulation Policy (“the Policy Document”). The HSF sees this opportunity as a way of fostering critical, yet constructive, dialogue between civil society and government in terms of the legislative process.

Summary statement of issues

Section 18(1) of the Films and Publications Act (“Act”) states that any person who intends to distribute any film or game and certain publications in the Republic of South Africa first must register with the Board and to submit to the Board for examination and classification of such film, game or publication. The meaning of “any person” is not defined and neither are the entities and circumstances in which they would be subject to the Policy.

The HSF has the following concerns about the Policy Document:

- the use of vague definitions, in particular, the complete absence of definition of “certain publications”;

- the impractical requirements placed on the Film and Publication Board and Film and Publication Review Board[1];

- the limitations on freedom of speech, other than those entrenched by the Constitution and common law;

- a failure to guarantee the right to publication of political views;

- the potential for impeding news delivery through media other than newspapers;

- a failure to recognise that greater caregiver interaction as well as the use of parental controls could meet the aims of this Policy Document;

- the powers granted to access premises and subsequent indemnity for any losses resulting from such access;

- capacity requirements and imposition of costs;

- the generally unenforceable nature of the policy, and the potential for specific victimisation.

Potential Drawbacks on the draft regulations

The HSF seeks to note the following concerns, which are to be elaborated upon in the body of the submission:

- The inconsistencies which exist within certain sections; examples being section 5.1.1 which refers to films, games and certain publications, and 5.1.3 only refers to games and films.

- The fact that the Policy Document focuses on pre-classification of content which means then that anyone who wants to publish anything on the internet has to apply to the FPB and pay the distributor’s fee. This is not practical as in essence it limits the freedom of expression and will not work for platforms such as Facebook and Twitter, to name but a few.

- The regulations contain many contradictions. Even though the FPB does not intend to regulate the individual’s personal posts, the Policy Document uses the word “persons” which mean individuals. The worrying factor is where it is stated that user created content includes any publication (which includes amongst other things a drawing, picture, etc.).[2] This means that any form of media published online will fall into this category. The clause goes on to state that where user-created content which is prohibited or illegal the Board has the power to take it down and refer such content to the SAPS for prosecution. This then means that ordinary people might be prosecuted for their individual content that they publish online. This is over-reaching in our view.

- The limitations on freedom of speech are such that people can no longer enjoy the freedom of their publications online without the fear of being prosecuted. Restriction on freedom of speech is unconstitutional, with few exceptions, such as hate speech.

- The Policy Document makes no mention of protection of political views.

- The Policy Document regards the only thing protecting children is caregivers exercising parental control. This is not true in the sense that there exists many different role players established to protect children, such a courts, social workers, legislation etc.

Schedule

Concerns about the rationale[3]:

- The rationale is the protection of children.

- However, this is more readily achieved by encouraging parents or guardians to use the “parental control” feature found on most platforms and other existing forms of protection of children

Concerns about the use of the terms “film” and “certain publication”[4]:

- Nowhere in the schedule is there a clear definition of a “certain publication”.

- Furthermore, “film” as defined in the document and the Films and Publications Act of 1996 (‘the Act”) is too broad.

Concerns about the focus on content as opposed to platform[5]:

- Each platform has its own policies and mechanisms in place to protect users. No mention of these are made, nor of the protection which they afford.

- A focus on content and not on platform will inevitably create a situation in which the constitution and laws protecting free speech will be under threat by the FPB.

- The protection of children should not be used as a slippery slope argument[6] to forward darker agendas, such as compliance with “community standards” or the “government’s agenda on social cohesion”. The fact that both these terms are undefined makes the possibility of abuse worse.

Explanatory Memorandum

1. Background

1.1. All stakeholders “will join hands and share the costs and responsibility for digital content classification and compliance monitoring to ensure that children are protected from exposure to disturbing and harmful content”.[7]

1.2. What is of concern is that all stakeholders will now shoulder the costs incurred in a regulatory institution in which they had no part in creating.

2. Policy Development Context

2.1. The memorandum states that “…the downside to this is that there is also a proliferation of illegal content in and the abuse of social media platforms which are at times used by sexual predators to lure their child victims and people who advocate racist ideologies and therefore use these platforms to undermine the government's agenda on social cohesion”.[8]

2.2. The concern here is the failure to understand the options available to curb abuse of social media. The drafters have clearly never lodged a complaint to a social media platform or they would know that these are generally addressed speedily.

2.3. The attempt to introduce pre-publication censorship is wrong in principle and would be expensive and probably ineffective in practice.

2.4. The community is not as vulnerable as the policy would have us believe.[9] In particular, the principle that adults are their own best judges of their reading, listening and viewing should be respected.

3. Problems with the current framework

3.1. “More specifically, the main problem identified was the piecemeal regulatory responses to changes in technologies, markets and consumer behaviour which have the potential to create uncertainty for both consumers and industry.”[10]

3.2. The HSF believes that changes in technologies, markets and consumer behaviour are inherently unpredictable. Piecemeal change is inevitable. The draft itself provides for annual updates of the policy.

4. Key Features of the Policy Document

4.1. The key features include the following:

4.1.1. Platform-neutral regulation.[11] This raises concerns as now the platforms themselves become subject to the policy, as do their terms and conditions.

4.1.1.1. The Policy Document seeks to “elevate the Act’s ‘platform-neutrality’ to ensure uniform compliance by all content distributors”.[12] This will require that “all content distributors” be subject to these requirements. This is clearly contrary to the media statements that the policy only applies to films and games.[13]

4.1.2. The Policy Document nowhere elucidates the scope of the type of content to be classified.[14]

4.1.2.1. The Policy Document recognises that “it is impractical to expect all media content…to be classified” and yet makes it the responsibility of the platform provider, in consultation with the FPB, to classify all this content.[15]

4.1.2.2. The Policy Document then provides for what can only be interpreted as an obscure exception. It states that “the obligation to classify content will not generally apply to persons uploading online content on a non-commercial basis”.[16] This could then include or exclude any NGO or political party depending on the interpretation of “not generally” and “non-commercial basis”.

4.1.2.3. Of concern is the provision that the any “non-compliance” will result in the distributor being instructed to remove offending content until such time as the FPB approves a classification and the fees for such classification have been paid.[17]

4.1.3. The title “Co-regulation and industry classification”[18] is misleading. The policy merely serves to make industry enforce the directives of the FPB.

4.1.3.1. This view seems rather one sided when one considers the benefits and costs of this “greater role to the industry”.[19] “Co” as a prefix denotes that there will be an equal distribution of work and benefit. The policy itself notes that this is untrue. The benefit still accrues entirely to the FPB and the work, and costs are imposed on publishers. Certain provisions go so far as to make the publishers solely responsible for the classification process, subject to penalties.

4.1.4. Thus, the implementation and interpretation of “co-regulation” is clearly not ideal.[20] Oversight trumps guidance.

4.1.4.1. The FPB will be responsible for a range of functions that are not only deadly specific but far too broad.[21]

5. Policy Consultation Process as well as Consultation Participants

5.1. The policy recognises that this policy will have an effect on “all South Africans”[22] which is clearly more accurate than its assertions that limited persons and interactions will be subject to this policy.

5.2. The period for stakeholder consultation specified on the website[23] is too short. A longer period would allow more meaningful consultation and input.

Draft Online Regulation Policy

Definitions

- “Online content”[24] is given a scope that, in the light of platform-neutrality, creates the opportunity to target online media and is very broad.

- “Self-generated content or user-generated content”[25] creates the situation in which “through open collaboration” allows for networks being subject to the policy. Furthermore, “interact to create or produce or service online” could potentially make third parties liable for infringements.

- The lack of definitions in the regulation fails to give substance to the regulation and its implementation.

1. Introduction

1.1. The Policy Document is silent as to the content of “certain publication”[26], which is problematic as it becomes the term by which liability is attached. It also fails to provide parameters as to the type of content it is referring.

1.2. The identity of the “Executive authority”[27] that has called for this framework is not clear. If it were, the agenda might become clearer.

1.3. It should be noted that the “innovative regulation” envisioned herein is akin to Tom Sawyer and Huckleberry Finn[28] having industry paint the fence. However, cunning wit is replaced with punishment.

1.4. The Policy Document’s intention that “classification focuses on media content”[29] makes the platform and its inherent safeguards subject to the classification of the “Executive authority”.

2. Application of the Draft Policy

2.1. “This Online Regulation Policy applies to every person who distributes or exhibits online any film, game, or certain publication in the Republic of South Africa.”[30]

2.2. The concern lies with the definitions of the individual elements housed in the blanket application.

3. Objectives of the Draft Policy Document

4. Guiding Principles for Online Content Regulatory Policy[31]

4.1. These principles have been cut and pasted from the policy of another country, namely Australia, facing an entirely different context.

4.2. Principle two[32] is too politically loaded for South Africa.

4.3. Principle three[33], however noble, does not recognise the limits of South African State capacity.

4.4. Principle four[34] would result in the news being censored before release. This is completely unacceptable. Simply exempting newspapers is not nearly enough in age when social media are important distributors of news.

5. Policy on Online Distribution of Digital Films, Games, and Certain Publications[35]

5.1. The policy seeks to enact a procedure whereby uniform classification of online content will be ensured:

5.1.1. Any person who intends to be responsible for content will need to apply to be a distributor. What this means is that NGO’s, such as ourselves, would need to apply as they have online content of public interest.

5.1.2. Any content will then need to be classified. This will either be done by someone trained to do so or by the Board. Both avenues will be subject to fees. Thus, every item of content that is posted on the HSF website will need to be pre-approved and paid for.

5.1.3. Item 5.1.4[36] is tyrannical. It allows for the Board where it is “convenient and practical” for them to enter the distributor’s premises. Furthermore, the distributors are to ensure that the “work of the classifiers takes place unhindered and without interference”,[37] and that the distributor indemnifies the Board against any damage as a result of the classifiers.[38]

6. Online Distribution of Television Films, Games and Certain Publications:

6.1. The section refers to all “digital content in the form of television films and programmes streamed online”.[39] What is not clear, however, is what constitutes such a film or programme.

6.2. In item 6.2 the exact nature of “co-regulation” is noted in that “[e]ssentially, the Board must be satisfied that authorised classification systems deliver classification decisions comparable to those that might be made if content were classified by the Board’s classifiers operating under the Act”.[40]

6.3. The policy makes provision that content which is likely to have “a high profile release” or has the potential to produce “controversy in another jurisdiction”, then the Board has the power to classify the content.[41] These provisions are an unacceptable limitation on content.

6.4. Lastly the specific provision requires that websites contain all classification decisions along with an explanation as to how the classification system works and indeed what content is “deemed”. This is potentially problematic as it creates an added burden upon the publisher as they now have to cater for extra information on their servers. To have a page detailing how every classification decision was arrived at, in the detail required, would furthermore be time consuming and presumably add to costs.[42]

7. Prohibition Against Child Exploitative Media Content and Classification by the Board of Self-Generated Content

7.1. What should be considered is the person who allows a child access to these platforms. If bulk of “contact services”[43] is unclassified, parents or guardians should monitor their children.

7.2. Item 7.4 makes it abundantly clear that all content – even on private forums – will be subject to the FPB and the “Executive authority”.

7.3. Item 7.6 notes that the decision of the Classification Committee is final and binding. What about the Courts?

8. Matter the Board must consider[44]

8.1. Item 8.2 clearly contradicts item 5.1 as this item calls for a “uniform classification” and the former says it is not possible.

8.2. Item 7.6 cannot be read as being a realisation of items 8.2.iv-v, as the latter leaves all power in the hands of the Board’s Classification Committee, and the former requires the utmost transparency in the decision making process as well as the availability and integrity of review mechanisms.

9. Checks and Safeguards

9.1. Allowing for self-classification may result in “certain sectors of South African society” having concerns about the “acceptable balance” between content and communities.[45] Who constitutes these certain sectors and more importantly who determines community needs and concerns?

10. Online Distributing Licensing Fee and Classification Fee Per Title

10.1. All online content distributors need to register with the Board. They will also have to pay the prescribed (but not yet published) online licensing fee.[46]

10.2. The prescribed (but undisclosed fee) will be paid annually and will escalate accordingly.[47]

10.3. Moreover, the Board may charge a classification fee per title.[48]

11. Complaints

11.1. Item 11.1 enables the Board to investigate “valid complaints”. Yet, there is no indication as to how it will be determined that a complaint is even prima facie valid. This is an important concern given the tyrannical powers that the Board would have in inspecting and even destroying a distributor’s property.

11.2. Although the policy makes provision for the content provider / distributor to develop their own complaint handling mechanisms[49] the Board still retains all authority to investigate the complaints[50].

11.3. The Board may, furthermore, at its own accord direct a provider / distributor to classify or review the original classification decision.[51]

12. Reviews of Classification Decisions[52]

12.1. Item 12.1 defines review proceedings involves “the making of a new decision on the merits, which replaces the original decision”. From a legal point of view this is known as an “appeal”, as reviews consider the reasoning of the decision maker and not the content of the matter. Thus the Board creates confusion by attempting to limit recourse even further.

12.2. Item 12.3 further notes that all voluntarily classified content is subject to review as well and thus subject to additional costs.

13. Audits of Industry Classification Decisions[53]

13.1. The Board is given the power to undertake “post-classification” audits of all media content that must be classified.[54] It is unclear if this content refers particularly to media as in the organisations responsible for its creation and or distribution, or if it refers to what was previously “online” content.

13.2. What becomes clear is that the audit results will be used as evidence of “serious and repeated misconduct” which will allow the Board to impose sanctions. However, what is not clear is the procedure leading up to this decision.

14. Sanctions Regime for Industry Classifiers[55]

14.1. The question that needs to be answered is whether South Africans are indeed as vulnerable as the policy drafters make us out to be. Item 14.1 notes that sanctions are a last resort against those distributors that repeatedly mislead the consumer with incorrect or grossly inadequate classifications. Should the consumer not simply learn from own continued abuse? Or should the principle be: fool me once, shame on you; fool me twice, let government protect me?

14.2. The Board will have the power to impose sanctions on the above transgressors for their repeated violations. What constitutes “repeated” will clearly be at the discretion of the Board as it is defined nowhere else.

Possible Solutions

Even though the rationale for the policy is the protection of children, there are other and better ways for protecting children. Such other methods include working with the department of education to introduce digital literacy tuition into the school curricula, and the roll-out of national digital literacy public awareness campaigns to warn parents etc.

There are also apps which assist parents in monitoring what their children watch and post online to assist them.

Francis Antonie

Director 13 July 2015

Footnotes:


[1] As provided for in Chapter 2 of the Film and Publications Act 65 of 1996.

[2] Page 26 (8 of 20).

[3] Page 4 pt 2.

2. The mandate of the Board can be summarised as follows:

2.1 To regulate the creation, production, possession and distribution of films, games and certain publications by way of classification;

2.2 To protect children from exposure to disturbing and harmful material and from premature exposure to adult material; and

2.3 To criminalise child pornography and the use and exposure of children to pornography

[4] Page 4 pt 3.

[5] Page 4 pt 4. Furthermore, page 5 pt 8 culminates in noting that the target is once again the content, regardless of the laws governing the platforms.

[6] Page 5 pt 6 continues the point that this is all to protect children in their interactions.

[7] Page 8 (Page 1 of the Explanatory Memorandum).

[8] Page 9 (Page 2 of the Explanatory Memorandum).

[9] Page 10 (3):

“…there continues to be a community expectation that certain media content, including digital content, be accompanied by classification information based on decisions which reflect the community's moral standards.”

The section concludes with the worst case scenario of what is not protected by section 16 of the Constitution and fails to address that all the examples mentioned could be solved by reducing stupidity and not by implementing “big brother”.

[10] Page 12 (5).

[11] Page 12 (5) pt 4.1.

[12] Page 13 (6).

[13] Reid, J. Africa’s worst new Internet Censorship Law: Everything you don’t want to know – but need to. Daily Maverick  

[14] Page 13 (6) pt 4.2.

[15] Page 14 (7).

[16] Page 14 (7).

[17] Page 14 (7).

[18] Page 13 (6) pt 4.3.

[19] Page 15 (8).

[20] Page 15 (8).

[21] Page 15 (8).

[22] Page 16 (9) pt 5.

[23] See here accessed on 26 June 2015.

[24] Page 22 (4 of 20).

[25] Page 22 (4 of 20).

[26] Page 24 (6 of 20).

[27] Page 24 (6 of 20).

[28] Mark Twain sets out, in The Adventures of Huckleberry Finn, how Tow Sawyer “convinces” the other children to paint the fence, a job which was his responsibility, as it is fun. All this happens while Huck Finn sits by admiring the masterful manipulation.

[29] Page 25 (7 of 20).

[30] Page 26 (8 of 20).

[31] Page 27 (9 of 20) It should be noted that page 8, the Guiding Principles, of the Australian Law Reform 2012 Report (Classification—Content Regulation and Convergent Media, ALRC Report 118 February 2012) was simply copy and paste into South African policy. The plagiarism from the Australian report puts the FPB in a bad light as it contradicts the claim that they have publicly made that the policy was drafted with “South African cultural values” in mind. The FPB says it consulted broadly before drafting the regulations, and focused on specifically South African cultural values – they cannot tell us with whom they consulted, though, and whose "cultural values" have been considered, which is problematic in and of itself considering the broad spectrum of cultural diversity within our country. This was confirmed by Ms Reid.

[32] Page 17 (9 of 20) “(2) communications and media services available to South Africans should broadly reflect community standards, while recognising a diversity of views, cultures and ideas in the community”.

[33] Page 27 (9 of 20) “(3) children should be protected from material likely to harm or disturb them”.

[34] Page 27 (9 of 20) “(4) consumers should be provided with information about media content in a timely and clear manner, and with a responsive and effective means of addressing their concerns, including through complaints”.

[35] Page 28 (10 of 20).

[36] Page 28 (10 of 20).

[37] Page 28 (10 of 20).

[38] Page 28 (10 of 20) pt 5.1.6.

[39] Page 32 (14 of 20) pt 6.1.

[40] Page 32 (14 of 20) pt 6.2.

[41] Page 32 (14 of 20) pt 6.3.

[42] Page 32 (14 of 20) pt 6.5.

[43] Page 33 (15 of 20).

[44] Page 34 (16 of 20).

[45] Page 25 (17 of 20) pt 9.1.

[46] Page 35 (17 of 20) pt 10.1.

[47] Page 35 (17 of 20) pt 10.2.

[48] Page 35 (17 of 20) pt 10.3.

[49] Page 36 (18 of 20) pt 11.2-3.

[50] Page 36 (18 of 20) pt 11.4.

[51] Page 36 (18 of 20) pt 11.6.

[52] Page 36 (18 of 20).

[53] Page 37 (19 of 20).

[54] Page 37 (19 of 20) pt 13.1.

[55] Page 37 (19 of 20).

Issued by the HSF, July 13 2015