SUMMARY: Introduction: A look at TikTok through the law. I. The "distinctive features" of TikTok. II. What regulation? Judicial interventions and actions by Authorities against TikTok. III. What awareness and what agency? A sociological perspective: prosumers, digital, encoded and convergent culture, interactivity, personalization. IV. Strategies for a conscious use of the platform: digital educational pacts and digital citizenship. References.
Introduction: A look at TikTok through the law
TikTok is a digital environment accessed by millions of users worldwide, the vast majority of whom are minors and young people1. Originally created to share very short videos featuring choreography, pranks, and karaoke, which remain the most widespread type of instant and entertaining content2, it has recently become a vehicle for messages of cultural, philosophical, historical, literary, and lato sensu educational nature3, and, yet, political and activist4, health-related5, environmental6, and economic content7.
What still seems to be lacking in this context is an investigation through legal instruments and - an aspect of particular interest here - from the perspective of protecting the rights of minors, children, adolescents and young people in general; in other words, an approach that takes seriously the forms of interaction on this platform and, within this dimension, considers the potential vulnerability of these subjects8.
This contribution primarily aims to thoroughly examine the distinctive features of this platform and its usage, also with the view to understanding the risks arising from the lack of awareness of younger users. From this standpoint, the initial discussion aligns with the perspective of the recently approved AI Act, which, as is well known, revolves around the concept of "risk"9 and the issue of "usability" of technologies.
The analysis - which is part of an effort to promote inclusive and equal digital citizenship - will then critically examine the messages conveyed on social media, particularly focusing on "hate speech" and "sexist" content promoted by certain circulating materials on the web, which can also be widely disseminated on Tik-Tok10. Specifically, the contribution will focus on certain measures - such as those by AgCom (the Communications Regulatory Authority)11 and Agem (the Antitrust Authority)12 - adopted in Italy under the new "Audiovisual Media Regulation" ap proved in December 202313. The new rules allow authorities to restrict the distribution of programs, videos, and commercial communications aimed at the Italian public that could potentially harm the physical, psychological, or moral development of minors, promote racial, sexual, religious, or ethnic hatred, violate human dignity, or fail to ensure sufficient protection for consumers.
This provision, along with others that will be reviewed in the text and adopted in various parts of the world, demonstrates how TikTok today represents a space that, like other digital environments connected to the development of artificial intelligence, cannot escape legal regulation14.
Some have even gone so far as to claim that this platform is "rewriting the World"15 or, at the very least, has profoundly changed social networks16 in the "connective world"17.
Beyond specific studies on certain geopolitical contexts that are increasingly emerging, the global spread of TikTok18 makes it essential not only to understand its functioning19 but also to grasp its role in what has been defined as the "dynamics of hypersocial communication"20.
Most importantly, it is necessary to ensure its conscious use, which allows for the prevention and mitigation of risks while also recognizing its potential, going beyond the restrictive and exclusive dimension of insecurity.
As has been aptly observed, the proliferation of social networks "not only allows for the flow of information and communications, but also gives rise to new forms of sociality, fosters new types of individual and collective identity, and forms particular communities that, while a-territorial, have their own consistency, share a certain sense of belonging, and consolidate their own interactive dynamics"21.
Therefore, understanding the impact of platforms (in this case, TikTok) on these processes, as well as on the changing "mental habits", particularly of the younger generations, is not only a matter of the communication and information system on the Web, and thus of the realm of knowledge (and the powers of participation associated with it), but also of the legal experience and the role of law in relation to certain phenomena occurring on this platform.
I. The "distinctive features" of TikTok
The social network TikTok is a video-sharing platform that was launched in September 2016 and quickly spread in the United States and globally22, with a significant impact on forms of creativity and youth culture23.
It is an application primarily used by very young individuals under the age of eighteen worldwide, except - not by chance - in China, where a local alter ego, Douyin24, is used.
Both applications are Chinese, similar in their algorithms and in the strategic role they play25; indeed, the very young age of their user base is of fundamental importance: it can, in fact, facilitate the collection of data from citizens, as well as from the future ruling class, through an algorithm that determines content and directs consumption, thus implying the possibility of shaping (or even directing) the formation of personality26.
There are some differences between the platform in China and the rest of the world: in the former, the application seems to be more sophisticated, as it also integrates specific functions for Internet marketing27; in the rest of the world, however, it seems to be less developed, at least so far, being more oriented towards forms of "entertainment" and "socialization"28 compared to the Chinese version.
Unlike other social networks (Facebook, Instagram, WhatsApp, or Telegram), TikTok allows users to upload very short and extremely short videos29 (although the time limits have been extended and the maximum duration can currently reach fifteen minutes) and to accelerate, modify, integrate and filter their own videos or those uploaded to the platform30.
Moreover, the platform makes extensive use of "Artificial Intelligence" to analyze the interests and preferences expressed by the application's users in order to "personalize" their experiences and the content proposed to them31.
Sociological analyses (which will be discussed more extensively below) highlight the importance for the platform of the presence of young people from the so-called "Generation Z" - those born between 1997 and 2012 - and "Generation Alpha", i.e. those born between 2010 and 2020. These analyses follow the modalities of "targeting" and "user profiling" imposed by the logic of marketing, which is often intertwined with the spread of social networks.
For this reason, numerous controversies surround the app, with the risk of amplifying and consolidating various phenomena such as online hate, violence, and sexism32, or the transmission of misleading messages to very young users; in short, the perception of risk and insecurity that is often associated, especially among adults, with the internet and its multiple devices, systems, and artifacts.
According to a reading of TikTok's policy, the dissemination and publication of content via the platform would be prohibited in the following areas: the sale of animals, tobacco, drugs, and weapons; pornographic material and nudity; gambling; fraud; copyrighted material; chemicals or, more generally, dangerous materials; funeral services; abortion services; terrorism, crime, threats, and violent content; harassment, bullying, and, indeed, incitement to hatred, including through discourse characterized by such.
Nevertheless, even indirectly, such content manages to circulate within this platform. It is no coincidence that various states have begun to issue measures against the platform, both through jurisprudence and via authorities. This has thus opened up a more specific issue of regulation.
II. What regulation? Judicial interventions and actions by Authorities against TikTok
As mentioned above, TikTok raises a number of legal issues that have led to decisive measures taken by states against the platform, including judicial actions.
The global popularity achieved by TikTok has required the platform to constantly negotiate with the rules, regulations, and legal frameworks of the regions in which it operates33, but more broadly, it must be evaluated in light of the Convention on the Rights of the Child (1989) and, hopefully, other recently adopted charters (for example, the "Digital Rights Charter and the Human Rights of Children and Adolescents"34).
The platform has been banned (either temporarily or permanently) for its content in several countries, including India, Indonesia, and Pakistan35.
Over time, its popularity among underage users has brought the platform under increasing scrutiny and criticism.
In Italy, for instance, on January 22, 2021, the Data Protection Authority mandated the suspension of the use of user data for which age verification had not been established36. This decision was taken following the death of a ten-year-old girl who was taking part in a user-defined challenge - defined "blackout" - in which participants attempted to suffocate themselves with a tight belt around the neck37. The girl chose the most dangerous variant, which involved participants tying t a belt around their necks to measure their resistance to suffocation and lack of oxygen, with the expectation that they would share their results after the exercise.
In this case, the relevant aspect on the juridical stage is that of the processing of data concerning minors38 as well as their use of social media, given that there is currently no mechanism for effectively verifying the age of users, or at least no mechanism that cannot be circumvented. In fact, registration on the platforms is based on a mere trust-based instrument, namely the declaration of legal age during the registration process on the platform itself.
These are profiles that invoke the so-called parental control39. Starting from November 21, 2023, Italy has implemented a parental control system to protect minors online and to block access to websites with inappropriate content. These are specific technical measures, but in order to ensure effective control, as will be discussed below, it is also necessary to promote an informed use that goes beyond mere technical and informational restrictions.
A second relevant case concerning Italy arose from another dangerous trend that went viral on social media, mainly in France: hence the term "French scar". The challenge consisted in violently squeezing the skin of the cheeks until bruises or "red marks" appeared on the cheekbones. Many psychologists have intervened to highlight the dangers of this trend, not only due to its effects on physical health - in some cases the bruises can last for several weeks, if not months - but especially due to the risks to mental health that such challenges may pose to adolescents.
The trend of this challenge peaked in the early months of winter 2023, and after a series of incidents, AGCM - the Antitrust Authority- has imposed to TikTok to remove the videos40: in this way, it has been possible to reach the first measure adopted by the authority under the new Regulation on Audiovisual Media approved in December 202341.
Moving to another global context, in India, on April 3, 2019, the Madras High Court explicitly requested the Indian government to ban the TikTok application because, according to the Supreme Court42, it encourages serious phenomena related to "sex offenders"43 by displaying inappropriate content to the younger segments of the population. The ban came into effect on April 17, 2019, at the same time as ByteDance (the company that developed TikTok) claimed to have removed more than six million pieces of potentially harmful content regarding minors. However, this ban was lifted on April 25, 2019, following an appeal from one of the app's developers.
Nevertheless, on 29 June 2020, India's Ministry of Electronics and Information Technology decided to completely remove TikTok from Indian online stores following several armed clashes between Indian and Chinese forces in the Ladakh region, stating that the application - also regarded a weapon of "cognitive warfare" (a specification of so-called information warfare) - was considered a threat to national integrity and the protection of free Indian institutions.
In addition, the Indian government clarified that this decision - beyond its purely military nature, within the perspective of a "digital risk society" - was adopted to protect users' data and privacy. Concerns about the treatment of minors' data have also been raised by various European Data Protection Authorities.
In this regard, it is worth noting that in 2023 the Irish Data Protection Authority fined TikTok 345 million euros for breaching the rules regarding on the protection of minors' personal data, specifically the GDPR - General Data Protection Regulation44; the same position was taken by French and Dutch authorities, which expressed significant concern over the inadequate management of minors' data privacy45.
Particularly noteworthy is the recent decision of the Munich Court46, which -pursuant to the Digital Market Act (2020) and the Digital Services Act (2022) - highlighted how the platform violates the obligation to negotiate with rights holders for the utilization of protected content on the platform itself In particular, the German court - assuming that TikTok plays the role of an ocssP (Online Content-Sharing Service) - asserted the uncontestable fact that "large amounts of user-uploaded content are stored on the TikTok platform and then made publicly available to other users. Such content is organized by the defendant, among other things, in user profiles and through 'hashtags'. It is made accessible by the defendant for the purpose of generating profits through advertising revenue. The TikTok platform competes with other online content. Given its global reach of over one billion active users per month, the platform can be considered to play a quantitatively important role in the online content market"47.
Moreover, it should also be noted, in passing, that on September 5, 2023, the European Commission classified ByteDance (the holding company that owns TikTok) as a gatekeeper under Article 3 of the DSA Regulation, precisely because of the service offered by TikTok and its relevance in the market. This decision has been challenged by ByteDance: it will be necessary to wait for a definitive position, as currently, TikTok's precautionary requests have been rejected for lack of periculum.
The court recognized as proven the presence of the appellant's protected videos on the platform managed by TikTok, and that such presence constituted a public communication act under Directive 2019/790 and its implementation in Germany. According to the Court, TikTok should be held responsible for this act of public communication because the platform had not made any efforts to attempt to conclude a license with the rights holder.
III. What awareness and what agency? A sociological perspective: prosumers, digital, encoded and convergent culture, interactivity, personalization
From a purely sociological perspective, the platform belongs to the category of so-called new media platforms, i.e., those that have emerged from the rapid development of information technology, where concepts such as participation, interactivity, data sharing, informative content, and the promotion of information occur through networking activities supported by software designed for such activities48. This has a significant impact on the behavior of minors and the protection of their rights, as well as on the new challenges regarding their education and the shaping of their "mental habits"49.
Since these are behaviors and mental habits subject to very relevant processes of change, it seems appropriate to focus on some key concepts that characterize the TikTok platform and its logics.
A first relevant concept, useful for understanding the logic of TikTok, is that of the prosumer, which combines the figure of the producer and the consumer. Effectively, the consumer of new media - in general - does not limit himself to the simple consumption of passive content, but rather exploits the possibilities of being an active part by participating in the creation of content: this is particularly evident on TikTok50.
TikTok thrives on its contents, which - unlike traditional media - are not created by a single production "department", but by anyone who has the will and passion to do so. This results in the combination of consumer and producer: the prosumer, indeed51.
A second, by no means secondary, aspect is the "codified digital culture" that can be observed within the dense dynamics of TikTok.
Within the platform it is possible to witness - with various stimuli - the birth and establishment of different trends, linguistic or behavioral fashions, which are reflected in the analog interactions of consociates52 and, specifically, in those of younger individuals53.
The cultural product, or rather the cultural products (due to their vastness and variety), offered by TikTok through the algorithm that directs its functioning, is capable of making viral content that is then picked up by the masses54(rectius, by users55): the aim is to follow the trend, socialization and sharing, as well as the comments generated by the latter.
Closely related to this is a third concept, that of "convergent culture"56, meaning that users have the opportunity to promote their own content: thus, a bottom-up movement occurs rather than vice versa. Such a tendency has a high media potential, as it can be "inspired" by any message or behaviour. Moreover, the establishment of convergent culture is based entirely on "imitation", with both positive and negative consequences within ad personam relationships57.
The final sociological considerations concern a fourth and fifth concept: that of "interactivity" and "personalization".
TikTok's algorithm follows a logic called push, which suggests content relevant to users' interests58. A pull logic, on the other hand, sees an active and autonomous user in search of the ideal content for them. Liking content rather than not interacting with it, saving it in favorites or not, playing it several times rather just once, saving the sound used to then reuse it in one's own content: these are all indicators that help the algorithm choose which content to propose in the future59.
On TikTok, the words "interact" and "personalize" can be seen as synonyms.
"Interacting" with content within the platform means personalizing our own experience within the application, by sending information to the algorithm.
On the other hand, "personalizing content" means interacting with it in order to make it consistent with one's personal taste and communicative intentions.
Closely related to both of them is the concept of "consultation", in which the user indirectly poses a question to the platform through their interactions with the content, and the platform, via the algorithm, is responsible for providing a response consistent with the user's needs and desires. An input is given to the algorithm, which then, after processing and analyzing the information, provides a content-based output in the user's feed within the platform, generating what can be defined as a "process of algorithmization of the self"60.
In the field of personalization, two other variables are included: that of time and space, as well as that of production. The so-called "peripheral consumer" is free to control the time and place of the content delivered in response to the request made to the platform. Personalising one's time and space within TikTok thus means consuming the content offered by the platform when and where they wish.
This can be both a positive and a negative aspect: on the one hand, one is autonomous in managing spaces and times, but on the other hand, no one can warn the user (starting with the adult) in case of content abuse or prevent the viewing of it, thus risking falling into a vortex of content addiction and negative emulation (as seen with the case of the challenge among users defined blackout or that of the "English scar"). In fact, these results point to the need to structure strategies that can promote and foster an aware use of the platform and, more generally, of social media and the tools offered by information technologies, going beyond the "risk (security) paradigm" and promoting, concretely, agency61, which constitutes the "social foundation of rights" for minors and young people in general62.
IV. Strategies for a conscious use of platforms: digital educational pacts and digital citizenship
There is a risk is that individuals belonging to the so-called "Generation Z"63, the generation born in close contact with the emergence of new technologies, may -without critical awareness and through "hyper-mediated"64 behaviors that become a mental habit - emulate negative examples that may jeopardize their own psycho-physical well-being as well as that of others65.
The digitization process, with its numerous opportunities for interconnection and the pervasiveness of new devices, poses new and complex challenges in the educational field. Challenges that firstly schools and teachers, families and parents, but also the entire community and citizenship, are called upon to address.
What seems to be needed is the definition of shared goals: on the one hand, certain processes focus on the creation of pacts and alliances, of which the so-called "digital education pacts" are a significant example, and, on the other hand, the valorrization of positive uses of the platform within the framework of "digital citizenship".
The "digital educational pacts" are valuable tools in the direction of prevention and education, rather than a logic of control, prohibition, and sanction associated with what has been referred to in this context and in the introductory note to the Forum as the "risk paradigm".
What is needed is the definition of shared goals, and in this context, the so-called "digital educational pacts" represent valuable tools66.
The digital educational pact can be understood as a "pact of co-responsibility" for digital education, starting with the family and opening to cooperation with other stakeholders involved in the digital education of minors.
Digital education is not limited to the management of individual family choices: experience shows that the community (schools, local authorities, and the world of associations) can join with individual families to form a collective alliance, outlining the best strategies for the digital education of children.
In fact, when children receive inconsistent messages from the adult world, they are - fundamentally - disoriented. This 'disorientation' can lead to wrong choices in online navigation. Conversely, a strong community alliance (rectius, of groups), which organizes itself to outline the best educational strategies in the digital realm, can provide a clear framework for educational choices.
A digital educational pact contains the fundamental principles of these shared choices, ensuring that families, local authorities, and the world of associations commit to the proper use of new technologies by children and teenagers, so as to guarantee their psycho-physical well-being67.
On these premises, the following programmatic guidelines for a digital educational pact can be outlined:
- Any "technophobic" vision on the part of parents must be overcome, but the transition from analog to digital must be gradual;
- Preparing the child's digital autonomy in advance, starting with shared use of devices, and gradually giving independence to the minor, who can then begin to exercise what we have defined as agency;
- Establishing clear and transparent rules for the use of the devices from the outset;
- Families are adequately informed about the use of the device by the minor and, in a sense, take greater responsibility through the pact, which involves other families, institutions, and the world of associations.
Digital educational pacts are undoubtedly an important opportunity, as they allow synergy between families, schools, local authorities, and the world of associations, establishing clear but never oppressive rules for children and this can accompany their introduction to platforms such as TikTok and gradually guide them in how to use them.
This approach permits to enhance the positive potential of new technologies without being consumed by the frantic race toward digitalization, on the one hand, and without falling into the negative circuits of new technologies - such as viewing inappropriate content for minors or operating in an 'inappropriate' online environment - on the other.
The undoubted advantage of a digital educational pact, in addition to the creation of shared principles and rules, is its flexibility: new families joining the pact, or the original families, will be able to redefine or establish new content as needs arise and evolve over time.
The primary goal is the psycho-physical well-being of children (and the full protection of their rights), so that they can gain awareness and maturity in the use of devices, without rushing the physiological process of growth and maturation when interacting with online content and, at the same time, learn how to practice digital skills.
This type of path seems to be very well linked to the practices of discovery and sharing practices at various levels that platforms in general and TikTok in particular can make possible68. This opens the door, as mentioned in the incipit, to a conscious use of such tools that is fully in line with the perspective of digital citizenship69, which, in addition to risk assessment and management, is manifested in the practices of socialization and interaction enabled by the network and the digital society, thus promoting, for minors and young people in general an encounter between knowledge (in its various forms), awareness, and agency.
That this happens even in a space ab origine intended for entertainment seems to me a significant achievement.