Dark Patterns and Addictive Designs

Authors

  • Xin Ye The University of Hong Kong, China

1 Introduction

In today’s digital landscape, competition for user attention has transformed from a momentary request to a constant demand, creating what scholars term the “attention economy” (Bhargava & Velasquez, 2021). With the widespread adoption of attention-centered business models and persuasive technology, users’ ability to control their attention faces unprecedented challenges. On the one hand, platforms largely determine what users pay attention to. Individual attention is often manipulated in directions that platforms deem profitable. As a result, information cocoons and the spread of disinformation have gradually eroded social structures and political institutions (Colomina et al., 2021). On the other hand, individuals are becoming increasingly dependent on platforms. People struggle to control their attention and spend substantial time online. According to 2024 data, most residents of developing countries spend more than 7 hours online daily. Although this figure is somewhat lower in developed countries, users spend more than 5 hours online in almost all countries (Statista, 2024). In this context, online addiction has become one of the most serious public health crises worldwide today (World Health Organization, 2018).

In the digital age, the rise of manipulative design practices, known as “dark patterns,” has become a growing concern for regulators. Dark patterns exploit cognitive biases and vulnerabilities to influence user behavior and decision-making, threatening personal autonomy, privacy, and well-being (Narayanan, 2020). Addictive designs have emerged as a particularly concerning subset of dark patterns because they systematically capture and manipulate user attention to create compulsive engagement with digital platforms. Despite the growing recognition of the harm associated with addictive designs, the legal framework in the European Union (EU) has yet to fully address this issue.

This paper offers a conceptual framework for understanding and regulating addictive designs, drawing on the legal concept of “dark patterns.” It examines the relationship between addictive designs and dark patterns, arguing that the former should be classified as dark patterns due to their manipulative nature, their impact on users’ autonomy, and their harmful consequences. By analyzing the current EU regulatory framework, the paper identifies significant gaps in how attention manipulation is addressed and propose approaches for more comprehensive regulation of addictive designs within the dark patterns framework.

2 Defining Dark Patterns

The concept of “patterns” in design originates from Christopher Alexander et al.’s (1977) influential work in architecture, in which they documented reusable solutions to recurring design problems. Alexander described patterns as capturing “the invariant property common to all places which succeed in solving the problem” (p. 14). This framework was later adopted in software engineering (Gamma et al., 1995) and user experience design (Tidwell, 2010) as a positive methodology for solving recurring design challenges. In 2010, user experience specialist Harry Brignull introduced the term “dark pattern” as a deliberate inversion of this constructive concept. While creating an online repository for people to document the deceptive designs they encountered in their daily lives, Brignull defined dark patterns as “tricks used in websites and apps that make you do things that you didn’t mean to, like buying or signing up for something” (Deceptive Design, n.d.). Whereas traditional design patterns aim to create spaces and experiences that enhance human well-being and agency, dark patterns systematically undermine users’ autonomy and informed decision-making; thus, they constitute an unethical application of pattern thinking for business advantage rather than user benefit.

Traditionally, dark patterns have primarily focused on user interfaces (UIs) in digital environments, especially in early academic work. For example, Arunesh Mathur et al.’s (2019) research examined interface-level deceptions on e-commerce websites and identified manipulative design elements that coerce or deceive users into making unintended purchasing decisions. Their large-scale study revealed numerous instances when the strategic placement of buttons, misleading text, hidden costs, and deceptive visual cues directly manipulated user behavior through the interface. Their work speaks to the initial focus of dark pattern research on identifying specific UI techniques that undermine consumer autonomy in online shopping contexts. Expanding the theoretical framework of manipulative design, Gray et al. (2018) advanced the discourse on dark patterns by reconceptualizing them as strategic design decisions rather than isolated interface manipulations. Their research delineated five motivational categories that demonstrate the deliberate prioritization of shareholder value over user autonomy: nagging, obstruction, sneaking, interface interference, and forced action (Gray et al., 2018). This taxonomic evolution transcends the mere documentation of deceptive interface elements, instead providing analytical insight into the strategic intentions underlying such patterns. By situating dark patterns within the broader context of professional ethics and design responsibility, this framework establishes them as a significant ethical consideration embedded in design practice. With the evolution and growing sophistication of dark pattern typologies, academic research has moved beyond its initial focus on consumer protection in e-commerce to investigating dark patterns in diverse domains, such as privacy invasion and data collection (Narayanan et al., 2020), social interaction manipulation (Mildner & Savino, 2021), and attention-capture mechanisms that create addictive engagement loops (Monge Roffarello & De Russis, 2022).

Extensive research has explored dark patterns that extract money through unnecessary purchases or obtain data through deceptive privacy interfaces. However, Narayanan et al. (2020) identified a third critical goal of dark patterns, which has remained relatively understudied despite its significant implications for digital well-being and addictive technology use: capturing user attention. Recent work by Monge Roffarello and De Russis (2022) began addressing this gap by conceptualizing “attention-capture dark patterns” as designs or system “functionalit[ies] that exploit people’s psychological vulnerabilities to maximize time spent, daily visits, and/or interactions on a digital service against the person’s will” (p. 2). Building on this foundation, they developed a comprehensive typology of 11 attention-capture patterns (e.g., infinite scroll, casino pull-to-refresh, never-ending autoplay) that share five key characteristics: they exploit psychological vulnerabilities, automate user experiences, cause users to lose track of their goals, lead to a lost sense of time and control, and ultimately result in user regret (Monge Roffarello et al., 2023). Gray et al.’s (2024) comprehensive dark pattern ontology provides additional structure for understanding attention capture within the broader ecosystem of manipulative designs. In their hierarchical framework, they classify “attention capture” as a meso-level pattern under the high-level “forced action” category, defining it as a strategy that “subverts the user’s expectation that they have rational control over the time they spend using a system, instead tricking them into spending more time or other resources to continue use for longer than they otherwise would” (Gray et al., 2024, p. 19).

Despite the growing attention dark patterns have received from academics, regulators, and the public, a significant research gap persists regarding their precise definition and classification. Although regulators and policymakers have shown interest in dark patterns scholarship, they have frequently developed new domain-specific terminology for concepts already established in academic literature when creating legal guidance. This terminological divergence between regulatory and academic frameworks further complicates definitional clarity (Gray et al., 2024). In the realm of digital regulations, the concept of “dark pattern” remains ambiguous, with definitions varying across laws and regulations, even within the same jurisdiction. According to the Unfair Commercial Practices Directive (UCPD), dark patterns refer to “a type of malicious nudging, generally incorporated into digital design interfaces.” The UCPD explains that dark patterns can be data driven, personalized, or use general tools to exploit users’ heuristics and behavioral biases. In 2022, guidelines adopted by the European Data Protection Board (EDPB) made a new attempt to address dark patterns on social media platforms by identifying six categories widely used online: overloading, skipping, stirring, hindering, fickle, and left in the dark. In these guidelines, “dark patterns” are defined as interfaces and user experiences (UX) that lead users to make unintended, unwilling, and potentially harmful decisions by influencing their behavior and hindering their ability to protect their personal data and make conscious choices about data processing. Similarly focused on interfaces and user experience design, the Digital Services Act (DSA) issued in November 2022 defines dark patterns as “practices” deployed on online interfaces that “materially distort or impair, either on purpose or in effect, the ability of recipients of the service to make autonomous and informed choices or decisions.” As the first EU law to ban dark patterns, the DSA emphasizes that these negative consequences stem from immoral persuasion that encourages users to perform actions they do not truly want to perform. In the United States, the Federal Trade Commission (2022) identifies dark patterns as “design practices that trick or manipulate users into making choices they would not otherwise have made, and that may cause harm.” Several states have begun legislating against dark patterns, as exemplified by the California Privacy Rights Act and the Colorado Privacy Act. However, under these laws, dark patterns are limited to UI designs, with the term awaiting further regulatory definition.

Across various regulatory regimes and legislative enactments addressing online manipulation, three common elements of dark patterns emerge despite the lack of definitional harmony. First, the nature of dark patterns is unethical and manipulative. Generally, there are two types of definitions of “manipulation.” The first approach defines it as a type of pressure or force, though not to the level of coercion (Noggle, n.d.). This suggests that manipulation can be distinguished from persuasion and coercion by measuring the level of pressure being exerted. Moral persuasion exerts no pressure, coercion exerts maximum pressure, and manipulation falls between persuasion and coercion. However, the pressure approach cannot encompass the full scope of manipulation. Most scholars believe that only some forms of manipulation can be considered pressure (Noggle, n.d.), and some suggest that manipulation contains elements of both coercion and deception (Noggle, n.d.). While appealing, this account fails to explain why some manipulative behaviors involve only coercion or deception. The second approach regards “manipulation” as a non-rational influence, suggesting that manipulators influence others by diminishing their ability for rational decision-making (Susser et al., 2019b). The core argument of this theory takes manipulation as “hidden influence or trickery,” where someone is manipulated when they unconsciously make decisions benefiting the manipulators without realizing that their decision-making process has been influenced (Susser et al., 2019a). In this account, deception is one specific type or form of manipulation (Susser et al., 2019b). One clear strength of this theory is that it establishes a clear conceptual boundary between coercion and manipulation. Thus, in the context of digital regulation, the first common element ascribed to dark patterns is that they are unethical and manipulative in the sense of exerting hidden influence to exploit users’ vulnerabilities. Deception is but one way dark patterns may manipulate users.

Second, “dark patterns” are designed to achieve specific aims by affecting users’ behaviors and autonomy in decision-making. Unlike accidental design flaws, dark patterns are intentional design choices that systematically undermine user agency. These intentional designs typically exploit users’ psychological vulnerabilities through various mechanisms. Researchers have identified numerous cognitive biases that may be exploited by dark patterns. For example, Waldman (2020) discusses how anchoring can lead users to disclose more personal information after seeing others’ sharing behaviors. Similarly, biases such as hyperbolic discounting, where individuals prefer small immediate rewards over larger long-term benefits, and overchoice, where an overwhelming number of options can hinder decision-making, are often leveraged by platforms to manipulate user behavior (Waldman, 2020). Mildner et al. (2023) identified several strategies in social networking services that manipulate users’ decision-making, such as interactive hooks (e.g., infinite scrolling), social brokering (e.g., nudging users to connect with others based on similarities), and decision uncertainty (e.g., presenting confusing options that hinder clear decision-making). Although recent scholarship has established a theoretical “relationship model of cognitive biases and dark patterns,” more empirical evidence is needed to validate how specific cognitive biases are systematically exploited in manipulative designs across different contexts (Mildner et al., 2024).

Third, dark patterns lead to negative or harmful outcomes for consumers. As discussed above, these take various forms, from economic harm (such as unwanted subscriptions or hidden fees; Luguri & Strahilevitz, 2021) to privacy violations (Waldman, 2020) and psychological impacts (such as addiction to digital platforms; Roffarello & De Russis, 2022). Though specific harm varies across contexts, a consistent characteristic of dark patterns is their tendency to produce outcomes that benefit service providers at users’ expense (Narayanan 2020). The academic literature has identified a broad spectrum of potential harms in various domains, yet legal regulations have primarily concentrated on addressing dark patterns in the contexts of privacy infringement and e-commerce manipulation – both areas where consumer harm is most readily quantifiable and regulatory frameworks already exist. The regulatory approaches to dark patterns in privacy and e-commerce contexts will be examined in greater detail in Section 3.

In summary, the concept of dark patterns has evolved significantly since its introduction in 2010, expanding from a focus on deceptive user interfaces in e-commerce to encompass a broader range of manipulative design strategies in diverse digital environments. While academic definitions have developed nuanced taxonomies and conceptual frameworks, legal and regulatory approaches have often created domain-specific terminology, leading to definitional inconsistencies across jurisdictions. Nevertheless, three common elements emerge in the legal conceptualization of dark patterns: they employ manipulative techniques that exert influence on users, they intentionally exploit psychological vulnerabilities and cognitive biases to affect users’ behavior and undermine their autonomy, and they produce harmful outcomes that prioritize service providers’ interests over users’ wellbeing. Although current regulatory frameworks primarily address dark patterns in privacy and e-commerce contexts, attention-capturing mechanisms that promote excessive engagement and potential addiction represent an equally concerning but comparatively understudied category of dark patterns.

3 Addictive Designs as a Type of Dark Pattern

Addictive designs in digital markets usually concern the features and techniques used by platforms to keep users engaged. From a broader perspective, addictive designs lead to different types of addictive behavior, such as spending time and money and engaging online (European Commission, 2024). Social media addiction is one of the most serious issues in this regard. Users’ addictive behaviors primarily stem from two aspects of human-computer interactions (HCI): addictive UI designs, such as auto-play and infinite scrolling, and recommendation systems. Even though the issue of addictive designs will be one of the key focuses of EU digital legislation in 2025, its conceptualization has not been sufficiently discussed. In the following discussion, this paper argues that addictive designs should be classified as a type of dark pattern by examining three key aspects: their manipulative nature, their impact on users’ behavior and autonomy, and their harmful consequences for human well-being.

3.1 Addictive Designs as Manipulative Tools

In light of concerns that digital technologies will change individual behaviors and life choices, online manipulation theory has garnered increasing attention over the past 5 years. According to Susser et al. (2019b), online manipulation occurs when users are targeted and exploited by manipulative practices, which not only diminish their economic interests but also result in autonomy loss. Online manipulative designs usually have unconscious effects on users’ behavior as they take advantage of users’ vulnerability in the decision-making process. Online addiction, for example, is one of the expected outcomes of manipulative designs. Further, online manipulation undermines users’ autonomy by challenging both their competency and their authenticity in making their own decisions. As a result, online manipulation makes it impossible for users to make rational judgments about their choices and severely dampens users’ motivation to evaluate and modify their behaviors.

Based on online manipulation theory, information technology is the perfect medium for exerting manipulative influence (Susser et al., 2019b). First, online users’ vulnerabilities in decision-making are easily exposed by pervasive digital surveillance. All online activities are recorded as data for analysis. To some extent, online platforms that own substantial amounts of personal data better understand users’ decision-making. Second, the interactive nature of information technology strengthens online manipulation and allows it to adapt constantly. Online platforms may not only know users better than they do themselves, but they also know them better and better with every passing day. Moreover, online manipulation usually impacts users secretly. Once users become accustomed to the way they are offered information services, “the device or interface itself recedes from conscious attention, allowing [them] to focus on the tasks [they] are using it to accomplish” (Susser et al., 2019b, p. 7). This “technological transparency” blinds users to their potentially manipulative influence, making these practices “invisible” and, thus, preventing users from taking action to avoid harm.

According to online manipulation theory, addictive designs impair personal autonomy in two ways (Susser et al., 2019b). First, addictive designs can lead online users to act toward ends that they have not chosen. This usually relates to decisions about attention devotion, where users’ attention is directed somewhere unconsciously rather than where intended. The notification system of WeChat exemplifies this mechanism: while users initially intend to briefly respond to messages, the placement of the Discovery Page, with its red-dot notifications exploiting humans’ instinctive response to red as a signal of urgency, guides them toward additional services, such as Moments (to check friends’ posts) and Channels (short-video platforms). This design creates a stress-response cycle, compelling users to engage with these features to alleviate the anxiety it induces. Consequently, attention allocation becomes increasingly involuntary, with users frequently engaging in unplanned interactions. Despite experiencing subsequent regret, individuals often misattribute this loss of control to personal deficiencies in self-regulation instead of recognizing it as a product of intentional design manipulation. Second, addictive designs can push online users to act for reasons not authentically their own. This happens when users think that they are deciding to stay online for their own reasons, but their decision-making process is in fact being manipulated. The most common type of addictive design is recommendation systems based on emotion manipulation. Facebook, for example, has been criticized for keeping its users online by providing infuriating content (Pelley, 2021). In this case, while users believe that they are making autonomous choices about directing their attention and consuming content, these perceived choices often mask the actual factors driving their decisions. More significantly, this undermines users’ capacity to critically reflect on and understand their true motivations for engaging with apps or platforms, preventing them from evaluating whether the excessive time they spend on these programs truly results from self-directed attention.

Three common manipulative elements are widely used in addictive designs (Bhargava & Velasquez, 2021). The first is the use of intermittent variable rewards, which trigger users’ curiosity by making rewards unpredictable in terms of frequency or magnitude. For example, Pinterest’s UI design shows only a small portion of photos at the bottom of the page to entice users to scroll and reveal the full picture (Eyal, 2014). Second, platforms, particularly social networks, create reward systems that exploit psychological tendencies and needs, such as the desire for social recognition. The ubiquitous “like button” on social media exemplifies this approach. Third, UIs are designed to erode natural stopping points. Features like infinite scroll and auto-play deliver content continuously, preventing users from taking breaks or making conscious decisions to stop. The logic behind addictive designs clearly aligns with unethical persuasive practices. Empirical studies have demonstrated that persuasive technology increases users’ screen time and leads to addictive online behaviors. A 2023 study of 183 Chinese university students revealed that 44% of participants believed that smartphones occasionally or frequently negatively affected their studies or professional life (Chen et al., 2023). In interviews, participants estimated that if they could eliminate all persuasive designs from their smartphones, they might reduce their screen time by an average of 37% (ranging from 10% to 65%). The research identified multiple persuasive design features in social networking, gaming, and short video apps that were associated with extended use and habit formation.

Online manipulation theory reveals how addictive designs systematically undermine personal autonomy through both unconscious attention capture and manipulation. The manipulative nature of these designs is particularly concerning because users often misattribute their loss of attention control to personal failings instead of recognizing it as the result of intentional design manipulation. Understanding addictive designs as a form of online manipulation highlights the need to address not only their immediate behavioral impacts but also their deeper implications for personal autonomy and attention control in the digital age.

3.2 The Impairment of Personal Autonomy

The function of attention in personal autonomy is intuitively recognizable, although it is rarely treated as a serious topic for discussion. More precisely, attention is known to play an essential role in building consciousness, information acquisition, decision-making, and self-control.

First, attention is essential for independent decision-making. Attention is the foundation of information acquisition, providing individuals with adequate information for making decisions as autonomous agents. Whether attention is regarded as the key to determining which pieces of information reach higher consciousness or as a means of prioritizing different inputs, it plays a crucial role in information access and processing. As Waltz (2023) noted, “Attention makes information accessible or useable” (p. 4). Thus, information freedom rests on the autonomy of attention. The significance of attention in pursuing personal autonomy has increased rapidly in the information age, especially because individuals face daily information overload, which makes it harder to “make good decisions about what to look at, spend time on, believe and share” (Lorenz-Spreen et al., 2020, p. 1104).

Second, attention plays a key role in self-consciousness and identity formation. It shapes our consciousness and enables critical and creative thinking. Self-consciousness cannot exist without attention. Through the constant interplay of attention between the environment and the self as described in Carver & Scheier’s (2012) theory of self-regulation, the information we process shapes our perceptions and constructions of reality, influencing both our behavior and our understanding of ourselves and our surroundings. What we attend to and how we attend to it shape our thoughts and ultimately make each individual unique. For an autonomous person, self-mastery implies the ability to understand both the world and oneself. This self-determination can only be achieved when attention is under control. Through countless attentional processes, individuals establish unique perceptions of the world based on their self-consciousness. Notably, the development of self-consciousness is dynamic, and shifts in attentional focus can have unexpected impacts on subsequent decision-making and self-cultivation. Hence, individuals must maintain control over their attention is crucial, and without it, they may lose direction.

Personal autonomy requires the capacity for control over one’s life. However, when a person’s attention is constantly distracted, it becomes challenging for them to make independent decisions and live authentically. Thus, persistent online manipulation severely compromises users’ ability to control their attention. Frequent interruptions and constant instant gratification make it difficult for users to focus on tasks without immediate rewards. Consequently, sustained attention, that is, the ability to maintain focus on task goals, becomes impaired. When individuals become habituated to social media and short-form video consumption, they struggle to maintain focus on challenging tasks that require sustained effort. Yet, sustained attention is key to retaining authorship over one’s life and engaging in creative activities. Moreover, the manipulation of attention leads to an increase in “involuntary attention” at the expense of “voluntary attention.” Constantly updated personalized content, in particular, pushes users to rely increasingly on non-voluntary attention. This manipulation has two significant implications: attention becomes entertainment based rather than goal based, and it is directed toward platform-determined content rather than user-chosen priorities. The monopolization of attention undermines users’ authority over their sense of self-worth. Notably, individuals lose sovereignty over their values and choices when they are shaped by external forces rather than autonomous decision-making. Most troublingly, even when people recognize this manipulation and its importance, they often struggle to resist its addictive nature.

Attention plays a crucial role in the pursuit of personal autonomy, facilitating information freedom, shaping self-consciousness and experiences, enabling independent thought and action, and ultimately allowing individuals to become who they aspire to be. In the information age, particularly in the context of information overload, the importance of attention to shaping ideal personal lives has grown considerably. Therefore, by manipulating attention, addictive designs fundamentally undermine personal autonomy.

3.3 Harmful Effects on Human Well-Being

In addition to impairing personal autonomy, addictive designs pose significant risks to users’ mental and physical health, particularly through the development of addictive behaviors leading to Internet addiction. In current addiction studies, the definition and nature of addiction are hotly debated. The moral or religious model typically views addiction as the result of sin and moral weakness, for which individuals should be held responsible (Cook, 2006). Generally, four main theories address the causes of addiction: (1) choice theory, which argues that individuals become addicted because they perceive the benefits to outweigh the costs; (2) disease theory, which suggests that addictive substances cause persistent pathological changes, resulting in intense cravings and diminished self-control; (3) learning theory, which posits that addiction is a learned behavior developed through positive and negative reinforcement; and (4) neurobiological theory, which emerged in the 1990s, proposing neurological explanations for addiction through molecular and neurological studies (Bhargava & Velasquez, 2021). These theories lead to different perspectives on the relationship between addiction and autonomy. The choice model maintains that addiction is self-destructive behavior but argues that addicts should still be regarded as autonomous. In contrast, the disease model emphasizes compulsion and loss of control, suggesting that addiction represents a state of non-autonomy or reduced autonomy (Koopmans & Sremac, 2011).

Clinically, addiction falls into two categories: substance addiction (such as drug or alcohol addiction) and behavioral addictions (such as gambling, sex, and gaming addiction). To contextualize Internet addiction within existing nosological frameworks, it is instructive to examine its conceptual alignment with gaming disorder. Gaming disorder is officially defined in the 11th Revision of the International Classification of Disease as “a pattern of gaming behavior” characterized by “impaired control over gaming, increasing priority given to gaming over other activities to the extent that gaming takes precedence over other interests and daily activities, and continuation or escalation of gaming despite the occurrence of negative consequences” (World Health Organization, n.d.). Internet addiction follows a similar pattern. According to US research, over 50% of teenagers acknowledge being distracted from important priorities such as homework while using social media and 72% of teenagers recognize that they are being manipulated into spending more time on their devices, although many of them admit that they cannot resist or control their behavior (Rideout & Robb, 2018). Recent research from the Harris Poll (2024) reveals an interesting paradox in Gen Z’s relationship with social media: while 60% of Gen Z (aged 18–27 years) spend at least 4 hours daily on these platforms and 82% describe them as “addicting,” nearly half wish that platforms like TikTok (47%), Snapchat (43%), and X (formerly Twitter, 50%) had never been invented. This contrast highlights how digital platforms can simultaneously be both compulsively engaging and consciously unwanted, much like other addictive behaviors.

However, there is considerable debate around the concept of “Internet addiction.” The main issue stems from the lack of a common theoretical and diagnostic model, which hinders the official recognition of Internet addiction in the manual of mental disorders. Critics like Bell (2007) argue that the term itself is conceptually flawed because the Internet is a communication medium rather than a substance or activity, suggesting that being “addicted” to the Internet is as nonsensical as being addicted to language or radio waves. Instead, Cantelmi et al. (2000) prefer to use the term “Internet-related psychopathology” to describe these clinical conditions, which encompass pathological gambling, cybersex, game dependency, and information overload addiction.

However, proponents of treating Internet addiction as a behavioral disorder argue that it can be diagnosed. Young (1998) suggests that Internet addiction disorder shares key characteristics with drug addiction as an impulse control disorder, particularly “the inability to control the use of something” (Musetti et al. p. 1). She identifies seven common criteria: withdrawal, tolerance, preoccupation, increased usage, centralized procurement activities, loss of other interests, and disregard for consequences. Researchers like Shapira et al. (2000) further defined problematic Internet use as including maladaptive preoccupation and irresistible extended use. Tao et al. (2010) later proposed additional diagnostic criteria, such as losing previous interests and using the Internet to escape negative moods.

Research has demonstrated that Internet addiction (including social media addiction) leads to various mental health issues, such as low self-esteem, depression, anxiety, and increased suicide risk. The reduction in sleep, physical exercise, and face-to-face social interaction associated with Internet addiction not only compromises physical health but also impairs cognitive abilities, including accurate reasoning, clear thinking, and sustained concentration (Bhargava & Velasquez, 2021). Moreover, while individuals with addiction often recognize their dissatisfaction with their life progress and diminished capacity for future planning, they typically struggle to modify their addictive behaviors independently.

Although Internet addiction is not yet officially recognized as a behavioral disorder, there is an emerging consensus that excessive Internet use should be treated as a genuine addiction. First, as mentioned above, excessive Internet use shares many characteristics with substance and behavioral addictions. Second, functional neuroimaging studies demonstrate that Internet addiction disorder leads to changes in brain structure and function. The brain areas active in drug and behavioral addictions are similarly active in individuals who develop addictive Internet use patterns and meet the diagnostic criteria for Internet addiction (Bhargava & Velasquez, 2021). Third, the molecular pathways involved in substance addiction (such as the dopaminergic brain system) have also been observed in Internet addiction disorder. This indicates that the neurobiological mechanisms of Internet addiction closely parallel those of other addictive disorders (Hou et al., 2012). Collectively, these findings suggest that Internet addiction deserves the same serious consideration as other types of addiction. Given the health issues associated with Internet addiction disorder, it requires greater attention not only in psychological and neuroscientific research but also in policy development.

Like dark patterns, addictive designs are unethical business practices that work as manipulative tools for controlling users’ attention. These designs take advantage of individuals’ vulnerabilities, such as the instinctual desire to be socially accepted, to push users to increase their online screen time. By manipulating users’ autonomy of attention, these designs undermine their ability to control their attention. Furthermore, online users are not sufficiently alarmed by the manipulative influence of addictive designs. People may spend more time than expected on social media but barely notice or understand what happens behind UI designs or algorithmic recommendation systems. Most users are unaware of the risks that these designs pose to their behaviors and lifestyles or their harmful impact on mental and physical health. Thus, because addictive designs are manipulative tools that aim to change users’ behavior by impairing their autonomy, they should be considered a type of dark pattern.

4 The Current Legal Framework of Dark Patterns in the European Union

In 2022, research conducted by the EU’s consumer protection cooperation network showed that nearly 40% of online shopping websites rely on manipulative practices to exploit consumers’ vulnerabilities or trick them (European Commission, 2023). Currently, no single piece of legislation can fully cover dark patterns, but some new forthcoming regulations have already taken steps to address this issue.

The cornerstone of the legal framework for regulating dark patterns is the GDPR. Although it does not explicitly mention dark patterns, it forms part of the legal framework regulating dark patterns because the issue of personal data is intrinsic to their use. From the perspective of UIs’ compliance with data protection regulations, the GDPR’s Articles 5 and 25 set the basic rules for fairness of processing and provide for the obligation to implement data protection by design, respectively. When data controllers collect personal data to implement dark patterns, they should fulfill the requirements of “fairness, lawfulness and transparency” in Article 5. Given that the principle of fairness serves an umbrella function, any data collected or processed specifically to use dark patterns is forbidden because dark patterns are unethical persuasive technologies characterized by manipulation and deception. According to Article 25, data controllers should implement appropriate measures to ensure data subjects’ rights and freedoms by design and by default. Regarding the key rules for fair designs, data subjects should be provided with data processing information and options in an objective and neutral way that avoids any deceptive or manipulative language or design. Therefore, to comply with their obligations under the GDPR, data controllers should take fairness elements into account in terms of the amount of personal data collected and the extent of their processing. This means that data controllers should be very careful about the personal data they use for interface designs or dark pattern processes.

Another core element of the GDPR is the consent principle, which guarantees users’ free choice to opt into data processing. The basic requirements for the effectiveness of valid legal consent are defined in Article 7 and specified further in recital 32 of the GDPR. Consent must be “freely given, specific, informed and unambiguous”; freely given consent must be voluntary. EU regulators have already noticed the prevalent use of dark patterns to obtain consent. The EDPB guidelines point out that “dark patterns” may hinder data subjects’ abilities and exploit their autonomy to make them give consent. For example, color choices and content placement on interfaces could be designed to make data subjects feel anxious or guilty if they refuse to share more personal data. In its guidelines on dark patterns in social media platform interfaces, the EDPB clarifies the applicability of the GDPR’s provisions to the designs and use of UIs, even though they mainly fall within the realm of data protection.

The GDPR also breaks new ground in regulating automated decision-making. The basic rules for automated decision-making are stated in Article 22, which includes the interpretative guidance of the Article 29 Working Party (WP29). According to Article 22, data subjects have the right not to be subject to automated decision-making when (1) the decision is solely based on automated processing, that is, without any human intervention and (2) the decision produces legal effects or similarly significant effects on the data subject. One type of automated decision-making mentioned in this provision is profiling, which is defined as any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyze or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location or movements.

Considering that individuals understand the techniques involved in automated decision-making processes to different degrees, WP29 emphasizes the principles of fairness and transparency at all stages of the processing, including when applying a profile to make decisions affecting the individual. As we know, building an accurate profile is foundational in providing powerful personalized content. However, Article 22 has a relatively restricted range of use because the requirement of “legal or similarly significant effects” is difficult to meet for many algorithmic recommendations. For example, decisions to present targeted advertisements based on profiling would not be considered to have a “similarly significant effect” on consumers in most cases. One typical case that could trigger Article 22 is when platforms provide services or goods with different prices to consumers based on the analysis of their personal data, leading to high barriers that prevent some consumers from purchasing the products. Thus, decisions made by addictive algorithmic systems to provide personalized content fall outside the provisions outlined in Article 22 unless they can be proven to have legal or similar effects, such as risks to consumers’ physical and mental health.

Aside from applying the principle-based provisions of the GDPR to data protection, the EU seems more inclined to address dark patterns from the perspective of consumer protection law, especially the Unfair Commercial Practices Directive (UCPD). The 2021 guidance published by the European Commission confirmed that the UCPD covers the issue of dark patterns in Section 4.2.7, which explains how the relevant provisions of the UCPD apply to all data-driven business-to-consumer commercial practices, including personalized services such as targeted advertising and recommender systems, and how dark patterns may contribute to unfair commercial practices. Notably, the guidelines emphasize that the application of the UCPD should cover the practices that “capture the consumer’s attention, which results in transactional decisions such as continuing to use the services (e.g., scrolling through a feed), to view advertising content or to click on a link.” Thus, any practices that function as dark patterns to distort consumers’ economic behavior or hinder their free transactional decisions, such as visually obscuring important information or using trick questions, breach the trader’s professional diligence requirements and constitute a misleading or aggressive practice. However, although the UCPD covers the practice of “capturing the consumer’s attention,” it merely focuses on the economic angle instead of attention protection. Consequently, the UCPD could be an effective tool for protecting consumers from being induced into unexpected overconsumption or unfair data-sharing, but it is not a solution to other attentional issues. This points to the limitations of attempting to protect personal attention within the legal framework of consumer protection law.

While the UCPD provides a framework for addressing dark patterns from a consumer protection perspective, it has limitations in its scope and focus. As a promising tool for creating a fairer digital economy while fostering innovation and ensuring competitiveness, the Digital Services Act (DSA) provides updated rules for the governance of online technology in Europe. The concept of dark patterns is clarified in Recital 67, which highlights that dark patterns used in online interfaces hinder recipients’ autonomy and their ability to make free and informed decisions. The relevant provisions laid out in Article 25 prohibit deceptive and manipulative designs in online interfaces, although they do not feature the expression “dark patterns.”

Except for clarifying the obligations of online platform providers concerning the UI, the DSA also includes new rules for regulating recommender systems. Recommender systems are defined in Article 3(s) as “fully or partially automated systems used by an online platform to suggest in its online interface specific information to recipients of the service or prioritize that information.” Unlike the GDPR, which establishes the general principles for automated decisions, the DSA specifically puts the spotlight on algorithmic recommendation systems and further explains the reasons for regulating these services. Recital 70 cites the risks of disinformation, algorithmic amplification, and online behavior change when using algorithmic systems to enhance interactions between service providers and recipients and improve their user experience. In the DSA, algorithmic recommendation systems are to be regulated by improving the requirements of transparency based on the informed consent principle. First, providers of online platforms should state the reasons for using parameters in their recommender systems in plain and intelligible language. Second, the providers of online platforms should also allow recipients to select or modify their preferred options, for example, to determine which parameter matters most in deciding the output.

The DSA pays particular attention to regulating very large online platforms (VLOPs) and very large online search engines (VLOSEs). In terms of recommender systems, VLOPs and VLOSEs should conduct risk assessments for any systemic risks stemming from the design or functioning of their services. Article 34 identifies four categories of risks: (1) the dissemination of illegal content, (2) hindering fundamental rights, including privacy, free speech, pluralism of the media, consumer protection, and non-discrimination, (3) negative effects on civic discourse, electoral processes, and public security, and (4) negative consequences for public or personal physical and mental well-being. The requirement to assess risks is a comprehensive effort to regulate social problems arising from the development of online platforms in the attention economy. Among these provisions, the fourth category of risk is closely related to the issues of attention protection and online addiction. Recital 83 further explains how to understand the risk to physical and mental well-being and what should be included in this category. First, the assessment of systemic risks should cover all stages of platform establishment, including the design process, functioning, and use. Second, the recital indicates that risks could come from manipulation and cause actual or foreseeable negative effects. Although dark patterns are not referenced vis-à-vis recommender systems in Recital 83, it essentially covers concerns about the use of dark patterns in algorithmic recommendations. According to Recital 67 and Article 25, any use of dark patterns in online interfaces is prohibited. However, when it comes to recommender systems, the only requirement is to conduct risk assessments to prevent the risks of dark patterns. This points to a more cautious attitude toward algorithmic systems than toward online interfaces in the DSA. Third, Recital 83 also pays attention to the negative effects of online interfaces that stimulate behavioral addictions. This is the first time the EU legal framework has included the issue of online addiction caused by online interfaces. Even though the recital mainly focuses on UIs rather than the full process of HCI, this is an important step in integrating attention protection in the digital world. At the very least, VLOPs and VLOSEs should take it seriously and provide safeguards for their users, making their services less addictive and avoiding the risks to public and individual physical and mental well-being. Except for risk assessments, VLOPs and VOLSEs should offer at least one version of the recommender system that is not based on profiling for users to choose from. This enables users to control the way platforms and search engines present information, what they want to experience online, and how their personal data should be dealt with.

As the foremost regulation regarding online platforms within the existing EU legal framework, the DSA is ambitious about establishing fairer platforms for providing digital services. It covers not only UIs and recommender systems but also the advertising-driven business models behind the attention economy. On the one hand, this business model using manipulative technologies could threaten “public health, public security, civil discourse, political participation and equality.” On the other hand, it has negative effects on individuals’ lives and personal autonomy. Regarding attention protection, the DSA enables online users to take some control back from platforms by deciding “what should be paid attention to.” From interface design and algorithmic recommendations to concerns about disinformation and illegal content, the DSA is so far the most comprehensive regulation and provides a new toolkit for addressing issues related to the attention economy. Thus, the DSA has the potential to lead to fairer digital platforms and protect individuals’ personal rights in the area of the attention economy, although there is still plenty of work left for EU regulators to implement this promising regulation.

In short, the EU’s approach to regulating dark patterns reflects a complex interplay of legal instruments. The GDPR’s fairness and consent principles provide important foundations for regulating dark patterns, particularly in the realm of data protection. The fairness principle serves an “umbrella function” by fundamentally prohibiting data collection and processing for dark patterns. Complementing this, the consent principle requires that consent must be “freely given, specific, informed and unambiguous.” These principles work to prevent manipulation in data collection, ensure genuine user choice, and protect user autonomy in data-sharing decisions. However, their effectiveness is primarily limited to the data protection domain, leaving broader issues of attention manipulation and addictive designs potentially unaddressed. The UCPD, though it addresses dark patterns from a consumer protection perspective, primarily focuses on economic behavior rather than broader attention protection. The DSA marks the most significant advance as it explicitly defines dark patterns as practices that impair users’ autonomous decision-making and requires VLOPs to assess the risks posed to users’ mental well-being by behavioral addictions. However, the framework still shows crucial limitations: it takes a more cautious approach to algorithmic systems than to interface designs, focuses primarily on VLOPs/VLOSEs for rigorous oversight, and lacks specific provisions addressing the manipulative nature of attention capture in recommender systems.

5 Suggestions for Regulating Addictive DesignsWithin the Framework of Dark Patterns

The discussion of the EU’s current regulations related to dark patterns shows that while the EU has created a framework for addressing dark patterns, there remain significant gaps in users’ protection from attention manipulation and addictive designs. This article makes three suggestions regarding how to efficiently address the issue of addictive designs based on the concept of dark patterns.

First, the definition and scope of dark patterns should be clarified. The lack of consensus around the concept of “dark patterns” has prevented the establishment of a comprehensive legal framework to solve online manipulation. As discussed above, the scope of “dark patterns” varies widely across regulations. It mostly includes interface designs but remains ambiguous as concerns algorithmic systems or other mechanisms behind platform designs and operation. In fact, the definition of dark patterns in academic circles has already expanded from the initial interface designs to other areas of HCI, such as applications in robotics (Lacey & Caudwell, 2019), yet the EU’s legal framework still focuses primarily on UI design. Thus, to construct a legal instrument for protecting users’ attention from the harmful impact of addictive designs in the context of “dark patterns,” the scope of “dark patterns” should be broad enough to include the whole of HCI, especially UI design and recommendation algorithms. Dark patterns are unethical by nature as they involve manipulation, deception, or the exploitation of psychological vulnerabilities. Accordingly, they should include designs that systematically exploit attention mechanisms and cognitive biases to create addictive engagement. Dark patterns lead to negative or harmful outcomes for users, including both economic or privacy harms and the erosion of attention capacity, mental well-being, and the ability to make autonomous choices about digital engagement.

Second, the value of attention should be emphasized in digital regulations. Dark patterns are traditionally understood through the lens of deceptive interfaces and impairment of users’ autonomy in the decision-making process for transactions, but attention capture represents another form of autonomy violation. Attention plays a crucial role in shaping personal autonomy because it serves as the cognitive foundation for rational deliberation and sustained engagement with meaningful choices. When digital platforms employ addictive designs that systematically capture and manipulate users’ attention, they not only affect immediate choices but also impair users’ long-term ability to exercise self-control to lead ideal lives. Therefore, future digital regulations should recognize attention rights as a distinct category of protection, acknowledging that the right to control one’s attention is fundamental to preserving personal autonomy in the digital age.

Third, to address the issue of addictive designs within the conceptual framework of dark patterns, it is imperative to amend consumer protection laws. The EU’s existing consumer protection legislation was primarily designed to regulate traditional markets and fails to adequately consider the impact and risks that digital markets pose to consumer autonomy (Davida, 2024). The existing legislation regarding dark patterns mostly engages with privacy protection and e-consumer protection to prevent online users from being driven to share personal data or engage in excessive consumption. Therefore, the EU’s legal framework for regulating dark patterns must be reformed to effectively address attention and health-related harms. Specifically, the UCPD’s interpretation of unfair commercial practices should be expanded to acknowledge attention-capturing dark patterns, recognizing that manipulative designs that systematically exploit users’ attention constitute unfair practices that transcend conventional economic harm. Moreover, the DSA should explicitly include provisions addressing designs that create or reinforce addictive behaviors in its regulations on dark patterns. Risk assessment requirements should be extended beyond VLOPs to encompass all platforms that employ such patterns.

6 Contributions and Limitations

This paper contributes to the discourse on dark patterns in digital regulatory frameworks in several ways. First, it established a conceptual connection between addictive design practices and dark patterns at the legal and regulatory level. The analysis articulated the relationship between attention mechanisms and personal autonomy, positioning attention protection as an essential component of safeguarding individual autonomy in digital environments. By examining current EU regulations, this paper identified critical gaps regarding attention-capture dark patterns, demonstrating that the existing legislation inadequately addresses these manipulative design practices despite their impact on users’ welfare. The paper also explored potential regulatory approaches for governing addictive designs within the dark patterns framework. However, this work has limitations. Although it established a theoretical basis for incorporating addictive designs into dark pattern regulatory frameworks, it did not present specific regulatory recommendations for different categories of addictive design practices. Future research should develop a classification system for various categories of addictive designs and examine targeted regulatory mechanisms that are appropriate to each category’s characteristics and associated harms.

7 Conclusion

The rise of addictive designs as a form of dark pattern is a significant threat to user autonomy, well-being, and the integrity of the digital ecosystem. Although the EU has taken important steps to address dark patterns through various legal instruments, such as the GDPR, UCPD, and DSA, the current framework still leaves significant gaps in user protection from the harms of addictive designs. To effectively regulate these practices and safeguard users’ rights, it is crucial to clarify the definition and scope of dark patterns, recognize the fundamental role of attention in shaping personal autonomy, and amend consumer protection laws to address the issue of online manipulation posed by digital markets. By adopting these measures and making efforts to address addictive designs, the EU is expected to establish a robust legal framework that effectively combats addictive designs and protects users’ attention and well-being in the digital age.

References

Alexander, C. (1977). A pattern language: Towns, buildings, construction. Oxford University Press.

Bell, V. (2007). Online information, extreme communities and internet therapy: Is the internet good for our mental health? Journal of Mental Health, 16(4), 445-451.

Bhargava, V. R., & Velasquez, M. (2021). Ethics of the attention economy: The problem of social media addiction. Business Ethics Quarterly, 31(3), 321-342. https://doi.org/10.1017/beq.2020.32

Cantelmi, T., Del Miglio, C., Talli, M., and D’Andrea, A. (2000). Internet Related Psychopathology: primi dati sperimentali, aspetti clinici e note critiche. Giornale Italiano Psicopatol. 6, 40-51.

Carver, C. S., & Scheier, M. F. (2012). Attention and self-regulation: A control-theory approach to human behavior. Springer Science & Business Media.

Chen, X., Hedman, A., Distler, V., & Koenig, V. (2023). Do persuasive designs make smartphones more addictive? A mixed-methods study on Chinese university students. Computers in Human Behavior Reports, 10, Article 100299. https://doi.org/10.1016/j.chbr.2023.100299

Colomina, C., Sanchez Margalef, H., & Young, R. (2021). The impact of disinformation on democratic processes and human rights in the world. European Parliament. https://www.europarl.europa.eu/thinktank/en/document.html.

European Commission. (2023, March 8). Consumer Protection: Manipulative Online Practices Found on 148 out of 399 Online Shops Screened. European Commission. https://ec.europa.eu/commission/presscorner/detail/en/ip_23_418.

European Commission. (2024, October 3). Fitness check of EU consumer law on digital fairness (Commission Staff Working Document No. SWD(2024) 230 final).

Cook, C. C. (2006). Alcohol, addiction and Christian ethics (Vol. 27). Cambridge University Press.

Davida, Z. (2024). Consumer decision-making autonomy in the digital environment: Towards a new understanding of national courts’ obligation to assess ex officio violations of fair commercial practices. European Journal of Risk Regulation, 15, 1034-1051. https://doi.org/10.1017/err.2024.11

Deceptive Design. (n.d.). https://www.deceptive.design/

Eyal, N. (2014). Hooked: How to build habit-forming products. Penguin UK.

Federal Trade Commission. (2022, September 14). Bringing dark patterns to light. US Federal Trade Commission. https://www.ftc.gov/system/files/ftc_gov/pdf/P214800%20Dark%20Patterns%20Report%209.14.2022%20-%20FINAL.pdf

Gamma, E., et al. (1995). Design patterns: Elements of reusable object-oriented software. Pearson Deutschland.

Gray, C. M., Kou, Y., Battles, B., Hoggatt, J., & Toombs, A. L. (2018). The dark (patterns) side of UX design. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI ‘18). Association for Computing Machinery. https://doi.org/10.1145/3173574.3174108

Gray, C. M., Santos, C. T., Bielova, N., & Mildner, T. (2024). An ontology of dark patterns knowledge: Foundations, definitions, and a pathway for shared knowledge-building. In Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems (CHI ‘24). Association for Computing Machinery. https://doi.org/10.1145/3613904.3642436

Hou, H., Jia, S., Hu, S., Fan, R., Sun, W., Sun, T. & Zhang, H. (2012). Reduced striatal dopamine transporters in people with internet addiction disorder. BioMed Research International, 2012, Article 854524. https://doi.org/10.1155/2012/854524

Koopmans, F., & Sremac, S. (2011). Addiction and autonomy: Are addicts autonomous? Nova prisutnost: časopis za intelektualna i duhovna pitanja, 9, 171-185.

Lacey, C., & Caudwell, C. (2019). Cuteness as a ‘dark pattern’ in home robots. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI ‘19). IEEE Press. https://dl.acm.org/doi/abs/10.5555/3378680.3378736

Lorenz-Spreen, P., Lewandowsky, S., Sunstein, C. R., & Hertwig, R. (2020). How behavioural sciences can promote truth, autonomy, and democratic discourse online. Nature Human Behaviour, 4(11), 1102-1112. https://doi.org/10.1038/s41562-020-0889-7

Luguri, J., & Strahilevitz, L. J. (2021). Shining a light on dark patterns. Journal of Legal Analysis, 13(1), 43-70. http://dx.doi.org/10.2139/ssrn.3431205

Mathur, A., Acar, G., Friedman, M. J., Lucherini, E., Mayer, J., Chetty, M., & Narayanan, A. (2019). Dark patterns at scale: Findings from a crawl of 11K shopping websites. Proceedings of the ACM on Human-Computer Interaction, 3(1), 1-23. https://doi.org/10.1145/3359183

Mildner, T., Inkoom, A., Malaka, R., & Niess, J. (2024). Hell is paved with good intentions: The intricate relationship between cognitive biases and dark patterns. arXiv preprint arXiv:240507378. https://doi.org/10.48550/arXiv.2405.07378

Mildner, T., & Savino, G. L. (2021). Ethical user interfaces: Exploring the effects of dark patterns on Facebook. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems, 1-7.https://doi.org/10.48550/arXiv.2104.03010

Mildner, T., Savino, G.-L., Doyle, P. K., Cowan, B. K., & Malaka, R. (2023). About engaging and governing strategies: A thematic analysis of dark patterns in social networking services. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. ACM. https://doi.org/10.1145/3544548.3580695

Monge Roffarello, A., & De Russis, L. (2022). Towards understanding the dark patterns that steal our attention. Journal of Digital Design Studies, 10, 23-45. www.doi.org/10.1145/3491101.3519829

Monge Roffarello, A., Lukoff, K., & De Russis, L. (2023). Defining and identifying attention capture deceptive designs in digital interfaces. Proceedings of the CHI 2023 Conference. https://doi.org/10.1145/3544548.3580729

Musetti, A., et al. (2016). Challenges in internet addiction disorder: Is a diagnosis feasible or not? Frontiers in Psychology, 7, 842.

Narayanan, A., Mathur, A., Chetty, M., & Kshirsagar, M. (2020). Dark patterns: Past, present, and future: The evolution of tricky user interfaces. Queue, 18(4), 67-76. https://doi.org/10.1145/3400899.3400901

Noggle, R. (n.d.). The ethics of manipulation. Stanford Encyclopedia of Philosophy. https://plato.stanford.edu/entries/ethics-manipulation/

Pelley, S. (2021, October 4). Facebook Whistleblower Frances Haugen details company’s misleading efforts on 60 Minutes. CBS News. https://www.cbsnews.com/news/facebook-whistleblower-frances-haugen-misinformation-public-60-minutes-2021-10-03/

Rideout, V., & Robb, M. B. (2018). Social media, social life: Teens reveal their experiences. Common Sense Media.

Shapira, N. A., et al. (2000). Psychiatric features of individuals with problematic internet use. Journal of Affective Disorders, 57(3), 267-272. https://doi.org/10.1016/S0165-0327(99)00107-X

Statista. (2024, March 8). Time Spent Online Worldwide by Region 2024. Statista. https://www.statista.com/statistics/1258232/daily-time-spent-online-worldwide/#statisticContainer

Susser, D., Roessler, B., & Nissenbaum, H. (2019a). Online manipulation: Hidden influences in a digital world. Georgetown Law Technology Review, 4, 1-25. http://dx.doi.org/10.2139/ssrn.3306006

Susser, D., Roessler, B., & Nissenbaum, H. (2019b). Technology, autonomy, and manipulation. Internet Policy Review, 8(1), 1-25. https://doi.org/10.14763/2019.2.1410

Tao, R., Huang, X., Wang, J., Zhang, H., Zhang, Y., & Li, M. (2010). Proposed diagnostic criteria for internet addiction. Addiction, 105(3), 556–564. https://doi.org/10.1111/j.1360-0443.2009.02828.x

Tidwell, J. (2010). Designing interfaces: Patterns for effective interaction design. O’Reilly Media.

Waldman, A. E. (2020). Cognitive biases, dark patterns, and the ‘privacy paradox’. Current Opinion in Psychology, 31, 105-109. https://doi.org/10.1016/j.copsyc.2019.08.025

Watzl, S. (2023). What attention is: The priority structure account. Wiley Interdisciplinary Reviews: Cognitive Science, 14(5), Article e1632. https://doi.org/10.1002/wcs.1632

Harris Poll. (2024, September 10). What Gen Z Thinks about Its Social Media and Smartphone Usage. The Harris Poll. https://theharrispoll.com/briefs/gen-z-social-media-smart-phones/

World Health Organization. (2018, September 13). Public health implications of excessive use of the internet and other communication and gaming platforms. World Health Organization. https://www.who.int/news/item/13-09-2018-public-health-implications-of-excessive-use-of-the-internet-and-other-communication-and-gaming-platforms

Word Health Organization. (n.d.). Addictive Behaviours: Gaming Disorder. World Health Organization. https://www.who.int/news-room/questions-and-answers/item/addictive-behaviours-gaming-disorder

Young, K. S. (1998). Internet addiction: The emergence of a new clinical disorder. Cyberpsychology & behavior, 1(3), 237-244. https://doi.org/10.1089/cpb.1998.1.237

Date received: December 2024

Date accepted: March 2025

Metrics

Metrics Loading ...

Downloads

Published

12-05-2025