Looks like you’re crossposting this from your blog elsewhere.
Read full explanation
A Series of Essays Dedicated to Mental Health, Data Privacy and Social Technology Reform in America
The purpose of this series of essays is to emphasize the critical role of social technology reform in advancing the United States' interests in national security and human rights. While we acknowledge the numerous benefits that technology provides to society, these essays argue that significant technology reform is necessary to maintain a well-functioning society that represents the best interests of its citizens while preserving the nation’s global leadership in a competitive marketplace.
We use TikTok as a case study to highlight the need for regulation. However, broader industry-wide regulations are essential to effectively address the challenges posed by the overuse of technology in society, all while recognizing the considerable benefits technology brings to our daily lives.
Finally, offers a proposed solution to the current dilemma of TikTok potentially being banned in the United States. This is the first essay in a series of technology reform.
Introduction:
Purpose of Paper:
The Founding Fathers of America intentionally made changes to our democracy incredibly difficult. The structure of our government instills the notion that a tremendous amount of bipartisan consensus, driven by public support, is required to bring about change that the nation will accept. Understandably, this creates frustration and resentment, as change becomes a tedious process that requires substantial nuance to become the law of the land.
There are times when discovering the truth is not self-evident. This is not one of those times. Mental health matters, and it is not a partisan issue. We all have differences, which is to be expected. Yet, we must collectively agree on the framework to have this debate as a nation or risk ceasing to exist as a united entity.
A nation is a set of collective principles that unite us toward a common goal. Today, we are more divided than ever. Those with differing viewpoints are often seen as adversaries rather than individuals with different life experiences.
In the past, we have solved the greatest challenges of our time by working with those we vehemently disagreed with. We have gone to the moon, prevented nuclear disasters, and ended world wars. Yet, there is something unprecedented in human history: the rise of highly addictive and dangerous social media algorithms.
There is no doubt that technology has advanced our society, ushering in a new age of innovation that can make our lives easier and more productive. However, the negative impacts are more insidious and hidden. With rapid advancements in technology, we have become more polarized and radicalized than ever.
We are still grappling with understanding the long-term effects of the internet on the social fabric of our society. Mental illness is evident throughout institutions, people, and interactions. It does not take a peer-reviewed research study to recognize this, although many scholars have eloquently laid out this argument.
We are still grappling with understanding the long-term effects of these algorithmic models. Now imagine new models that are exponentially more powerful. Imagine a system that we don’t understand. How do we create policy for something we have no idea how to address? How does our government keep up with a model that changes rapidly?
This series of essays is dedicated primarily to the American public to understand and address our mental health and privacy crisis. We will analyze this through a specific case study: TikTok, the social media application revolutionizing the world. This is of critical importance in the United States, where President Biden signed a bill to ban the application unless certain conditions are met, sparking significant public intrigue.
Although the divestiture or ban on TikTok is primarily due to national security concerns, it is essential to highlight that mental health and data privacy regulations would benefit all parties involved. Addressing mental health as a national security and public health concern can help mitigate the risks associated with TikTok and other social media platforms.
Our paper seeks to offer recommendations that ensure the well-being of our citizens while addressing the broader national security concerns of the United States. This approach recognizes that data privacy and mental health are integral to national security, creating a comprehensive strategy that serves the interests of all stakeholders involved. This paper focuses only on the post-effects of model development on society; however, it acknowledges that there may be appropriate policy measures for the pre-model development stages.
Mental Health is a state of well-being in which an individual can cope with life's stresses, work productively, and contribute to their community.
We acknowledge that all human beings have mental health needs, regardless of their race, color, religion, sex, national origin, age, disability, or genetic information.
We acknowledge that the preamble of the American Constitution promotes the general welfare of the public.
We acknowledge that although all human beings have mental health needs, it is difficult as a policy measure to create specialized solutions that cater to the needs of each individual.
We acknowledge that technology has many great uses in our society. It enables us to be more productive, makes our lives easier, builds social connections, and helps us discover ourselves better.
We acknowledge that technology also has the capacity to cause great harm to society due to its overreliance and the addictive nature of algorithms.
We acknowledge that the government represents the will of the public. When enough individuals collectively believe in a set of ideals, they become represented in our institutions.
We acknowledge that we live in a free market society—companies have the right to innovate and create revolutionary technology that transforms how we live our lives.
We acknowledge that with great power comes greater responsibility. No person, government, or institution is incorruptible. We intend to create systems that don't fall prey to the weaknesses of human nature but work with them.
Lastly, we acknowledge that mental health requires constant and ongoing monitoring. It is an imperfect process, but all human beings are inherently imperfect.
Social media platforms are intentionally designed to be addictive, capturing and retaining user attention. Their primary goal is to keep users engaged while fostering habits that perpetuate a cycle of content consumption, potentially leading to overdependence and exacerbating mental health issues.
A key feature contributing to this problem is personalized content delivery, which reinforces users' existing preferences and creates echo chambers. These isolated environments can promote extremist viewpoints and self-reinforcing ideologies. Furthermore, individuals are incentivized to produce content that garners views or builds an audience, sometimes at the expense of their own well-being.
A significant privacy and mental health concern associated with social media is its tendency to highlight only the positive aspects of users' lives, creating a misleading and incomplete portrayal of reality. This curated presentation does not reflect the day-to-day struggles individuals face, often leading to feelings of isolation, inadequacy, and stress. Prolonged social media use has been linked to these negative emotional impacts, as users frequently compare themselves to others.
Additionally, the rapid sharing of content on these platforms can exacerbate harmful behaviors, such as online harassment and the spread of false information, presenting broader societal challenges. Readers of these essays are highly encouraged to explore the works of Tristan Harris, Cal Newport, and Jonathan Haidt.
Key Stakeholders:
Key Stakeholders to Consider:
United States Government:
The role of the government is to establish policies that promote the responsible use of technology. It must ensure that platforms are held accountable for the data they collect on their users and prioritize the general well-being of the public. Since responsible technology use may conflict with companies' profit incentives, it is the government's duty to intervene when necessary.
TikTok (ByteDance):
As a leading social media company, TikTok holds significant influence over user experiences through its advanced algorithms. By implementing features that promote healthier usage patterns, increasing transparency, and prioritizing user safety, the platform can set a precedent for responsible operations within the industry. What sets TikTok apart from other platforms is its ownership by the Chinese technology company ByteDance, which raises additional national security concerns compared to American-owned companies.
Peer Social Media Companies:
The largest concern for social media platforms is that regulations will hamper their economic potential, which may be undeniable to a certain extent. However, the issue is that if no regulations are passed, it could disrupt the societal cohesion of the United States to a devastating degree and destabilize the fabric of the nation.
This is why any regulation must be thoughtfully crafted to ensure that the United States maintains its distinctive global advantage while still allowing companies to foster a culture of innovation and growth. Competing platforms also share an ethical responsibility to ensure their technologies do not cause harm to their consumers.
A Collaboration on best practices and the development of shared standards can help create a digital environment that fosters positive engagement rather than dependency or harm. Additionally, any legal or court cases involving TikTok will set a precedent for broader regulatory actions across the industry, drawing significant attention to these issues.
Creating a Basis for Technology Reform:
Meaningful technology reform that drives real change can only occur when issues such as mental health, data privacy, and ethical technology use are framed in a way that aligns with the interests of all stakeholders and demonstrates how they benefit all parties involved. This can be achieved by connecting these issues to areas of higher interest, such as economic productivity, national security, and societal cohesion. By linking these priorities together, we can implement meaningful reform in a more sustainable way that drives progress forward, especially since mental health, data privacy, and ethical technology use may not rank highly on the organizational priority list for most companies.
Stakeholder Collaboration:
There needs to be a massive amount of collaboration between all key stakeholders to develop policies that address mental health and privacy concerns, creating an effective and balanced solution. These policy positions must be at the intersection of digital well-being while also taking into account each individual stakeholder’s interests. The most challenging aspect of stakeholder collaboration is ensuring that businesses are universally aligned in making a change, as a disproportionate impact on one stakeholder will delay any agreement.
Public Awareness and Education:
Public awareness and education about these issues are the cornerstone of driving forward policy positions in mental health and responsible social media use. When citizens become informed about the negative effects of technology usage, advocacy groups and activists can put political and social pressure on politicians and lobbyists to push for change within corporations. If corporations are not incentivized to make changes to their platforms through external social pressure, there is little motivation to subvert profit motives to do so.
Research and Evidence-Based Policy:
Investing in research to understand the long-term effects of social media on health is extremely important. These findings will inform policy and create regulations that promote user well-being and data privacy protections grounded in evidence. Due to the rapid and evolving nature of technology, there will need to be continuous and ongoing monitoring of emerging models to help drive and develop policy positions.
Demonstrating Tangible Benefits:
Government: Expanding government oversight in key areas where businesses lack accountability can enhance national security.
For Social Media Companies: Prioritizing user well-being helps build brand reputation, foster user trust, and sustain long-term engagement. Companies that act ethically will attract more users, avoid regulatory penalties, and ensure a stable operating environment.
The Public: Improved mental health and digital literacy can enhance the overall quality of life. Individuals who are aware of the impacts of social media can make more informed choices, leading to better mental health outcomes and more balanced social media use.
By framing mental health and data privacy issues in terms that resonate with the self-interest of all stakeholders, we can foster a collective commitment to technology reform. This creates a powerful impetus for change, ensuring that mental health becomes a central consideration in the development and regulation of digital technologies.
Essay #1 - To the American Government
The Importance of Regulating TikTok and Social Media for National Mental Health and Security
Introduction:
Among the challenges the government faces, there is a rapidly evolving problem in the digital landscape involving mental health, data privacy, and national security. The case of TikTok exemplifies this issue exceptionally well, as it highlights the risks associated with modern digital technology. This essay argues that it is imperative for the American government to regulate TikTok and other social media platforms in a way that addresses these concerns while still maintaining our competitive economic advantage.
Pain Point #1: Lack of Transparency and Algorithm Accountability
Transparency in algorithmic decision-making is critical for ensuring trust and fairness on social media platforms such as TikTok. Users need to understand how algorithms operate in layman’s terms and how their content is prioritized and recommended to them. A lack of transparency and accountability makes it difficult to ensure there isn’t hidden bias or preferences associated with their content feed.
In the context of the American government, if it seeks to limit the amount of foreign influence or propaganda being disseminated to the general public, it must work in conjunction with social media companies to enable transparent algorithmic usage that promotes free speech while limiting foreign influence.
In the case of TikTok, this is especially important because it is difficult to understand the direct or indirect influence of the Chinese government on the American public without measurable algorithmic accountability.
Recommendation 1: Transparent Algorithms
Algorithmic Transparency:
Social media companies are responsible for establishing the criteria for how content is prioritized, the factors influencing recommendations, and how user interactions impact content visibility. There needs to be a federal law in place similar to the proposed Data Transparency and Algorithm Transparency Agreement (DATA) Act of 2023-2024. However, the number of covered users should extend to all users of social media platforms, not just the 30,000,000 active monthly users (though this is negotiable).
Transparency Reports:
Social media platforms need to publish regular transparency reports that provide insights into the impacts of their algorithms on content distribution for both the federal government and the general public. These reports should include information on the types of content most frequently prioritized, demographic analysis, and any changes made to the algorithmic system. By explaining how these algorithms affect content visibility in layman's terms, users and regulators will gain a clearer understanding of the types of content most frequently prioritized, enabling proper assessments of their impact on mental health and digital privacy.
Long-Term Studies:
It should be mandated that social media companies conduct long-term studies on the mental health effects of their algorithms and publicly release their findings to the United States government and the public. Due to the rapid and evolving nature of social media algorithms, it is important that technology companies periodically release data related to the psychological effects of their newest models on users. There should also be regular user feedback surveys and engagement content analysis, measuring time spent on the platform against the mental health effects of social media usage.
Pain Point #2 - The Lone Wolf Problem and Radicalization
The “lone wolf” problem refers to individuals who have become radicalized and developed extremist views. These individuals then act independently to commit acts of violence. Social media platforms, such as TikTok, contribute to this problem by creating echo chambers that increase the risk of radicalized individuals resorting to violence. The previous Biden administration emphasized the risks associated with dual-use technologies, which have the potential to cause significant harm. For the incoming Trump administration, it is crucial to consider ways to prevent users from reaching this level of radicalization, ensuring that individuals with lone wolf tendencies do not become radicalized.
In the case of TikTok, this is especially important due to the power of its social media algorithms. As these algorithms become increasingly sophisticated and remain outside the jurisdiction of the United States government, there is a heightened risk that the Chinese government could spread propaganda that radicalizes American users toward anti-American sentiment.
Recommendation 2 - Intervention Steps
The government should mandate that social media companies implement features that limit screen time and notifications while warning users about the addictive nature of these platforms. These notifications should be similar to warning labels on alcohol and tobacco products, indicating the potential dangers of these platforms and the risks associated with excessive use.
To enhance privacy and protection, companies must ensure that personal data is collected legally and under the strictest conditions to prevent misuse or exploitation of individuals. There should also be automated reminders for the timeframe of content consumption, where a notification appears when a user has consumed content for an excessive amount of time (the exact time frame can be negotiable). This may help remind users to step away from their content consumption to pursue healthier habits.
These measures could be labeled as "intervention steps" for users, meaning there should be additional mechanisms in place to encourage users to pause and reflect before consuming new content. Further thought should be given to other appropriate intervention steps depending on the circumstances. For example, requiring a government-issued ID to access a personal social media account might be too excessive, as it could be used to build psychological profiles on users, potentially enabling misuse by government actors or social media companies.
Finally, to prevent radicalization, the government should work in conjunction with social media companies to address the types of content recommended to users. This can be achieved by altering algorithmic recommendations to prioritize not only user engagement but also diverse content exposure. When users are consistently exposed to diverse viewpoints, they can develop a more holistic understanding of various topics.
Pain Point #3: Insufficient Data Privacy Regulations
The United States needs to develop more stringent data privacy standards, similar to the European Union’s General Data Protection Regulation (GDPR), which represents the highest standard for data privacy and security. The European standards emerged out of necessity after World War II, when data was used to identify and prosecute members of minority classes. In response to these events, Europe took a stronger stance on data privacy as a fundamental human right.
Similar conditions could arise in the United States if societal conditions continue to degrade in a similar manner and proper technology reforms are not implemented. If the data of individuals is leaked and abused by foreign entities, it could allow for the exploitation of private lives through blackmail, cyberattacks, psychological manipulation, and other malicious purposes.
In the case of TikTok, this is especially dangerous because a foreign “adversary” with access to this type of information could potentially influence policies in the United States.
Recommendation 3: Robust Data Privacy Standards
Data Localization:
Data must be kept within local borders to reduce the risk of unauthorized foreign actors, even though it may be more costly to do so. While there may be operational challenges associated with this approach, it paves the way for the government to have greater control over its data and prevents foreign adversaries from accessing the data of American citizens.
Regular Audits:
There need to be regular audits of social media applications to ensure compliance with data protections and standards, conducted by third-party organizations in conjunction with the appropriate government body. If there is a lack of compliance, the appropriate fines should be imposed to ensure that social media companies adhere to the necessary measures. Without some mechanism to hold these companies accountable, there is no incentive for compliance.
Data Minimization:
Any data collected on American citizens should be limited in scope and used only for the intended purpose for which it was submitted. Companies must clearly define what type of data is being collected and the purpose for which it is being collected. There must also be clear standards to ensure that the data collected is not retained longer than necessary, in accordance with storage limitations. The deletion of data that is no longer necessary should be a common practice among companies.
Strengthening of User Rights:
There needs to be a greater emphasis on giving users more control over their data. This means that users should have authority over how their personal information is stored, processed, used, and deleted. By allowing users greater control over their data, it promotes accountability and transparency. In terms of discrimination and bias, if users feel that their data is being misused against them by certain institutional bodies, it provides them with a level of control and leverage over how their information is used.
Selling of Data to Third Parties and Removal of Consent:
The selling of data to third parties without the express consent of the user should be strictly prohibited. Users need to be in control of their data and personal information. If social media companies wish to sell user data to third-party companies, users must receive monetary compensation for doing so. In terms of removal of consent, users should have the right to revoke consent for their individual data sharing.
User Control and Customization:
There should be complete user control and customization of their content feed based on their preferences. Providing users with a way to curate their digital experience in alignment with their interests and values will reduce potential exposure to harmful content and sources in a way that empowers them. That said, there need to be built-in mechanisms to ensure users don’t fall into a content echo chamber of their own preferences. This can be achieved by monitoring algorithmic bias to ensure content diversity within the user customization process. Additionally, there should be an opt-out mechanism in place, allowing users to choose not to have their data input into the algorithmic system.
Security Measures:
While this paper is primarily focused on the data privacy aspects of TikTok and other social media platforms, it also acknowledges the pivotal role of data security, cybersecurity, and encryption standards, which are important to address as well. One key factor to consider is the implementation of a robust system of data-security breach notifications that allows users to know when their data has been compromised or leaked.
For Further Discussion:
For the purposes of this paper, we will leave the following topics for future discussion: interoperability, cross-data transfers, biometric data protection, and bias/discrimination in algorithms. Lastly, an important idea to explore in future papers is the concept of treating personal data under property law, which would represent an interesting paradigm shift. While we are not expressly advocating for treating personal data under property law at this time, we would like to compare it to other frameworks in future essays.
We acknowledge the importance of these topics; however, for the purposes of this paper, we will address them at a later date.
Pain Point #4 - Decline in Military Recruitment
Another critical issue to consider is the declining rates of military recruitment, which can be attributed to declining mental and physical health among citizens. Mental health issues, such as anxiety and depression, combined with lower physical fitness, contribute to a diminished interest in military careers. Social media can exacerbate these issues by promoting unrealistic body standards, perpetuating negative self-images, and encouraging sedentary behaviors through prolonged screen time.
In the context of TikTok and other social media platforms, it must be understood that we live in an attention economy. If our attention spans are diverted from productive means that contribute to better health, they can devolve into wasteful social media usage. Although this paper acknowledges the entertainment aspects of social media, the excessive nature of content consumption may detract from more productive use of one’s time in leading a healthier lifestyle.
Recommendation 4 - Reassessment of Health Standards
The government should promote campaigns that highlight the benefits of military service and healthier lifestyles. It must actively reassess current FDA regulations to ensure they serve the best interests of the American people. The government should be involved in creating a populace that prioritizes healthy initiatives to cultivate citizens who meet the highest standards of mental and physical fitness.
A comprehensive reassessment is necessary to identify the key factors that contribute to both physical and mental fitness and address these concerns effectively. In tandem with these efforts, the government should collaborate with social media companies to promote businesses that encourage a more active and health-conscious lifestyle. Given that there might be contentious debate on what constitutes “healthy,” we should start with the lowest common denominators—such as drinking clean water, eating unprocessed food, and maintaining regular sleep patterns—before working our way toward more contested methods.
This approach should not focus solely on military recruitment but instead aim to foster a healthier American population overall. As a downstream effect, this will naturally lead to a pool of candidates with improved mental and physical fitness, enhancing the quality of military recruitment as a byproduct.
Solution to the TikTok Divesture or Ban Law
Conclusion:
There are a couple of problems with the TikTok divest-or-be-banned law. From the perspective of the United States, there is the question of national security and having a foreign “adversary” (the Chinese government) access the data of American citizens. This is why the American government wants ByteDance (the parent company of TikTok) to sell the platform, due to fears that the Chinese government may exploit the data for its own political gain. In the instance where ByteDance is not able to sell the platform, it will result in a ban of TikTok in the United States. Whether unfounded or not, it is a legitimate concern that the United States government has.
To mitigate those concerns, TikTok has proposed the idea of Project Texas8. There are a couple of central tenets in Project Texas; however, the main idea is that there will be a special subsidiary called TikTok U.S. Data Security (USDS) that will monitor all American operations with its own leadership, while ByteDance will retain overall ownership of the company. The idea is that USDS will be able to protect and monitor American data, while the United States stores its data in an Oracle Cloud infrastructure. The theory is that any breaches or security concerns will then be mitigated through this structure. The main issue again is that there is no way to guarantee that Chinese national surveillance laws won’t bypass this structure due to their own national security concerns and collect American data for their own purposes.
While this is a conceptual solution, our plan would suggest that the United States needs to have full operational control of TikTok through an independent entity. This entity would have an independent board structure with no influence from ByteDance. From an ownership perspective, ByteDance may still retain minority ownership in this independent entity but have no decision-making authority. All data must remain localized and processed in the United States. In order for this to make sense to ByteDance, they will receive a royalty or a defined percentage of revenue for the intellectual property and brand use of TikTok. By doing it this way, ByteDance will still receive revenue-sharing capabilities; however, all American data will remain secure and within the control of the United States.
In order for this to work, there will need to be a massive amount of stakeholder collaboration among ByteDance, an independent U.S. board of directors, the current U.S. TikTok team, Oracle, and the United States government (President Trump, Congress, CFIUS, FTC, and the FCC).
In future essays, we will dive further into the specifics of this process and how it would work.
A Series of Essays Dedicated to Mental Health, Data Privacy and Social Technology Reform in America
The purpose of this series of essays is to emphasize the critical role of social technology reform in advancing the United States' interests in national security and human rights. While we acknowledge the numerous benefits that technology provides to society, these essays argue that significant technology reform is necessary to maintain a well-functioning society that represents the best interests of its citizens while preserving the nation’s global leadership in a competitive marketplace.
We use TikTok as a case study to highlight the need for regulation. However, broader industry-wide regulations are essential to effectively address the challenges posed by the overuse of technology in society, all while recognizing the considerable benefits technology brings to our daily lives.
Finally, offers a proposed solution to the current dilemma of TikTok potentially being banned in the United States. This is the first essay in a series of technology reform.
Introduction:
Purpose of Paper:
The Founding Fathers of America intentionally made changes to our democracy incredibly difficult. The structure of our government instills the notion that a tremendous amount of bipartisan consensus, driven by public support, is required to bring about change that the nation will accept. Understandably, this creates frustration and resentment, as change becomes a tedious process that requires substantial nuance to become the law of the land.
There are times when discovering the truth is not self-evident. This is not one of those times. Mental health matters, and it is not a partisan issue. We all have differences, which is to be expected. Yet, we must collectively agree on the framework to have this debate as a nation or risk ceasing to exist as a united entity.
A nation is a set of collective principles that unite us toward a common goal. Today, we are more divided than ever. Those with differing viewpoints are often seen as adversaries rather than individuals with different life experiences.
In the past, we have solved the greatest challenges of our time by working with those we vehemently disagreed with. We have gone to the moon, prevented nuclear disasters, and ended world wars. Yet, there is something unprecedented in human history: the rise of highly addictive and dangerous social media algorithms.
There is no doubt that technology has advanced our society, ushering in a new age of innovation that can make our lives easier and more productive. However, the negative impacts are more insidious and hidden. With rapid advancements in technology, we have become more polarized and radicalized than ever.
We are still grappling with understanding the long-term effects of the internet on the social fabric of our society. Mental illness is evident throughout institutions, people, and interactions. It does not take a peer-reviewed research study to recognize this, although many scholars have eloquently laid out this argument.
We are still grappling with understanding the long-term effects of these algorithmic models. Now imagine new models that are exponentially more powerful. Imagine a system that we don’t understand. How do we create policy for something we have no idea how to address? How does our government keep up with a model that changes rapidly?
This series of essays is dedicated primarily to the American public to understand and address our mental health and privacy crisis. We will analyze this through a specific case study: TikTok, the social media application revolutionizing the world. This is of critical importance in the United States, where President Biden signed a bill to ban the application unless certain conditions are met, sparking significant public intrigue.
Although the divestiture or ban on TikTok is primarily due to national security concerns, it is essential to highlight that mental health and data privacy regulations would benefit all parties involved. Addressing mental health as a national security and public health concern can help mitigate the risks associated with TikTok and other social media platforms.
Our paper seeks to offer recommendations that ensure the well-being of our citizens while addressing the broader national security concerns of the United States. This approach recognizes that data privacy and mental health are integral to national security, creating a comprehensive strategy that serves the interests of all stakeholders involved. This paper focuses only on the post-effects of model development on society; however, it acknowledges that there may be appropriate policy measures for the pre-model development stages.
______________________________________________________________________________
Preamble:
Mental Health is a state of well-being in which an individual can cope with life's stresses, work productively, and contribute to their community.
We acknowledge that all human beings have mental health needs, regardless of their race, color, religion, sex, national origin, age, disability, or genetic information.
We acknowledge that the preamble of the American Constitution promotes the general welfare of the public.
We acknowledge that although all human beings have mental health needs, it is difficult as a policy measure to create specialized solutions that cater to the needs of each individual.
We acknowledge that technology has many great uses in our society. It enables us to be more productive, makes our lives easier, builds social connections, and helps us discover ourselves better.
We acknowledge that technology also has the capacity to cause great harm to society due to its overreliance and the addictive nature of algorithms.
We acknowledge that the government represents the will of the public. When enough individuals collectively believe in a set of ideals, they become represented in our institutions.
We acknowledge that we live in a free market society—companies have the right to innovate and create revolutionary technology that transforms how we live our lives.
We acknowledge that with great power comes greater responsibility. No person, government, or institution is incorruptible. We intend to create systems that don't fall prey to the weaknesses of human nature but work with them.
Lastly, we acknowledge that mental health requires constant and ongoing monitoring. It is an imperfect process, but all human beings are inherently imperfect.
__________________________________________________________________________
Addictiveness and Mental Health Effects:
Social media platforms are intentionally designed to be addictive, capturing and retaining user attention. Their primary goal is to keep users engaged while fostering habits that perpetuate a cycle of content consumption, potentially leading to overdependence and exacerbating mental health issues.
A key feature contributing to this problem is personalized content delivery, which reinforces users' existing preferences and creates echo chambers. These isolated environments can promote extremist viewpoints and self-reinforcing ideologies. Furthermore, individuals are incentivized to produce content that garners views or builds an audience, sometimes at the expense of their own well-being.
A significant privacy and mental health concern associated with social media is its tendency to highlight only the positive aspects of users' lives, creating a misleading and incomplete portrayal of reality. This curated presentation does not reflect the day-to-day struggles individuals face, often leading to feelings of isolation, inadequacy, and stress. Prolonged social media use has been linked to these negative emotional impacts, as users frequently compare themselves to others.
Additionally, the rapid sharing of content on these platforms can exacerbate harmful behaviors, such as online harassment and the spread of false information, presenting broader societal challenges. Readers of these essays are highly encouraged to explore the works of Tristan Harris, Cal Newport, and Jonathan Haidt.
Key Stakeholders:
Key Stakeholders to Consider:
United States Government:
The role of the government is to establish policies that promote the responsible use of technology. It must ensure that platforms are held accountable for the data they collect on their users and prioritize the general well-being of the public. Since responsible technology use may conflict with companies' profit incentives, it is the government's duty to intervene when necessary.
TikTok (ByteDance):
As a leading social media company, TikTok holds significant influence over user experiences through its advanced algorithms. By implementing features that promote healthier usage patterns, increasing transparency, and prioritizing user safety, the platform can set a precedent for responsible operations within the industry. What sets TikTok apart from other platforms is its ownership by the Chinese technology company ByteDance, which raises additional national security concerns compared to American-owned companies.
Peer Social Media Companies:
The largest concern for social media platforms is that regulations will hamper their economic potential, which may be undeniable to a certain extent. However, the issue is that if no regulations are passed, it could disrupt the societal cohesion of the United States to a devastating degree and destabilize the fabric of the nation.
This is why any regulation must be thoughtfully crafted to ensure that the United States maintains its distinctive global advantage while still allowing companies to foster a culture of innovation and growth. Competing platforms also share an ethical responsibility to ensure their technologies do not cause harm to their consumers.
A Collaboration on best practices and the development of shared standards can help create a digital environment that fosters positive engagement rather than dependency or harm. Additionally, any legal or court cases involving TikTok will set a precedent for broader regulatory actions across the industry, drawing significant attention to these issues.
Creating a Basis for Technology Reform:
Meaningful technology reform that drives real change can only occur when issues such as mental health, data privacy, and ethical technology use are framed in a way that aligns with the interests of all stakeholders and demonstrates how they benefit all parties involved. This can be achieved by connecting these issues to areas of higher interest, such as economic productivity, national security, and societal cohesion. By linking these priorities together, we can implement meaningful reform in a more sustainable way that drives progress forward, especially since mental health, data privacy, and ethical technology use may not rank highly on the organizational priority list for most companies.
Stakeholder Collaboration:
There needs to be a massive amount of collaboration between all key stakeholders to develop policies that address mental health and privacy concerns, creating an effective and balanced solution. These policy positions must be at the intersection of digital well-being while also taking into account each individual stakeholder’s interests. The most challenging aspect of stakeholder collaboration is ensuring that businesses are universally aligned in making a change, as a disproportionate impact on one stakeholder will delay any agreement.
Public Awareness and Education:
Public awareness and education about these issues are the cornerstone of driving forward policy positions in mental health and responsible social media use. When citizens become informed about the negative effects of technology usage, advocacy groups and activists can put political and social pressure on politicians and lobbyists to push for change within corporations. If corporations are not incentivized to make changes to their platforms through external social pressure, there is little motivation to subvert profit motives to do so.
Research and Evidence-Based Policy:
Investing in research to understand the long-term effects of social media on health is extremely important. These findings will inform policy and create regulations that promote user well-being and data privacy protections grounded in evidence. Due to the rapid and evolving nature of technology, there will need to be continuous and ongoing monitoring of emerging models to help drive and develop policy positions.
Demonstrating Tangible Benefits:
Government: Expanding government oversight in key areas where businesses lack accountability can enhance national security.
For Social Media Companies: Prioritizing user well-being helps build brand reputation, foster user trust, and sustain long-term engagement. Companies that act ethically will attract more users, avoid regulatory penalties, and ensure a stable operating environment.
The Public: Improved mental health and digital literacy can enhance the overall quality of life. Individuals who are aware of the impacts of social media can make more informed choices, leading to better mental health outcomes and more balanced social media use.
By framing mental health and data privacy issues in terms that resonate with the self-interest of all stakeholders, we can foster a collective commitment to technology reform. This creates a powerful impetus for change, ensuring that mental health becomes a central consideration in the development and regulation of digital technologies.
Essay #1 - To the American Government
The Importance of Regulating TikTok and Social Media for National Mental Health and Security
Introduction:
Among the challenges the government faces, there is a rapidly evolving problem in the digital landscape involving mental health, data privacy, and national security. The case of TikTok exemplifies this issue exceptionally well, as it highlights the risks associated with modern digital technology. This essay argues that it is imperative for the American government to regulate TikTok and other social media platforms in a way that addresses these concerns while still maintaining our competitive economic advantage.
Pain Point #1: Lack of Transparency and Algorithm Accountability
Transparency in algorithmic decision-making is critical for ensuring trust and fairness on social media platforms such as TikTok. Users need to understand how algorithms operate in layman’s terms and how their content is prioritized and recommended to them. A lack of transparency and accountability makes it difficult to ensure there isn’t hidden bias or preferences associated with their content feed.
In the context of the American government, if it seeks to limit the amount of foreign influence or propaganda being disseminated to the general public, it must work in conjunction with social media companies to enable transparent algorithmic usage that promotes free speech while limiting foreign influence.
In the case of TikTok, this is especially important because it is difficult to understand the direct or indirect influence of the Chinese government on the American public without measurable algorithmic accountability.
Recommendation 1: Transparent Algorithms
Algorithmic Transparency:
Social media companies are responsible for establishing the criteria for how content is prioritized, the factors influencing recommendations, and how user interactions impact content visibility. There needs to be a federal law in place similar to the proposed Data Transparency and Algorithm Transparency Agreement (DATA) Act of 2023-2024. However, the number of covered users should extend to all users of social media platforms, not just the 30,000,000 active monthly users (though this is negotiable).
Transparency Reports:
Social media platforms need to publish regular transparency reports that provide insights into the impacts of their algorithms on content distribution for both the federal government and the general public. These reports should include information on the types of content most frequently prioritized, demographic analysis, and any changes made to the algorithmic system. By explaining how these algorithms affect content visibility in layman's terms, users and regulators will gain a clearer understanding of the types of content most frequently prioritized, enabling proper assessments of their impact on mental health and digital privacy.
Long-Term Studies:
It should be mandated that social media companies conduct long-term studies on the mental health effects of their algorithms and publicly release their findings to the United States government and the public. Due to the rapid and evolving nature of social media algorithms, it is important that technology companies periodically release data related to the psychological effects of their newest models on users. There should also be regular user feedback surveys and engagement content analysis, measuring time spent on the platform against the mental health effects of social media usage.
______________________________________________________________________________
Pain Point #2 - The Lone Wolf Problem and Radicalization
The “lone wolf” problem refers to individuals who have become radicalized and developed extremist views. These individuals then act independently to commit acts of violence. Social media platforms, such as TikTok, contribute to this problem by creating echo chambers that increase the risk of radicalized individuals resorting to violence. The previous Biden administration emphasized the risks associated with dual-use technologies, which have the potential to cause significant harm. For the incoming Trump administration, it is crucial to consider ways to prevent users from reaching this level of radicalization, ensuring that individuals with lone wolf tendencies do not become radicalized.
In the case of TikTok, this is especially important due to the power of its social media algorithms. As these algorithms become increasingly sophisticated and remain outside the jurisdiction of the United States government, there is a heightened risk that the Chinese government could spread propaganda that radicalizes American users toward anti-American sentiment.
Recommendation 2 - Intervention Steps
The government should mandate that social media companies implement features that limit screen time and notifications while warning users about the addictive nature of these platforms. These notifications should be similar to warning labels on alcohol and tobacco products, indicating the potential dangers of these platforms and the risks associated with excessive use.
To enhance privacy and protection, companies must ensure that personal data is collected legally and under the strictest conditions to prevent misuse or exploitation of individuals. There should also be automated reminders for the timeframe of content consumption, where a notification appears when a user has consumed content for an excessive amount of time (the exact time frame can be negotiable). This may help remind users to step away from their content consumption to pursue healthier habits.
These measures could be labeled as "intervention steps" for users, meaning there should be additional mechanisms in place to encourage users to pause and reflect before consuming new content. Further thought should be given to other appropriate intervention steps depending on the circumstances. For example, requiring a government-issued ID to access a personal social media account might be too excessive, as it could be used to build psychological profiles on users, potentially enabling misuse by government actors or social media companies.
Finally, to prevent radicalization, the government should work in conjunction with social media companies to address the types of content recommended to users. This can be achieved by altering algorithmic recommendations to prioritize not only user engagement but also diverse content exposure. When users are consistently exposed to diverse viewpoints, they can develop a more holistic understanding of various topics.
Pain Point #3: Insufficient Data Privacy Regulations
The United States needs to develop more stringent data privacy standards, similar to the European Union’s General Data Protection Regulation (GDPR), which represents the highest standard for data privacy and security. The European standards emerged out of necessity after World War II, when data was used to identify and prosecute members of minority classes. In response to these events, Europe took a stronger stance on data privacy as a fundamental human right.
Similar conditions could arise in the United States if societal conditions continue to degrade in a similar manner and proper technology reforms are not implemented. If the data of individuals is leaked and abused by foreign entities, it could allow for the exploitation of private lives through blackmail, cyberattacks, psychological manipulation, and other malicious purposes.
In the case of TikTok, this is especially dangerous because a foreign “adversary” with access to this type of information could potentially influence policies in the United States.
Recommendation 3: Robust Data Privacy Standards
Data Localization:
Data must be kept within local borders to reduce the risk of unauthorized foreign actors, even though it may be more costly to do so. While there may be operational challenges associated with this approach, it paves the way for the government to have greater control over its data and prevents foreign adversaries from accessing the data of American citizens.
Regular Audits:
There need to be regular audits of social media applications to ensure compliance with data protections and standards, conducted by third-party organizations in conjunction with the appropriate government body. If there is a lack of compliance, the appropriate fines should be imposed to ensure that social media companies adhere to the necessary measures. Without some mechanism to hold these companies accountable, there is no incentive for compliance.
Data Minimization:
Any data collected on American citizens should be limited in scope and used only for the intended purpose for which it was submitted. Companies must clearly define what type of data is being collected and the purpose for which it is being collected. There must also be clear standards to ensure that the data collected is not retained longer than necessary, in accordance with storage limitations. The deletion of data that is no longer necessary should be a common practice among companies.
Strengthening of User Rights:
There needs to be a greater emphasis on giving users more control over their data. This means that users should have authority over how their personal information is stored, processed, used, and deleted. By allowing users greater control over their data, it promotes accountability and transparency. In terms of discrimination and bias, if users feel that their data is being misused against them by certain institutional bodies, it provides them with a level of control and leverage over how their information is used.
Selling of Data to Third Parties and Removal of Consent:
The selling of data to third parties without the express consent of the user should be strictly prohibited. Users need to be in control of their data and personal information. If social media companies wish to sell user data to third-party companies, users must receive monetary compensation for doing so. In terms of removal of consent, users should have the right to revoke consent for their individual data sharing.
User Control and Customization:
There should be complete user control and customization of their content feed based on their preferences. Providing users with a way to curate their digital experience in alignment with their interests and values will reduce potential exposure to harmful content and sources in a way that empowers them. That said, there need to be built-in mechanisms to ensure users don’t fall into a content echo chamber of their own preferences. This can be achieved by monitoring algorithmic bias to ensure content diversity within the user customization process. Additionally, there should be an opt-out mechanism in place, allowing users to choose not to have their data input into the algorithmic system.
Security Measures:
While this paper is primarily focused on the data privacy aspects of TikTok and other social media platforms, it also acknowledges the pivotal role of data security, cybersecurity, and encryption standards, which are important to address as well. One key factor to consider is the implementation of a robust system of data-security breach notifications that allows users to know when their data has been compromised or leaked.
For Further Discussion:
For the purposes of this paper, we will leave the following topics for future discussion: interoperability, cross-data transfers, biometric data protection, and bias/discrimination in algorithms. Lastly, an important idea to explore in future papers is the concept of treating personal data under property law, which would represent an interesting paradigm shift. While we are not expressly advocating for treating personal data under property law at this time, we would like to compare it to other frameworks in future essays.
We acknowledge the importance of these topics; however, for the purposes of this paper, we will address them at a later date.
Pain Point #4 - Decline in Military Recruitment
Another critical issue to consider is the declining rates of military recruitment, which can be attributed to declining mental and physical health among citizens. Mental health issues, such as anxiety and depression, combined with lower physical fitness, contribute to a diminished interest in military careers. Social media can exacerbate these issues by promoting unrealistic body standards, perpetuating negative self-images, and encouraging sedentary behaviors through prolonged screen time.
In the context of TikTok and other social media platforms, it must be understood that we live in an attention economy. If our attention spans are diverted from productive means that contribute to better health, they can devolve into wasteful social media usage. Although this paper acknowledges the entertainment aspects of social media, the excessive nature of content consumption may detract from more productive use of one’s time in leading a healthier lifestyle.
Recommendation 4 - Reassessment of Health Standards
The government should promote campaigns that highlight the benefits of military service and healthier lifestyles. It must actively reassess current FDA regulations to ensure they serve the best interests of the American people. The government should be involved in creating a populace that prioritizes healthy initiatives to cultivate citizens who meet the highest standards of mental and physical fitness.
A comprehensive reassessment is necessary to identify the key factors that contribute to both physical and mental fitness and address these concerns effectively. In tandem with these efforts, the government should collaborate with social media companies to promote businesses that encourage a more active and health-conscious lifestyle. Given that there might be contentious debate on what constitutes “healthy,” we should start with the lowest common denominators—such as drinking clean water, eating unprocessed food, and maintaining regular sleep patterns—before working our way toward more contested methods.
This approach should not focus solely on military recruitment but instead aim to foster a healthier American population overall. As a downstream effect, this will naturally lead to a pool of candidates with improved mental and physical fitness, enhancing the quality of military recruitment as a byproduct.
Solution to the TikTok Divesture or Ban Law
Conclusion:
There are a couple of problems with the TikTok divest-or-be-banned law. From the perspective of the United States, there is the question of national security and having a foreign “adversary” (the Chinese government) access the data of American citizens. This is why the American government wants ByteDance (the parent company of TikTok) to sell the platform, due to fears that the Chinese government may exploit the data for its own political gain. In the instance where ByteDance is not able to sell the platform, it will result in a ban of TikTok in the United States. Whether unfounded or not, it is a legitimate concern that the United States government has.
To mitigate those concerns, TikTok has proposed the idea of Project Texas8. There are a couple of central tenets in Project Texas; however, the main idea is that there will be a special subsidiary called TikTok U.S. Data Security (USDS) that will monitor all American operations with its own leadership, while ByteDance will retain overall ownership of the company. The idea is that USDS will be able to protect and monitor American data, while the United States stores its data in an Oracle Cloud infrastructure. The theory is that any breaches or security concerns will then be mitigated through this structure. The main issue again is that there is no way to guarantee that Chinese national surveillance laws won’t bypass this structure due to their own national security concerns and collect American data for their own purposes.
While this is a conceptual solution, our plan would suggest that the United States needs to have full operational control of TikTok through an independent entity. This entity would have an independent board structure with no influence from ByteDance. From an ownership perspective, ByteDance may still retain minority ownership in this independent entity but have no decision-making authority. All data must remain localized and processed in the United States. In order for this to make sense to ByteDance, they will receive a royalty or a defined percentage of revenue for the intellectual property and brand use of TikTok. By doing it this way, ByteDance will still receive revenue-sharing capabilities; however, all American data will remain secure and within the control of the United States.
In order for this to work, there will need to be a massive amount of stakeholder collaboration among ByteDance, an independent U.S. board of directors, the current U.S. TikTok team, Oracle, and the United States government (President Trump, Congress, CFIUS, FTC, and the FCC).
In future essays, we will dive further into the specifics of this process and how it would work.
To read more - please see the following link - https://substack.com/@commonerai
References:
¹ Business of Apps. (2024). TikTok revenue and usage statistics (2024). Retrieved from https://www.businessofapps.com
² DemandSage. (2024). How many people use TikTok (2024 statistics). Retrieved from https://www.demandsage.com
³ What's The Big Data? (2024). Key TikTok statistics and trends 2024. Retrieved from https://www.whatsthebigdata.com
⁴ Social Media Management Hub. (2024). 21 TikTok stats to be aware of in 2024 (and beyond). Retrieved from https://www.socialmediamanagementhub.com
⁵ Hootsuite. (2024). 27 TikTok statistics marketers need to know in 2024. Retrieved from https://www.hootsuite.com
⁶ Tridens. (2024). 25 essential TikTok statistics you need to know in 2024. Retrieved from https://www.tridens.com
⁷ TechCrunch. (2024). TikTok revenue and usage statistics (2024). Retrieved from https://www.techcrunch.com
⁸ TikTok. (2024). About TikTok U.S. Data Security. Retrieved from https://usds.tiktok.com/usds-about/