Policy Document Series · Issue 21 of 35 · April 2026
Technology Must Serve People, Not Extract From Them
Americans have almost no legal rights over their own digital lives. There is no federal privacy law. Tech monopolies extract hundreds of billions in value from personal data while paying nothing for it. Algorithms decide who gets hired, who gets a loan, who sees what news — with zero accountability. This platform addresses the full architecture of digital power through nine enforceable pillars.
Contents
The United States is the only major democracy without a comprehensive federal data privacy law. A patchwork of 20+ state laws with different standards has left Americans effectively without enforceable digital rights — while tech monopolies have built a $323 billion global data broker industry on the unconsented harvest of personal information.
The Common Good Party position is clear: technology must serve people, not extract from them. This platform addresses the full architecture of digital power through nine enforceable pillars — from a federal privacy law and children's digital safety to AI accountability, digital Fourth Amendment protections, tech antitrust, right to repair, net neutrality, digital public infrastructure, and platform worker protections. Each pillar has specific enforcement through private rights of action and penalties calibrated to 4% of global revenue. This is not anti-technology. This is about making technology serve people.
The EU has levied €5.65 billion in GDPR fines across 2,245 enforcement actions. The largest U.S. privacy fine under state law: $2.75 million — less than what Meta earns in four minutes. Algorithms deny loans, deny care, and issue wrongful arrests with no legal accountability. Federal agencies buy location data to bypass the Fourth Amendment for $9,000 a year. The status quo is a system designed to extract value from Americans while insulating the extractors from any consequence.
This platform reclaims the internet — built with public money, on open protocols, by researchers at public universities — for the people it was always meant to serve.
The failures of the current digital landscape are not the result of technology itself — they are the result of a regulatory vacuum that allowed a handful of corporations to capture nearly every layer of the digital economy. The failures cluster into four structural categories.
The surveillance business model: The real-time bidding ad-tech ecosystem broadcasts Americans' personal data — location, browsing history, health conditions — to thousands of companies in milliseconds, every time a web page loads. This is the engine of Big Tech's hundreds of billions in revenue. It was never consented to. It was imposed.
The American digital rights crisis was not inevitable. It was the result of deliberate legislative failure, sustained industry lobbying, and a Congress that defunded its own technical expertise three decades ago.
1995
Congress Defunds Its Own Technical Knowledge
Congress eliminated the Office of Technology Assessment — its independent source of technical expertise. For thirty years, legislators analyzing AI, cybersecurity, gene editing, and social media have relied on industry lobbyists for technical knowledge. The companies Congress must regulate supply the briefings Congress uses to understand those industries.
1998
COPPA — Written Before the Modern Internet
The Children's Online Privacy Protection Act was written in 1998 — before smartphones, before social media, before algorithmic feeds. It remains the primary federal law governing children's digital lives, hopelessly inadequate for a world its authors could not have imagined.
2015–2017
Net Neutrality: Enacted, Repealed
The FCC established net neutrality in 2015, protecting the open internet from ISP throttling and paid prioritization. It was repealed in 2017. The Sixth Circuit ruled in January 2025 that the FCC lacks Title II authority entirely, leaving ISPs free to throttle at will. Verizon had already throttled the Santa Clara County Fire Department's data during an active wildfire, demanding a plan upgrade.
2022–2024
The Failed Legislative History
The American Data Privacy and Protection Act passed committee 53-2 on a bipartisan basis in 2022, then died — killed by a preemption dispute and an industry lobbying blitz from Apple, Microsoft, Facebook, and Amazon combined at $15M per quarter. The Kids Online Safety Act passed the Senate 91-3 in 2024 but never received a House vote. The pattern repeats: broad bipartisan support, industry money, and legislative death.
The United States is not without models. Every other major democracy has enacted meaningful digital rights protections. The innovation argument — that regulation stifles technology — has been tested internationally. The evidence does not support it.
| Country | Law / Framework | Key Achievement |
|---|---|---|
| EUGDPR (2018) | General Data Protection Regulation | €5.65B in cumulative fines; extraterritorial scope; dedicated Data Protection Authorities in every member state |
| EUAI Act (2024) | Artificial Intelligence Act | Risk-based tiers (banned / high-risk / limited / minimal); mandatory audits; first comprehensive AI law in the world |
| EUDMA (2022) | Digital Markets Act | Ex ante rules for gatekeepers; interoperability mandates; €8B+ in fines; structural requirements for dominant platforms |
| UKAge Appropriate Design Code | Children's data protection | 91 documented product changes across 6 major platforms — including default privacy settings and disabling autoplay for minors |
| BrazilLGPD (2020) | Lei Geral de Proteção de Dados | Blocked Meta AI training on user data; stopped Worldcoin biometric collection; enforced data subject rights |
| EstoniaeID System | National digital identity infrastructure | 99% of government services available online; 800M+ digital signatures; privacy-preserving by design |
| GermanyCodetermination Law | Worker board representation | Workers on corporate boards, mandatory non-compete compensation — model for platform worker governance |
The argument that the U.S. must avoid regulation to stay ahead of China ignores the fact that China has already regulated AI — with 190+ models registered under its Algorithm Registration system. The EU has the AI Act. The U.S. is the global outlier in having done nothing. A national framework provides legal certainty that accelerates responsible development.
The Common Good Party's digital platform addresses the full architecture of digital power. Each pillar is enforceable, specific, and calibrated to the scale of the problem. This is a framework, not a wish list.
Americans own their data. Companies are custodians, not owners. This establishes the foundational legal framework for all digital rights.
Teens spending 3+ hours per day on social media face a 2x risk of depression. The design choices driving that outcome are not accidents — they are features. This pillar bans those features for minors.
A four-tier risk-based framework — modeled on the EU AI Act but tailored to American constitutional and legal context.
The Fourth Amendment's protection against unreasonable search and seizure cannot be rendered void by a commercial transaction. This pillar closes the data broker loophole permanently.
Tech monopoly is not the result of superior innovation. It is the result of strategic acquisition, self-preferencing, and the use of monopoly power in one market to foreclose competition in others.
Americans should be able to fix what they own. Manufacturer software locks and warranty voiding for independent repair transfers wealth from consumers to corporations and creates needless waste.
The open internet is infrastructure. Broadband is a utility, not a luxury. After the Sixth Circuit stripped FCC authority in January 2025, only an act of Congress can protect net neutrality. This is that act.
The internet was built with public money. Public infrastructure should remain in the public interest. This pillar builds the open, democratic digital commons that private platforms have failed to provide.
DoorDash median pay: $11/hour. Post-Prop 22 California drivers: approximately $6.20/hour after expenses. Platform misclassification is not a business model innovation — it is wage theft at scale.
Most of this platform is enforcement and regulation — not spending. The major expenditures are offset by enforcement revenue and the economic value of the digital economy they enable.
Change of this scale requires phased implementation. The sequencing prioritizes executive action first, foundational legislation second, structural reform third, and infrastructure investment fourth.
Phase 1 — Executive Action
Months 1–6
Phase 2 — Foundation Legislation
Months 6–18
Phase 3 — Structural Reform
Years 2–3
Phase 4 — Infrastructure & Labor
Years 3–5
Phase 5 — Full Implementation
Year 5+
The arguments against digital regulation are largely recycled from prior industries — tobacco, finance, pharmaceuticals — that resisted oversight with identical claims. Here is the evidence.
"Regulation stifles innovation."
The EU passed GDPR in 2018, the DMA in 2022, and the AI Act in 2024. It has not experienced innovation collapse. What it has experienced: platforms that must compete rather than foreclose, users with actual rights, and enforcement generating billions in fines. California has banned non-competes for 150 years — and produced Silicon Valley. The real innovation killer: research shows that tech giants' market dominance creates "kill zones" that suppress venture capital investment by six times in adjacent markets. Monopoly is the enemy of innovation.
"Privacy regulation hurts small businesses."
GDPR's primary weakness has been the compliance burden on small and medium enterprises, which averaged $1.7M in compliance costs. The Federal Digital Privacy Act directly addresses this with simplified compliance requirements for companies under 500 employees and $50M in revenue. Core rights — consent, deletion, no sale of sensitive data — apply to everyone. The administrative burden is calibrated to business size; the rights are not. The goal is to make compliance simple for small businesses and inescapable for large ones.
"Section 230 reform will censor free speech."
The reform is surgical. Core Section 230 protection is retained: platforms are not liable for content users post. The carveout applies only to algorithmic amplification — when a platform's recommendation engine actively promotes content to users. Hosting is protected. Algorithmic promotion is an editorial choice, not passive hosting. The threshold of 10 million monthly active users ensures small platforms and open-source communities are entirely unaffected.
"Encryption backdoors are needed for law enforcement."
72% of cryptography experts confirm there is no such thing as a backdoor only the "good guys" can use. Weakening encryption weakens it for everyone — including the 80% of Americans who received a data breach notification in 2024. The Columbia and MIT "Bugs in Our Pockets" study is definitive: client-side scanning creates mass surveillance infrastructure that cannot be limited to specific targets. Strong encryption protects individuals, businesses, critical infrastructure, and national security. The answer to law enforcement needs is better traditional investigation, not broken cryptography.
"AI regulation will put the U.S. behind China."
China has already regulated AI — 190+ models registered under its Algorithm Registration system. The EU has the AI Act. The U.S. is the outlier in having enacted nothing. A national framework provides legal certainty that actually accelerates responsible development; regulatory ambiguity chills investment. The NAIRR expansion to $500M+/year ensures public research keeps pace with commercial development. Responsible AI development is a competitive advantage, not a handicap.
The following statistics underpin the policy positions in this document. Each is sourced from peer-reviewed research, government data, or established investigative reporting.
Digital rights and technology policy intersect with nearly every other position in this platform. The following cross-references identify the most significant dependencies and complementary policies.
| #2 | Taxation | The data use tax and financial transaction tax on digital trading fund the Data Dividend Fund and digital infrastructure investments. |
| #3 | Housing | Algorithmic price-fixing ban covers rent-setting algorithms like RealPage, which coordinated rent increases across landlords nationwide. |
| #12 | Criminal Justice | Predictive policing ban, facial recognition restrictions, and regulation of COMPAS-style risk assessment algorithms address racial bias in automated criminal justice tools. |
| #13 | Labor & Minimum Wage | Gig worker protections, AI in employment decisions, algorithmic wage-setting, and non-compete agreements are addressed across both issues. |
| #14 | Trade Policy | Data localization requirements and cross-border data transfer rules in trade agreements affect both privacy protections and digital market access. |
| #16 | Reproductive Rights | The sensitive data sale ban explicitly protects reproductive healthcare access. Location data from abortion clinic visits has already been commercially available and used to target patients and providers. |
| #18 | Voting Rights | Deepfake election integrity rules, political ad transparency requirements, and campaign finance digital regulations directly connect to voting rights protections. |
| #20 | Corporate Power & Antitrust | Tech antitrust provisions build on the general corporate power framework. CINA (Corporate Independent National Authority) provides independent research capacity for both issues. |
"The internet was built with public money, on open protocols, by researchers at public universities. A handful of corporations captured it. This platform reclaims it — for the people it was always meant to serve."— The Common Good Party
Sources & Citations