<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[The Ethical Tech Project]]></title><description><![CDATA[Transforming the tech innovation ecosystem from "move fast and break things" to "think first, then make things - things that are better for individuals, for business, and for society."]]></description><link>https://news.ethicaltechproject.org</link><generator>Substack</generator><lastBuildDate>Mon, 20 Apr 2026 06:19:54 GMT</lastBuildDate><atom:link href="https://news.ethicaltechproject.org/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[The Ethical Tech Project]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[ethicaltechproject@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[ethicaltechproject@substack.com]]></itunes:email><itunes:name><![CDATA[The Ethical Tech Project]]></itunes:name></itunes:owner><itunes:author><![CDATA[The Ethical Tech Project]]></itunes:author><googleplay:owner><![CDATA[ethicaltechproject@substack.com]]></googleplay:owner><googleplay:email><![CDATA[ethicaltechproject@substack.com]]></googleplay:email><googleplay:author><![CDATA[The Ethical Tech Project]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[Now Accepting Nominations! ]]></title><description><![CDATA[Join a transformative cohort of changemakers in our Spring 2026 Ethical Tech Project Fellows Program]]></description><link>https://news.ethicaltechproject.org/p/now-accepting-nominations</link><guid isPermaLink="false">https://news.ethicaltechproject.org/p/now-accepting-nominations</guid><pubDate>Fri, 27 Feb 2026 15:30:53 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!AjYk!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F50c45b67-17ad-4339-a45f-7cec05f7366d_800x800.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>The Ethical Tech Project is seeking outstanding early- to mid-career tech builders (designers, product managers, engineers, architects, digital marketers, etc.) to join its <strong>Spring 2026 cohort of Ethical Tech Fellows.</strong><br><br>Successful nominees will participate in a 10-week structured curriculum, designed to contextualize the social impact of the decisions and choices made by tech teams bringing digital and AI-driven products and services to market. Fellows will develop both leadership and practical skills they can apply to their day-to-day work. These skills will enable them to produce better outcomes for individuals, businesses and society. At the end of the program, participants will have the opportunity to present a mini-capstone or project idea to a panel of industry luminaries and receive valuable, actionable feedback.<br><br>The Ethical Tech curriculum has been designed by subject matter experts in conjunction with the Digital Futures Institute at Columbia University and is moderated by <strong><a href="https://www.linkedin.com/in/jenniebaird/">Jennie Baird</a></strong>, <strong><a href="https://www.linkedin.com/in/nancy-green-saraisky-a09768186/">Nancy Green Saraisky</a></strong> and <strong><a href="https://www.linkedin.com/in/robertlevitan/">Robert Levitan</a></strong> along with special guests from the tech community.<br><br><strong>Sessions will be held on Thursday evenings from April 9, 2026 through June 11, 2026.</strong> Though a few sessions will be remote, the majority of sessions will be in-person and meet in New York City. <br><br>This is a unique learning, leadership, and networking opportunity for motivated and passionate individuals who want their tech and AI-oriented work to support human flourishing and who believe companies can do well by doing good. <br><br>There is no cost to participate in the program, though acceptance to previous fellowship cohorts has been extremely competitive. <br><br>Nominate yourself or nominate someone else who you think would make a great ETP Fellow. <strong>Applications for the Fellows program will close March 9, 2026.</strong><br><br>Fill out your application <a href="https://docs.google.com/forms/d/e/1FAIpQLSeQtklM2zh1Mxw_1AijbESxLQf8n2PC1NOZrn5JvKM2h0zD5g/viewform">here</a>. </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!AjYk!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F50c45b67-17ad-4339-a45f-7cec05f7366d_800x800.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!AjYk!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F50c45b67-17ad-4339-a45f-7cec05f7366d_800x800.jpeg 424w, https://substackcdn.com/image/fetch/$s_!AjYk!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F50c45b67-17ad-4339-a45f-7cec05f7366d_800x800.jpeg 848w, https://substackcdn.com/image/fetch/$s_!AjYk!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F50c45b67-17ad-4339-a45f-7cec05f7366d_800x800.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!AjYk!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F50c45b67-17ad-4339-a45f-7cec05f7366d_800x800.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!AjYk!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F50c45b67-17ad-4339-a45f-7cec05f7366d_800x800.jpeg" width="800" height="800" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/50c45b67-17ad-4339-a45f-7cec05f7366d_800x800.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:800,&quot;width&quot;:800,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;No alternative text description for this image&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="No alternative text description for this image" title="No alternative text description for this image" srcset="https://substackcdn.com/image/fetch/$s_!AjYk!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F50c45b67-17ad-4339-a45f-7cec05f7366d_800x800.jpeg 424w, https://substackcdn.com/image/fetch/$s_!AjYk!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F50c45b67-17ad-4339-a45f-7cec05f7366d_800x800.jpeg 848w, https://substackcdn.com/image/fetch/$s_!AjYk!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F50c45b67-17ad-4339-a45f-7cec05f7366d_800x800.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!AjYk!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F50c45b67-17ad-4339-a45f-7cec05f7366d_800x800.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h2><strong>Here&#8217;s what we&#8217;ve been reading and listening to, and you should be too:</strong></h2><h4><strong>&#129300; Attitudes on AI</strong></h4><p><strong>TIME:</strong> <strong><a href="https://time.com/7377579/ai-data-centers-people-movement-cover/">The People vs. AI</a>. </strong>Across red states and blue, a grassroots movement is pushing back on the unchecked growth of the artificial intelligence industry.<strong><br><br>The New York Times: <a href="https://www.nytimes.com/interactive/2026/02/02/opinion/ai-future-leading-thinkers-survey.html">Opinion | Where Is A.I. Taking Us? Eight Leading Thinkers Share Their Visions.</a></strong> Experts share their thoughts on the future of A.I. and how it will reshape society in the coming years.</p><p><strong>The Boston Globe: <a href="https://www.bostonglobe.com/2026/01/18/business/ai-restaurants-phone-pizza/">AI answering systems are &#8216;saving the day&#8217; for New England pizzerias. Customers aren&#8217;t so sure.</a></strong> The artificial receptionists, being used to take orders and field calls, have been met with resistance from some customers who said they can&#8217;t get the service they are used to.</p><h4><strong>&#128187; Jobs</strong></h4><p><strong>The Atlantic: <a href="https://www.theatlantic.com/ideas/2026/02/ai-white-collar-jobs/686031/">The Worst-Case Future for White-Collar Workers</a>.</strong> The well-off have no experience with the job market that might be coming.</p><p><strong>CNBC: <a href="https://www.cnbc.com/2026/02/19/accenture-ai-orders-senior-staff-lose-out-promotions.html">Accenture tells senior staff to use AI tools or risk losing out on leadership promotions</a>. </strong>Accenture started tracking how often senior staff are logging in to its AI tools this month saying AI adoption will be a &#8220;visible input to talent discussions.&#8221;</p><h4><strong>&#129504; Mental Health</strong></h4><p><strong>Ohio Capital Journal: <a href="https://ohiocapitaljournal.com/2026/02/20/ohio-bill-would-prevent-people-from-creating-ai-models-that-encourage-users-to-engage-in-self-harm/">Ohio bill would prevent people from creating AI models that encourage users to engage in self-harm</a>. </strong>Ohio lawmakers have introduced a bipartisan bill that would prevent anyone from creating an AI model in Ohio that encourages users to engage in self-harm or harm another person.</p><h4><strong>&#127758; Environment</strong></h4><p><strong>WIRED: <a href="https://www.wired.com/story/could-we-put-ai-data-centers-in-space/">Could AI Data Centers Be Moved to Outer Space?</a></strong> Massive data centers for generative AI are bad for the Earth. How about launching them into orbit?</p><h4><strong>&#128195; Copyright</strong></h4><p><strong>Deadline: <a href="https://deadline.com/2026/01/hollywood-ai-protest-campaign-1236692896/">Actors And Musicians Help Launch &#8220;Stealing Isn&#8217;t Innovation&#8221; Campaign To Protest Big Tech&#8217;s Use Of Copyrighted Works In AI Models</a>.</strong> The campaign is a protest against the unauthorized use of copyrighted works to train AI models.</p><p><strong>BBC: <a href="https://www.bbc.com/news/articles/ckg1dl410q9o">What is Seedance? The Chinese AI app sending Hollywood into a panic</a> </strong>Clips of Deadpool and other film characters have sparked alarm within Hollywood over copyright infringement.</p><h4>&#128499;&#65039; Democracy </h4><p><strong>WIRED: <a href="https://www.wired.com/story/ai-powered-disinformation-swarms-are-coming-for-democracy/">AI-Powered Disinformation Swarms Are Coming for Democracy</a>.</strong> Advances in artificial intelligence are creating a perfect storm for those seeking to spread disinformation at unprecedented speed and scale. And it&#8217;s virtually impossible to detect.</p><h4>&#127822; Education</h4><p><strong>The New York Times: <a href="https://www.nytimes.com/2026/02/12/opinion/ai-companies-college-students.html">Opinion | A.I. Companies Are Eating Higher Education</a>.</strong> Human intelligence &#8212; the thing we as educators are duty bound to defend and advance &#8212; is under attack.</p><p><strong>Fortune: <a href="https://fortune.com/2026/01/26/is-it-worth-it-to-go-to-law-school-ai-hiring-entry-level/">Law school admissions expert sees &#8216;dangerous one-two punch&#8217; as Gen Z seeks shelter from the AI hiring storm in 6-figure debt and JD lifeboat</a>.</strong> ChatGPT can pass the bar, but just as in 2008 and during the pandemic, law schools are overwhelmed with applicants</p><p><strong>NPR: <a href="https://www.npr.org/2026/01/27/nx-s1-5683821/china-ai-schools-curriculum">In China, AI is no longer optional for some kids. It&#8217;s part of the curriculum</a></strong>. While debate rages in the U.S. about the merits and risks of AI in schools, it&#8217;s become a state-mandated part of the curriculum in China, as the authorities try to create a pool of AI-savvy professionals</p><h4><strong>&#129658; Medicine</strong></h4><p><strong>The Guardian: <a href="https://www.theguardian.com/technology/2026/jan/24/google-ai-overviews-youtube-medical-citations-study">Google AI Overviews cite YouTube more than any medical site for health queries, study suggests</a>.</strong> German research into responses to health queries raises fresh questions about summaries seen by 2bn people a month</p><h4><strong>&#127891; Research and Resources</strong></h4><p><strong>TechEquity: <a href="https://techequity.us/2025/08/19/how-californians-feel-about-ai/">How Californians Feel About AI &#8211; Findings From the 2025 AI Compass.</a></strong> TechEquity surveyed over a thousand Californians to understand how they feel about AI and its impact on their lives.</p><p><strong>MIT Sloan: <a href="https://mitsloan.mit.edu/ideas-made-to-matter/agentic-ai-explained">Agentic AI, explained</a>.</strong> The age of agentic AI &#8212; systems that are semi- or fully autonomous and can act on their own &#8212; has arrived. Here&#8217;s what you need to know, according to MIT experts.</p><p><strong>The Guardian: <a href="https://www.theguardian.com/technology/2026/feb/03/deepfakes-ai-companions-artificial-intelligence-safety-report">&#8216;Deepfakes spreading and more AI companions&#8217;: seven takeaways from the latest artificial intelligence safety report</a>. </strong>Annual review highlights growing capabilities of AI models, while examining issues from cyber-attacks to job disruption.</p><p></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://donorbox.org/ethical-tech-project&quot;,&quot;text&quot;:&quot;Donate Here&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://donorbox.org/ethical-tech-project"><span>Donate Here</span></a></p><h3></h3><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption"><strong>Was this forwarded to you? Subscribe here to stay up to date!</strong></p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Ethical Tech is Good for Business]]></title><description><![CDATA[In a world filled with AI, Trust and Safety are more important than ever]]></description><link>https://news.ethicaltechproject.org/p/ethical-tech-is-good-for-business</link><guid isPermaLink="false">https://news.ethicaltechproject.org/p/ethical-tech-is-good-for-business</guid><pubDate>Wed, 14 Jan 2026 15:15:38 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!V1Ou!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fec449223-90a9-49c6-a1a0-aa034dd057c7_1920x1080.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>People are in a state of equal parts fear and excitement about AI. Companies are operating in a world with heightened economic and political uncertainty.</p><p>Amidst this background, recent headlines make sense - businesses and people are choosing AI platforms based on trust and safety issues.</p><p><strong>The trust gap is real.</strong> A <a href="http://(https://fortune.com/2025/12/09/harvard-business-review-survey-only-6-percent-companies-trust-ai-agents/)">recent Harvard Business Review study</a> found that &#8220;Only 6% of companies fully trust AI agents to autonomously run their core business processes&#8221;. As a result, many companies are restricting AI use and the message is clear: businesses want AI they can trust. </p><p><strong>Market data backs this up.</strong> Anthropic Claude&#8217;s market share of LLM API calls from enterprise customers is steadily growing&#8212;according to reports from Menlo Ventures. (Menlo is an investor in Anthropic). In July 2025, Menlo released the graph below showing Claude&#8217;s growing enterprise market share.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!V1Ou!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fec449223-90a9-49c6-a1a0-aa034dd057c7_1920x1080.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!V1Ou!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fec449223-90a9-49c6-a1a0-aa034dd057c7_1920x1080.png 424w, https://substackcdn.com/image/fetch/$s_!V1Ou!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fec449223-90a9-49c6-a1a0-aa034dd057c7_1920x1080.png 848w, https://substackcdn.com/image/fetch/$s_!V1Ou!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fec449223-90a9-49c6-a1a0-aa034dd057c7_1920x1080.png 1272w, https://substackcdn.com/image/fetch/$s_!V1Ou!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fec449223-90a9-49c6-a1a0-aa034dd057c7_1920x1080.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!V1Ou!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fec449223-90a9-49c6-a1a0-aa034dd057c7_1920x1080.png" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ec449223-90a9-49c6-a1a0-aa034dd057c7_1920x1080.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:147903,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://news.ethicaltechproject.org/i/184224022?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fec449223-90a9-49c6-a1a0-aa034dd057c7_1920x1080.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!V1Ou!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fec449223-90a9-49c6-a1a0-aa034dd057c7_1920x1080.png 424w, https://substackcdn.com/image/fetch/$s_!V1Ou!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fec449223-90a9-49c6-a1a0-aa034dd057c7_1920x1080.png 848w, https://substackcdn.com/image/fetch/$s_!V1Ou!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fec449223-90a9-49c6-a1a0-aa034dd057c7_1920x1080.png 1272w, https://substackcdn.com/image/fetch/$s_!V1Ou!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fec449223-90a9-49c6-a1a0-aa034dd057c7_1920x1080.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>On December 9<sup>, </sup>2025, Menlo Ventures published its <em><strong><a href="https://menlovc.com/perspective/2025-the-state-of-generative-ai-in-the-enterprise/">2025 The State of Generative AI in the Enterprise</a></strong></em> with these key findings:</p><ul><li><p>Anthropic now commands 40% in enterprise LLM API market share (more than triple its 12% share in 2023)</p></li><li><p>Google climbed to 21% (a 3x increase from 7% in 2023)</p></li><li><p>OpenAI&#8217;s share fell to 27%</p></li></ul><p><strong>What&#8217;s driving this shift?</strong></p><p>The Harvard Business Review study cites &#8220;Security and privacy worries loom largest as barriers to wider adoption.&#8221;</p><p>The Menlo Ventures report cites Claude&#8217;s code performance and control features.</p><p>Fortune&#8217;s December 2, 2025 article, however, is clear with its headline: &#8220;<strong><a href="https://fortune.com/2025/12/02/how-anthropics-safety-first-approach-won-over-big-business-and-how-its-own-engineers-are-using-its-claude-ai/">Anthropic&#8217;s Safety First Approach Won Over Big Business</a></strong><a href="https://fortune.com/2025/12/02/how-anthropics-safety-first-approach-won-over-big-business-and-how-its-own-engineers-are-using-its-claude-ai/">&#8221;</a></p><p>Here are some of the safety features cited in this article:</p><ul><li><p>Constitutional AI: Claude is trained with a written &#8220;constitution&#8221;&#8212;a set of principles that guides its behavior and outputs.</p></li><li><p>Multi-Layer Security: that screens out dangerous information from training data, uses &#8220;constitutional classifiers&#8221; to detect jailbreaking attempts, and monitors outputs for constitutional compliance in real-time.</p></li><li><p>Reliability which delivers lower hallucination rates</p></li><li><p>Active Threat Intelligence: Dedicated teams probe for vulnerabilities and investigate suspicious usage patterns</p></li></ul><p>There is plenty of debate about the safety levels of AI models. <a href="https://mashable.com/article/ai-safety-report-2025-chatgpt-gemini-claude">In an article published by Mashable</a> on December 3, 2025, the headline reads: &#8220;AI safety report: Only 3 models make the grade. Gemini, Claude, and ChatGPT are top of the class &#8212; but even they are just C students.&#8221;</p><p>This is according to an <a href="https://futureoflife.org/ai-safety-index-winter-2025/">AI Safety Index</a> published by the Future of Life Institute which is led by MIT professor Max Tegmark.</p><p>So what is the truth? Are some AI models safer than others? Are they all unsafe? Are some AI companies just better at PR and brand positioning? These are important questions to answer but regardless of these answers, it is clear that AI Safety is an important issue that every business needs to actively address.</p><p><strong>The bottom line:</strong> Safety isn&#8217;t just an ethical imperative&#8212;it&#8217;s a competitive advantage. Whether you are a company selling AI systems, or a company incorporating AI into your operations and services, you have a greater chance to succeed if you build with trust, transparency, and security at their core of your AI work.</p><p>In an uncertain world, responsible AI isn&#8217;t just the right thing to do, it&#8217;s the best business strategy.</p><p>At the Ethical Tech Project, we are committed to helping business leaders think about issues such as data privacy, transparency, safety and human control of AI systems. If you care about these critically important issues, let&#8217;s collaborate!</p><p><em>Robert Levitan is the new Co-Chair of ETP&#8217;s Board of Directors. A longtime digital entrepreneur, he has helped shape multiple phases of the Internet&#8217;s development, from early online communities to advanced e-commerce and enterprise solutions. Over the past thirty years, he has founded, scaled, and advised companies such as iVillage, Flooz, Pando Networks, and various venture-backed startups, establishing himself as a thoughtful leader at the crossroads of technology, society, and governance. His recent work focuses on the societal impacts of AI and the importance of robust, transparent frameworks that safeguard individuals and communities as technology progresses.</em></p><h2><strong>Here&#8217;s what we&#8217;ve been reading and listening to, and you should be too:</strong></h2><h4><strong>&#129300; Attitudes on AI</strong></h4><p><strong>The Prof G Pod &#8211; Scott Galloway: <a href="https://www.youtube.com/watch?v=MLvxRHlsMz0">The AI Dilemma &#8212; with Tristan Harris</a></strong>. Tristan Harris, former Google design ethicist and co-founder of the Center for Humane Technology, joins Scott Galloway to explain why children have become the front line of the AI crisis. They unpack the rise of AI companions, the collapse of teen mental health, the coming job shock, and how the U.S. and China are racing toward artificial general intelligence. Harris makes the case for age-gating, liability laws, and a global reset before intelligence becomes the most concentrated form of power in history.</p><p><strong>Walter Quattrociocchi: <a href="https://www.linkedin.com/posts/walterquattrociocchi_new-brutal-paper-out-epistemological-activity-7408897169937559553-fS1i/?utm_medium=ios_app&amp;rcm=ACoAAAAn4IcBo8QBsHR-knAvWIRIZYIC5-j6f6A&amp;utm_source=social_share_send&amp;utm_campaign=share_via">New (BRUTAL) paper out &#8220;Epistemological Fault Lines Between Human and Artificial Intelligence&#8221;</a></strong> When text sounds right, we stop asking whether it&#8217;s true. Large Language Models are increasingly used to evaluate, summarize, and even judge information. The common assumption is that, as long as their outputs align with human judgments, they can safely take on epistemic roles.</p><h4><strong>&#128184; Money</strong></h4><p><strong>The New York Times: <a href="https://www.nytimes.com/2025/12/09/business/wall-street-valuation-ai-bubble.html">Wall Street Is Shaking Off Fears of an A.I. Bubble. For Now.</a> </strong>The valuations of some artificial intelligence companies are approaching those of the dot-com boom. But investors worry that pulling money from today&#8217;s market risks future gains.</p><p><strong>The Guardian: <a href="https://www.theguardian.com/commentisfree/2025/dec/12/ai-bubble-mass-layoffs-income-inequality">Most people aren&#8217;t fretting about an AI bubble. What they fear is mass layoffs</a>. </strong>Artificial intelligence could make income inequality even worse and create a new underclass. Governments and society must take action</p><p><strong>New York Post: <a href="https://nypost.com/2025/12/09/business/instacart-charging-different-prices-on-same-grocery-staples-in-same-stores-study?utm_source=slack&amp;utm_campaign=android_nyp">Instacart is charging different prices to different customers &#8212; on the same grocery items in the same stores, bombshell study reveals</a> </strong>Groundwork, a consumer advocacy group, said Instacart&#8217;s pricing algorithm could lead to shoppers forking over an extra $1,200 on groceries each year.</p><p><strong>Center for Humane Technology: <a href="https://open.substack.com/pub/centerforhumanetechnology/p/advertising-is-coming-to-ai-its-going?utm_campaign=post&amp;utm_medium=web">Advertising is Coming to AI. It&#8217;s Going to Be a Disaster.</a> </strong>A 22-year-old has an earnest query for her AI chatbot: &#8220;How do I really impress in my first job interview?&#8221; To which the AI helpfully responds, &#8220;First, you need to think about your clothes and what they communicate about you and your qualifications.&#8221; Now, is this sound advice for kicking off a productive career-coaching session &#8212; or is it sponsored content?</p><h4><strong>&#129504; Mental Health</strong></h4><p><strong>The Jed Foundation: <a href="https://jedfoundation.org/american-psychological-association-on-generative-ai/">When Young People Turn to AI for Emotional Support: JED&#8217;s Response to the APA&#8217;s New Advisory</a>. </strong>Artificial Intelligence holds promise for use in mental health support but we must apply the same guardrails as in other health interventions</p><p><strong>Robbie Torney:</strong> <strong><a href="https://www.linkedin.com/posts/rtorney_ai-teenmentalhealth-digitalsafety-activity-7397323843805229056-UYlF/?utm_medium=ios_app&amp;rcm=ACoAAAAn4IcBo8QBsHR-knAvWIRIZYIC5-j6f6A&amp;utm_source=social_share_send&amp;utm_campaign=copy_link">Today Common Sense Media released findings from our comprehensive risk assessment of AI chatbots and teen mental health support</a></strong>. The results are clear: these systems are fundamentally unsafe for the way millions of young people are already using them.</p><p><strong>Data Workers&#8217; Inquiry: <a href="https://data-workers.org/michael/">The Emotional Labor Behind AI Intimacy</a>. </strong>Imagine confiding your most private fantasies to what you believe is an unfeeling algorithm that cannot judge or remember. Now imagine that on the other side of that conversation is a man sitting in a one-room home in Nairobi, working through the night while his wife and children sleep. That man is Michael, and this is his story.</p><h4><strong>&#128195; Policy and Regulation</strong></h4><p><strong>The Guardian: <a href="https://www.theguardian.com/technology/2026/jan/04/world-may-not-have-time-to-prepare-for-ai-safety-risks-says-leading-researcher">World &#8216;may not have time&#8217; to prepare for AI safety risks, says leading researcher</a>. </strong>AI safety expert David Dalrymple said rapid advances could outpace efforts to control powerful systems</p><p><strong>Erie Meyer:</strong> <strong><a href="https://www.linkedin.com/posts/eriemeyer_1268b5c629180f6pdf-activity-7405358372511784960-tg18/?utm_medium=ios_app&amp;rcm=ACoAAAAn4IcBo8QBsHR-knAvWIRIZYIC5-j6f6A&amp;utm_source=social_share_send&amp;utm_campaign=copy_link">WHEW. A bipartisan group of 42 state Attorneys General are standing up against companies pushing dangerous, sycophantic AI</a>,</strong> and they&#8217;re asking for everything from individual executive accountability to worker protections. </p><p><strong>The New York Times:</strong> <strong><a href="https://www.nytimes.com/2025/12/11/technology/ai-trump-executive-order.html?smid=nytcore-ios-share">Trump Signs Executive Order to Neuter State A.I. Laws</a>. </strong>The order would create one federal regulatory framework for artificial intelligence, Mr. Trump told reporters in the Oval Office.</p><p><strong>Defense News: <a href="https://www.defensenews.com/pentagon/2025/12/09/pentagon-taps-google-gemini-launches-new-site-to-boost-ai-use/">Pentagon taps Google Gemini, launches new site to boost AI use</a> </strong>Hegseth said the U.S. must stay ahead of adversaries who are working to take advantage of rapid technology advancements, like the development of AI.</p><h4>&#127822; Education</h4><p><strong>Mashable: <a href="https://mashable.com/article/ai-safety-report-2025-chatgpt-gemini-claude">AI safety experts say most models are failing</a>. </strong>Gemini, Claude, and ChatGPT are top of the class &#8212; but even they are just C students.</p><h4><strong>&#129658; Medicine</strong></h4><p><strong>The Guardian: <a href="https://www.theguardian.com/technology/2026/jan/11/google-ai-overviews-health-guardian-investigation">&#8216;Dangerous and alarming&#8217;: Google removes some of its AI summaries after users&#8217; health put at risk</a>. </strong>Guardian investigation finds AI Overviews provided inaccurate and false information when queried over blood tests.</p><h4><strong>&#127891; Research and Resources</strong></h4><p><strong>Anthropic: <a href="https://www.anthropic.com/news/claude-for-nonprofits">Claude for Nonprofits</a>.</strong> Nonprofits tackle some of society&#8217;s most difficult problems, often with limited resources. In partnership with the global generosity movement GivingTuesday, Anthropic is launching Claude for Nonprofits to help organizations across the world maximize their impact.</p><p><strong>Tech Crunch:</strong> <strong><a href="https://techcrunch.com/2025/11/24/a-new-ai-benchmark-tests-whether-chatbots-protect-human-wellbeing/">A new AI benchmark tests whether chatbots protect human well-being.</a> </strong>AI chatbots have been linked to serious mental health harms in heavy users, but there have been few standards for measuring whether they safeguard human well-being or just maximize for engagement. A new benchmark dubbed HumaneBench seeks to fill that gap by evaluating whether chatbots prioritize user well-being and how easily those protections fail under pressure.</p><p></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://donorbox.org/ethical-tech-project&quot;,&quot;text&quot;:&quot;Donate Here&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://donorbox.org/ethical-tech-project"><span>Donate Here</span></a></p><h3></h3><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption"><strong>Was this forwarded to you? Subscribe here to stay up to date!</strong></p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Build a better tech future. Give today.]]></title><description><![CDATA[This Giving Tuesday, join us in shaping technology that protects people and society.]]></description><link>https://news.ethicaltechproject.org/p/build-a-better-tech-future-give-today</link><guid isPermaLink="false">https://news.ethicaltechproject.org/p/build-a-better-tech-future-give-today</guid><pubDate>Tue, 02 Dec 2025 15:15:40 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Hbx5!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98295c77-6e45-4ded-98af-e234cb2bd940_256x256.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h2>Safe, fair, ethical tech. Together, we can make it happen</h2><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://donorbox.org/giving-tuesday-850947&quot;,&quot;text&quot;:&quot;Donate Here&quot;,&quot;action&quot;:null,&quot;class&quot;:&quot;button-wrapper&quot;}" data-component-name="ButtonCreateButton"><a class="button primary button-wrapper" href="https://donorbox.org/giving-tuesday-850947"><span>Donate Here</span></a></p><p>Responsible technology begins with builders. And your support will directly equip tech and AI builders with the tools to embed ethics, privacy, and safety into the products of tomorrow.</p><p>The professionals in our ETP Fellows Program are transforming the tech innovation ecosystem. So instead of us describing ETP&#8217;s impact - we&#8217;re letting our Fellows speak for themselves about how powerful our program is. </p><p><strong><a href="https://www.linkedin.com/in/stevenchu/">Steven Chu</a> </strong>walks us through one of our flagship activities, and how he&#8217;s making his own code of ethics.  <strong><br></strong></p><div class="native-video-embed" data-component-name="VideoPlaceholder" data-attrs="{&quot;mediaUploadId&quot;:&quot;088364ec-8864-49bf-9d9a-6ef38dab9f45&quot;,&quot;duration&quot;:null}"></div><p><strong><a href="https://www.linkedin.com/in/joshua-evans-8ba8a7172/">Joshua Evans</a> </strong>shares what he loves about the ETP community, and gives his advice for future Fellows.</p><div class="native-video-embed" data-component-name="VideoPlaceholder" data-attrs="{&quot;mediaUploadId&quot;:&quot;8428ea07-66c7-4cc0-a7c9-e5a7eea77edd&quot;,&quot;duration&quot;:null}"></div><p>And <strong><a href="https://www.linkedin.com/in/kiransuryadevarapharmd/">Kiran Suryadevara</a> </strong>makes the case for why ethical tech is so crucial now.</p><div class="native-video-embed" data-component-name="VideoPlaceholder" data-attrs="{&quot;mediaUploadId&quot;:&quot;f58edc12-896a-4776-a72e-db04e26e954a&quot;,&quot;duration&quot;:null}"></div><p></p><h4>Every person we train, every builder we equip, and every standard we scale reaches millions through the products they create and the teams they influence.</h4><p>And that&#8217;s all made possible with your support.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://donorbox.org/giving-tuesday-850947&quot;,&quot;text&quot;:&quot;Donate Here&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://donorbox.org/giving-tuesday-850947"><span>Donate Here</span></a></p><p>With gratitude,</p><p>Jennie &amp; Nancy</p><h2><strong>Here&#8217;s what we&#8217;ve been reading and listening to, and you should be too:</strong></h2><h4>&#128584; Privacy</h4><p><strong>The New York Times: <a href="https://www.nytimes.com/2025/11/02/opinion/ai-privacy.html">A.I. Is Deciding Who You Are</a></strong><a href="https://www.nytimes.com/2025/08/24/technology/spotify-panama-playlists-privacy.html?smid=nytcore-ios-share&amp;referringSource=articleShare"><br></a>In the age of A.I., personal data is anything but personal.</p><p><strong>Christina E. Volcy: <a href="https://www.linkedin.com/posts/christina-e-volcy_surveillance-immigration-ai-activity-7384984879853207552-QQr1?utm_medium=ios_app&amp;rcm=ACoAAAAn4IcBo8QBsHR-knAvWIRIZYIC5-j6f6A&amp;utm_source=social_share_send&amp;utm_campaign=share_via linkedin.comlinkedin.com">Ring doorbell footage shared with ICE, Secret Service, Navy</a></strong><br>Your Ring doorbell just became part of ICE&#8217;s surveillance network.</p><h4>&#128499;&#65039; Democracy </h4><p><strong>TechPolicy.Press: <a href="https://www.linkedin.com/posts/corinnecath_amazon-cloud-outage-reveals-democratic-deficit-activity-7386044396510617600-bgXJ?utm_medium=ios_app&amp;rcm=ACoAAAAn4IcBo8QBsHR-knAvWIRIZYIC5-j6f6A&amp;utm_source=social_share_send&amp;utm_campaign=share_via">Amazon Cloud Outage Reveals Democratic Deficit in Relying on Big Tech.</a></strong> The AWS outage was not just a technical failure. It was a democratic one. </p><h4><br>&#129302; Advances in Tech</h4><p><strong>The New Yorker: <a href="https://www.newyorker.com/magazine/2025/11/10/the-case-that-ai-is-thinking">The Case That A.I. Is Thinking</a><br></strong>ChatGPT does not have an inner life. Yet it seems to know what it&#8217;s talking about.</p><h4>&#128478;&#65039; News and Information</h4><p><strong>Nathalie Malinarich: <a href="https://www.linkedin.com/posts/nathalie-malinarich-76909220_niai2025pdf-activity-7386656332613074944-vnJM?utm_source=share&amp;utm_medium=member_desktop&amp;rcm=ACoAAAIxE0oBL-QtJVdf7I_Ex5CAS2JBzCnBcJI">Our new research into the accuracy of the most popular AI assistants shows that while there has been improvement there are still significant issues with how they represent news content.</a></strong> Following research released in February, the BBC partnered with the EBU and its members to extend this to other countries and languages. 22 public service media organisations (in 18 countries and 14 languages) took part. </p><p><strong>The Guardian: <a href="https://www.theguardian.com/news/2025/nov/18/what-ai-doesnt-know-global-knowledge-collapse">What AI doesn&#8217;t know: we could be creating a global &#8216;knowledge collapse&#8217;</a>.</strong> As GenAI becomes the primary way to find information, local and traditional wisdom is being lost. And we are only beginning to realise what we&#8217;re missing.</p><h4>&#128184; Money</h4><p><strong>The New Yorker: <a href="https://www.newyorker.com/magazine/2025/11/03/inside-the-data-centers-that-train-ai-and-drain-the-electrical-grid">Inside the Data Centers That Train A.I. and Drain the Electrical Grid</a>. </strong>A data center, which can use as much electricity as Philadelphia, is the new American factory, creating the future and propping up the economy. How long can this last?</p><p><strong>The New York Times: <a href="https://www.nytimes.com/2025/11/22/business/the-ai-boom-economy.html">The A.I. Boom Is Driving the Economy. What Happens if It Falters?</a> </strong>A windfall for companies that build data centers and their suppliers is overshadowing weakness in other industries.</p><h4><strong>&#129466; Labor and Jobs</strong></h4><p><strong>Notes from the Circus: <a href="https://www.notesfromthecircus.com/p/the-coming-clash-of-civilizations">The Coming Clash of Civilizations</a></strong>. From the Wilderness of a Recovering Technocrat </p><p><strong>The Distributed Research Institute (DAIR):  <a href="https://drive.google.com/file/d/1yBxyqMYiHTtkCf3Yiv1e77eTv9Z74gP_/view">DRIVEN DOWN: How Workplace Technology Enables Amazon to Steal Wages, Hide Labor, Intensify Poor Working Conditions, and Evade Responsibility.</a> </strong>Based on in-depth interviews and direct organizing experience, &#8220;Drive Down&#8221; reveals how Amazon is using surveillance technologies to intensify driver workloads, instill a patchwork of uneven digital punishments and expose all drivers to potential wage theft and unsafe working conditions.</p><h4>&#129504; Mental Health</h4><p><strong>The New York Times:  <a href="https://www.nytimes.com/2025/10/28/opinion/openai-chatgpt-safety.html?smid=nytcore-android-share">I Led Product Safety at OpenAI. Don&#8217;t Trust Its Claims About &#8216;Erotica.&#8217;</a></strong> A.I. companies need to do more to show the proof behind their claims.</p><h4>&#128195; Policy and Regulation</h4><p><strong>TIME: <a href="https://time.com/7332888/we-need-ai-morals/">AI Regulation is Not Enough. We Need AI Morals</a> </strong>&#8220;The challenge of our time is to keep moral intelligence in step with machine intelligence.&#8221;</p><p><strong>The New York Times: <a href="https://www.nytimes.com/2025/11/12/business/media/ai-defamation-libel-slander.html">Who Pays When A.I. Is Wrong?</a> </strong>New court cases seek to define content created by artificial intelligence as defamatory &#8212; a novel concept that has captivated some legal experts.</p><p><strong>Luiza&#8217;s Newsletter: <a href="https://www.luizasnewsletter.com/p/i-expect-some-really-bad-stuff-to?utm_campaign=post&amp;utm_medium=email&amp;triedRedirect=true">&#8220;I expect some really bad stuff to happen&#8221;</a>. </strong>OpenAI&#8217;s legal department is probably fuming over Sam Altman&#8217;s recent statements |</p><h4>&#127758; Environmental Impact</h4><p><strong>The Guardian: <a href="https://www.theguardian.com/technology/2025/oct/25/amazon-datacentres-water-use-disclosure">Amazon strategised about keeping its datacentres&#8217; full water use secret, leaked document shows</a></strong>. Executives at world&#8217;s biggest datacenter owner grappled with disclosing information about water used to help power facilities</p><h4>&#127891; Research and Resources</h4><p><strong>Neal K. Shah: <a href="https://www.linkedin.com/posts/neal-shah-careyaya_a-dutch-laboratory-just-proved-what-many-activity-7387863830112018433-R4TU?utm_source=social_share_send&amp;utm_medium=android_app&amp;rcm=ACoAAAAn4IcBo8QBsHR-knAvWIRIZYIC5-j6f6A&amp;utm_campaign=share_via">A Dutch laboratory just proved what many of us feared: the rot is in the architecture.</a> </strong>Researchers built a social platform stripped to bare essentials - 500 AI agents, no algorithms, no surveillance apparatus. Just the fundamental mechanics: post, follow, amplify. The bots fractured into warring tribes within hours. A narrow elite captured all attention. Extremism flourished.</p><p><strong>Longview: <a href="https://podcastaddict.com/podcast/the-last-invention/6099247">The Last Invention</a>. </strong>The AI revolution has begun &#8211; the product of a seventy-year quest by scientists, mathematicians, and visionaries who set out to build machines that could think. But what began as a fringe idea has now become one of the most powerful forces of the 21st century.<strong> </strong></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Was this forwarded to you by a friend? Subscribe here to stay up to date!</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://donorbox.org/giving-tuesday-850947&quot;,&quot;text&quot;:&quot;Donate Here&quot;,&quot;action&quot;:null,&quot;class&quot;:&quot;button-wrapper&quot;}" data-component-name="ButtonCreateButton"><a class="button primary button-wrapper" href="https://donorbox.org/giving-tuesday-850947"><span>Donate Here</span></a></p>]]></content:encoded></item><item><title><![CDATA[Announcing our Fall 2025 ETP Fellows!]]></title><description><![CDATA[And a digest of the latest news in ethical tech]]></description><link>https://news.ethicaltechproject.org/p/announcing-our-fall-2025-etp-fellows</link><guid isPermaLink="false">https://news.ethicaltechproject.org/p/announcing-our-fall-2025-etp-fellows</guid><pubDate>Tue, 21 Oct 2025 14:31:03 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!XfiN!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1b7b252b-a776-4ab6-8c0f-0f42693a5fbd_4000x3000.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h2><strong>We are thrilled to announce the ETP Fellows Program cohort for Fall 2025!</strong></h2><p>These Fellows bring a wealth of experience to the program. Their work spans fields like content and policy that makes the internet safer and more inclusive, flavor and fragranced powered by AI, and data science fused with anthropology.</p><p>Please join us in welcoming <strong><a href="https://www.linkedin.com/in/audrey-k-2b24b11a5/">Audrey Kennedy</a>, <a href="https://www.linkedin.com/in/eden-senay/">Eden Senay</a>, <a href="https://www.linkedin.com/in/joshua-evans-8ba8a7172/">Joshua Evans</a>, <a href="https://www.linkedin.com/in/kiransuryadevarapharmd/">Kiranmayee Suryadevara</a>, <a href="http://linkedin.com/in/leahferentinos">Leah Ferentinos</a>, <a href="https://www.linkedin.com/in/melissaannemajor/">Melissa Major</a>, <a href="https://www.linkedin.com/in/michelle-dong20/">Michelle Dong</a>, <a href="https://www.linkedin.com/in/rossyesmil/">Rossy Esmil Araujo</a>, <a href="http://www.linkedin.com/in/shobavarma">Shoba Varma</a>, <a href="https://www.linkedin.com/in/stevenchu/">Steven Chu</a>, <a href="https://www.linkedin.com/in/craig-celestin">Tyler Celestin</a>, <a href="https://www.linkedin.com/in/wlynn2014/">William (Billy) Lynn</a>, <a href="https://www.linkedin.com/in/xeniamasl/">Xenia Masl</a>, </strong>and<strong> <a href="https://www.linkedin.com/in/zhamilya/">Zhamilya Bilyalova</a>.</strong></p><p>In the coming weeks, they&#8217;ll be diving into our proprietary curriculum to understand the most pressing issues in ethical tech, and learning from some of the foremost leaders in the field.</p><div class="image-gallery-embed" data-attrs="{&quot;gallery&quot;:{&quot;images&quot;:[{&quot;type&quot;:&quot;image/jpeg&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/1b7b252b-a776-4ab6-8c0f-0f42693a5fbd_4000x3000.jpeg&quot;},{&quot;type&quot;:&quot;image/jpeg&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/73c9dc12-e654-44df-80e4-f890a2923bbe_4000x3000.jpeg&quot;},{&quot;type&quot;:&quot;image/jpeg&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/7a16cc80-0ed7-48ca-83b9-c88bb271d908_4000x3000.jpeg&quot;},{&quot;type&quot;:&quot;image/jpeg&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/37d22c11-f82e-49fc-b281-ef8daa389b37_4000x3000.jpeg&quot;}],&quot;caption&quot;:&quot;ETP Fellows hard at work at our kickoff session&quot;,&quot;alt&quot;:&quot;&quot;,&quot;staticGalleryImage&quot;:{&quot;type&quot;:&quot;image/png&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f22761b0-62dd-4bfb-8da7-a49f15fb5908_1456x1456.png&quot;}},&quot;isEditorNode&quot;:true}"></div><p>And thank you to everyone who applied to the ETP Fellows Program. The selection process was highly competitive, and we&#8217;re working to expand our programming in the Spring of 2026.</p><p><strong>We&#8217;re also celebrating three of our Spring &#8216;25 Fellows</strong>, who are using the lessons they learned in our program to transform the tech innovation ecosystem.</p><ul><li><p><strong>George Nunez </strong>is the founder of Bronx Tech Hub, a non-profit organization dedicated to building a thriving tech ecosystem that reflects the needs of Bronxites. Read more about his work in this article by The New York Amsterdam News: <strong>&#8216;</strong><a href="https://amsterdamnews.com/news/2025/09/18/george-nunez-to-change-the-narrative-about-the-bronx-using-tech/">George Nunez wants to change the narrative about The Bronx through tech</a>&#8217;.</p></li></ul><ul><li><p><strong>Bobby Zipp</strong> has founded a new company <a href="https://startwithfirstep.carrd.co/">Firstep</a>, which helps people plan for their future families in ways that match their beliefs and values - from co-parents, partners, or support networks. You can find out the latest on their work <a href="https://startwithfirstep.carrd.co/">here</a>.</p></li></ul><ul><li><p><strong>Nara Valera-Simeon</strong> was selected as 1 of 70+ young leaders worldwide to explore how youth can shape the future of AI governance for &#8216;<a href="https://www.linkedin.com/posts/nara-valera-simeon_responsibleai-aiethics-latinasintech-ugcPost-7379980775338766337-iwH3?utm_medium=ios_app&amp;rcm=ACoAAAAn4IcBo8QBsHR-knAvWIRIZYIC5-j6f6A&amp;utm_source=social_share_send&amp;utm_campaign=copy_link">CTRL + Future: A Youth Forum on Responsible AI Summit&#8217;</a>.</p></li></ul><p>We can&#8217;t wait to see what our Fellows do next! To find out more about our Fellows and the ETP Fellows program, check out our <a href="https://www.ethicaltechproject.org/etp-fellowship">website</a>, and join us on <a href="https://www.linkedin.com/company/the-ethical-tech-project/posts/?feedView=all">LinkedIn.</a></p><h3><strong>Ever wish you could treat an ethical technologist to dinner?</strong> </h3><p>Us too! That&#8217;s why we serve sandwiches and snacks at all of our in-person training sessions. It&#8217;s more than just a meal - it&#8217;s a way for our Fellows to build community, and be ready to dig deep into the most pressing issues in ethical tech and responsible AI.</p><p>$25 is all we need to provide one meal, and even the smallest contributions help power our work. </p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://donorbox.org/buy-an-ethical-technologist-a-sandwich&quot;,&quot;text&quot;:&quot;Donate here!&quot;,&quot;action&quot;:null,&quot;class&quot;:&quot;button-wrapper&quot;}" data-component-name="ButtonCreateButton"><a class="button primary button-wrapper" href="https://donorbox.org/buy-an-ethical-technologist-a-sandwich"><span>Donate here!</span></a></p><p>With gratitude,</p><p>Jennie &amp; Nancy</p><h2><strong>Here&#8217;s what we&#8217;ve been reading and listening to, and you should be too:</strong></h2><h4>&#128584; Privacy</h4><p><strong>The New York Times: <a href="https://www.nytimes.com/2025/08/24/technology/spotify-panama-playlists-privacy.html?smid=nytcore-ios-share&amp;referringSource=articleShare">We Are Tech Privacy Reporters. Our Music Habits Got Doxxed.</a></strong><a href="https://www.nytimes.com/2025/08/24/technology/spotify-panama-playlists-privacy.html?smid=nytcore-ios-share&amp;referringSource=articleShare"><br></a>The &#8220;Panama Playlists&#8221; exposed the Spotify listening habits of some famous people &#8212; and two journalists who didn&#8217;t know as much about protecting their privacy as they had thought.</p><h4>&#128184; Money</h4><p><strong>The New York Times: <a href="https://www.nytimes.com/2025/08/27/business/economy/ai-investment-economic-growth.html">The A.I. Spending Frenzy Is Propping Up the Real Economy, Too</a><br></strong>The trillions of dollars that tech companies are pouring into new data centers are starting to show up in economic growth. For now, at least.</p><p><strong><a href="https://www.theverge.com/news/775072/rsl-standard-licensing-ai-publishing-reddit-yahoo-medium">The Verge: The web has a new system for making AI companies pay up</a><br></strong>The mission is to keep the web sustainable.</p><h4>&#129504; Mental Health</h4><p><strong>New York Post: <a href="https://nypost.com/2025/08/29/business/ex-yahoo-exec-killed-his-mom-after-chatgpt-fed-his-paranoia-report?utm_source=slack&amp;utm_campaign=android_nyp">How ChatGPT fueled delusional man who killed mom, himself in posh Conn. town<br></a></strong>Stein-Erik Soelberg, 56, had confided his darkest suspicions to OpenAI&#8217;s popular bot, which he nicknamed &#8220;Bobby,&#8221; before the shocking murder-suicide</p><p><strong>Ted Radio Hour: <a href="https://www.npr.org/2025/08/29/nx-s1-5519715/are-the-kids-alright-part-1?utm_source=npr_newsletter&amp;utm_medium=email&amp;utm_content=20250914&amp;utm_term=10349860&amp;utm_campaign=news&amp;utm_id=71577372&amp;orgid=299&amp;uniquet=5Yjf72-yx_09QFoG1duACg&amp;utm_att1=">Are the Kids Alright?</a><br></strong>Being a kid&#8212;or raising one&#8212;has never been tougher. From AI in classrooms to social media pressures to economic stress, kids are navigating a minefield. This episode digs into AI in education, and student well-being.</p><p><strong>BBC: <a href="https://www.bbc.com/news/articles/c74933vzx2yo">Safety of AI chatbots for children and teens faces US inquiry</a><br></strong>The Federal Trade Commission is inquiring into seven tech companies including Snap, Meta, OpenAI and XAI.</p><p><strong>The Atlantic: <a href="https://www.theatlantic.com/technology/archive/2025/08/ai-mass-delusion-event/683909/">AI Is a Mass-Delusion Event</a><br></strong>Three years in, one of AI&#8217;s enduring impacts is to make people feel like they&#8217;re losing it.</p><p><strong>Eli Pariser:</strong> <strong><a href="https://www.linkedin.com/pulse/era-hyperpersonalized-content-here-eli-pariser-8vise/">Think TikTok is addictive? We haven&#8217;t seen anything yet.</a><br></strong>Everyone needs to pay attention to the most recent AI products rolled out by OpenAI, Meta, and Google, because they tell us something important about the future of digital media. Together, it signals that we&#8217;re entering a new era of hyper-personalized, hyper-addicting media that&#8217;s likely to be orders of magnitude more engaging and addictive than TikTok.</p><h4>&#129300; Attitudes on AI</h4><p><strong>Pew Research Center: <a href="https://www.pewresearch.org/science/2025/09/17/how-americans-view-ai-and-its-impact-on-people-and-society/?utm_source=Pew+Research+Center&amp;utm_campaign=f7082dde3b-Weekly_9-20-25&amp;utm_medium=email&amp;utm_term=0_-f7082dde3b-399789969">How Americans View AI and Its Impact on People and Society</a><br></strong>Americans are worried about using AI more in daily life, seeing harm to human creativity and relationships. But they&#8217;re open to AI use in weather forecasting, medicine and other data-heavy tasks.</p><p><strong>Bloomberg: <a href="https://www.bloomberg.com/news/articles/2025-09-12/the-ai-doomers-are-losing-the-argument">The AI Doomers Are Losing the Argument</a><br></strong>As AI advances and the incentives to release products grow, safety research on superintelligence is playing catch-up.</p><p><strong>The New York Times: <a href="https://www.nytimes.com/2025/09/12/technology/ai-eliezer-yudkowsky-book.html">A.I.&#8217;s Prophet of Doom Wants to Shut It All Down</a><br></strong>Eliezer Yudkowsky has spent the past 20 years warning A.I. insiders of danger. Now, he&#8217;s making his case to the public.</p><p><strong>The San Francisco Standard: <a href="https://sfstandard.com/2025/09/29/tech-bro-2-0-new-silicon-valley-archetype-dominating-ai-age/">Tech Bro 2.0: The new Silicon Valley archetype dominating the AI age</a> </strong><br>He&#8217;s not what he used to be. He&#8217;s jacked, cracked, and thinks he might save America.</p><p><strong>Jay Van Bavel, PhD: <a href="https://www.linkedin.com/posts/jayvanbavel_ai-activity-7379544376433156096-mRRj?utm_source=share&amp;utm_medium=member_desktop&amp;rcm=ACoAAAIxE0oBL-QtJVdf7I_Ex5CAS2JBzCnBcJI">Sycophantic AI increases attitude extremity and overconfidence</a><br></strong>In a new paper, researchers found that sycophantic hashtag#AI chatbots make people more extreme--operating like an echo chamber. Yet, people prefer sycophantic chatbots and see them as less biased. Only open-minded people prefer disagreeable chatbots</p><h4>&#128195; Policy and Regulation</h4><p><strong>Semafor: <a href="https://www.semafor.com/article/09/17/2025/anthropic-irks-white-house-with-limits-on-models-uswhite-house-with-limits-on-models-use">Anthropic irks White House with limits on models&#8217; use<br></a></strong>The AI company declined to allow requests by contractors working with federal law enforcement.</p><p><strong>Deadline: <a href="https://deadline.com/2025/09/tv-academy-ai-guidelines-members-1236528315/">With Hollywood On Edge About AI, TV Academy Establishes Guidelines For Members</a></strong><a href="https://deadline.com/2025/09/tv-academy-ai-guidelines-members-1236528315/"><br></a>A Television Academy task force developing what it calls &#8220;responsible AI and production standards&#8221; has finalized a set of guidelines for members.</p><p><strong>Gary Marcus: <a href="https://garymarcus.substack.com/p/ai-red-lines-we-should-not-cross?utm_source=share&amp;utm_medium=android&amp;r=7fjau&amp;triedRedirect=true">AI red lines</a><br></strong>We urge governments to reach an international agreement on red lines for AI &#8212; ensuring they are operational, with robust enforcement mechanisms &#8212; by the end of 2026.</p><p><strong>NPR: <a href="https://www.npr.org/2025/09/18/nx-s1-5539869/google-antitrust-ruling-future-of-ai">What does the Google antitrust ruling mean for the future of AI?</a><br></strong>A federal judge&#8217;s mild ruling in the Justice Department&#8217;s suit over Google&#8217;s search engine monopoly has critics worried that the tech giant can now monopolize artificial intelligence.</p><p><strong>Anna Cook, M.S.: <a href="https://www.linkedin.com/feed/update/urn:li:activity:7366181541502189568/">Neglecting accessibility reveals a fundamental misunderstanding of what makes good design.</a><br></strong>The American federal government unveiled its &#8220;America by Design&#8221; initiative through an executive order, and shortly after, launched a website to showcase it.</p><p><strong>TechPolicy.Press:</strong> <strong><a href="https://www.techpolicy.press/how-google-paid-the-media-millions-to-avoid-regulatory-pressure/">How Google Paid the Media Millions to Avoid Regulatory Pressure</a><br></strong>Google has signed more than 2,000 contracts with news outlets worldwide in the past five years. A team of journalists reviewed its strategy.</p><p><strong>The Wrap: <a href="https://www.thewrap.com/ai-actress-tilly-norwood-signed-agency-eline-van-der-velden/">AI &#8216;Actress&#8217; Tilly Norwood Will Be Signed by an Agency &#8216;In the Coming Months,&#8217; Her Creator Claims</a><br></strong>A computer-generated actress named Tilly Norwood will be signed by an agency &#8220;in the coming months,&#8221; a claim made at the Zurich Summit by her creator, little-known actress, comedian and digital producer Eline van der Velden.</p><h4>&#127758; Environmental Impact</h4><p><strong>The New York Times: <a href="https://www.nytimes.com/2025/09/26/opinion/ai-quartz-mining-hurricane-helene.html">A.I.&#8217;s Environmental Impact Will Threaten Its Own Supply Chain</a><br></strong>Spruce Pine, N.C., supplies the world&#8217;s highest-purity quartz, a mineral that keeps the A.I. revolution afloat. What are the consequences?</p><h4>&#129658; Medicine</h4><p><strong>The Guardian: <a href="https://www.theguardian.com/science/2025/sep/17/new-ai-tool-can-predict-a-persons-risk-of-more-than-1000-diseases-say-experts">New AI tool can predict a person&#8217;s risk of more than 1,000 diseases, say experts</a><br></strong>Delphi-2M uses diagnoses, &#8216;medical events&#8217; and lifestyle factors to create forecasts for next decade and beyond</p><p><strong>The New Yorker: <a href="https://www.newyorker.com/magazine/2025/09/29/if-ai-can-diagnose-patients-what-are-doctors-for">If A.I. Can Diagnose Patients, What Are Doctors For?</a><br></strong>Large language models are transforming medicine&#8212;but the technology comes with side effects.</p><h4>&#127891; Research and Resources</h4><p><strong>Bethan Jinkinson: <a href="https://www.linkedin.com/posts/bethan-jinkinson-04146886_bbcideas-ailiteracy-digitalvideo-activity-7378351377296801792-f7h-?utm_medium=ios_app&amp;rcm=ACoAAAAn4IcBo8QBsHR-knAvWIRIZYIC5-j6f6A&amp;utm_source=social_share_send&amp;utm_campaign=copy_link">BBC and AI Literacy</a><br></strong>The BBC launched a new series of four short videos on AI Literacy - with the aim of empowering audiences to use AI smartly and safely. The videos were developed very much with audiences new to Gen AI in mind.</p><p><strong>Cornell University: <a href="https://arxiv.org/abs/2412.13507">Novel AI Camera Camouflage: Face Cloaking Without Full Disguise</a><br></strong>This study demonstrates a novel approach to facial camouflage that combines targeted cosmetic perturbations and alpha transparency layer manipulation to evade modern facial recognition systems.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Was this forwarded to you by a friend? Subscribe here to stay up to date!</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[ETP Updates, August 2025]]></title><description><![CDATA[We&#8217;re celebrating the Back-To-School season by opening applications for our next ETP Fellowship this fall!]]></description><link>https://news.ethicaltechproject.org/p/etp-updates-august-2025</link><guid isPermaLink="false">https://news.ethicaltechproject.org/p/etp-updates-august-2025</guid><dc:creator><![CDATA[Samuel Baird]]></dc:creator><pubDate>Tue, 26 Aug 2025 14:32:16 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Hbx5!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98295c77-6e45-4ded-98af-e234cb2bd940_256x256.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>We&#8217;re celebrating the Back-To-School season by <a href="https://docs.google.com/forms/d/e/1FAIpQLScAPq-W1y8VJKPxRK78gVZ8O8lL3gHn12IPhY5RMxj0lnSeoA/viewform">opening applications for our next ETP Fellowship this fall! </a></p><p>We are seeking nominations for outstanding early- to mid-career tech builders (designers, </p><p>product managers, engineers, architects, digital marketers, etc.) to join us. Successful nominees </p><p>will participate in a first-of-its-kind 10-week structured curriculum, designed to contextualize the </p><p>social impact of the decisions and choices made by tech teams bringing digital and AI-driven </p><p>products and services to market. </p><p>Fellows will develop both leadership and practical skills they can apply to their day-to-day work. </p><p>These skills will enable them to produce better outcomes for individuals, businesses and </p><p>society. At the end of the program, participants will have the opportunity to present a </p><p>mini-capstone or project idea to a panel of industry luminaries and receive valuable, actionable </p><p>feedback. </p><p>Our curriculum has been designed by subject matter experts in conjunction with the Digital </p><p>Futures Institute at Columbia University and will be moderated by our Board Chair Jennie Baird, </p><p>along with special guests from the tech community. </p><p><strong>Sessions will be held on Thursday evenings from September 25, 2025 through December </strong></p><p>11, 2025. The curriculum is hybrid to provide flexibility as well as community building. All </p><p>in-person sessions will meet in New York City. </p><p>This is a unique learning, leadership, and networking opportunity for motivated and passionate </p><p>individuals who want their tech and AI-oriented work to support human flourishing and who </p><p>believe companies can do well by doing good. </p><p>Nominate yourself, or someone else who you think would make a great ETP Fellow, at this link: <a href="https://docs.google.com/forms/d/e/1FAIpQLScAPq-W1y8VJKPxRK78gVZ8O8lL3gHn12IPhY5RMxj0lnSeoA/viewform">https://docs.google.com/forms/d/e/1FAIpQLScAPq-W1y8VJKPxRK78gVZ8O8lL3gHn12IPhY5RMxj0lnSeoA/viewform</a></p><p>Applications for the program close September 3, 2025. And if you have any questions, you </p><p>can reach out to us at contact@ethicaltechproject.org. </p><p>With gratitude, </p><p>Jennie &amp; Nancy</p><p></p><div><hr></div><h1>Here&#8217;s what we&#8217;ve been reading and listening to, and  should be too:</h1><p><strong>The New York Times</strong>: <a href="https://www.nytimes.com/2025/07/18/opinion/ai-chatgpt-school.html?smid=nytcore-ios-share&amp;referringSource=articleShare">I Teach Creative Writing. This Is What A.I. Is Doing to Students </a></p><p>We need to reckon with what ChatGPT is doing to the classroom and to human expression. </p><p><strong>TIME</strong>: <a href="https://time.com/7309268/youtube-ai-age-estimation-us-how-why-privacy-concerns-explainer/">YouTube to Estimate Users&#8217; Ages Using AI </a></p><p>Critics have raised concerns about privacy as well as access to the platform for users falsely flagged as underage. </p><p><strong>Luiza Jarovsky, PhD</strong>: <a href="https://www.linkedin.com/posts/luizajarovsky_breaking-after-metas-shocking-leak-deeming-activity-7362139728353054720-TOm2?utm_source=share&amp;utm_medium=member_android&amp;rcm=ACoAACwHeDMBwWTjcAw6R5qdVwIeFUSSrnzqulw">After Meta's shocking leak deeming 'romantic' AI interactions with children acceptable, U.S. senators are calling for a full investigation of its AI practices.</a></p><p>The 'move fast and break things' ethos MUST END. </p><p><strong>The Atlantic</strong>: <a href="https://www.theatlantic.com/technology/archive/2025/07/chatgpt-ai-self-mutilation-satanism/683649/?utm_campaign=the-atlantic&amp;utm_content=true-anthem&amp;utm_medium=social&amp;utm_source=linkedin">ChatGPT Gave Instructions for Murder, Self-Mutilation, and Devil Worship - The Atlantic </a></p><p>After The Atlantic received a tip, Shroff had asked the chatbot to help create a ritual offering to Molech, a god associated with child sacrifice. In discussions beginning with anodyne questions about demons and devils, Shroff found &#8220;that the chatbot can easily be made to guide users through ceremonial rituals and rites that encourage various forms of self-mutilation.&#8221; The chatbot also led her through other chants, invocations, and rituals&#8212;including ddetailedinstructions on how to carry out the sacrifice of large animals. </p><p><strong>The Guardian</strong>: <a href="https://www.theguardian.com/technology/2025/may/20/almost-half-of-young-people-would-prefer-a-world-without-internet-uk-study-finds">Almost half of young people would prefer a world without internet, UK study finds </a></p><p>Half of 16- to 21-year-olds support &#8216;digital curfew&#8217; and nearly 70% feel worse after using social media </p><p><strong>Vox</strong>: <a href="https://www.vox.com/technology/414264/apple-watch-oura-diabetes-blood-sugar-rfk-maha">I covered my body in health trackers for 6 months. It ruined my life.</a> </p><p>Are gadgets like Apple Watches and Oura Rings making us any healthier? </p><p><strong>The New York Times</strong>: <a href="https://www.nytimes.com/interactive/2025/08/04/well/phone-screen-time-scrolling.html">How to Reduce Screen Time: Tips to Put Your Phone Down </a></p><p>We asked screen-time experts how to avoid the relentless pull of our devices. </p><p><strong>The New York Times:</strong> <a href="https://www.nytimes.com/2025/07/31/well/mind/dementia-ai-companions.html">Could Dementia Patients Benefit from an A.I. Companion? </a></p><p>New products are being developed in an attempt to reduce loneliness and bolster cognition. </p><p><strong>NPR</strong>: <a href="https://www.npr.org/2025/07/17/nx-s1-5468637/clergy-grapple-with-the-ethics-of-using-ai-to-write-sermons">We asked clergy if they use AI to help write sermons. Here's what they said </a></p><p>How would you feel if you found out that the sermon at your church was written by artificial intelligence?</p>]]></content:encoded></item><item><title><![CDATA[ETP Updates, July 2025]]></title><description><![CDATA[It&#8217;s no secret that AI skills are the tools of the future.]]></description><link>https://news.ethicaltechproject.org/p/etp-updates-july-2025</link><guid isPermaLink="false">https://news.ethicaltechproject.org/p/etp-updates-july-2025</guid><dc:creator><![CDATA[Samuel Baird]]></dc:creator><pubDate>Mon, 21 Jul 2025 21:05:02 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Hbx5!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98295c77-6e45-4ded-98af-e234cb2bd940_256x256.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>It&#8217;s no secret that AI skills are the tools of the future. But as technology advances faster and faster, we at ETP believe that integrating ethics into the product lifecycle is more important than ever.  </p><p>And it&#8217;s clear that tech workers agree with us: applications for our inaugural Ethical Tech Project Fellowship exceeded capacity by 1,000%!</p><p>It&#8217;s that demand for our work that inspired us to launch our new corporate training initiative, so we can continue to empower and scale our community of ethical tech and AI builders and leaders.</p><p>For our corporate trainings, we&#8217;ve adapted our acclaimed Fellowship curriculum, which was developed in partnership with the Digital Futures Institute at Columbia University. The modules cover the most important topics in responsible AI and tech: from what &#8216;Ethical Tech&#8217; is and why we should care, to privacy engineering, policy and regulations, and models of &#8216;Doing Good&#8217;. We also tailor our curriculum to match each organization&#8217;s needs and interests, so stakeholders can focus on what&#8217;s most important to them. </p><p>Here&#8217;s a few examples of what we offer:</p><ul><li><p>Ethical Tech 101 for generalists, non-tech leaders, or tech employees in a &#8216;Lunch &amp; Learn&#8217; </p></li><li><p>A half-day introductory workshop designed for tech builders and their key stakeholders that covers the basics of ethical tech, plus deep-dives on 1-2 additional topics </p></li><li><p>Our full curriculum distilled into two days, covering all ethical tech topics through lectures, individual activities, and group work</p></li></ul><p>If you want to learn more about how we can serve your organization, we&#8217;d love to connect. You can reach us at contact@ethicaltechproject.org.</p><p>And stay tuned to this newsletter and our LinkedIn page to find out when we&#8217;ll begin accepting applications for our next ETP Fellowship!</p><p>Happy Summer,</p><p>Jennie &amp; Nancy</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><h1>Here&#8217;s what we&#8217;ve been reading and listening to, and you should be too:</h1><p></p><ul><li><p><strong>Axios</strong>: <a href="https://www.axios.com/2025/07/03/artificial-intelligence-moratorium-future-regulation">New push for national AI rules likely after state ban fails</a></p><ul><li><p>Advocates expect federal legislation to try to preempt state laws.&#9;</p></li></ul></li><li><p><strong>Bloomberg</strong>: <a href="https://www.bloomberg.com/news/articles/2025-07-09/microsoft-using-more-ai-internally-amid-mass-layoffs?embedded-checkout=true">Microsoft Touts $500 Million in AI Savings While Slashing Jobs&#9;</a></p><ul><li><p>Microsoft Corp. is keen to show employees how much AI is transforming its own workplace, even as the company terminates thousands of personnel.&#9;</p></li></ul></li><li><p><strong>Fortune</strong>: <a href="https://fortune.com/2025/07/11/indeed-glassdoor-layoffs-jobs-recruit-holdings-hisayuki-deko-idekoba-ai/">Indeed and Glassdoor are cutting over 1,000 jobs. The CEO overseeing both companies says 'we must adapt' to AI | Fortune&#9;</a></p><ul><li><p>Glassdoor CEO Christian Sutherland-Wong is also out. </p></li></ul></li><li><p><strong>Wall Street Journal</strong>: <a href="https://www.wsj.com/lifestyle/careers/ai-resume-screening-hiring-676a4701">Millions of R&#233;sum&#233;s Never Make It Past the Bots. One Man Is Trying to Find Out Why</a></p><ul><li><p>After more than 100 unsuccessful job applications, Derek Mobley sued software firm Workday for discrimination, claiming its algorithms screened him out.&#9;</p></li></ul></li><li><p><strong>The New York Times</strong>: <a href="https://www.nytimes.com/interactive/2025/06/23/technology/ai-computing-global-divide.html">A.I. Computing Power Is Splitting the World Into Haves and Have-Nots - The New York Times&#9;</a></p><ul><li><p>As countries race to power artificial intelligence, a yawning gap is opening around the world.</p></li></ul></li><li><p><strong>The New Yorker</strong>: <a href="https://www.newyorker.com/magazine/2025/07/07/the-end-of-the-english-paper?utm_source[%E2%80%A6]3f7f0f823c2c&amp;esrc=OIDC_SELECT_ACCOUNT_PAGE&amp;mbid=CRMNYR012019">What Happens After A.I. Destroys College Writing?</a></p><ul><li><p>The demise of the English paper will end a long intellectual tradition, but it&#8217;s also an opportunity to re&#235;xamine the purpose of higher education.&#9;</p></li></ul></li><li><p><strong>TIME</strong>: <a href="https://time.com/7295195/ai-chatgpt-google-learning-school/">ChatGPT's Impact On Our Brains According to an MIT Study&#9;</a></p><ul><li><p>The study, from MIT Lab scholars, measured the brain activity of subjects writing SAT essays with and without ChatGPT.</p></li></ul></li><li><p><strong>The New York Times</strong>: <a href="https://www.nytimes.com/2025/06/18/opinion/parents-smartphones-tiktok-facebook.html?smid=nytcore-ios-share&amp;referringSource=articleShare">We Don&#8217;t Have to Give In to the Smartphones</a></p><ul><li><p>Parents have a way to bring back childhood. To make it work, we have to act together.</p></li></ul></li><li><p><strong>Recorded Future News</strong>: <a href="https://therecord.media/alleged-killer-minnesota-lawmaker-data-brokers-list">Minnesota lawmaker&#8217;s alleged killer had list of data broker websites in car, FBI says&#9;</a></p><ul><li><p>Police found a list of 11 data brokers in an SUV driven by the man who allegedly murdered a Minnesota state representative and her husband. The list naming the data brokers includes notations about which sites are free to use and how much information they require to obtain detailed data about individuals being searched.</p></li></ul></li><li><p><strong>BBC</strong>: <a href="https://www.bbc.com/news/articles/cp8mp79gyz1o">WeTransfer says files not used to train AI after backlash</a></p><ul><li><p>Some social media users had threatened to delete their accounts after WeTransfer's terms were updated.</p></li><li><p></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/etp-updates-july-2025?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/etp-updates-july-2025?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p></li></ul></li></ul>]]></content:encoded></item><item><title><![CDATA[ETP Newsletter June 2025]]></title><description><![CDATA[The inaugural Ethical Tech Fellowship is now a wrap!]]></description><link>https://news.ethicaltechproject.org/p/etp-newsletter-june-2025</link><guid isPermaLink="false">https://news.ethicaltechproject.org/p/etp-newsletter-june-2025</guid><dc:creator><![CDATA[Samuel Baird]]></dc:creator><pubDate>Fri, 13 Jun 2025 14:25:19 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Hbx5!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98295c77-6e45-4ded-98af-e234cb2bd940_256x256.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>The inaugural Ethical Tech Fellowship is now a wrap! We&#8217;re so proud of our 18 Fellows, and hope you&#8217;ll join us in congratulating them for their efforts. </p><p>Here&#8217;s what some of them had to say about their experiences in the program: </p><p><em>What was really interesting was seeing the breadth and diversity of people who have a direct stakeholder involvement in ethical technology. &#8202;There are people who are in the criminal justice system, and people who are working in digital, multimedia, and major journalism groups. Getting a lot of those perspectives is really enriching, especially when you're early on in your career and you're realizing that everyone has a direct involvement in this. - <strong>Arjun Jagjivan</strong> </em></p><p><em>I knew I wanted to pursue a career in something related to ethical technology and responsible AI, and I wanted to connect with professionals in the space. Now that I'm in the program and have gained so much from it, it's been really exciting to hear every week from guest speakers and even present my capstone to a really awesome industry panel. - <strong>Hana Memon </strong></em></p><p><em>&#8202;I'm using the fellowship as a catalyst for what I want to do in my own entrepreneurial ambitions as I continue to build out solutions that use technology as a means to bridge the great societal challenges of the 21st century, whether it be in geopolitics, democratic, backsliding, et cetera. - <strong>Nikolas Ortega</strong></em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/etp-newsletter-june-2025?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/etp-newsletter-june-2025?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p></p><p>We designed the fellowship to educate, convene, and equip tech builders so they can make better decisions when designing, developing, and bringing to market tech- and AI- driven solutions. And our work is needed more now than ever before. </p><p>We&#8217;re planning more corporate trainings and executive workshops, as well as the next fellowship cohort for individuals. If you&#8217;re curious about how this could support your team&#8212;or if you&#8217;re personally interested in joining the fellowship&#8212;we&#8217;d love to connect. Just drop us a note at contact@ethicaltechproject.org and and we&#8217;ll share more about what&#8217;s coming up. </p><p>With gratitude, </p><p>Jennie &amp; Nancy</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/subscribe?"><span>Subscribe now</span></a></p><h1>Here&#8217;s what we&#8217;ve been reading and listening to, and you should be too:</h1><div><hr></div><p><a href="https://fairplayforkids.org/campaign-stop-googles-gemini-ai-rollout-to-young-kids/">Campaign: Stop Google's Gemini AI rollout to young kids!</a></p><p>The non-profits Fairplay and EPIC are leading a broad coalition to stop Google&#8217;s rollout of its Gemini AI companion to children under age 13.</p><p><strong>The New York Times:</strong> <a href="https://www.nytimes.com/2025/05/15/business/trump-online-misinformation-grants.html?smid=nytcore-ios-share&amp;referringSource=articleShare">Trump Administration Cancels Scores of Grants to Study Online Misinformation</a></p><p>Federal agencies say that by axing the funding they are protecting the First Amendment. Critics see it as stifling scientific inquiry into sources of harmful online content.</p><p><strong>NPR: </strong><a href="https://www.npr.org/2025/05/31/nx-s1-5407870/meta-ai-facebook-instagram-risks">Meta plans to replace humans with AI to assess risks</a></p><p>Current and former Meta employees fear the new automation push comes at the cost of allowing AI to make tricky determinations about how Meta's apps could lead to real world harm. </p><p><strong>The Atlantic:</strong> <a href="https://www.theatlantic.com/technology/archive/2025/05/reddit-ai-persuasion-experiment-ethics/682676/">&#8216;The Worst Internet-Research Ethics Violation I Have Ever Seen&#8217;</a></p><p>The most persuasive &#8220;people&#8221; on a popular subreddit turned out to be a front for a secret AI experiment.</p><p><strong>The Guardian:</strong> <a href="https://www.theguardian.com/technology/2025/jun/03/honest-ai-yoshua-bengio?CMP=Share_AndroidApp_Other">AI pioneer announces non-profit to develop &#8216;honest&#8217; artificial intelligence</a></p><p>Yoshua Bengio&#8217;s organisation plans to create system to act as guardrail against AI agents trying to deceive humans</p><p><strong>Click Here:</strong> <a href="https://play.prx.org/listen?ge=prx_8376_c062f1b4-eb93-420b-a131-85954b52a4bf&amp;uf=https%3A%2F%2Fpublicfeeds.net%2Ff%2F8376%2Fclickhere">227 new reasons to worry about North Korea</a></p><p>North Korea has built an artificial intelligence research center to supercharge its cyber operations, Unit 227. It&#8217;s a move that some experts say has been years in the making &#8212; and others say should scare us senseless.</p><p><strong>The Guardian: </strong><a href="https://www.theguardian.com/technology/ng-interactive/2025/may/13/chatgpt-ai-big-tech-cooperation">ChatGPT may be polite, but it&#8217;s not cooperating with you</a></p><p>Big tech companies have exploited human language for AI gain. Now they want us to see their products as trustworthy collaborators.</p><p><strong>The New York Times:</strong> <a href="https://www.nytimes.com/2025/05/19/books/review/empire-of-ai-karen-hao-the-optimist-keach-hagey.html?smid=nytcore-ios-share&amp;referringSource=articleShare">Hey ChatGPT, Which One of These Is the Real Sam Altman?</a></p><p>Two journalists explore the artificial intelligence company OpenAI and present complementary portraits of its notorious co-founder.</p><p><strong>On with Kara Swisher:</strong> <a href="https://podcasts.apple.com/us/podcast/sam-altman-openai-and-the-future-of/id1643307527?i=1000709404271">Sam Altman, OpenAI and the Future of Artificial (General) Intelligence</a></p><p>Few technological advances have made the kind of splash &#8211;&#8211; and had the potential long-term impact &#8211;&#8211; that ChatGPT did in November 2022. But who is Sam Altman? And was it inevitable that OpenAI would become such a huge player in the AI space?</p><p><strong>Morning Brew:</strong> <a href="https://www.morningbrew.com/stories/2025/05/24/having-an-affair-dont-let-claude-4-find-out?mbcid=40096930.1691108&amp;mblid=272f027754ef&amp;mid=a00e72cb3730bcb8df5d2ff1ee373f02&amp;utm_campaign=mb&amp;utm_medium=newsletter&amp;utm_source=morning_brew">Having an affair? Don&#8217;t let Claude 4 find out </a></p><p>Anthropic's latest AI model resorted to blackmail during testing.</p><p><strong>The New York Times:</strong> <a href="https://www.nytimes.com/2025/05/30/opinion/silicon-valley-ai-empire.html">Opinion | Silicon Valley Is at an Inflection Point</a></p><p>The influence of A.I. companies now extends well beyond the realm of business.</p><p><strong>TechPolicy.Press:</strong> <a href="https://www.techpolicy.press/the-myth-of-agi/">The Myth of AGI </a></p><p>Alex Hanna and Emily M. Bender write that claims of "Artificial General Intelligence" are a cover for abandoning the current social contract.</p><p><strong>AI Now Institute:</strong> <a href="https://ainowinstitute.org/publications/research/ai-now-2025-landscape-report">Artificial Power: 2025 Landscape Report</a></p><p>In the aftermath of the &#8220;AI boom,&#8221; this report examines how the push to integrate AI products everywhere grants AI companies - and the tech oligarchs that run them - power that goes far beyond their deep pockets.</p><p><strong>Slate:</strong> <a href="https://slate.com/culture/2025/05/mountainhead-elon-musk-hbo-max-movie-succession.html?pay=1749724942371&amp;support_journalism=please">The New Movie From the Creator of Succession Is Less a Satire Than a Documentary</a></p><p>The new HBO movie hits harder than just about any movie released in theaters this year.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/etp-newsletter-june-2025?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/etp-newsletter-june-2025?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p>]]></content:encoded></item><item><title><![CDATA[Our first month of ETP Fellows]]></title><description><![CDATA[We&#8217;ve had an amazing first month with our inaugural ETP fellows.]]></description><link>https://news.ethicaltechproject.org/p/our-first-month-of-etp-fellows</link><guid isPermaLink="false">https://news.ethicaltechproject.org/p/our-first-month-of-etp-fellows</guid><dc:creator><![CDATA[The Ethical Tech Project]]></dc:creator><pubDate>Thu, 15 May 2025 19:34:58 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!XAGG!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffebae5a2-b982-4560-a31a-d27c78610496_4000x3000.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>We&#8217;ve had an amazing first month with our inaugural ETP fellows.</p><p>Every Wednesday, our cohort of 18 engineers, product leaders, program managers, designers, and CEO/founders meets to dig into a key topic in ethical tech through activities, lectures and discussion.</p><p>Here&#8217;s what they&#8217;ve been up to so far:</p><ul><li><p>Defining &#8216;ethical tech&#8217; &#8211; and why everyone should care about it</p></li><li><p>Creating their own codes of ethics for their work</p></li><li><p>Learning how to make the business case for ethics in tech and AI</p></li><li><p>Navigating the ever-changing legal landscape</p></li></ul><p>Product manager Bobby Zipp had this to share about his experience in the fellowship so far:</p><p><em>One of the program's greatest strengths is the diversity of experience in the room. Each one of the fellows brings a unique and exceptional story to the table, fueled by a passion for changing the status quo. I have a feeling we're going to stay connected as a cohort long after the conclusion of the program this spring - I can't wait to see what we all accomplish in the future.</em></p><p>Want to find out more about our ETP fellows? Follow us <a href="https://www.linkedin.com/company/the-ethical-tech-project/posts/?feedView=all">on LinkedIn</a>.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/subscribe?"><span>Subscribe now</span></a></p><p></p><p>Demand for the pilot program surprised even us, with nearly 150 people applying for the 18 available slots. We&#8217;re now looking for ways to share our important curriculum with more tech builders - through additional fellowship cohorts, corporate training, industry workshops, and more. If you or your organization might be interested in learning more, please reach out to us at <a href="http://ethicaltechproject.org">ethicaltechproject.org</a>.</p><p>With gratitude,</p><p>Jennie &amp; Nancy</p><p></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/our-first-month-of-etp-fellows?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/our-first-month-of-etp-fellows?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!XAGG!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffebae5a2-b982-4560-a31a-d27c78610496_4000x3000.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!XAGG!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffebae5a2-b982-4560-a31a-d27c78610496_4000x3000.jpeg 424w, https://substackcdn.com/image/fetch/$s_!XAGG!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffebae5a2-b982-4560-a31a-d27c78610496_4000x3000.jpeg 848w, https://substackcdn.com/image/fetch/$s_!XAGG!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffebae5a2-b982-4560-a31a-d27c78610496_4000x3000.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!XAGG!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffebae5a2-b982-4560-a31a-d27c78610496_4000x3000.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!XAGG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffebae5a2-b982-4560-a31a-d27c78610496_4000x3000.jpeg" width="4000" height="3000" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/febae5a2-b982-4560-a31a-d27c78610496_4000x3000.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:3000,&quot;width&quot;:4000,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2412378,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://news.ethicaltechproject.org/i/163642071?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3c408813-6f9d-4606-ab8a-7d2ecb8316fd_4000x3000.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!XAGG!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffebae5a2-b982-4560-a31a-d27c78610496_4000x3000.jpeg 424w, https://substackcdn.com/image/fetch/$s_!XAGG!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffebae5a2-b982-4560-a31a-d27c78610496_4000x3000.jpeg 848w, https://substackcdn.com/image/fetch/$s_!XAGG!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffebae5a2-b982-4560-a31a-d27c78610496_4000x3000.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!XAGG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffebae5a2-b982-4560-a31a-d27c78610496_4000x3000.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div><hr></div><p><strong>Here&#8217;s what we&#8217;ve been reading and listening to, and you should be too:</strong></p><p>Our own Jennie Baird writes in The Hill about <a href="https://thehill.com/opinion/finance/5228017-surveillance-pricing-data-consumers/">how your personal data impacts the prices you pay. <br></a></p><p>ETP Fellow Teresa Datta co-authored <a href="https://arxiv.org/pdf/2405.03855">a white paper on the increasing prioritization of responsible artificial intelligence</a>.</p><p>Pew Research Center released the report, <a href="https://www.pewresearch.org/internet/2025/04/03/how-the-us-public-and-ai-experts-view-artificial-intelligence/">How the U.S. Public and AI Experts View Artificial Intelligence</a>. Here&#8217;s a finding that stood out to us: &#8220;Public optimism is low regarding AI&#8217;s impact on work. While 73% of AI experts surveyed say AI will have a very or somewhat positive impact on how people do their jobs over the next 20 years, that share drops to 23% among U.S. adults.&#8221;</p><p>Sidney Lake writes in Fortune <a href="https://fortune.com/2025/05/13/openai-ceo-sam-altman-says-gen-z-millennials-use-chatgpt-like-life-adviser/">how college students are turning to ChatGPT for life advice</a>, but the jury's still out on how safe that is for users.</p><p>Julia Angwin writes in The New York Times about <a href="https://www.nytimes.com/2025/04/30/opinion/musk-doge-data-ai.html">the frightening ways DOGE is creating a surveillance state.</a></p><p>Radhika Rajkumar writes in ZDNET about <a href="https://www.zdnet.com/article/anthropic-mapped-claudes-morality-heres-what-the-chatbot-values-and-doesnt/">what Anthropic found when it tried to analyze the morality matrix of its chatbot Claude</a>. (It&#8217;s mostly good news). </p><p>Sam Biddle writes in The Intercept about<a href="https://theintercept.com/2025/04/03/google-cbp-ai-border-surveillance-ibm-equitus/"> the critical role Google and AI are playing as U.S. Customs and Border Protection upgrades it surveillance tech.</a></p><p>Kyle Chayka describes in The New Yorker <a href="https://www.newyorker.com/culture/infinite-scroll/the-limits-of-ai-generated-miyazaki">how GPT-4o led to a slew of Studio Ghibli memes</a> &#8212; and what the new tool means for the future of artists, audiences and aesthetics.</p><p>Writing for The Guardian, Luke Barratt and Costanza Gambarini (with data graphics by Andrew Witherspoon and Aliya Uteuova) expose<a href="https://www.theguardian.com/environment/2025/apr/09/big-tech-datacentres-water?CMP=Share_AndroidApp_Other"> how Amazon, Microsoft and Google are building datacenters that put the local water supply at risk.</a></p><p>And if you&#8217;ve got a long car ride ahead of you -- the podcast series<a href="https://podcasts.apple.com/us/podcast/otherwise-objectionable/id1798723661"> Otherwise Objectionable</a> is the best history and analysis of Section 230 we&#8217;ve found yet! <br><br></p>]]></content:encoded></item><item><title><![CDATA[Hello from the Ethical Tech Project]]></title><description><![CDATA[It's been a while - here's what's happening.]]></description><link>https://news.ethicaltechproject.org/p/hello-from-the-ethical-tech-project</link><guid isPermaLink="false">https://news.ethicaltechproject.org/p/hello-from-the-ethical-tech-project</guid><dc:creator><![CDATA[Jennie]]></dc:creator><pubDate>Thu, 27 Mar 2025 17:09:29 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!qA0w!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccbb5f2d-95fb-43ee-89bf-b598ff6d307b_2115x1418.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Hi from the Ethical Tech Project! It&#8217;s been a minute since we&#8217;ve been in touch, and let&#8217;s be real, there&#8217;s a lot going on. Many people in our network are overwhelmed by the news and wondering how we got here - to a world of government-endorsed data breaches; bribery and corruption courtesy of meme coins; the demise of fact-checking on social media feeds; the rapid dismantling of consumer protection and regulatory functions; and real and present concerns about digital monitoring and surveillance. I think about the many times I urged my father to review his privacy settings over the years, and his consistent response that he had nothing to hide. Like him, most of us never thought we had anything to hide, but now we might wonder.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Ethical Tech Project! Subscribe for free to receive new posts.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>It didn&#8217;t have to be this way, and yet here we are. While tech and AI solutions have brought profound benefits to society, they have also brought equally profound harms, not the least among them the undermining of trust in institutions of democracy, and increasing societal polarization.</p><p>Here at The Ethical Tech Project, we&#8217;ve been honing our focus and mission. Our big audacious vision is to transform the tech innovation ethos from &#8220;move fast and break things&#8221; to &#8220;think first, then make things - things that are better for individuals, for your business, and for society more broadly.&#8221;</p><p>To that end, on April 2, we will be kicking off &#8220;The Ethical Tech Fellowship,&#8221; a program designed to educate, convene, and equip tech builders so they can make better decisions when designing, developing, and bringing to market tech- and AI- driven solutions.</p><p>Our pilot cohort of 18 engineers, product leaders, program managers, designers, and CEO/founders were selected from an incredibly competitive and impressive applicant pool of nearly 150.</p><p>We&#8217;ll keep you posted on the progress of our fellowship and introduce you to some of the participants in the weeks and months ahead. In the meantime, we&#8217;ll be heads down working to scale the program so we can meet the moment. If you are interested in helping us grow, please get in touch.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/hello-from-the-ethical-tech-project?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/hello-from-the-ethical-tech-project?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!qA0w!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccbb5f2d-95fb-43ee-89bf-b598ff6d307b_2115x1418.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!qA0w!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccbb5f2d-95fb-43ee-89bf-b598ff6d307b_2115x1418.jpeg 424w, https://substackcdn.com/image/fetch/$s_!qA0w!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccbb5f2d-95fb-43ee-89bf-b598ff6d307b_2115x1418.jpeg 848w, https://substackcdn.com/image/fetch/$s_!qA0w!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccbb5f2d-95fb-43ee-89bf-b598ff6d307b_2115x1418.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!qA0w!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccbb5f2d-95fb-43ee-89bf-b598ff6d307b_2115x1418.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!qA0w!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccbb5f2d-95fb-43ee-89bf-b598ff6d307b_2115x1418.jpeg" width="1456" height="976" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ccbb5f2d-95fb-43ee-89bf-b598ff6d307b_2115x1418.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:976,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1520894,&quot;alt&quot;:&quot;Gregory DiSalvo, Getty Images&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://news.ethicaltechproject.com/i/159998574?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccbb5f2d-95fb-43ee-89bf-b598ff6d307b_2115x1418.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Gregory DiSalvo, Getty Images" title="Gregory DiSalvo, Getty Images" srcset="https://substackcdn.com/image/fetch/$s_!qA0w!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccbb5f2d-95fb-43ee-89bf-b598ff6d307b_2115x1418.jpeg 424w, https://substackcdn.com/image/fetch/$s_!qA0w!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccbb5f2d-95fb-43ee-89bf-b598ff6d307b_2115x1418.jpeg 848w, https://substackcdn.com/image/fetch/$s_!qA0w!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccbb5f2d-95fb-43ee-89bf-b598ff6d307b_2115x1418.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!qA0w!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccbb5f2d-95fb-43ee-89bf-b598ff6d307b_2115x1418.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><div><hr></div><h1><strong>What We&#8217;re Reading on Ethical Tech This Week</strong></h1><p>Every week, we round up the latest in Ethical Tech. Subscribe now and also get our monthly digest, the <strong>Ethical Tech News Roundup</strong>!</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/subscribe?"><span>Subscribe now</span></a></p><ul><li><p>WIRED - <a href="https://www.wired.com/story/doge-is-the-deep-state/">DOGE Is the Deep State</a></p><ul><li><p>A shadowy group of unelected figures reshaping the federal government to their own benefit from the inside? Sounds familiar!</p></li></ul></li><li><p>Reuters - <a href="https://www.reuters.com/technology/meta-ends-third-party-fact-checking-program-adopts-x-like-community-notes-model-2025-01-07/">Meta shelves fact-checking in policy reversal ahead of Trump administration</a></p><ul><li><p>In January Meta scrapped its fact-checking program, marking its biggest overhaul to its approach on politcal content in recent memory.</p></li></ul></li><li><p>CNN - <a href="https://www.cnn.com/2025/02/28/business/crypto-mogul-trump-coins-civil-fraud-charges/index.html"> A crypto mogul who invested millions into Trump coins is getting a reprieve on civil fraud charges </a></p><ul><li><p>A businessman who pumped $75 million into the Trump family-backed crypto token finds himself in a fortunate position this week as federal securities regulators are hitting pause on their civil fraud case against him.</p></li></ul></li><li><p>The National Law Review - <a href="https://natlawreview.com/article/texas-ag-sues-allstate-violations-texas-privacy-law-first-enforcement-action-under#google_vignette"> Texas AG Sues Allstate for Violations of Texas Privacy Law in First Enforcement Action Under a State Comprehensive Data Privacy Law</a></p><ul><li><p>In case you missed it in January, the Texas AG sued Allstate. </p></li></ul></li><li><p>BBC -<a href="https://www.bbc.co.uk/aboutthebbc/documents/bbc-research-into-ai-assistants.pdf">Representation of BBC News content in AI Assistants</a></p><ul><li><p>AI assistants risk misleading audiences by distorting BBC Journalism`</p></li></ul></li></ul>]]></content:encoded></item><item><title><![CDATA[Post-Election Q&A with Alysa Hutnik]]></title><description><![CDATA[A breakdown on what the upcoming Republican administration means for Ethical Tech.]]></description><link>https://news.ethicaltechproject.org/p/post-election-q-and-a-with-alysa</link><guid isPermaLink="false">https://news.ethicaltechproject.org/p/post-election-q-and-a-with-alysa</guid><dc:creator><![CDATA[The Ethical Tech Project]]></dc:creator><pubDate>Tue, 19 Nov 2024 14:01:06 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Hbx5!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98295c77-6e45-4ded-98af-e234cb2bd940_256x256.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><strong>ETP:</strong> The GOP has won both the White House and Senate, with the House still undecided but leaning Republican. What does this shift in leadership mean for privacy legislation and enforcement?</p><p><strong>Alysa Hutnik</strong>: It's the question on everyone's mind, really. To start, we need to look at committee leadership. <a href="https://mcmorris.house.gov/">Kathy McMorris Rodgers</a>, who chaired the Energy and Commerce Committee and <a href="https://energycommerce.house.gov/posts/chair-rodgers-statement-on-the-american-privacy-rights-act">co-sponsored APRA</a>, didn&#8217;t seek re-election. If Republicans take the House, we don&#8217;t yet know who will replace her, but there&#8217;s significant pressure from <a href="https://www.uschamber.com/major-initiative/data-privacy#:~:text=The%20U.S.%20Chamber%20supports%20a,regulatory%20certainty%20in%20the%20marketplace.">the Chamber of Commerce for comprehensive federal privacy legislation</a>. This push for federal regulation is unlikely to lose momentum just because the GOP is now in power. In fact, it might drive more Republicans to support it.</p><p>In the Senate, Texas Senator Ted Cruz is now the ranking member of the Senate Commerce, Science, and Transportation Committee. <a href="https://www.texasattorneygeneral.gov/consumer-protection/file-consumer-complaint/consumer-privacy-rights/texas-data-privacy-and-security-act">Texas has its own comprehensive privacy law</a>, so Cruz could be motivated to use it as a model for a national standard. However, passing this through committee would require bipartisan support. A federal privacy law will need compromise on both sides&#8212;something we haven&#8217;t seen fully materialize yet.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Ethical Tech Project! Subscribe for free to receive new posts.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p><p><strong>ETP:</strong> When state privacy laws first emerged, there was a tendency to generalize <a href="https://pirg.org/edfund/resources/state-privacy-laws/">that Red states had weaker regulations</a>. You mentioned Texas as an exception. Would you say it&#8217;s a strong law?</p><p><strong>Alysa Hutnik:</strong> There are two main ways to measure impact: the strength of the law itself and the state&#8217;s enforcement level. Texas&#8217;s privacy law is both robust and comprehensive, and the state&#8217;s Attorney General has built a large team to enforce it. From what I&#8217;ve seen personally, Texas is actively enforcing this law, possibly even rivaling California in terms of investigation volume.</p><p><strong>ETP:</strong> A major issue on the table now is <a href="https://iapp.org/news/a/ceiling-or-floor-state-law-preemption-and-preservation-in-u-s-federal-privacy-bills">preemption</a>. If Cruz and the GOP manage to pass a federal privacy law based on Texas&#8217;s model, would that override existing frameworks like California&#8217;s?</p><p><strong>Hutnik</strong>: Preemption is a major issue. The Chamber of Commerce argues that a federal law would simplify things, reducing the compliance burden of a state-by-state patchwork, especially for smaller businesses. So yes, preemption will likely be a focal point in these discussions.</p><p><strong>ETP</strong>: What about the intersection of privacy and AI, particularly with personal data used to train AI models?</p><p><strong>Hutnik</strong>: AI stands apart somewhat. <a href="https://www.theverge.com/2024/11/12/24294483/donald-trump-ai-data-center-epa-lee-zeldin">The new administration has a strong pro-business, pro-innovation stance, especially given Elon Musk&#8217;s influence.</a> I doubt we&#8217;ll see significant regulation on AI anytime soon. However, with a spotlight on content moderation and censorship&#8212;both driven by algorithmic decisions&#8212;there could still be scrutiny around how AI impacts these areas. But I wouldn&#8217;t expect sweeping FTC regulations, given that Republican-led FTCs historically take a lighter approach.</p><p><strong>ETP:</strong> Speaking of the FTC, Lina Khan&#8217;s term ends in September. Is her departure inevitable?</p><p><strong>Alysa Hutnik:</strong> Traditionally, a change in administration prompts the FTC chair to resign before inauguration. Musk isn&#8217;t a fan of Khan, and neither is the Chamber, although J.D. Vance does support her. That probably won&#8217;t be enough to keep her in place. The main questions are when she&#8217;ll resign and who will replace her. It took a while for the previous Trump administration to make appointments, but this time the transition seems more organized. I anticipate that whoever is appointed will focus on antitrust and deal-making over consumer protection.</p><p><strong>ETP</strong>: Lastly, what&#8217;s in store for the FTC&#8217;s stance on surveillance capitalism?</p><p><strong>Alysa Hutnik:</strong> I think serious scrutiny of surveillance capitalism is unlikely. Targeted advertising might still see some attention, especially in health. If we look at past FTC actions under Trump, cases like Vizio (personal data within TV usage) and Flo (menstrual tracking data) were priorities. So privacy issues won&#8217;t disappear&#8212;they&#8217;ll just have a different focus. It&#8217;s going to be an interesting few years. Expect some interesting developments&#8212;we&#8217;re definitely in a &#8220;grab your popcorn&#8221; kind of scenario.</p><div class="captioned-button-wrap" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/post-election-q-and-a-with-alysa?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="CaptionedButtonToDOM"><div class="preamble"><p class="cta-caption">Thanks for reading The Ethical Tech Project! This post is public so feel free to share it.</p></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/post-election-q-and-a-with-alysa?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/post-election-q-and-a-with-alysa?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p></div><div><hr></div><h1><strong>What We&#8217;re Reading on Ethical Tech This Week</strong></h1><p>Every week, we round up the latest in Ethical Tech. Subscribe now and also get our monthly digest, the <strong>Ethical Tech News Roundup</strong>!</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/subscribe?"><span>Subscribe now</span></a></p><ul><li><p>Forbes - <a href="https://www.forbes.com/sites/danidiplacido/2024/11/16/coca-colas-ai-generated-ad-controversy-explained/">Coca Cola&#8217;s AI-Generated Ad Controversy, Explained</a></p><ul><li><p>This year, Coca-Cola&#8217;s Christmas ads are AI-generated and deeply uncanny, sparking online backlash from those who claimed the magic had been lost.</p></li></ul></li><li><p>The Drum - <a href="https://www.thedrum.com/news/2024/10/25/adland-alert-linkedin-pinterest-come-under-legal-fire-ad-tracking-practices"> Adland on alert as LinkedIn &amp; Pinterest come under legal fire for ad tracking practices </a></p><ul><li><p>Both platforms have been accused of significant GDPR violations this week over their advertising practices. But experts say that steep fines may not be enough to deter bad behavior &#8211; and that more intensive remedies are needed.</p></li></ul></li><li><p>Newsweek - <a href="https://www.newsweek.com/what-will-trumps-new-term-mean-ai-1982615">What Will Trump's New Term Mean for A.I.?</a></p><ul><li><p>Donald Trump's return to office has raised questions over potential changes to artificial intelligence policy in the United States.</p></li></ul></li></ul>]]></content:encoded></item><item><title><![CDATA[Has generative AI made our best privacy principles obsolete? ]]></title><description><![CDATA[The world&#8217;s foundational privacy guidelines weren&#8217;t written for the generative AI era.]]></description><link>https://news.ethicaltechproject.org/p/has-generative-ai-made-our-best-privacy</link><guid isPermaLink="false">https://news.ethicaltechproject.org/p/has-generative-ai-made-our-best-privacy</guid><dc:creator><![CDATA[JJ]]></dc:creator><pubDate>Tue, 01 Oct 2024 13:30:12 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Sf1i!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcbfb566f-640c-4c85-9016-90de2ce7fbf1_1520x800.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Note: <a href="https://statescoop.com/generative-ai-privacy-principles-obsolete-oecd/">This editorial previously appeared in StateScoop</a></em></p><p></p><p>The Organization for Economic Cooperation and Development&#8217;s <a href="http://oecdprivacy.org/">privacy principles</a> were, in many ways, a masterpiece. Written in 1980, they set out clear and specific guidelines that gave rise to the first comprehensive privacy laws. Core concepts such as accountability, transparency, and data security all spring from this source: if these ideas seem obvious today, it&#8217;s a reflection of how deeply ingrained the OECD&#8217;s guidelines are in the way that both consumers and businesses think about data privacy.</p><p>Look at <a href="https://legalinstruments.oecd.org/en/instruments/OECD-LEGAL-0449">the OECD&#8217;s principles on AI use</a>, which were updated last year, and you&#8217;ll see a very different story playing out. Compared with the precise language and clear principles in the 1980 privacy guidelines, the AI principles are almost comically hesitant. They offer only the broadest of goals &#8212; encouraging the creation of &#8220;trustworthy AI&#8221; that doesn&#8217;t actively break the law, for instance &#8212; with little or no guidance on how to enact them, or even judge whether they&#8217;ve been achieved.</p><p>Its recommendations are hedged with the repeated proviso that they only apply when &#8220;appropriate to the context and consistent with the state of the art&#8221; &#8212; a shrugging acknowledgement that AI is so new, and evolving so fast, that any specific principles would be outdated before the ink was dry.</p><p>The contrast between these two sets of guidelines couldn&#8217;t be clearer. When the privacy principles were adopted, the OECD could essentially ignore technical challenges and simply articulate how data should or shouldn&#8217;t be treated. In the AI era, though, tech is evolving so fast that <em>any</em> concrete guidelines risk becoming immediately obsolete. Like other regulatory groups, the OECD decided its only option was to articulate broad values, rather than give clear guidance on how they should be implemented.</p><p>Like it or not, we live in the AI era, and that&#8217;s complicating data privacy just as much as it&#8217;s complicating efforts to regulate AI itself. As we think about the future of privacy in an AI world, it&#8217;s important to ask: are the OECD&#8217;s privacy principles still relevant? Or are they becoming obsolete &#8212; a regulatory weapon from <a href="https://www.youtube.com/watch?v=vQA5aLctA0I">a more civilized age</a>, before AI made a mockery of attempts to pass clear and prescriptive recommendations?</p><p>The OECD&#8217;s privacy principles are grounded in the idea that every single bit of personal data is unique, important and deserving of individual protection. From my bank information to my facial biometrics, the idea goes, my information is <em>mine</em> and needs to be handled with the utmost caution and respect.</p><p>For AI innovators, however, the world looks very different. Instead of worrying about individual droplets of data, AI concerns itself with the swirling tides and currents of the entire ocean. My data may not matter much at all, on an individual level: What matters is data in aggregate and the patterns and signals that can be coaxed out of vast datasets.</p><p>That simple distinction lays waste to the OECD&#8217;s privacy principles.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Ethical Tech Project! Subscribe for free to receive new posts.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p><p>The collection limitation principle<strong> </strong>calls for restraint in the collection of personal data, and for data to only be collected with the subject&#8217;s knowledge and consent, while the data quality principle says we should only collect the data needed to achieve our goals. AI depends, however, on indiscriminately ingesting vast amounts of data, including colossal quantities of public data scraped without notifying anyone.</p><p>The purpose specification principle says we should disclose our goals up front, and the use limitation principle says personal data shouldn&#8217;t be used for purposes other than those specified during collection. But AI depends on collecting data <em>first</em> and then figuring out what&#8217;s possible to do with it &#8212; so unless we&#8217;re willing to accept &#8220;doing AI stuff&#8221; as a legitimate purpose, both these principles go out the window.</p><p>The security safeguards principle says personal data should be protected against loss or unauthorized access, and the individual participation principle says individuals have the right to know what&#8217;s being done with their data and to have their data amended or deleted. But AI models embed data in their own algorithms in ways that can&#8217;t simply be disclosed or deleted.&nbsp;</p><p>The openness principle says developers should work openly and transparently, and the accountability principle says a data controller should be accountable for enacting all the foregoing principles. In an era of black-box algorithms, where not even the developer really knows what&#8217;s going on under the hood, both these principles are virtually impossible to implement in any meaningful way.</p><p>Regulating AI using the OECD&#8217;s existing privacy principles will be about as effective as using traffic laws to halt a supernova. Case in point: The French Data Protection Authority got <a href="https://www.jdsupra.com/legalnews/french-data-protection-authority-7334230/">tied up in knots</a> recently by trying to argue that data minimization doesn&#8217;t preclude training AI models on big datasets &#8212; but <em>does </em>still require developers to avoid feeding &#8220;unnecessary&#8221; personal data into AI systems. That brings us straight back to the core question of how, exactly, AI developers can know in advance whether data was necessary or not.</p><p>Because the privacy industry is on a collision course with a technological and economic juggernaut, clinging too hard to the OECD&#8217;s principles could wind up doing more harm than good. If privacy advocates commit to a privacy framework that&#8217;s fundamentally incompatible with the AI revolution, the entire framework could wind up getting left behind. If we force people to choose between privacy and AI, we could wind up with AI and no privacy.</p><p>That would be a disaster, because the the values underpinning the OECD&#8217;s principles remain incredibly important. Precisely because AI radically complicates everything, and makes the OECD&#8217;s privacy principles virtually impossible to enforce as written, the underlying ideals of transparency and dignity and fairness and agency are needed more than ever. (Data privacy is dead &#8212; long live data privacy!)</p><p>I don&#8217;t have a satisfying solution to offer &#8212; like you, I&#8217;m just one more drop in the ocean. But I believe this is the conversation our industry needs to be having if we want to avoid sacrificing data privacy on the altar of AI innovation. Pretending that old approaches to data privacy are enough will backfire fast: If we want privacy to endure in the AI era, we need to get serious and start rethinking the foundational ideas on which modern privacy infrastructure is built.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Sf1i!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcbfb566f-640c-4c85-9016-90de2ce7fbf1_1520x800.webp" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Sf1i!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcbfb566f-640c-4c85-9016-90de2ce7fbf1_1520x800.webp 424w, https://substackcdn.com/image/fetch/$s_!Sf1i!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcbfb566f-640c-4c85-9016-90de2ce7fbf1_1520x800.webp 848w, https://substackcdn.com/image/fetch/$s_!Sf1i!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcbfb566f-640c-4c85-9016-90de2ce7fbf1_1520x800.webp 1272w, https://substackcdn.com/image/fetch/$s_!Sf1i!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcbfb566f-640c-4c85-9016-90de2ce7fbf1_1520x800.webp 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Sf1i!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcbfb566f-640c-4c85-9016-90de2ce7fbf1_1520x800.webp" width="1456" height="766" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/cbfb566f-640c-4c85-9016-90de2ce7fbf1_1520x800.webp&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:766,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:47204,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/webp&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Sf1i!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcbfb566f-640c-4c85-9016-90de2ce7fbf1_1520x800.webp 424w, https://substackcdn.com/image/fetch/$s_!Sf1i!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcbfb566f-640c-4c85-9016-90de2ce7fbf1_1520x800.webp 848w, https://substackcdn.com/image/fetch/$s_!Sf1i!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcbfb566f-640c-4c85-9016-90de2ce7fbf1_1520x800.webp 1272w, https://substackcdn.com/image/fetch/$s_!Sf1i!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcbfb566f-640c-4c85-9016-90de2ce7fbf1_1520x800.webp 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>&nbsp;</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/has-generative-ai-made-our-best-privacy?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/has-generative-ai-made-our-best-privacy?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><div><hr></div><h1><strong>What We&#8217;re Reading on Ethical Tech This Week</strong></h1><p>Every week, we round up the latest in Ethical Tech. Subscribe now and also get our monthly digest, the <strong>Ethical Tech News Roundup</strong>!</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/subscribe?"><span>Subscribe now</span></a></p><ul><li><p>The Washington Post - <a href="https://www.washingtonpost.com/technology/2024/09/29/ai-veto-california-regulation/">California Gov. Newsom vetoes AI bill in a win for Big Tech</a></p><ul><li><p>Tech executives and investors opposed the measure, which would have required companies to test the most powerful AI systems before release.</p></li></ul></li><li><p>The Economist - <a href="https://www.economist.com/business/2024/09/29/ai-and-globalisation-are-shaking-up-software-developers-world">AI and globalisation are shaking up software developers&#8217; world</a></p><ul><li><p>Their code will get cheaper. So might they.</p></li></ul></li><li><p>Forbes - <a href="https://www.forbes.com/sites/emilsayegh/2024/09/30/the-billion-dollar-ai-gamble-data-centers-as-the-new-high-stakes-game/">The Billion-Dollar AI Gamble: Data Centers As The New High-Stakes Game</a></p><ul><li><p>AI data centers are an extremely expensive investment&#8212; and it&#8217;s uncertain whether or not it will pay off. </p></li></ul></li></ul>]]></content:encoded></item><item><title><![CDATA[Surveillance Pricing Hurts Everyone in the Long Run: No One Wins]]></title><description><![CDATA[Dynamic pricing isn't limited to online platforms&#8212;it's expanding into physical retail as well.]]></description><link>https://news.ethicaltechproject.org/p/surveillance-pricing-hurts-everyone</link><guid isPermaLink="false">https://news.ethicaltechproject.org/p/surveillance-pricing-hurts-everyone</guid><dc:creator><![CDATA[Raashee Gupta Erry]]></dc:creator><pubDate>Tue, 17 Sep 2024 13:31:14 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!woyn!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab2e99d7-5f2d-47f2-bca4-9384d8c9fbe1_1125x750.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>The rapid growth of data-driven technologies has introduced surveillance pricing&#8212;where companies adjust prices based on individual consumer data. Unlike surge pricing from apps like Uber, which fluctuates with demand, surveillance pricing targets individuals based on their personal data. This shift transforms dynamic pricing from a tool for managing supply and demand to a form of dystopian personalization, where consumers may pay more due to personal habits, income, or impulsivity. With increasing scrutiny from the<a href="https://www.ftc.gov/news-events/news/press-releases/2024/07/ftc-issues-orders-eight-companies-seeking-information-surveillance-pricing"> Federal Trade Commission (FTC)</a> and lawmakers like<a href="https://finance.yahoo.com/news/elizabeth-warren-just-accused-kroger-192914486.html"> Senator Elizabeth Warren</a>, the detrimental effects on fairness, privacy, and economic equality are becoming clearer. In the end, no one benefits&#8212;neither consumers, who face unfair pricing, nor businesses, which risk losing customer trust.</p><p><strong>Surveillance Pricing: What Is It?</strong></p><p>Surveillance pricing involves using real-time consumer data to set individualized prices. Companies gather information such as browsing history, demographics, location, and credit history to estimate how much a consumer is willing to pay. Technologies like AI and advanced algorithms allow businesses to adjust prices dynamically based on this data, often charging different rates for the same product. Marketed under names like price optimization, predictive pricing, and personalized pricing, this practice often results in unfair price discrimination. The FTC is investigating this issue, focusing on a "<a href="https://www.adexchanger.com/marketers/the-ftc-orders-companies-to-disclose-info-on-surveillance-pricing/">shadowy ecosystem of pricing middlemen</a>" and has sought information from companies like Mastercard, JPMorgan Chase, McKinsey &amp; Co., and Accenture to understand its impact on privacy, competition, and consumer protection.</p><p><strong>Why the Hype?</strong></p><p>Dynamic pricing isn't limited to online platforms&#8212;it's expanding into physical retail as well. Senator Elizabeth Warren has accused Kroger, a major U.S. supermarket chain, of potential price gouging using electronic shelf labels (ESLs). These labels enable Kroger to adjust prices in real-time, potentially creating "surge pricing" similar to Uber's fare increases during high demand. While initially marketed as a consumer-friendly feature, there are concerns that Kroger&#8217;s ESLs might lead to opportunistic price hikes, such as higher ice cream prices on hot days. Warren&#8217;s letter compares this to previous issues like Orbitz charging Mac users higher prices and Staples showing different prices based on location.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Ethical Tech Project! Subscribe for free to receive new posts.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p><p><strong>Devils in the Data: What Is Causing This?</strong></p><p>Surveillance pricing relies on extensive data collection, often without full consumer consent. Data brokers, digital platforms, and other intermediaries gather information from various sources, including browsing history and purchasing patterns, to feed algorithms that set prices. <a href="https://www.hbs.edu/ris/Publication%20Files/22-050_ec28aaca-2b94-477f-84e6-e8b58428ba43.pdf">A Harvard study suggests that higher prices can arise from the automated nature of algorithms, impacting any market where firms price algorithmically. </a>The FTC&#8217;s inquiry focuses on key questions: What data are companies using to set prices? How is it collected? Who is buying these services, and how does it affect consumers?</p><p><strong>Economic Inequality: Who Pays the Price?</strong></p><p>While personalized pricing may seem beneficial, it often exacerbates economic inequality. Higher-income consumers may receive discounts, while those with less disposable income might face higher prices. Surveillance pricing can further entrench these inequalities, like how credit scores impact loan rates or zip codes affect insurance premiums. Marginalized communities, including people of color and low-income individuals, are particularly vulnerable to higher prices due to algorithmic biases, as seen in allegations against rideshare apps like Uber.</p><p><strong>Privacy Risks: How Much Do They Know?</strong></p><p>One of the biggest concerns with surveillance pricing is the lack of transparency in data collection. Consumers often unknowingly provide data through cookie agreements and lengthy terms of service. This data&#8212;ranging from location to credit scores&#8212;fuels pricing algorithms and raises privacy issues. According to the IAPP Consumer Trust Study,<a href="https://iapp.org/media/pdf/resource_center/usa_consumer_trust_infographic.pdf"> 70% of consumers decided against making an online purchase, because of privacy concerns</a>.</p><p><strong>Implications for Business: How It May Impact?</strong></p><p>Surveillance pricing doesn&#8217;t just harm consumers. Businesses that engage in this practice may see short-term gains, but they risk long-term damage to their brand reputation and consumer trust. When consumers feel manipulated or unfairly treated, businesses lose loyal customers and face public backlash.</p><p><strong>Regulatory Actions: What May Come Next?</strong></p><p>Increased regulatory scrutiny could impact personalized pricing models. The FTC&#8217;s investigation using its 6(b) authority could lead to new laws or regulations limiting surveillance pricing practices. This potential regulation could affect both current practitioners and the broader retail and e-commerce sectors.</p><p><strong>A Call for Privacy-First, Fair-Pricing Models</strong></p><p>To protect consumer privacy and foster a fair marketplace, businesses should:</p><ul><li><p>Opt-in Data Collection: Seek explicit consumer consent before collecting personal data.</p></li><li><p>Fair Pricing Models: Shift to subscription-based or fixed-rate pricing to avoid unfair charges based on personal data.</p></li><li><p>Transparency in Algorithms: Disclose how pricing algorithms work to ensure transparency and fairness.</p></li></ul><p>Surveillance pricing might boost short-term profits, but it comes with significant long-term costs. The FTC&#8217;s investigation highlights the opaque and problematic nature of this practice. From deepening economic inequality to raising serious privacy concerns, surveillance pricing ultimately harms everyone. Businesses must prioritize consumer trust, data privacy, and fair pricing to ensure a more transparent and equitable marketplace.</p><p><em>(For more reading on the subject of surveillance pricing, you might be interested in <a href="https://www.linkedin.com/posts/jonathanjoseph1_the-federal-trade-commission-is-asking-a-activity-7237887276788113408-37mg?utm_source=share&amp;utm_medium=member_desktop">this LinkedIn post from ETP&#8217;s JJ.</a>)</em></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!woyn!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab2e99d7-5f2d-47f2-bca4-9384d8c9fbe1_1125x750.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!woyn!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab2e99d7-5f2d-47f2-bca4-9384d8c9fbe1_1125x750.jpeg 424w, https://substackcdn.com/image/fetch/$s_!woyn!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab2e99d7-5f2d-47f2-bca4-9384d8c9fbe1_1125x750.jpeg 848w, https://substackcdn.com/image/fetch/$s_!woyn!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab2e99d7-5f2d-47f2-bca4-9384d8c9fbe1_1125x750.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!woyn!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab2e99d7-5f2d-47f2-bca4-9384d8c9fbe1_1125x750.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!woyn!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab2e99d7-5f2d-47f2-bca4-9384d8c9fbe1_1125x750.jpeg" width="1125" height="750" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ab2e99d7-5f2d-47f2-bca4-9384d8c9fbe1_1125x750.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:750,&quot;width&quot;:1125,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:153313,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!woyn!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab2e99d7-5f2d-47f2-bca4-9384d8c9fbe1_1125x750.jpeg 424w, https://substackcdn.com/image/fetch/$s_!woyn!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab2e99d7-5f2d-47f2-bca4-9384d8c9fbe1_1125x750.jpeg 848w, https://substackcdn.com/image/fetch/$s_!woyn!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab2e99d7-5f2d-47f2-bca4-9384d8c9fbe1_1125x750.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!woyn!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fab2e99d7-5f2d-47f2-bca4-9384d8c9fbe1_1125x750.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div class="captioned-button-wrap" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/surveillance-pricing-hurts-everyone?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="CaptionedButtonToDOM"><div class="preamble"><p class="cta-caption">Thanks for reading The Ethical Tech Project! This post is public so feel free to share it.</p></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/surveillance-pricing-hurts-everyone?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/surveillance-pricing-hurts-everyone?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p></div><div><hr></div><h1><strong>What We&#8217;re Reading on Ethical Tech This Week</strong></h1><p>Every week, we round up the latest in Ethical Tech. Subscribe now and also get our monthly digest, the <strong>Ethical Tech News Roundup</strong>!</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/subscribe?"><span>Subscribe now</span></a></p><p></p><ul><li><p>Elizabeth Warren - <a href="https://www.warren.senate.gov/newsroom/press-releases/warren-casey-investigate-krogers-use-of-digital-price-tags-warn-of-grocery-giants-surge-pricing-causing-price-gouging-and-hurting-consumers">Warren, Casey Investigate Kroger&#8217;s Use of Digital Price Tags, Warn of Grocery Giant&#8217;s &#8220;Surge Pricing&#8221;  Causing Price Gouging and Hurting Consumers</a></p><ul><li><p>Senator Elizabeth Warren is focused on investigating Kroger&#8217;s potential use of surveillance pricing. </p></li></ul></li><li><p>FTC - <a href="https://www.ftc.gov/news-events/news/press-releases/2024/07/ftc-issues-orders-eight-companies-seeking-information-surveillance-pricing">FTC Issues Orders to Eight Companies Seeking Information on Surveillance Pricing</a></p><ul><li><p>Agency seeks information about products and services that use personal data, including finances and browser history, to set individualized prices for the same goods or services.</p></li></ul></li></ul><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/surveillance-pricing-hurts-everyone?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/surveillance-pricing-hurts-everyone?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p></p>]]></content:encoded></item><item><title><![CDATA[ Apple Must Convince Us to Trust AI With Our Data]]></title><description><![CDATA[Techno-wizardry could help keep our data safe, but it won&#8217;t eliminate the need to get the basics right.]]></description><link>https://news.ethicaltechproject.org/p/apple-must-convince-us-to-trust-ai</link><guid isPermaLink="false">https://news.ethicaltechproject.org/p/apple-must-convince-us-to-trust-ai</guid><dc:creator><![CDATA[JJ]]></dc:creator><pubDate>Fri, 06 Sep 2024 13:31:01 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!m2QN!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff312e150-a7bf-4be8-a15f-4cfeff478b2c_2400x1600.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Techno-wizardry could help keep our data safe, but it won&#8217;t eliminate the need to get the basics right. </p><p>Apple recently announced its first foray into the wild and wonderful world of AI, and it&#8217;s hoping to convince us that &#8220;Apple Intelligence&#8221; can solve the enduring privacy challenges associated with AI technologies. </p><p>Apple&#8217;s plan involves a slew of technological fixes designed to unlock the value of AI without putting people&#8217;s data at risk. Most of Apple&#8217;s AI processing will take place on-device, theoretically reducing the risk of data being leaked by or absorbed into AI models. More complex AI calculations, however, will still take place in the cloud, via an encrypted mechanism that Apple is calling &#8220;Private Cloud Compute.&#8221; </p><p>Effectively, Private Cloud Compute offers an arm&#8217;s-length alternative to conventional cloud-based AI: some subset of your data drifts into the cloud, but it isn&#8217;t stored there or passed to third parties. Apple says only specified AI models will be able to unlock the user&#8217;s data, and that the security of its AI infrastructure can be verified by independent security researchers.</p><p>That&#8217;s all well and good, but the reality is that only a tiny proportion of Apple customers will have any idea whether Apple&#8217;s privacy system is really working as advertised. The technology may well be just as effective as Apple says &#8211; but whether consumers accept it will depend, quite simply, on whether or not they trust the tech giant to do right by them. </p><h2><strong>The Rise of Techno-Wizardry</strong></h2><p>Apple isn&#8217;t the only company offering technological fixes for AI-related privacy concerns. Many tech companies are holding out synthetic data as a privacy solution: instead of feeding your data into AI models, they argue, it&#8217;s possible to use simulated datasets that are statistically similar to the real thing, but not directly traceable back to any individual&#8217;s actual data.&nbsp;&nbsp;&nbsp;</p><p>Some important questions remain, however. Researchers say that the more sophisticated synthetic data grows, the more closely it will approximate the data it&#8217;s mimicking &#8211; and the easier it will become to infer actual facts about individuals based on synthetic data. That&#8217;s especially true of &#8220;outlier&#8221; individuals, like a patient with an unusual medical condition or someone with huge amounts of debt. Of course, those outliers might be the very people who are most concerned about protecting their privacy.</p><p>This doesn&#8217;t mean synthetic data is a bad idea. In many cases, it may well work exactly as advertised, enabling AI functionality while protecting user privacy. But once again, this is a technological fix that&#8217;s far too complex for individuals to be able to understand.</p><p>When users put their trust in a company that&#8217;s using synthetic data, in other words, they aren&#8217;t actually trusting &#8220;synthetic data.&#8221; Instead, they&#8217;re choosing to trust the company that&#8217;s telling them that synthetic data is sufficient to protect their privacy.</p><p>In just the same way, Apple users won&#8217;t really be making an informed decision about whether Private Cloud Compute is enough to keep them safe; they&#8217;ll just be deciding whether or not Apple itself is sufficiently trustworthy.&nbsp;</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Ethical Tech Project! Subscribe for free to receive new posts.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><h2><strong>How Much do You Trust Apple</strong></h2><p>Now, I&#8217;m not here to tell you that you shouldn&#8217;t trust Apple with your data. But it isn&#8217;t a given that you should trust Apple, either. Apple Intelligence could potentially have access to everything from our text messages to our finances to our physical and mental health, and there are good reasons to question anyone who wants access to that much of our data.&nbsp;</p><p>Certainly, the US Department of Justice (DoJ) believes there&#8217;s ample reason to question the purity of Apple&#8217;s motives. In its <a href="https://www.justice.gov/opa/pr/justice-department-sues-apple-monopolizing-smartphone-markets">antitrust lawsuit</a>, the DOJ accused Apple of using privacy as a marketing stunt, and deliberately degrading user privacy &#8211; by enabling app-makers to capture data, say, or encouraging and profiting from data-driven advertising.</p><p>You can agree or disagree with the DOJ&#8217;s allegations. Many of the industry experts I talk to say that Apple does a better job than most companies of prioritizing its users&#8217; privacy and data rights. Others, of course, believe that the company&#8217;s sheer scale (and scale of data collection) demands that it be held to a higher standard.&nbsp;</p><p>Either way, though, we come back to the same question: how can we, as consumers, decide who to trust? Who should we allow to access our intimate personal data &#8211; and how can we assess the claims they make about how it will or won&#8217;t be used?</p><p>The reality is that just as we can&#8217;t make sense of the endless boilerplate in companies&#8217; privacy policies, neither can we make meaningful judgments about the technologies that companies like Apple promise will keep our data safe. Instead, we have to go back to basics, and ask whether these companies are doing right by us and offering us the transparency and control over our data that we&#8217;re entitled to expect.&nbsp;</p><h2><strong>Privacy and Magic</strong></h2><p>The British sci-fi writer Arthur C. Clarke famously said that any sufficiently advanced technology is indistinguishable from magic. Between Apple&#8217;s Private Cloud Compute and the rise of synthetic data, it&#8217;s starting to feel like we&#8217;re entering a new era of privacy magic: technological solutions that may well work as intended, but that are simply too complex for non-specialists to understand.&nbsp;</p><p>That isn&#8217;t necessarily a problem. We all use technologies we don&#8217;t fully understand: how many people can explain at a technical level how an LCD screen or a rechargeable battery actually works? But this raises the stakes when it comes to data privacy &#8211; because it means that in order for techno-wizardry to work as a privacy solution, it isn&#8217;t enough for it to be effective. It also has to be trusted, and underpinned by the moral faith and credit of the company that&#8217;s putting it forward.</p><p>Look at it this way: if <a href="https://www.infosecurity-magazine.com/news/ftc-cambridge-analytica-deceived/">Cambridge Analytica</a> told you they&#8217;d devised a private cloud solution for AI, or started selling synthetic datasets derived from your information, would you trust them? In both cases, you might well have some serious concerns &#8211; not based on questions about the underlying tech, but on your opinion of the companies deploying it.&nbsp;&nbsp;</p><p>In the new era of technological privacy fixes, organizations obviously need to build effective data infrastructure. But they also need to pay attention to the core principles on which trust is founded: transparency, consumer control and data dignity.</p><p>Organizations that get that right have an opportunity to turn technological advances into drivers of competitive advantage &#8211; while those that fail will get snubbed by consumers, no matter how advanced their privacy technologies become</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!m2QN!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff312e150-a7bf-4be8-a15f-4cfeff478b2c_2400x1600.webp" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!m2QN!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff312e150-a7bf-4be8-a15f-4cfeff478b2c_2400x1600.webp 424w, https://substackcdn.com/image/fetch/$s_!m2QN!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff312e150-a7bf-4be8-a15f-4cfeff478b2c_2400x1600.webp 848w, https://substackcdn.com/image/fetch/$s_!m2QN!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff312e150-a7bf-4be8-a15f-4cfeff478b2c_2400x1600.webp 1272w, https://substackcdn.com/image/fetch/$s_!m2QN!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff312e150-a7bf-4be8-a15f-4cfeff478b2c_2400x1600.webp 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!m2QN!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff312e150-a7bf-4be8-a15f-4cfeff478b2c_2400x1600.webp" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f312e150-a7bf-4be8-a15f-4cfeff478b2c_2400x1600.webp&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:351420,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/webp&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!m2QN!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff312e150-a7bf-4be8-a15f-4cfeff478b2c_2400x1600.webp 424w, https://substackcdn.com/image/fetch/$s_!m2QN!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff312e150-a7bf-4be8-a15f-4cfeff478b2c_2400x1600.webp 848w, https://substackcdn.com/image/fetch/$s_!m2QN!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff312e150-a7bf-4be8-a15f-4cfeff478b2c_2400x1600.webp 1272w, https://substackcdn.com/image/fetch/$s_!m2QN!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff312e150-a7bf-4be8-a15f-4cfeff478b2c_2400x1600.webp 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Nikolas Kokovlis / Getty Images</figcaption></figure></div><p>.</p><div class="captioned-button-wrap" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/apple-must-convince-us-to-trust-ai?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="CaptionedButtonToDOM"><div class="preamble"><p class="cta-caption">Thanks for reading The Ethical Tech Project! This post is public so feel free to share it.</p></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/apple-must-convince-us-to-trust-ai?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/apple-must-convince-us-to-trust-ai?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p></div><div><hr></div><h1><strong>What We&#8217;re Reading on Ethical Tech This Week</strong></h1><p>Every week, we round up the latest in Ethical Tech. Subscribe now and also get our monthly digest, the <strong>Ethical Tech News Roundup</strong>!</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/subscribe?"><span>Subscribe now</span></a></p><ul><li><p>Scientific American - <a href="https://www.scientificamerican.com/article/ai-surveillance-pricing-practices-under-federal-probe/">AI &#8216;Surveillance Pricing&#8217; Could Use Data to Make People Pay More</a></p><ul><li><p>The Federal Trade Commission is studying how companies use consumer data to charge different prices for the same product</p></li></ul></li><li><p>Martech - <a href="https://martech.org/the-marketers-guide-to-state-data-privacy-laws/">U.S. state data privacy laws: What you need to know</a></p><ul><li><p>Six states have privacy protection laws in effect, Montana's goes online Oct. 1 and 10 other state's laws will kick in by the end of next year. Here's what you need to know about them.</p></li></ul></li><li><p>Human Rights Watch - <a href="https://www.hrw.org/news/2024/09/03/720-australian-and-brazilian-children-better-protected-ai-misuse">720 Australian and Brazilian Children Better Protected from AI Misuse</a></p><ul><li><p>Child Data Protection Laws Urgently Needed to Protect All Children</p></li></ul></li><li><p>Euractiv - <a href="https://www.euractiv.com/section/data-privacy/news/dutch-data-protection-watchdog-hits-clearview-ai-with-e30-5-million-fine-for-misusing-facial-recognition-data/">Dutch data protection watchdog hits Clearview AI with &#8364;30.5 million fine for misusing facial recognition data</a></p><ul><li><p>The Dutch Data Protection Authority (DPA) fined Clearview AI &#8364;30.5 million on Tuesday (3 September), for illegally building a database with over 30 billion photos.</p></li></ul></li></ul>]]></content:encoded></item><item><title><![CDATA[For News Media, Consumer Data Privacy Shouldn’t Be A Bad Thing]]></title><description><![CDATA[Consumer data has been a financial backbone of online News Media for years, how can it adapt in a world of increasing data privacy awareness?]]></description><link>https://news.ethicaltechproject.org/p/for-news-media-consumer-data-privacy</link><guid isPermaLink="false">https://news.ethicaltechproject.org/p/for-news-media-consumer-data-privacy</guid><dc:creator><![CDATA[JJ]]></dc:creator><pubDate>Tue, 27 Aug 2024 13:28:55 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!v38s!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3ca8d7e9-92f5-418d-bacc-04b97858b557_1024x631.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Note: <a href="https://www.adexchanger.com/the-sell-sider/for-news-media-consumer-data-privacy-shouldnt-be-a-bad-thing/">This editorial previously appeared in adexchanger</a></em></p><p></p><p>Lobbyists representing media organizations &#8211; including giants like The New York Times, the Washington Post and CNN &#8211; are <a href="https://www.iab.com/wp-content/uploads/2024/04/IAB-Letter-to-EC-re-April-17-2024-Hearing.pdf">urging lawmakers</a> to water down federal privacy bills like the recently stalled American Privacy Rights Act.&nbsp;</p><p>The planned federal privacy law would crush publishers that rely on targeted advertising, the lobbyists argue, effectively dismantling the free press and Americans&#8217; first amendment rights.</p><p>This <a href="https://theintercept.com/2022/02/01/surveillance-data-collection-ads-news-media/">isn&#8217;t the first time</a> news outlets have claimed that data privacy is incompatible with journalism, and it&#8217;s easy to see why: Today, most publishers are primarily in the business of selling their readers&#8217; attention to advertisers. News sites monetize users&#8217; data <a href="https://timlibert.me/pdf/LIBERT_BINNS-2019-GOOD_NEWS.pdf">markedly more</a> than non-news websites, with readers of The New York Times tracked by <a href="https://www.nytimes.com/2019/09/18/opinion/data-privacy-tracking.html">around 50 advertisers and data brokers</a> every time they read an article.&nbsp;</p><p>This &#8220;corporate surveillance,&#8221; the term the FTC uses to describe targeted advertising, is at the core of some publishers&#8217; current business models. That sentiment among regulators, and an increasing set of consumers, is the catalyst for the growing awareness in publishers that a change is needed.&nbsp;</p><p>The key is for publishers to reposition themselves as champions of data dignity.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Ethical Tech Project! Subscribe for free to receive new posts.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p><p><strong>The rise of data-funded news</strong></p><p>News organizations first began collecting online data after print revenues were decimated by the rise of the internet. Print circulation has <a href="https://www.pewresearch.org/journalism/fact-sheet/newspapers/">plummeted by two-thirds</a> since the mid-1990s, and ad revenues have taken a corresponding dive. Virtually all news outlets that survived the crash did so by tracking their readers and partnering with data brokers to build sprawling programmatic advertising capabilities.&nbsp;</p><p>That&#8217;s problematic for two reasons. First, it leaves news organizations playing catch-up. There&#8217;s no realistic way for publishers to beat giants like Google and Meta at the data-collection game. While tracking your users can generate short-term revenues, in the long run the tech giants will come out on top. The further behind publishers fall, meanwhile, the more urgently they feel the need to squeeze maximal value &#8211; and maximal data &#8211; from their readers.</p><p>That leads to the second problem: Harvesting data and lobbying against consumer protections cuts against news outlets&#8217; core brand as principled organizations with their readers&#8217; best interests at heart. At a time when <a href="https://news.gallup.com/poll/512861/media-confidence-matches-2016-record-low.aspx">less than a third</a> of Americans trust the news media, taking liberties with readers&#8217; data is a quick way to further sour the core relationship on which news organizations rely.&nbsp;</p><p>The rise of social media has already taken a toll: Despite chasing readers across the social web, news websites currently get <a href="https://memo.co/blog/social-engagement-is-prs-failing-barometer/">less than 1%</a> of their traffic from social media referrals. Now, <a href="https://www.washingtonpost.com/technology/2024/05/13/google-ai-search-io-sge/">AI-powered search tools</a> are throwing a new wrench in the works by allowing consumers to access news without visiting publishers&#8217; sites.</p><p><strong>Put data dignity first</strong></p><p>One option, of course, is to simply acquiesce to the AI takeover: We&#8217;re already seeing some major publishers, including <a href="https://openai.com/index/news-corp-and-openai-sign-landmark-multi-year-global-partnership/">News Corp</a> and <a href="https://www.wsj.com/business/media/openai-to-pay-politico-parent-axel-springer-for-using-its-content-bdc33332?mod=article_inline">Politico parent group Axel Springer</a>, signing lucrative deals trading away their content to OpenAI.&nbsp;</p><p>Another option, though, is for publishers to remember that their core customers aren&#8217;t AI giants, data brokers or advertisers. By repositioning themselves as defenders of consumers&#8217; data dignity, news publishers could strengthen that essential relationship.</p><p>Taking a forthright pro-consumer stance on issues like transparency and data controls would make it easier for publishers to attract and retain paying subscribers &#8211; people who don&#8217;t want to get their news from a chatbot and who are willing to pay publishers for reliable, high-quality content.&nbsp;</p><p>Some big publishers are already refocusing on subscription-based business models: Even before partnering with OpenAI, News Corp. got <a href="https://pressgazette.co.uk/media_business/news-corp-subscriptions-new-york-times/">just 16% of revenues</a> from advertising and 44% from subscriptions. The New York Times similarly hit $1B in digital subscription revenues last year, lifting overall revenues even as advertising income fell.&nbsp;</p><p>Other publishers are also reemphasizing the reader relationship. Vox just started <a href="https://www.cnn.com/2024/05/20/media/vox-relaunch-reliable-sources/index.html">offering subscriptions</a>, while the Guardian now nets <a href="https://www.axios.com/2023/11/14/guardian-record-us-reader-revenue">57% of its US revenues</a> &#8211; accounting for one-third of global digital revenues &#8211; from reader donations. Such models, however, depend upon maintaining the trust and confidence of your readers.</p><p><strong>Trust and transparency</strong></p><p>Prioritizing trust and transparency wouldn&#8217;t mean sacrificing ad revenues. The point of data dignity isn&#8217;t that organizations should stop using data; it&#8217;s that they should put consumers in charge of how their data is used and empower them to make more meaningful choices. Research shows that consumers are perfectly willing to exchange their data for things they value.</p><p>By winning and retaining loyal subscribers who actively opt in to share more data than they otherwise would, publishers will ultimately wind up with a richer source of high-quality and properly permissioned data derived from readers with whom they have an enduring connection.&nbsp;</p><p>The news media should focus on pushing back against the spread of low-quality ad-driven and AI-generated content. That starts with taking readers&#8217; data rights seriously &#8211; and advocating for the transparency and meaningful data controls that both consumers and publishers deserve.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!v38s!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3ca8d7e9-92f5-418d-bacc-04b97858b557_1024x631.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!v38s!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3ca8d7e9-92f5-418d-bacc-04b97858b557_1024x631.jpeg 424w, https://substackcdn.com/image/fetch/$s_!v38s!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3ca8d7e9-92f5-418d-bacc-04b97858b557_1024x631.jpeg 848w, https://substackcdn.com/image/fetch/$s_!v38s!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3ca8d7e9-92f5-418d-bacc-04b97858b557_1024x631.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!v38s!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3ca8d7e9-92f5-418d-bacc-04b97858b557_1024x631.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!v38s!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3ca8d7e9-92f5-418d-bacc-04b97858b557_1024x631.jpeg" width="1024" height="631" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/3ca8d7e9-92f5-418d-bacc-04b97858b557_1024x631.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:631,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:149659,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!v38s!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3ca8d7e9-92f5-418d-bacc-04b97858b557_1024x631.jpeg 424w, https://substackcdn.com/image/fetch/$s_!v38s!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3ca8d7e9-92f5-418d-bacc-04b97858b557_1024x631.jpeg 848w, https://substackcdn.com/image/fetch/$s_!v38s!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3ca8d7e9-92f5-418d-bacc-04b97858b557_1024x631.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!v38s!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3ca8d7e9-92f5-418d-bacc-04b97858b557_1024x631.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><div class="captioned-button-wrap" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/for-news-media-consumer-data-privacy?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="CaptionedButtonToDOM"><div class="preamble"><p class="cta-caption">Thanks for reading The Ethical Tech Project! This post is public so feel free to share it.</p></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/for-news-media-consumer-data-privacy?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/for-news-media-consumer-data-privacy?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p></div><div><hr></div><h1><strong>What We&#8217;re Reading on Ethical Tech This Week</strong></h1><p>Every week, we round up the latest in Ethical Tech. Subscribe now and also get our monthly digest, the <strong>Ethical Tech News Roundup</strong>!</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/subscribe?"><span>Subscribe now</span></a></p><ul><li><p>AdExchanger -&nbsp;<a href="https://www.adexchanger.com/data-driven-thinking/google-wont-kill-off-cookies-consumers-will-and-thats-how-it-should-be/">Google Won&#8217;t Kill Off Cookies, Consumers Will &#8211; And That&#8217;s How It Should Be</a></p><ul><li><p>Google&#8217;s decision to cancel cookie deprecation may spell the end for third party cookies. </p></li></ul></li><li><p>IAPP -&nbsp;<a href="https://iapp.org/news/a/a-view-from-dc-bracing-for-the-state-privacy-consumer-protection-one-two-enforcement-punch">A view from DC: Bracing for the state privacy, consumer protection one-two enforcement punch</a></p><ul><li><p>The future of state privacy laws.</p></li></ul></li><li><p>The Drum -&nbsp;<a href="https://www.thedrum.com/news/2024/08/14/california-s-ai-bill-headed-sparking-fierce-debate-silicon-valley">California&#8217;s AI bill is sparking fierce debate in Silicon Valley</a></p><ul><li><p>California&#8217;s first landmark AI bill is heading for the State Assembly tomorrow. Supporters say it&#8217;s a strong first step towards regulating the technology, but opponents worry it will strangle innovation.</p></li></ul></li><li><p>FastCo -&nbsp;<a href="https://www.fastcompany.com/91175277/general-motors-texas-driver-data-ron-wyden-ed-markey">The next frontier in the battle over data privacy doesn&#8217;t revolve around a keyboard or smartphone. It involves what happens inside your car.</a></p><ul><li><p>The next frontier in the battle over data privacy doesn&#8217;t revolve around a keyboard or smartphone. It involves what happens inside your car.&nbsp;</p><p></p><p></p></li></ul></li></ul>]]></content:encoded></item><item><title><![CDATA[Photos of your children are being used to train AI without your permission, and there’s nothing you can do about it]]></title><description><![CDATA[Without legislation to protect consumers from predatory AI scraping, even our most personal information is at risk of becoming training for AI models.]]></description><link>https://news.ethicaltechproject.org/p/photos-of-your-children-are-being</link><guid isPermaLink="false">https://news.ethicaltechproject.org/p/photos-of-your-children-are-being</guid><dc:creator><![CDATA[JJ]]></dc:creator><pubDate>Tue, 20 Aug 2024 17:25:54 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!xcCm!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F534045f8-c1bb-49ad-9e13-04f5f780a3df_594x330.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Note: <a href="https://thehill.com/opinion/technology/4794388-ai-training-children-photos-privacy-chevron/amp/">This editorial previously appeared The Hill</a></em></p><p>Human Rights Watch just completed a <a href="https://www.hrw.org/news/2024/07/03/australia-childrens-personal-photos-misused-power-ai-tools">sweeping audit of AI training materials</a> and revealed that pictures of children scraped from the internet were used to train models &#8212; without the consent of the children or their families.&nbsp;&nbsp;</p><p>This already isn&#8217;t great, but it gets much worse.&nbsp;&nbsp;</p><p>According to HRW: &#8220;Some children&#8217;s names are listed in the accompanying caption or the URL where the image is stored. In many cases, their identities are easily traceable, including information on when and where the child was at the time their photo was taken.&#8221;</p><p>Did I mention it gets worse? Many of the images that were scraped weren&#8217;t publicly available on the internet but were hidden behind privacy settings on popular social media sites.&nbsp;&nbsp;</p><p>In other words, some parents who thought they were doing everything right in sharing images of their kids are about to find out just how wrong they were.&nbsp;&nbsp;</p><p>I&#8217;m not unsympathetic. I&#8217;m from Australia. I live with my wife and kids in the states. There was a time when social media seemed like the perfect vehicle to keep friends and loved ones up to date on my growing family. Ultimately, I realized that I was violating my kid&#8217;s privacy &#8212; and that later in life, they might not want these pictures online and available.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Ethical Tech Project! Subscribe for free to receive new posts.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p><p>Sharenting &#8212; posting information, pictures and stories about your kid&#8217;s life online &#8212; has increasingly been under fire for a lot of very legitimate reasons. A three-year-old can&#8217;t meaningfully consent to their parents sharing their potty training fail video for the world to see. It might seem like innocent enough fun, but a three-year-old doesn&#8217;t stay three years old forever, and today&#8217;s children will have extensive information about them online well before they&#8217;re of consenting age.&nbsp;&nbsp;</p><p>But aside from a child not being able to consent, HRW&#8217;s report reveals that adult parents have no way of knowing what the long-term implications of sharenting might be. Ten years ago, nobody imagined that the photo album they shared of their family vacation might be ingested into machine learning. There are real unintended consequences already rolling out.&nbsp;&nbsp;</p><p>Of course, a reasonable reading might be that this shouldn&#8217;t be allowed at all. Why do for-profit AI companies have the right to train on anybody else&#8217;s data? Let alone children&#8217;s? Let alone data hidden behind privacy settings?&nbsp;&nbsp;</p><p>Surely the Federal Trade Commission will have something to say about this. Except that, as of last month, the FTC and every other federal agency had its hands tied behind its back when the Supreme Court <a href="https://www.scotusblog.com/2024/06/supreme-court-strikes-down-chevron-curtailing-power-of-federal-agencies/">ruled against the Chevron doctrine</a> &#8212; taking power out of the hands of federal agencies and giving them to the courts.&nbsp;&nbsp;</p><p>&#8220;In one fell swoop, the majority today gives itself exclusive power over every open issue&#8212;no matter how expertise-driven or policy-laden&#8212;involving the meaning of regulatory law,&#8221; <a href="https://www.newsweek.com/supreme-court-scotus-chevron-kagan-dissent-1918938">wrote Justice Elena Kagan</a> in her dissent from the ruling. &#8220;As if it did not have enough on its plate, the majority turns itself into the country&#8217;s administrative czar.&#8221;&nbsp;&nbsp;</p><p>If a federal privacy law wasn&#8217;t cooked before, it&#8217;s certainly cooked now. The overwhelming result will be to push privacy legislation back to the states. Meanwhile, federal decisions will stay in limbo as understaffed courts with no special insight on privacy try to wade through a workload that they are neither prepared or equipped for.&nbsp;&nbsp;</p><p>While we wait, AI will continue scraping kids&#8217; data &#8212; and, ultimately, whether or not that&#8217;s a perfectly legal thing to do will come down to the state you live in.&nbsp;&nbsp;</p><p>Sharing photos of your kid&#8217;s little league game might be a fun way to stay connected to family near and far, but until meaningful protections are in place, it&#8217;s a risk I wouldn&#8217;t advise anybody to take. We deserve data dignity, we deserve ethical technology, we deserve sound and responsible guardrails for AI. At present, we have none of that &#8212; and the Supreme Court&#8217;s decision adds a significant hurdle to ever achieving those things.&nbsp;&nbsp;</p><p>In the meantime, Big Tech has been left to make its own rules. Perhaps the only way to get their attention is to delete the apps, stop posting and cease feeding the beast.&nbsp;&nbsp;</p><p>State legislators can&#8217;t act fast enough</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!xcCm!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F534045f8-c1bb-49ad-9e13-04f5f780a3df_594x330.webp" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!xcCm!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F534045f8-c1bb-49ad-9e13-04f5f780a3df_594x330.webp 424w, https://substackcdn.com/image/fetch/$s_!xcCm!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F534045f8-c1bb-49ad-9e13-04f5f780a3df_594x330.webp 848w, https://substackcdn.com/image/fetch/$s_!xcCm!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F534045f8-c1bb-49ad-9e13-04f5f780a3df_594x330.webp 1272w, https://substackcdn.com/image/fetch/$s_!xcCm!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F534045f8-c1bb-49ad-9e13-04f5f780a3df_594x330.webp 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!xcCm!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F534045f8-c1bb-49ad-9e13-04f5f780a3df_594x330.webp" width="594" height="330" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/534045f8-c1bb-49ad-9e13-04f5f780a3df_594x330.webp&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:330,&quot;width&quot;:594,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:25788,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/webp&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!xcCm!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F534045f8-c1bb-49ad-9e13-04f5f780a3df_594x330.webp 424w, https://substackcdn.com/image/fetch/$s_!xcCm!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F534045f8-c1bb-49ad-9e13-04f5f780a3df_594x330.webp 848w, https://substackcdn.com/image/fetch/$s_!xcCm!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F534045f8-c1bb-49ad-9e13-04f5f780a3df_594x330.webp 1272w, https://substackcdn.com/image/fetch/$s_!xcCm!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F534045f8-c1bb-49ad-9e13-04f5f780a3df_594x330.webp 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><p>.</p><div class="captioned-button-wrap" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/photos-of-your-children-are-being?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="CaptionedButtonToDOM"><div class="preamble"><p class="cta-caption">Thanks for reading The Ethical Tech Project. This post is public so feel free to share it.</p></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/photos-of-your-children-are-being?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/photos-of-your-children-are-being?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p></div><p></p><div><hr></div><h1><strong>What We&#8217;re Reading on Ethical Tech This Week</strong></h1><p>Every week, we round up the latest in Ethical Tech. Subscribe now and also get our monthly digest, the <strong>Ethical Tech News Roundup</strong>!</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/subscribe?"><span>Subscribe now</span></a></p><ul><li><p><strong>Infosecurity Magazine </strong>- <a href="https://www.infosecurity-magazine.com/opinions/apple-trust-ai-data/">Apple Must Convince Us to Trust AI With Our Data</a></p><ul><li><p>Can <em>Apple Intelligence</em> solve the issues of AI security?&nbsp;</p></li></ul></li><li><p><strong>PRN</strong> - <a href="https://www.prnewswire.com/news-releases/onetrust-named-to-the-forbes-cloud-100-for-sixth-consecutive-year-302221140.html">OneTrust Named to the Forbes Cloud 100 for Sixth Consecutive Year</a></p><ul><li><p><em>OneTrust</em>&#8211; the market leading platform for responsible AI use&#8211; continues to see growing success.</p></li></ul></li><li><p><strong>Bleeping Computer</strong> - <a href="https://www.bleepingcomputer.com/news/artificial-intelligence/x-faces-gdpr-complaints-for-unauthorized-use-of-data-for-ai-training/">X faces GDPR complaints for unauthorized use of data for AI training</a>&nbsp;</p><ul><li><p><em>X may have covertly used millions of users' data to train their Grok AI.&nbsp;</em></p></li></ul></li><li><p><strong>Performance Marketing World</strong> - <a href="https://www.performancemarketingworld.com/article/1884713/navigating-compliance-maze-b2b-marketing-world-data-privacy-regulations">Navigating the compliance maze: B2B marketing in a world of data privacy regulations</a></p><ul><li><p>Keeping up with data privacy compliance in B2B marketing.&nbsp;</p></li></ul></li><li><p><strong>The Conversation</strong> - <a href="https://theconversation.com/a-bipartisan-data-privacy-law-could-backfire-on-small-businesses-2-marketing-professors-explain-why-234771">A bipartisan data-privacy law could backfire on small businesses &#8722; 2 marketing professors explain why</a></p><ul><li><p>Analysis of the potential downsides to a privacy bill.</p></li></ul></li><li><p><strong>CSIS</strong> - <a href="https://www.csis.org/analysis/protecting-data-privacy-baseline-responsible-ai">Protecting Data Privacy as a Baseline for Responsible AI</a></p><ul><li><p>How should we protect our privacy in the ever-accelerating world of AI?</p></li></ul></li><li><p><strong>Digiday </strong>- <a href="https://digiday.com/marketing/wtf-is-surveillance-pricing/">WTF is surveillance pricing?</a></p><ul><li><p>Are we moving towards a world where the price of goods will be determined by your personal data?</p></li></ul></li><li><p><strong>Computer Weekly</strong> - <a href="https://www.computerweekly.com/news/366602448/Australias-cyber-security-skills-gap-remains-pressing-issue">Australia&#8217;s cyber security skills gap remains pressing issue</a></p><ul><li><p>Without sufficient cyber-security professionals, Australia faces an increased risk of data breaches.</p></li></ul></li><li><p><strong>NRF </strong>- <a href="https://www.nortonrosefulbright.com/en-ca/knowledge/publications/46ec98d2/ised-s-consultation-on-ai-compute-canadas-race-to-remain-globally-competitive">ISED&#8217;s consultation on AI compute: Canada&#8217;s race to remain globally competitive</a></p><ul><li><p>Canada is pushing into the growing AI and computational resource market.</p><p></p></li></ul></li></ul><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/photos-of-your-children-are-being?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/photos-of-your-children-are-being?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p>]]></content:encoded></item><item><title><![CDATA[Data Privacy in the Age of AI Means Moving Beyond Buzzwords]]></title><description><![CDATA[Privacy is possible, but only if companies move beyond empty promises and commit to ethical data practices.]]></description><link>https://news.ethicaltechproject.org/p/data-privacy-in-the-age-of-ai-means</link><guid isPermaLink="false">https://news.ethicaltechproject.org/p/data-privacy-in-the-age-of-ai-means</guid><dc:creator><![CDATA[The Ethical Tech Project]]></dc:creator><pubDate>Tue, 02 Jul 2024 13:02:50 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Evxy!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0c5abff5-9019-4a3c-bab1-546be4d569b9_2877x1918.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><strong>By <a href="https://www.linkedin.com/in/maritzaj/">Dr. Maritza Johnson</a>, Ethical Tech Project Board Member</strong></p><p><em>Note: <a href="https://www.informationweek.com/data-management/data-privacy-in-the-age-of-ai-means-moving-beyond-buzzwords">This editorial originally appeared in the May 2024 edition of </a></em><a href="https://www.informationweek.com/data-management/data-privacy-in-the-age-of-ai-means-moving-beyond-buzzwords">Information Week</a></p><p>As tech companies fixate on taking advantage of the latest developments in artificial intelligence, people&#8217;s privacy concerns are being disregarded in the pursuit of new features and profit opportunities.&nbsp;&nbsp;</p><p>Many companies justify their actions by upholding a false narrative that people do not truly care about privacy, but any perceived apathy is a product of generations of companies choosing to not invest in giving customers meaningful privacy choices. Privacy is not dead, if anything it is more relevant than ever in the face of emerging AI tools built on people&#8217;s data. Companies need to acknowledge the importance of privacy and start investing accordingly. &nbsp;</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Ethical Tech Project! Subscribe for free to receive new posts.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>In reality, it is companies themselves, not consumers, that often disregard privacy concerns. Look no further than the <a href="https://www.theguardian.com/technology/2024/feb/15/23andme-hack-data-genetic-data-selling-response">recent data breach</a> at 23andMe as an example of a corporation blaming everyone but themselves for their own mistakes.&nbsp;&nbsp;</p><p>In recent months the company disclosed a data leak affecting half their customers, approximately 7 million people. For many, this included genetic information, sensitive health information, and a list of their relatives. Instead of acknowledging their own privacy failures, the company has responded by blaming users for not updating their passwords and downplaying the breach by claiming the information, &#8220;cannot be used for any harm.&#8221; The company is now being sued by users in a class-action lawsuit for negligence.&nbsp;&nbsp;</p><div class="captioned-button-wrap" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/data-privacy-in-the-age-of-ai-means?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="CaptionedButtonToDOM"><div class="preamble"><p class="cta-caption">Thank you for reading The Ethical Tech Project. This post is public so feel free to share it.</p></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/data-privacy-in-the-age-of-ai-means?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/data-privacy-in-the-age-of-ai-means?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p></div><p>We do not have to live in a world of endless breaches and privacy violations. Companies can and should prioritize privacy to maintain trust with their customers, but this does not happen by accident. Instead, it requires an unequivocal commitment to privacy from both executives and builders and an ongoing investment of resources. It is not enough to say your company is applying &#8220;privacy by design&#8221; without actually translating privacy into real company policy and practices. Privacy considerations must be in the center of product decisions from the moment you decide to use people&#8217;s data, not be added on at the end in the form of a half-hearted &#8220;retrofit&#8221;.&nbsp;</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Evxy!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0c5abff5-9019-4a3c-bab1-546be4d569b9_2877x1918.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Evxy!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0c5abff5-9019-4a3c-bab1-546be4d569b9_2877x1918.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Evxy!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0c5abff5-9019-4a3c-bab1-546be4d569b9_2877x1918.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Evxy!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0c5abff5-9019-4a3c-bab1-546be4d569b9_2877x1918.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Evxy!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0c5abff5-9019-4a3c-bab1-546be4d569b9_2877x1918.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Evxy!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0c5abff5-9019-4a3c-bab1-546be4d569b9_2877x1918.jpeg" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0c5abff5-9019-4a3c-bab1-546be4d569b9_2877x1918.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:768330,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Evxy!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0c5abff5-9019-4a3c-bab1-546be4d569b9_2877x1918.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Evxy!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0c5abff5-9019-4a3c-bab1-546be4d569b9_2877x1918.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Evxy!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0c5abff5-9019-4a3c-bab1-546be4d569b9_2877x1918.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Evxy!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0c5abff5-9019-4a3c-bab1-546be4d569b9_2877x1918.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Photo by <a href="https://unsplash.com/@ev?utm_content=creditCopyText&amp;utm_medium=referral&amp;utm_source=unsplash">ev</a> on <a href="https://unsplash.com/photos/a-pole-with-a-bunch-of-stickers-on-it-gpjvRZyavZc?utm_content=creditCopyText&amp;utm_medium=referral&amp;utm_source=unsplash">Unsplash</a></figcaption></figure></div><p>Building for privacy will require assessing whether a company&#8217;s existing privacy metrics indicate anything of relevance. For example, simply having roles with &#8220;privacy&#8221; in the title is not an effective measure of a privacy practice. In the same vein, headcount is not a privacy solution. Just because Meta <a href="https://www.facebook.com/business/platform-safety">proudly claims</a> it has 40,000 people working on their safety and security teams does not change the fact that, <a href="https://innovation.consumerreports.org/wp-content/uploads/2024/01/CR_Who-Shares-Your-Information-With-Facebook.pdf">according to Consumer Reports</a>, the average Facebook consumer has their data shared by over 2,000 different companies. Instead, companies should be focusing on metrics that evaluate data protection, customer trust and the enforcement of tangible privacy measures throughout an entire organization. &nbsp;</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/subscribe?"><span>Subscribe now</span></a></p><p>The relationship between ROI and privacy may appear at odds, but it&#8217;s a false equivalence. If you respect your customer, respect their data. This has to come from the top. This is a challenge for corporate leaders who are incentivized to focus on big, sexy innovation projects instead of mitigating privacy risks. We see this right now as companies rush to hire &#8220;chief AI officers&#8221; and deploy AI tools while the privacy implications of those tools remain an afterthought.&nbsp;&nbsp;</p><p>Leaders should care about privacy not just because it is ethical, but also because it is good for business. Privacy builds trust with your customers and increases their lifetime value to your organization. <a href="https://news.ethicaltechproject.com/p/the-profit-in-privacy-how-ethical">Polling from the Ethical Tech Project</a> found privacy features increased consumer purchasing intent by more than 15% and increased trust by over 17%. Effective privacy measures also strengthen a company&#8217;s reputation, differentiate their product, and protect against ending up on the wrong side of an investigation by the Federal Trade Commission or a state attorney general. &nbsp;</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/data-privacy-in-the-age-of-ai-means?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/data-privacy-in-the-age-of-ai-means?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p>Good privacy practices are possible, and they are attainable with a sustained, committed effort from corporate leadership and everyone who works with data. Thankfully, strategies exist to help business leaders. Two examples I am familiar with, among many, include The Ethical Tech Project&#8217;s <a href="https://theprivacystack.org/">privacy stack</a>, a privacy reference architecture for technical teams, as well as the Center for Financial Inclusion&#8217;s <a href="https://www.centerforfinancialinclusion.org/privacy-as-product-privacy-by-design-for-inclusive-finance">privacy toolkit</a> for inclusive financial products.&nbsp;&nbsp;</p><p>Privacy, or the lack of privacy, in modern technology products is a choice that every company faces. For the sake of their companies, corporate leaders can and should invest in offering their customers meaningful privacy options instead of empty promises.&nbsp;&nbsp;</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.informationweek.com/data-management/data-privacy-in-the-age-of-ai-means-moving-beyond-buzzwords&quot;,&quot;text&quot;:&quot;Read the OpEd in Information Week&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.informationweek.com/data-management/data-privacy-in-the-age-of-ai-means-moving-beyond-buzzwords"><span>Read the OpEd in Information Week</span></a></p><p><em>Dr. Maritza Johnson, Ph.D., formerly with Facebook and Google, is a Principal at&#8239;<a href="https://goodresearch.com/">Good Research</a>, Board Member at the <a href="https://www.ethicaltechproject.com/">Ethical Tech Project</a>, and was the Founding Director of the&#8239;<a href="https://www.sandiego.edu/engineering/centers/center-for-digital-civil-society/">Center for Digital Civil Society at University of San Diego</a>.&nbsp;</em></p><div class="captioned-button-wrap" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/data-privacy-in-the-age-of-ai-means?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="CaptionedButtonToDOM"><div class="preamble"><p class="cta-caption">Thank you for reading The Ethical Tech Project. This post is public so feel free to share it.</p></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/data-privacy-in-the-age-of-ai-means?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/data-privacy-in-the-age-of-ai-means?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p></div><div><hr></div><h2>What We&#8217;re Reading on Ethical Tech This Week</h2><p>Every week, we round up the latest in Ethical Tech. Subscribe now and also get our monthly digest, the <strong>Ethical Tech News Roundup</strong>!</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/subscribe?"><span>Subscribe now</span></a></p><ul><li><p><strong>The Hill</strong> - <a href="https://thehill.com/policy/technology/4738735-advertisers-business-groups-want-significant-changes-to-data-privacy-bill/">Advertisers, business groups want &#8216;significant changes&#8217; to data privacy bill</a></p><ul><li><p>How might significant alterations to the data privacy bill impact the future of online advertising and consumer trust?</p></li></ul></li><li><p><strong>Forbes</strong> - <a href="https://www.forbes.com/sites/forbesbusinesscouncil/2024/06/26/ais-emerging-privacy-threats-a-strategic-guide-for-business-leaders/">AI's Emerging Privacy Threats: A Strategic Guide For Business Leaders</a></p><ul><li><p>With AI advancing rapidly, what steps should businesses take to balance innovation with privacy concerns?</p></li></ul></li><li><p><strong>NYT</strong> - <a href="https://www.nytimes.com/article/tiktok-ban.html">Why the U.S. Is Forcing TikTok to Be Sold or Banned</a></p><ul><li><p>Millions of TikTok users would face a major disruption in their entertainment and communication routines if the ban is enforced, but are TikTok&#8217;s ethical data issues shared by other major social media platforms?</p></li></ul></li><li><p><strong>FT</strong> - <a href="https://www.ft.com/content/56585c50-7608-49f3-998d-a7e100e0ddc7">Privacy fears sap potential of female fertility tech start-ups</a></p><ul><li><p>How can female fertility tech startups overcome privacy fears to unlock their full potential?</p></li></ul></li><li><p><strong>NextGov</strong> - <a href="https://www.nextgov.com/digital-government/2024/06/house-pivots-data-privacy-bill-removing-algorithmic-discrimination-coverage/397618/">House pivots on data privacy bill, removing algorithmic discrimination coverage</a></p><ul><li><p>With the removal of algorithmic discrimination coverage, are we losing essential safeguards in our data privacy laws?</p></li></ul></li><li><p><strong>IAPP</strong> - <a href="https://iapp.org/news/a/ahead-of-2025-federal-election-will-canada-pass-bill-c-27-">Ahead of 2025 federal election, will Canada pass Bill C-27?</a></p><ul><li><p>As Canada considers Bill C-27, the country stands on the brink of major privacy law reforms.</p></li></ul></li><li><p><strong>SCMagazine</strong> - <a href="https://www.scmagazine.com/resource/executives-are-bullish-about-ai-capabilities-but-worry-about-data-privacy-and-security">Executives bullish about AI capabilities, but worry about data privacy and security</a></p><ul><li><p>With so much confidence in AI, how should executives tackle the ethical and bias challenges the article mentions?</p></li></ul></li><li><p><strong>Digital Journal </strong>- <a href="https://www.digitaljournal.com/life/data-privacy-change-and-what-consumers-actually-want/article">https://www.digitaljournal.com/life/data-privacy-change-and-what-consumers-actually-want/article</a></p><ul><li><p>Are Businesses Truly Respecting Your Privacy Preferences?</p></li></ul></li><li><p><strong>AdNews</strong> - <a href="https://www.adnews.com.au/news/media-agencies-weigh-in-australia-s-privacy-laws-vs-gdpr">Media agencies weigh in: Australia&#8217;s privacy laws vs GDPR</a></p><ul><li><p>With Australia&#8217;s new privacy laws on the horizon, are media agencies prepared to handle the stricter consent requirements?</p></li></ul></li><li><p><strong>Biometric Update</strong> - <a href="https://www.biometricupdate.com/202406/biometrics-developers-dance-with-data-privacy-regulations-continues">Biometrics developers dance with data privacy regulations continues</a></p><ul><li><p>How are biometrics developers adapting to the ever-changing landscape of global data privacy regulations?</p></li></ul></li></ul><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Ethical Tech Project! Subscribe for free to receive new posts.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><div><hr></div><h2><a href="https://news.ethicaltechproject.com/podcast">Conversations in Ethical Tech</a></h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Q8wG!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Q8wG!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 424w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 848w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 1272w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Q8wG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png" width="428" height="240.75" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:428,&quot;bytes&quot;:847505,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:&quot;&quot;,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!Q8wG!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 424w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 848w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 1272w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>In this episode, JJ and Maritza explore the concept of <strong>dark patterns</strong>, meaning deceptive design practices that manipulate users into taking actions they didn't intend to.</p><ul><li><p>How does the choice architecture of the tech products you use impact your personal privacy and data protection?</p></li><li><p>How can good product design and business outcomes be balanced against avoiding deceptive practices?</p></li><li><p>Learn a few common-sense suggestions that you can take as a developer, leader, or consumer, including providing feedback to companies, filing complaints with regulatory bodies, and raising awareness about dark patterns.</p></li></ul><p><strong><a href="https://news.ethicaltechproject.com/p/what-are-dark-patterns">Listen Now!</a></strong></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.com/p/what-are-dark-patterns&quot;,&quot;text&quot;:&quot;Listen to the Latest Podcast&quot;,&quot;action&quot;:null,&quot;class&quot;:&quot;button-wrapper&quot;}" data-component-name="ButtonCreateButton"><a class="button primary button-wrapper" href="https://news.ethicaltechproject.com/p/what-are-dark-patterns"><span>Listen to the Latest Podcast</span></a></p><div><hr></div>]]></content:encoded></item><item><title><![CDATA[Why Brands Must Do More to Protect Consumers' Personal Information]]></title><description><![CDATA[Consumers still need stronger protection against &#8216;surveillance capitalism&#8217;, the Ethical Tech Project writes in AdAge]]></description><link>https://news.ethicaltechproject.org/p/why-brands-must-do-more-to-protect</link><guid isPermaLink="false">https://news.ethicaltechproject.org/p/why-brands-must-do-more-to-protect</guid><dc:creator><![CDATA[JJ]]></dc:creator><pubDate>Thu, 27 Jun 2024 15:01:41 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!XDIR!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5c6c153b-39d1-4e6d-a607-653e20e858b0_2400x1800.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Note: <a href="https://adage.com/article/opinion/data-privacy-why-brands-must-do-more-protect-consumers-personal-information/2560721">This editorial previously appeared in the May edition of AdAge</a></em></p><p>For some years now, we've been predicting a tipping point in consumer awareness that would motivate brands to adopt responsible data practices.&nbsp;</p><p>We have long understood that consumers deeply value their privacy. However, there was a nagging question about their seeming apathy toward the current state of data privacy&#8212;a feeling helplessness over how their data is used and even abused.</p><p>As we delved deeper, we realized that this apathy didn't mean a lack of concern; rather, it reflected the overwhelming burden placed on consumers to manage and protect their own data and hold brands accountable for data practices.&nbsp;</p><p>Consider the EU&#8217;s General Data Protection Regulation (GDPR), a landmark in data privacy and the dawn of a new era where consumer rights are respected, and businesses are committed to transparency. Five years on, it's clear that one of its central tenets&#8212;opt-in consent&#8212;does not equate to &#8220;informed&#8221; consent for consumers. How many people are willing to sift through pages of data agreements before consenting to how their data is manipulated? This approach is simply not right for most people.&nbsp;</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Ethical Tech Project! Subscribe for free to receive new posts.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>Some argue that interest in data privacy varies by population segment&#8212;a tech-savvy, ad-blocking user might care, but not online shoppers or avid users of social media apps. Or that Gen-Zers and millennials care more than boomers.&nbsp; There is some truth to those statements, but not with the framing.</p><p>Everyone cares about data privacy&#8212;but care more or less depends on the type of data in question. Consumers might not mind sharing location data if they believe the value exchange is worth it, while others may recoil at how behavioral data is used to generate hyper-personalized ads.</p><p>Each consumer&#8217;s threshold is unique. Are you comfortable with your location data, behavioral patterns and purchasing behaviors being accessed? How about the way you drive, where you go and how you brake and accelerate?&nbsp;</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!XDIR!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5c6c153b-39d1-4e6d-a607-653e20e858b0_2400x1800.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!XDIR!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5c6c153b-39d1-4e6d-a607-653e20e858b0_2400x1800.jpeg 424w, https://substackcdn.com/image/fetch/$s_!XDIR!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5c6c153b-39d1-4e6d-a607-653e20e858b0_2400x1800.jpeg 848w, https://substackcdn.com/image/fetch/$s_!XDIR!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5c6c153b-39d1-4e6d-a607-653e20e858b0_2400x1800.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!XDIR!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5c6c153b-39d1-4e6d-a607-653e20e858b0_2400x1800.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!XDIR!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5c6c153b-39d1-4e6d-a607-653e20e858b0_2400x1800.jpeg" width="1456" height="1092" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/5c6c153b-39d1-4e6d-a607-653e20e858b0_2400x1800.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1092,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1205498,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!XDIR!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5c6c153b-39d1-4e6d-a607-653e20e858b0_2400x1800.jpeg 424w, https://substackcdn.com/image/fetch/$s_!XDIR!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5c6c153b-39d1-4e6d-a607-653e20e858b0_2400x1800.jpeg 848w, https://substackcdn.com/image/fetch/$s_!XDIR!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5c6c153b-39d1-4e6d-a607-653e20e858b0_2400x1800.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!XDIR!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5c6c153b-39d1-4e6d-a607-653e20e858b0_2400x1800.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em>Photo by <a href="https://unsplash.com/@nathaliarosa?utm_content=creditCopyText&amp;utm_medium=referral&amp;utm_source=unsplash">Nath&#225;lia Rosa</a> on <a href="https://unsplash.com/photos/goods-on-shelf-rWMIbqmOxrY?utm_content=creditCopyText&amp;utm_medium=referral&amp;utm_source=unsplash">Unsplash</a></em></figcaption></figure></div><h4>Is car data the tipping point?</h4><p>The New York Times <a href="https://archive.is/o/5HhwH/https://www.nytimes.com/2024/03/11/technology/carmakers-driver-tracking-insurance.html">recently highlighted</a> how some car manufacturers are collecting and sharing driving data, including details of every trip, speed, acceleration and braking information.&nbsp;This data is sold to data brokers and has been used by auto insurers to, among other things, increase policy premiums.&nbsp;The crux of the issue is the lack of transparency about data-sharing practices. Drivers were in the dark about driving behavior being monitored. Some manufacturers such as General Motors which was called out in the article, had not sufficiently informed drivers about how their data was being used.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/subscribe?"><span>Subscribe now</span></a></p><p>Within weeks of the Times article,&nbsp;<a href="https://archive.is/o/5HhwH/https://www.nytimes.com/2024/03/22/technology/gm-onstar-driver-data.html">GM stopped sharing&nbsp;</a>data with the two data brokers involved in creating insurance risk profiles. The rapidity with which GM addressed its data-sharing practices was noteworthy&#8212;practically instantaneous.</p><p>Consumers are growing increasingly aware of how their data is used and surprised at the pervasiveness. And it&#8217;s not just auto data.</p><p>Take the backlash against Amazon subsidiary, Ring, and its smart doorbells. The discovery that Ring was sharing user video footage with police without explicit user consent was tinder for increased transparency and user control over their data. Ring responded by <a href="https://archive.is/o/5HhwH/https://www.theverge.com/2024/1/24/24049165/ring-police-neighbors-app-clips-search-warrant">changing its policies</a> on data sharing with law enforcement.</p><p>Regulatory enforcement continues to highlight egregious cases and is a catalyst for further awareness of data privacy issues among individuals. The rhetoric among regulators in the U.S. is that advertising is &#8220;surveillance capitalism,&#8221; and at times it certainly feels like it.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/why-brands-must-do-more-to-protect?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/why-brands-must-do-more-to-protect?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p>As part of a <a href="https://archive.is/o/5HhwH/https://www.ftc.gov/policy/advocacy-research/tech-at-ftc/2024/03/ftc-cracks-down-mass-data-collectors-closer-look-avast-x-mode-inmarket">crackdown on mass data collectors</a>, The Federal Trade Commission last month issued a warning to businesses in the data trade: &#8220;Browsing and location data are sensitive. Full stop.&#8221; The FTC&#8217;s broadened definition of sensitive information is aimed at curbing the widespread and intrusive practices taking hold in the martech and ad tech ecosystems.</p><h4>The new standard</h4><p>In the digital age, we've somewhat resigned ourselves to the idea that our data is collected and shared as part of how the internet works. We trade bits of our data for the benefits of custom experiences, attractive discounts and personalized content. But at a certain point, we need to draw a line. Maybe it&#8217;s automotive data, maybe it&#8217;s health or other personal data. Will revelations about surveillance, such as those reported in the New York Times regarding automotive data, spark a collective realization that we've reached our limit?</p><p>The question on everyone's mind should be: Is nothing sacred anymore? In a world where individuals have varying thresholds for when they care about data privacy, there is only one option: choice. Data dignity demands that brands provide people with transparency on their data practices, and choice and control over how that data is used.&nbsp;</p><p>In the words of Steve Jobs in 2010, &#8220;Privacy means people know what they're signing up for, in plain English and repeatedly. People are smart and some people want to share more data than other people do. Ask them. Ask them every time. Make them tell you to stop asking them if they get tired of your asking them. Let them know precisely what you're going to do with their data.&#8221;</p><div class="captioned-button-wrap" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/why-brands-must-do-more-to-protect?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="CaptionedButtonToDOM"><div class="preamble"><p class="cta-caption">Thank you for reading The Ethical Tech Project. This post is public so feel free to share it.</p></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/why-brands-must-do-more-to-protect?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/why-brands-must-do-more-to-protect?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://adage.com/article/opinion/data-privacy-why-brands-must-do-more-protect-consumers-personal-information/2560721&quot;,&quot;text&quot;:&quot;Read the OpEd in AdAge&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://adage.com/article/opinion/data-privacy-why-brands-must-do-more-protect-consumers-personal-information/2560721"><span>Read the OpEd in AdAge</span></a></p><div><hr></div><h2>What We&#8217;re Reading on Ethical Tech This Week</h2><p>Every week, we round up the latest in Ethical Tech. Subscribe now and also get our monthly digest, the <strong>Ethical Tech News Roundup</strong>!</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/subscribe?"><span>Subscribe now</span></a></p><ul><li><p><strong>Statescoop</strong> - <a href="https://statescoop.com/vermont-gov-phil-scott-privacy-veto-data-rights/">One governor&#8217;s veto won&#8217;t stop the momentum on data rights for all</a></p><ul><li><p>Governor Phil Scott&#8217;s veto of a key privacy bill in Vermont stands out because it challenges the growing movement towards stronger state-level data protection laws.</p></li></ul></li><li><p><strong>SCMagazine</strong> - <a href="https://www.scmagazine.com/brief/vermont-data-privacy-legislation-rejected-by-governor">Vermont data privacy legislation rejected by governor</a></p><ul><li><p>What does Vermont&#8217;s surprising veto mean for the future of data privacy protections across the country?</p></li></ul></li><li><p><strong>Forbes</strong> - <a href="https://www.forbes.com/sites/forbesbusinesscouncil/2024/06/18/four-ways-data-privacy-laws-are-reshaping-business-practices/">Four Ways Data Privacy Laws Are Reshaping Business Practices</a></p><ul><li><p>How are new data privacy laws transforming the way businesses handle your personal information?</p></li></ul></li></ul><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/why-brands-must-do-more-to-protect?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/why-brands-must-do-more-to-protect?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><ul><li><p><strong>The Record</strong> - <a href="https://therecord.media/ftc-files-complaint-tiktok-data-privacy">FTC files complaint against TikTok for alleged data privacy practices</a></p><ul><li><p>The FTC&#8217;s complaint against TikTok is a striking development that underscores growing concerns over how social media giants manage and protect user data.</p></li></ul></li><li><p><strong>Forbes</strong> -&nbsp; <a href="https://www.forbes.com/sites/douglaslaney/2024/06/12/gdpr-violations-and-fines-trends-insights-and-compliance-strategies/">GDPR violation and trends</a></p><ul><li><p>Rising GDPR fines and violations reveal a compelling trend where companies are being forced to drastically rethink their data protection and compliance strategies.</p></li></ul></li><li><p><strong>Lexology</strong> - <a href="https://www.lexology.com/library/detail.aspx?g=b7b10034-3d9a-47a8-a639-2a3118bc288c">American privacy rights act on the move with significant amendments</a></p><ul><li><p>How are global data privacy regulations changing the way companies protect your personal information?</p></li></ul></li><li><p><strong>The Record</strong> - <a href="https://therecord.media/23andme-data-breach-canada-uk-privacy-investigation">Privacy authorities in Canada and UK announce joint probe of 23andMe data breach</a></p><ul><li><p>The 23andMe data breach, now under investigation in Canada and the UK, highlights critical vulnerabilities in the protection of sensitive genetic information.</p></li></ul></li><li><p><strong>Bloomberg</strong> -<a href="https://www.bloomberg.com/news/articles/2024-06-20/hackers-auction-off-stolen-lendingtree-consumers-information"> Hackers Auction Off Stolen LendingTree Consumers&#8217; Data</a></p><ul><li><p>Hackers are selling data about consumers of the LendingTree subsidiary QuoteWizard after the company detected unauthorized access on a cloud database hosted by Snowflake, but the size and scope of the leak is still being investigated.</p></li></ul></li><li><p><strong>The Hacker News</strong> - <a href="https://thehackernews.com/2024/06/meta-halts-ai-training-on-eu-user-data.html">Meta Pauses AI Training on EU User Data Amid Privacy Concerns</a></p><ul><li><p>Meta&#8217;s decision to halt AI training using EU user data underscores the growing impact of stringent global data privacy regulations on tech giants&#8217; operations.</p></li></ul></li><li><p><strong>The Verge</strong> - <a href="https://www.theverge.com/2024/6/14/24178591/meta-ai-assistant-europe-ireland-privacy-objections">Meta suspends halts AI training amid privacy concerns</a></p><ul><li><p>Meta&#8217;s new AI assistant is facing privacy objections in Europe, highlighting the ongoing tension between innovation and stringent data protection laws.</p></li></ul></li></ul><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/why-brands-must-do-more-to-protect?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/why-brands-must-do-more-to-protect?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><ul><li><p><strong>The Guardian</strong> - <a href="https://www.theguardian.com/australia-news/article/2024/jun/17/betterhelp-online-therapy-potential-investigation-australian-privacy-laws">&#8216;Alarm bells should be going off&#8217; as mental health counselling app expands into Australia, critics say</a></p><ul><li><p>What does the possible investigation into BetterHelp say about the difficulties of safeguarding your personal data in online therapy?</p></li></ul></li><li><p><strong>Silicon Angles</strong> - <a href="https://siliconangle.com/2024/06/12/enhancing-data-governance-in-financial-services-awsfinserv/">Adancing Data Governance in Financial Services</a></p><ul><li><p>&#8203;&#8203;AWS&#8217;s innovative approach to data governance in the financial services sector promises to significantly enhance the security and management of sensitive financial data, potentially setting new industry standards.</p></li></ul></li></ul><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Ethical Tech Project! Subscribe for free to receive new posts.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><div><hr></div><h2><a href="https://news.ethicaltechproject.com/podcast">Conversations in Ethical Tech</a></h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Q8wG!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Q8wG!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 424w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 848w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 1272w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Q8wG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png" width="428" height="240.75" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:428,&quot;bytes&quot;:847505,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:&quot;&quot;,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!Q8wG!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 424w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 848w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 1272w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>In this episode, JJ and Maritza explore the concept of <strong>dark patterns</strong>, meaning deceptive design practices that manipulate users into taking actions they didn't intend to.</p><ul><li><p>How does the choice architecture of the tech products you use impact your personal privacy and data protection?</p></li><li><p>How can good product design and business outcomes be balanced against avoiding deceptive practices?</p></li><li><p>Learn a few common-sense suggestions that you can take as a developer, leader, or consumer, including providing feedback to companies, filing complaints with regulatory bodies, and raising awareness about dark patterns.</p></li></ul><p><strong><a href="https://news.ethicaltechproject.com/p/what-are-dark-patterns">Listen Now!</a></strong></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.com/p/what-are-dark-patterns&quot;,&quot;text&quot;:&quot;Listen to the Latest Podcast&quot;,&quot;action&quot;:null,&quot;class&quot;:&quot;button-wrapper&quot;}" data-component-name="ButtonCreateButton"><a class="button primary button-wrapper" href="https://news.ethicaltechproject.com/p/what-are-dark-patterns"><span>Listen to the Latest Podcast</span></a></p><div><hr></div>]]></content:encoded></item><item><title><![CDATA[One governor’s veto won’t stop the momentum on data rights for all]]></title><description><![CDATA[Vermont Gov. Phil Scott last week squashed a data privacy bill with strong consumer protections, but other states will pick up the mantle.]]></description><link>https://news.ethicaltechproject.org/p/one-governors-veto-wont-stop-the</link><guid isPermaLink="false">https://news.ethicaltechproject.org/p/one-governors-veto-wont-stop-the</guid><dc:creator><![CDATA[JJ]]></dc:creator><pubDate>Fri, 21 Jun 2024 14:01:30 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!kBsM!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1c934fe-eb24-4f8e-bd61-3462ee5e51e6_4928x3264.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Praised &#8211; and criticized &#8211; as one of the strongest data privacy measures in the country, Vermont&#8217;s privacy law passed the legislature last month only to be vetoed yesterday by the Governor. <br><br>The bill would have given Vermont residents the right to file lawsuits against companies that violate their privacy rights. There was little doubt that the bill would of sailed through without the private right of action &#8211; it had a ton of opposition from business groups and big tech. </p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Ethical Tech Project! Subscribe for free to receive new posts.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p><br>I had the privilege of speaking with <a href="https://www.linkedin.com/in/mepriestley/">Monique Priestley</a>, one of the bill&#8217;s sponsors and a brave advocate for consumer rights. It&#8217;s clear that people&#8217;s ability to sue violators scares Big Tech and data brokers for all the right reasons, but it scares small businesses as well. So the private right of action was limited to data brokers and large companies processing the data of at least 100,000 Vermonters, and that was enough to appease hesitant legislators, but it was not enough to stay the Governor&#8217;s veto. </p><p><a href="https://statescoop.com/vermont-gov-phil-scott-privacy-veto-data-rights/">I recently wrote in an opinion piece in Statescoop that:</a></p><blockquote><p>There are more than 500 data brokers in the U.S. who are secretly collecting your data &#8212; via your cell phone, credit cards, apps and more &#8212; forming a map and profile of every moment of your digital day and then selling it to the highest bidder.</p><p>That is who the governor&#8217;s veto protected.&nbsp;</p><p>The veto didn&#8217;t protect the Main Street shop or bar, or the maple farms that jot down emails for newsletters and loyalty programs. The law wouldn&#8217;t have done a thing to the retail site that suggests a pair of pants because you clicked on a matching shirt. People understand that data is at the heart of the value exchange with business, and they deserve transparency and choice in how that data is collected and used.</p><p>The proposed law would have empowered individuals to identify and act on privacy violations, and exercising that right shouldn&#8217;t be something that businesses who build trust with their customers should worry about. Responsible data practices should be the status quo for every business that we trust with our data.</p><p>Explaining the veto, the governor said the bill was &#8220;a national outlier and more hostile than any other state to many businesses and nonprofits.&#8221; When it comes to data privacy, being an outlier should be a point of pride in a state that holds strong to ideals of the independent American spirit.</p><p>As for Vermont&#8217;s privacy aspirations hurting small businesses? Limiting the private right of action to large companies should&#8217;ve quashed those concerns. But still, he vetoed.</p></blockquote><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://statescoop.com/vermont-gov-phil-scott-privacy-veto-data-rights/&quot;,&quot;text&quot;:&quot;Read My Full Opinion Piece&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://statescoop.com/vermont-gov-phil-scott-privacy-veto-data-rights/"><span>Read My Full Opinion Piece</span></a></p><p>The veto could still be overridden but, even if it dies on Governor Scott&#8217;s desk, I want to applaud the legislators who worked on the bill. I hope it becomes a framework for strong and fair data policy across the country.</p><div class="captioned-button-wrap" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/one-governors-veto-wont-stop-the?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="CaptionedButtonToDOM"><div class="preamble"><p class="cta-caption">Thank you for reading The Ethical Tech Project. This post is public so feel free to share it.</p></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/one-governors-veto-wont-stop-the?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/one-governors-veto-wont-stop-the?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!kBsM!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1c934fe-eb24-4f8e-bd61-3462ee5e51e6_4928x3264.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!kBsM!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1c934fe-eb24-4f8e-bd61-3462ee5e51e6_4928x3264.jpeg 424w, https://substackcdn.com/image/fetch/$s_!kBsM!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1c934fe-eb24-4f8e-bd61-3462ee5e51e6_4928x3264.jpeg 848w, https://substackcdn.com/image/fetch/$s_!kBsM!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1c934fe-eb24-4f8e-bd61-3462ee5e51e6_4928x3264.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!kBsM!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1c934fe-eb24-4f8e-bd61-3462ee5e51e6_4928x3264.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!kBsM!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1c934fe-eb24-4f8e-bd61-3462ee5e51e6_4928x3264.jpeg" width="1456" height="964" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b1c934fe-eb24-4f8e-bd61-3462ee5e51e6_4928x3264.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:964,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:12546993,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!kBsM!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1c934fe-eb24-4f8e-bd61-3462ee5e51e6_4928x3264.jpeg 424w, https://substackcdn.com/image/fetch/$s_!kBsM!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1c934fe-eb24-4f8e-bd61-3462ee5e51e6_4928x3264.jpeg 848w, https://substackcdn.com/image/fetch/$s_!kBsM!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1c934fe-eb24-4f8e-bd61-3462ee5e51e6_4928x3264.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!kBsM!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1c934fe-eb24-4f8e-bd61-3462ee5e51e6_4928x3264.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><a href="https://www.flickr.com/photos/7272600@N06/22324543055/in/photolist-A1K7SK-6RDHYK-zoqzXs-afwFoT-4ECyk-3XFMR-tewVSY-2bbmp-3XG5r-A2FoWr-SNigFj-A2FrJk-zZAB45-3f8tX2-2Abn6M-2ctLFeb-giqxif-giqud8-5XFe2G-emah9u-afzpkG-d3YHrj-Qqhems-tzernM-em4w6T-ccooqy-5coMCF-ueuHq7-z4RMjk-3XFPy-2bapWNt-3XFWy-N5bvW-ehdhFf-NMTe7D-3XFVu-5sgWP-wAAqeR-62EALG-zZAEqq-zZAGY1-cnQpAW-shQvjA-zYryDU-2bbnt-zJendD-A2FsoB-A1K4z2-powrh4-A2FjXK">Photo courtesy Flickr User Bop P.B.</a></figcaption></figure></div><div><hr></div><h2><a href="https://news.ethicaltechproject.com/podcast">Conversations in Ethical Tech</a></h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Q8wG!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Q8wG!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 424w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 848w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 1272w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Q8wG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png" width="428" height="240.75" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:428,&quot;bytes&quot;:847505,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!Q8wG!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 424w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 848w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 1272w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>In this episode, JJ and Maritza explore the concept of <strong>dark patterns</strong>, meaning deceptive design practices that manipulate users into taking actions they didn't intend to.</p><ul><li><p>How does the choice architecture of the tech products you use impact your personal privacy and data protection? </p></li><li><p>How can good product design and business outcomes be balanced against avoiding deceptive practices?</p></li><li><p> Learn a few common-sense suggestions that you can take as a developer, leader, or consumer, including providing feedback to companies, filing complaints with regulatory bodies, and raising awareness about dark patterns.</p></li></ul><p><strong><a href="https://news.ethicaltechproject.com/p/what-are-dark-patterns">Listen Now!</a></strong></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.com/p/what-are-dark-patterns&quot;,&quot;text&quot;:&quot;Listen to the Latest Podcast&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.com/p/what-are-dark-patterns"><span>Listen to the Latest Podcast</span></a></p><div><hr></div><h2><strong>What We&#8217;re Reading on Ethical Tech This Week</strong></h2><ul><li><p><strong>Statescoop</strong> - <a href="https://statescoop.com/vermont-gov-phil-scott-privacy-veto-data-rights/">One governor&#8217;s veto won&#8217;t stop the momentum on data rights for all</a></p><ul><li><p>Read the fuel piece excerpted above by JJ</p></li></ul></li></ul><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/one-governors-veto-wont-stop-the?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/one-governors-veto-wont-stop-the?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><ul><li><p><strong>Bloomberg</strong> -<a href="https://www.bloomberg.com/news/articles/2024-06-20/hackers-auction-off-stolen-lendingtree-consumers-information"> Hackers Auction Off Stolen LendingTree Consumers&#8217; Data</a></p><ul><li><p>Hackers are selling data about consumers of the LendingTree subsidiary  QuoteWizard after the company detected unauthorized access on a cloud database hosted by Snowflake, but the size and scope of the leak is still being investigated.</p></li></ul></li><li><p><strong>Digiday</strong> - <a href="https://digiday.com/marketing/wtf-is-a-financial-media-network/?utm_campaign=digidaydis&amp;utm_source=linkedin&amp;utm_medium=social&amp;utm_content=60424">WTF is a financial media network?</a></p><ul><li><p>Why are financial firms creating their own ad networks, and how will consumers be affected?</p></li></ul></li><li><p><strong>Digiday</strong> - <a href="https://digiday.com/marketing/what-it-will-take-for-advertisers-to-finally-get-ready-to-let-go-of-the-third-party-cookie/">What it will take for advertisers to finally get ready to let go of the third-party cookie</a></p><ul><li><p>What innovative solutions are advertisers discovering as they adapt to a world without third-party cookies, and what does it mean for privacy?</p></li></ul></li><li><p><strong>Tech Monitor</strong> - <a href="https://techmonitor.ai/technology/data/meta-privacy-policy-train-ai-gdpr">Meta training AI products through user data breaks European law, claim activists</a></p><ul><li><p>What surprising changes in Meta&#8217;s new privacy policy are shaking up the AI landscape under GDPR?</p></li></ul></li><li><p><strong>Office of the Privacy Commissioner of Canada</strong> - <a href="https://www.priv.gc.ca/en/opc-actions-and-decisions/ar_index/202324/ar_202324/">Trust, innovation, and protecting the fundamental right to privacy in the digital age</a></p><ul><li><p>Canada&#8217;s Privacy Watchdog weighs in on rights under the new AI paradigm</p></li></ul></li></ul>]]></content:encoded></item><item><title><![CDATA[Ethical Tech News Roundup]]></title><description><![CDATA[What we read in the last month in Ethical Tech]]></description><link>https://news.ethicaltechproject.org/p/ethical-tech-news-roundup-b78</link><guid isPermaLink="false">https://news.ethicaltechproject.org/p/ethical-tech-news-roundup-b78</guid><dc:creator><![CDATA[The Ethical Tech Project]]></dc:creator><pubDate>Mon, 17 Jun 2024 20:08:07 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Facd9e155-8f6f-4998-af19-399b4dd2aedd_6000x4000.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Every month, The Ethical Tech Project rounds up news you may have missed in Ethical Tech! Did we miss something? Post your links in the comments!</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Ethical Tech Project! Subscribe for free to receive new posts.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><h3><strong>From Our Board</strong></h3><ul><li><p><strong>AdAge</strong> - <a href="https://adage.com/article/opinion/data-privacy-why-brands-must-do-more-protect-consumers-personal-information/2560721?share-code=1X34EvvQSOI-rOq4mpbIb_o-0&amp;utm_id=gfta-em-240520">Data Privacy - Why Brands Must Do More to Protect Consumers' Personal Information</a>&nbsp;</p><ul><li><p>Consumers still need stronger protection against &#8216;surveillance capitalism&#8217;</p></li></ul></li><li><p><strong>Conversations in Ethical Tech - </strong><a href="https://news.ethicaltechproject.com/p/the-impact-of-a-federal-privacy-law">The Impact of a Federal Privacy Law</a></p><ul><li><p>JJ and Maritza dig into the recently proposed American Privacy Rights Act (<a href="https://news.ethicaltechproject.com/p/privacy-goes-to-washington">read our initial reactions in this blog post</a>). What will the impact be on business, and why will it mean more businesses adopt data minimization?</p></li></ul></li><li><p><strong>The Problem Is&#8230;</strong> - <a href="https://www.problem-is.com/p/what-you-need-to-know-about-the-ai">What You Need to Know About the AI Alignment Problem</a></p><ul><li><p>Ethical Tech Project Board Member Vivek Vaidya shares his conversation with Oxford AI ethics + alignment scholar Brian Christian</p></li></ul></li><li><p><strong>Conversations in Ethical Tech</strong> - <a href="https://news.ethicaltechproject.com/p/what-are-dark-patterns">What Are 'Dark Patterns'?</a></p><ul><li><p>JJ and Maritza explore the concept of dark patterns, meaning deceptive design practices that manipulate users into taking actions they didn't intend to. How does the choice architecture of the tech products you use impact your personal privacy and data protection?</p></li></ul></li><li><p><strong>Tom Chavez </strong>- <a href="https://news.ethicaltechproject.com/p/is-programmatic-advertising-a-threat">Is Programmatic Advertising a Threat to Democracy?</a></p><ul><li><p>Ethical Tech Project Founder reflects on the early development of the programmatic advertising industry and it&#8217;s unintended affects</p></li></ul></li><li><p><strong><a href="https://www.informationweek.com/data-management/data-privacy-in-the-age-of-ai-means-moving-beyond-buzzwords">Information Week</a></strong><a href="https://www.informationweek.com/data-management/data-privacy-in-the-age-of-ai-means-moving-beyond-buzzwords"> - </a><em><a href="https://www.informationweek.com/data-management/data-privacy-in-the-age-of-ai-means-moving-beyond-buzzwords">Data Privacy in the Age of AI Means Moving Beyond Buzzwords</a></em><a href="https://www.informationweek.com/data-management/data-privacy-in-the-age-of-ai-means-moving-beyond-buzzwords"> by Dr. Maritza Johnson</a></p><ul><li><p>Privacy is possible, but only if companies move beyond empty promises and commit to ethical data practices.</p></li></ul></li></ul><div class="captioned-button-wrap" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/ethical-tech-news-roundup-b78?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="CaptionedButtonToDOM"><div class="preamble"><p class="cta-caption">Thank you for reading The Ethical Tech Project. This post is public so feel free to share it.</p></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/ethical-tech-news-roundup-b78?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/ethical-tech-news-roundup-b78?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p></div><h3><strong>AI Regulation</strong></h3><ul><li><p><strong>Canada Chamber of Commerce</strong> - <a href="https://chamber.ca/policy-matters-5-measures-to-fix-canadas-proposed-ai-legislation/">Policy Matters: 5 Measures to Fix Canada&#8217;s Proposed AI Legislation</a></p><ul><li><p>Are Canada's current AI laws enough to keep up with the rapid pace of technological innovation?</p></li></ul></li><li><p><strong>Wired</strong> - <a href="https://www.wired.com/story/us-forming-global-ai-safety-network-key-allies/">The US Is Forming a Global AI Safety Network With Key Allies</a></p><ul><li><p>How will this international AI safety network impact the development and use of AI technologies worldwide?</p></li></ul></li><li><p><strong>Independent Sector</strong>: <a href="https://independentsector.org/blog/fashionable-legislation-how-states-are-taking-charge-of-data-privacy-and-what-it-means-for-nonprofits/">How States are taking charge of Data</a></p><ul><li><p>What unexpected challenges do state data privacy regulations pose for nonprofits, and how can they navigate this evolving legal maze?</p></li></ul></li><li><p><strong>The Record</strong> - <a href="https://therecord.media/ben-wiseman-interview-ftc-data-privacy">A top FTC official on the consumer privacy message the agency is sending to industry</a></p><ul><li><p>What steps will one of the nation&#8217;s top regulators make on privacy before the 2024 election?</p></li></ul></li></ul><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/subscribe?"><span>Subscribe now</span></a></p><h3><strong>Privacy</strong></h3><ul><li><p><strong>Digiday</strong> - <a href="https://digiday.com/marketing/ad-tech-vendor-colossus-faces-scrutiny-for-alleged-mismanaging-ids/">Ad tech vendor Colossus faces scrutiny for alleged mismanaging IDs</a></p><ul><li><p>Can we trust ad tech companies with our personal data, especially when giants like Colossus are accused of mismanaging IDs?</p></li></ul></li><li><p><strong>Spiceworks</strong> - <a href="https://www.spiceworks.com/it-security/data-security/news/vermont-data-privacy-law/">Vermont Cracks Down on Personal Data Use With New Data Privacy Law</a></p><ul><li><p>Will other states follow Vermont&#8217;s lead in tightening data privacy laws, and what could this mean for your personal data?</p></li></ul></li><li><p><strong>The Drum</strong> - <a href="https://www.thedrum.com/opinion/2024/05/14/gen-z-more-optimistic-about-ai-and-less-worried-about-data-privacy">Gen Z is more optimistic about AI &#8211; and less worried about data privacy</a></p><ul><li><p>Could Gen Z's relaxed views on AI and privacy spark a shift in how we all think about and manage our personal data?</p></li></ul></li><li><p><strong>Forbes</strong> - <a href="https://www.forbes.com/sites/forrester/2024/05/15/proposed-federal-privacy-law-gains-momentum/">Proposed Federal Privacy Law Gains Momentum</a></p><ul><li><p>Could this growing support for a federal privacy law be the breakthrough we need to finally safeguard our digital rights?</p></li></ul></li><li><p><strong>The Record</strong> - <a href="https://therecord.media/ftc-connected-cars-data-privacy-geolocation">FTC fires 'shot across the bow' at automakers over connected-car data privacy</a></p><ul><li><p>How will the FTC's tough stance on car data privacy affect the information your vehicle collects about you every day?</p></li></ul></li><li><p><strong>Computer Weekly </strong>- <a href="https://www.computerweekly.com/feature/How-to-manage-data-privacy-versus-the-growing-grab-bag-of-requirements?_gl=1*ahxj0z*_ga*NDc5MjY5ODAyLjE3MDQ3MTA4NDI.*_ga_TQKE4GS5P9*MTcxNTYxMTI1NS42My4wLjE3MTU2MTEyNTUuMC4wLjA.">How to manage data privacy versus the growing grab bag of requirements</a></p><ul><li><p>Can businesses maintain their operational flexibility while adapting to increasingly stringent data privacy requirements?</p></li></ul></li><li><p><strong>CPO</strong> - <a href="https://www.cpomagazine.com/data-protection/maryland-enacts-comprehensive-consumer-privacy-legislation-what-you-need-to-know/">Maryland Enacts Comprehensive Consumer Privacy Legislation: What You Need to Know</a></p><ul><li><p>Will Maryland&#8217;s groundbreaking privacy law inspire a wave of similar legislation across the US?</p></li></ul></li><li><p><strong>AdExchanger</strong> - <a href="https://www.adexchanger.com/data-privacy-roundup/meet-ron-de-jesus-the-first-ever-field-chief-privacy-officer/">Meet Ron De Jesus, The First-Ever &#8216;Field&#8217; Chief Privacy Officer</a></p><ul><li><p>How will Ron De Jesus&#8217;s new role as the first 'Field' Chief Privacy Officer change the way companies handle data privacy?</p></li></ul></li><li><p><strong>Politico</strong> - <a href="https://www.politico.com/news/2024/05/18/vermont-data-privacy-law-tech-lobbyists-00158711">Vermont&#8217;s data privacy law sparks state lawmaker alliance against tech lobbyists</a></p><ul><li><p>Could Vermont's bold move inspire a nationwide movement to challenge tech lobbyists and strengthen privacy laws?</p></li></ul></li><li><p><strong>The Record</strong> - <a href="https://therecord.media/grindr-chief-privacy-officer-lgbtq-dating-app-data-policies">Grindr's chief privacy officer on the dating app's data controversies</a></p><ul><li><p>How will Grindr&#8217;s new privacy chief improve data safety and user trust for the LGBTQ+ community?</p></li></ul></li><li><p><strong>TechCrunch</strong> - <a href="https://techcrunch.com/2024/05/21/uk-data-protection-watchdog-ends-privacy-probe-of-snaps-genai-chatbot-but-warns-industry/?guccounter=1&amp;guce_referrer=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS8&amp;guce_referrer_sig=AQAAANYP1UNp_hmtAa1wSLtoaNs3udSF4J9TiKhTYGPlhHjkmViuBWnOPNl2wtit2WHQaHXHqagvAms1fwAxD5_F8SX0e_Ha2aZKIS3HMfvn1QEn-xGx5JnDWR8qoZr2kzGPhNpIknYPTY4eqVSuknD-B2hRgalKoo1WnJlBHRPFN_YI">UK data protection watchdog ends privacy probe of Snap&#8217;s GenAI chatbot, but warns industry</a></p><ul><li><p>What privacy risks does the UK watchdog see on the horizon for tech companies following the Snap GenAI chatbot investigation?</p></li></ul></li><li><p><strong>Intelligent Insurer</strong> - <a href="https://www.intelligentinsurer.com/privacy-and-data-protection-have-big-impact-on-us-cyber-insurance-market-says-kynd-survey">Privacy and data protection have big impact on US cyber insurance market, says Kynd survey</a></p><ul><li><p>How are privacy and data protection concerns transforming the US cyber insurance market according to the latest Kynd survey?</p></li></ul></li><li><p><strong>WH</strong> - <a href="https://www.wilmerhale.com/en/insights/blogs/wilmerhale-privacy-and-cybersecurity-law/20240521-maryland-and-nebraska-adopt-comprehensive-privacy-laws">Maryland and Nebraska Adopt Comprehensive Privacy Laws</a></p><ul><li><p>How will Maryland and Nebraska's new privacy laws, allowing residents to sue for data misuse and demand transparency, impact businesses and consumer rights?</p></li></ul></li><li><p><strong>CEPS</strong> - <a href="https://www.ceps.eu/the-eu-us-data-transfers-and-privacy-quarrel-the-end-is-not-in-sight/">In the EU-US data transfer and privacy quarrel, the end is not in sight</a></p><ul><li><p>What intriguing developments are keeping the EU-US data privacy conflict alive and unresolved?</p></li></ul></li><li><p><strong>Campaign</strong> -&nbsp; <a href="https://www.campaignasia.com/article/a-knife-edge-of-what-is-legal-six-years-on-from-gdpr-marketers-are-still-taki/496275">Data on a knife edge of what is legal</a></p><ul><li><p>Why is GDPR still a challenge for marketers six years on, and what surprising legal issues are they facing?</p></li></ul></li><li><p><strong>Office of the Privacy Commissioner of Canada</strong> - <a href="https://www.priv.gc.ca/en/opc-actions-and-decisions/ar_index/202324/ar_202324/">Trust, innovation, and protecting the fundamental right to privacy in the digital age</a></p><ul><li><p>What intriguing revelations did Canada&#8217;s Privacy Commissioner uncover in the latest report on data privacy?</p></li></ul></li></ul><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/ethical-tech-news-roundup-b78?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/ethical-tech-news-roundup-b78?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><h3><strong>Tech + Big Business</strong></h3><ul><li><p><strong>NYTimes</strong> - <a href="https://www.nytimes.com/2024/06/17/health/surgeon-general-social-media-warning-label.html">Surgeon General Calls for Warning Labels on Social Media Platforms</a></p><ul><li><p>Dr. Vivek Murthy said he would urge Congress to require a warning that social media use can harm teenagers&#8217; mental health. </p></li></ul></li><li><p><strong>GizChina</strong> - <a href="https://www.gizchina.com/2024/05/20/canadian-intelligence-sounds-the-alarm-on-tiktoks-data-practices/">CANADIAN INTELLIGENCE SOUNDS THE ALARM ON TIKTOK&#8217;S DATA PRACTICES</a></p><ul><li><p>What actions will Canadian authorities take to protect user data from potential exploitation by foreign entities through TikTok?</p></li></ul></li><li><p><strong>Tech Monitor</strong> - <a href="https://techmonitor.ai/technology/data/meta-privacy-policy-train-ai-gdpr">Meta training AI products through user data breaks European law, claim activists</a></p><ul><li><p>What surprising changes in Meta&#8217;s new privacy policy are shaking up the AI landscape under GDPR?</p></li></ul></li><li><p><strong>Spiceworks</strong> - <a href="https://www.spiceworks.com/it-security/data-security/news/wall-street-data-security-regulations-updated-us-sec/">Wall Street Data Security Regulations Updated by US SEC</a></p><ul><li><p>How will the SEC&#8217;s revamped data security rules impact Wall Street's ability to defend against sophisticated cyber attacks?</p></li></ul></li><li><p><strong>Global News</strong> - <a href="https://globalnews.ca/news/10532082/temu-app-privacy-class-action-lawsuits/">Temu faces class action&nbsp;</a></p><ul><li><p>What intriguing details are emerging from the privacy lawsuits sparked by the Temu app?</p></li></ul></li><li><p><strong>Digiday</strong> - <a href="https://digiday.com/marketing/wtf-is-a-financial-media-network/?utm_campaign=digidaydis&amp;utm_source=linkedin&amp;utm_medium=social&amp;utm_content=60424">WTF is a financial media network?</a></p><ul><li><p>Why are financial firms creating their own ad networks, and how will consumers be affected?</p></li></ul></li><li><p><strong>Digiday</strong> - <a href="https://digiday.com/marketing/what-it-will-take-for-advertisers-to-finally-get-ready-to-let-go-of-the-third-party-cookie/">What it will take for advertisers to finally get ready to let go of the third-party cookie</a></p><ul><li><p>What innovative solutions are advertisers discovering as they adapt to a world without third-party cookies, and what does it mean for privacy?</p></li></ul></li></ul><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/subscribe?"><span>Subscribe now</span></a></p><h3><strong>Health Data</strong></h3><ul><li><p><strong>The Conversation</strong> - <a href="https://theconversation.com/wearable-devices-can-now-harvest-our-brain-data-australia-needs-urgent-privacy-reforms-229006">Wearable devices can now harvest our brain data. Australia needs urgent privacy reforms</a></p><ul><li><p>What urgent actions should Australia undertake to safeguard our neurological privacy as wearable technology advances?</p></li></ul></li><li><p><strong>The Guardian</strong> - <a href="https://www.theguardian.com/australia-news/article/2024/may/17/medisecure-data-breach-australia-healthcare-prescriptions-impact">MediSecure data breach: cyber security chief says no current prescriptions affected</a></p><ul><li><p>How did the MediSecure data breach manage to compromise healthcare information without affecting current prescriptions?</p></li></ul></li><li><p><strong>FedScoop</strong> - <a href="https://fedscoop.com/video/fdas-thomas-beach-on-the-benefits-of-strong-data-governance/">FDA's Thomas Beach Strong Data Governance</a>&nbsp;</p><ul><li><p>What surprising benefits has Thomas Beach uncovered in revolutionizing data governance at the FDA?</p></li></ul></li><li><p><strong>TechRadar Pro</strong> - <a href="https://www.techradar.com/pro/security/millions-of-us-customers-have-social-security-numbers-stolen-in-major-sav-rx-data-breach">Millions of US customers have social security numbers stolen in major Sav-Rx data breach</a></p><ul><li><p>What surprising vulnerabilities allowed millions of social security numbers to be stolen in the Sav-Rx data breach?</p></li></ul></li></ul><div class="captioned-button-wrap" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/ethical-tech-news-roundup-b78?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="CaptionedButtonToDOM"><div class="preamble"><p class="cta-caption">Thank you for reading The Ethical Tech Project. This post is public so feel free to share it.</p></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/ethical-tech-news-roundup-b78?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/ethical-tech-news-roundup-b78?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p></div><div><hr></div><h2><a href="https://news.ethicaltechproject.com/podcast">Conversations in Ethical Tech</a></h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://news.ethicaltechproject.com/p/what-are-dark-patterns" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Q8wG!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 424w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 848w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 1272w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Q8wG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png" width="428" height="240.75" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:428,&quot;bytes&quot;:847505,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:&quot;&quot;,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:&quot;https://news.ethicaltechproject.com/p/what-are-dark-patterns&quot;,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!Q8wG!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 424w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 848w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 1272w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>In this episode, JJ and Maritza explore the concept of <strong>dark patterns</strong>, meaning deceptive design practices that manipulate users into taking actions they didn't intend to.</p><ul><li><p>How does the choice architecture of the tech products you use impact your personal privacy and data protection? </p></li><li><p>How can good product design and business outcomes be balanced against avoiding deceptive practices?</p></li><li><p> Learn a few common-sense suggestions that you can take as a developer, leader, or consumer, including providing feedback to companies, filing complaints with regulatory bodies, and raising awareness about dark patterns.</p></li><li><p><strong><a href="https://news.ethicaltechproject.com/p/what-are-dark-patterns">Listen Now!</a></strong></p></li></ul><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.com/p/what-are-dark-patterns&quot;,&quot;text&quot;:&quot;Listen to the Latest Podcast&quot;,&quot;action&quot;:null,&quot;class&quot;:&quot;button-wrapper&quot;}" data-component-name="ButtonCreateButton"><a class="button primary button-wrapper" href="https://news.ethicaltechproject.com/p/what-are-dark-patterns"><span>Listen to the Latest Podcast</span></a></p><div><hr></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!9SsR!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Facd9e155-8f6f-4998-af19-399b4dd2aedd_6000x4000.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!9SsR!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Facd9e155-8f6f-4998-af19-399b4dd2aedd_6000x4000.jpeg 424w, https://substackcdn.com/image/fetch/$s_!9SsR!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Facd9e155-8f6f-4998-af19-399b4dd2aedd_6000x4000.jpeg 848w, https://substackcdn.com/image/fetch/$s_!9SsR!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Facd9e155-8f6f-4998-af19-399b4dd2aedd_6000x4000.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!9SsR!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Facd9e155-8f6f-4998-af19-399b4dd2aedd_6000x4000.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!9SsR!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Facd9e155-8f6f-4998-af19-399b4dd2aedd_6000x4000.jpeg" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/acd9e155-8f6f-4998-af19-399b4dd2aedd_6000x4000.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:641942,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!9SsR!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Facd9e155-8f6f-4998-af19-399b4dd2aedd_6000x4000.jpeg 424w, https://substackcdn.com/image/fetch/$s_!9SsR!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Facd9e155-8f6f-4998-af19-399b4dd2aedd_6000x4000.jpeg 848w, https://substackcdn.com/image/fetch/$s_!9SsR!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Facd9e155-8f6f-4998-af19-399b4dd2aedd_6000x4000.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!9SsR!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Facd9e155-8f6f-4998-af19-399b4dd2aedd_6000x4000.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Photo by <a href="https://unsplash.com/@filisantillan?utm_content=creditCopyText&amp;utm_medium=referral&amp;utm_source=unsplash">Fili Santill&#225;n</a> on <a href="https://unsplash.com/photos/turned-on-laptop-computer-HeyFNqApSLQ?utm_content=creditCopyText&amp;utm_medium=referral&amp;utm_source=unsplash">Unsplash</a></figcaption></figure></div>]]></content:encoded></item><item><title><![CDATA[What Are 'Dark Patterns'?]]></title><description><![CDATA[Conversations in Ethical Tech, Episode 3]]></description><link>https://news.ethicaltechproject.org/p/what-are-dark-patterns</link><guid isPermaLink="false">https://news.ethicaltechproject.org/p/what-are-dark-patterns</guid><dc:creator><![CDATA[The Ethical Tech Project]]></dc:creator><pubDate>Fri, 14 Jun 2024 14:13:28 GMT</pubDate><enclosure url="https://api.substack.com/feed/podcast/145641637/3260642942875bd476cf21991b65d2bb.mp3" length="0" type="audio/mpeg"/><content:encoded><![CDATA[<p>Join Ethical Tech Project Board Member <a href="https://www.linkedin.com/in/maritzaj/">Dr. Maritza Johnson</a> and Board Advisor <a href="https://www.linkedin.com/in/jonathanjoseph1/">Jonathan Joseph</a> (AKA &#8216;JJ&#8217;) for &#8220;<a href="https://news.ethicaltechproject.com/p/the-impact-of-a-federal-privacy-law">Conversations in Ethical Tech!</a>&#8221;</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Ethical Tech Project! Subscribe for free to receive new posts.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>In this episode, JJ and Maritza explore the concept of <strong>dark patterns</strong>, meaning deceptive design practices that manipulate users into taking actions they didn't intend to.</p><p>How does the choice architecture of the tech products you use impact your personal privacy and data protection? Find out more about hidden privacy policies, misleading consent buttons, and nagging prompts. </p><p>How can good product design and business outcomes be balanced against avoiding deceptive practices? Learn a few common-sense suggestions that you can take as a developer, leader, or consumer, including providing feedback to companies, filing complaints with regulatory bodies, and raising awareness about dark patterns.</p><div class="captioned-button-wrap" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/what-are-dark-patterns?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="CaptionedButtonToDOM"><div class="preamble"><p class="cta-caption">Thank you for reading The Ethical Tech Project. This post is public so feel free to share it.</p></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/what-are-dark-patterns?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/what-are-dark-patterns?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Q8wG!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Q8wG!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 424w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 848w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 1272w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Q8wG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Q8wG!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 424w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 848w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 1272w, https://substackcdn.com/image/fetch/$s_!Q8wG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd470206b-fff9-493a-8550-cf8d51cc7313_2048x1152.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div><hr></div><h2><strong>What We&#8217;re Reading on Ethical Tech This Week</strong></h2><ul><li><p><strong>The Problem Is&#8230;</strong> - <a href="https://www.problem-is.com/p/what-you-need-to-know-about-the-ai">What You Need to Know About the AI Alignment Problem</a></p></li><li><p><strong>The Ethical Tech Project</strong> - <a href="https://news.ethicaltechproject.com/p/is-programmatic-advertising-a-threat">Is Programmatic Advertising a Threat to Democracy?</a></p></li><li><p><strong>Digiday</strong> - <a href="https://digiday.com/marketing/wtf-is-a-financial-media-network/?utm_campaign=digidaydis&amp;utm_source=linkedin&amp;utm_medium=social&amp;utm_content=60424">WTF is a financial media network?</a></p></li><li><p><strong>Tech Monitor</strong> - <a href="https://techmonitor.ai/technology/data/meta-privacy-policy-train-ai-gdpr">Meta training AI products through user data breaks European law, claim activists</a></p></li><li><p><strong>The Record</strong> - <a href="https://therecord.media/ben-wiseman-interview-ftc-data-privacy">A top FTC official on the consumer privacy message the agency is sending to industry</a></p></li><li><p><strong>Digiday</strong> - <a href="https://digiday.com/marketing/what-it-will-take-for-advertisers-to-finally-get-ready-to-let-go-of-the-third-party-cookie/">What it will take for advertisers to finally get ready to let go of the third-party cookie</a></p></li></ul><div class="captioned-button-wrap" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/what-are-dark-patterns?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="CaptionedButtonToDOM"><div class="preamble"><p class="cta-caption">Thank you for reading The Ethical Tech Project. This post is public so feel free to share it.</p></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://news.ethicaltechproject.org/p/what-are-dark-patterns?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://news.ethicaltechproject.org/p/what-are-dark-patterns?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p></div><p></p>]]></content:encoded></item></channel></rss>