{"id":35134,"date":"2023-08-03T08:00:42","date_gmt":"2023-08-03T07:00:42","guid":{"rendered":"https:\/\/www.redpepper.org.uk\/?p=35134"},"modified":"2023-09-28T18:12:11","modified_gmt":"2023-09-28T17:12:11","slug":"artificial-intelligence-impact-regulation-sensationalism","status":"publish","type":"post","link":"https:\/\/www.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/","title":{"rendered":"Against AI sensationalism"},"content":{"rendered":"\n<p class=\"drop-cap-paragraph\">In case you missed it: artificial intelligence (AI) will make teachers <a href=\"https:\/\/www.theguardian.com\/technology\/2023\/jul\/07\/ai-likely-to-spell-end-of-traditional-school-classroom-leading-expert-says\">redundant<\/a>, <a href=\"https:\/\/www.economist.com\/by-invitation\/2022\/09\/02\/artificial-neural-networks-are-making-strides-towards-consciousness-according-to-blaise-aguera-y-arcas\">become sentient<\/a>, and soon, <a href=\"https:\/\/www.nytimes.com\/2023\/05\/30\/technology\/ai-threat-warning.html\">wipe out humanity<\/a> as we know it. From <a href=\"https:\/\/decrypt.co\/142159\/elon-musk-warns-ai-could-disarm-humanity-achieve-world-peace\">Elon Musk<\/a>, to the godfather of AI, <a href=\"https:\/\/www.theguardian.com\/technology\/2023\/may\/02\/geoffrey-hinton-godfather-of-ai-quits-google-warns-dangers-of-machine-learning\">Geoffrey Hinton<\/a>, to Rishi Sunak\u2019s<a href=\"https:\/\/www.independent.co.uk\/news\/uk\/politics\/ai-artificial-intelligence-kill-humans-sunak-b2352099.html\"> AI advisor<\/a>, industry leaders and experts everywhere are warning about AI\u2019s mortal threat to our existence as a species.<\/p>\n\n\n\n<p>They are right about one thing: AI can be harmful. Facial recognition systems are already being used to <a href=\"https:\/\/www.bbc.co.uk\/news\/uk-england-northamptonshire-66120010\">prohibit possible protestors exercising fundamental rights<\/a>. Automated fraud detectors are falsely <a href=\"https:\/\/news.sky.com\/story\/thousands-face-incorrect-benefit-cuts-from-automated-fraud-detector-11651031\">cutting off thousands of people from much-needed welfare payments<\/a> and surveillance tools are <a href=\"https:\/\/www.workerinfoexchange.org\/wie-report-managed-by-bots\">being used in the workplace<\/a> to<a href=\"https:\/\/www.theguardian.com\/technology\/2018\/jan\/31\/amazon-warehouse-wristband-tracking\"> monitor workers\u2019 productivity<\/a>.<\/p>\n\n\n\n<p>Many of us might be shielded from the worst harms of AI. Wealth, social privilege or proximity to whiteness and capital mean that many are less likely to fall prey to tools of societal control and surveillance. As Virginia Eubanks <a href=\"https:\/\/us.macmillan.com\/books\/9781250074317\/automatinginequality\">puts it<\/a> \u2018many of us in the professional middle class only brush against [the invisible spider web of AI] briefly\u2026 We may have to pause a moment to extricate ourselves from its gummy grasp, but its impacts don\u2019t linger.\u2019<\/p>\n\n\n\n<p class=\"has-text-align-center rp-full-width rp-quote has-grey-color has-pale-1-background-color has-text-color has-background has-antonio-font-family\" style=\"padding-top:2%;padding-right:2%;padding-bottom:2%;padding-left:2%;font-size:clamp(1.743rem, 1.743rem + ((1vw - 0.2rem) * 1.571), 3rem);\">These systems look and move suspiciously like the government, the police and the very private sector tech leaders that claim to be the ones protecting us from harm<\/p>\n\n\n\n<p>By contrast, it is well established that the worst harms of government decisions already fall hardest on those most marginalised. Let&#8217;s take the example of drugs policing and the disproportionate impact on communities of colour. Though the evidence shows that Black people use drugs <a href=\"https:\/\/www.release.org.uk\/sites\/default\/files\/pdf\/publications\/Release%20-%20Race%20Disparity%20Report%20final%20version.pdf\">no more<\/a>, and possibly less, than white people, the police direct efforts to identify drug-related crimes towards communities of colour. As a consequence, the data then shows that communities of colour are more likely to be \u2018hotpots\u2019 for drugs. In this way, policing efforts to \u2018identify\u2019 the problem creates a problem in the eyes of the system, and <a href=\"https:\/\/www.libertyhumanrights.org.uk\/wp-content\/uploads\/2023\/04\/HoldingOurOwn_Digital-DoubleSpreads.pdf\">the cycle of overpolicing<\/a> continues. When you automate such processes, as with predictive policing tools based on racist and classist criminal justice data, these biases are <a href=\"https:\/\/www.libertyhumanrights.org.uk\/issue\/policing-by-machine\/\">further entrenched<\/a>.<\/p>\n\n\n\n<p>Given these systems inevitably target marginalised groups, it is difficult not to posit that governments deploy algorithms intentionally as social control mechanisms. As Sarah Valentine <a href=\"https:\/\/core.ac.uk\/reader\/216959440\" target=\"_blank\" rel=\"noopener\">writes<\/a>, \u2018algorithmic decision-making increases the state\u2019s capacity to dominate vulnerable communities by making it almost impossible to challenge system errors [and] it allows governments to hide these negative effects behind the veneer of technological infallibility.\u2019 In our drugs policing example above, the algorithm runs off data that is race-and-class blind and subsequently correlates poor people of colour with substance use. Anybody who has experienced police abuse or neglect at the hands of the state will be able to attest to how difficult it is to hold state institutions accountable as things currently stand, let alone an opaque algorithmic system built years prior.<\/p>\n\n\n\n<p>But you\u2019ll be hard-pressed to find this kind of analysis in the numerous articles sensationalising the existential threat of AI. Not only are AI and algorithmic decision-making already negatively impacting people but these systems look and move suspiciously like the government, the police and the very private sector tech leaders that claim to be the ones protecting us from harm. The reality is that they\u2019re often the ones perpetrating the violence.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Regulatory loopholes<\/h2>\n\n\n\n<p>The UK government\u2019s <a href=\"https:\/\/www.gov.uk\/government\/publications\/ai-regulation-a-pro-innovation-approach\/white-paper#annexc\">AI regulation white paper<\/a> was supposed to respond to the gaps in the current legal framework governing AI. Yet, it has already begun on the wrong footing, placing an inherently capitalist aim of AI \u2018innovation\u2019 at the heart of its agenda, a move that will be music to the ears of the very industries that are at the centre of the death-by-AI-robot debate. We call this \u2018AI ethics washing\u2019: the industry resisting statutory regulation while simultaneously advocating for their own, industry-created and managed, non-statutory checks and balances as proof that they are taking these issues seriously. For examples of this practice, just take a look at the plethora of \u2018ethical codes of conduct\u2019 springing up, steered by the potentially dangerous assumption that industry can voluntarily regulate itself.<\/p>\n\n\n\n<p>As AI governance scholars Karen Yeung, Andrew Howes and Ganna Pogrebna have acutely <a href=\"https:\/\/papers.ssrn.com\/sol3\/papers.cfm?abstract_id=3435011\">described it<\/a>: \u2018More conventional regulation, involving legally mandated regulatory standards and enforcement mechanisms, are swiftly met by protest from industry that \u2018regulation stifles innovation.\u2019 These protests assume that innovation is an unvarnished and unmitigated good, an unexamined belief that has resulted in technological innovation entrenching itself as the altar upon which cash-strapped contemporary governments worship, na\u00efvely hoping that digital innovation will create jobs [and] stimulate growth.\u2019<\/p>\n\n\n\n<p>And as\u00a0<em>Liberty<\/em> and other human rights and tech and data justice organisations <a href=\"https:\/\/publiclawproject.org.uk\/content\/uploads\/2023\/06\/AI-alternative-white-paper-in-template.pdf\">have argued<\/a>, the proposals within the government\u2019s AI regulatory proposals fail to ensure that adequate safeguards and standards are in place for use of AI, particularly by public bodies. Meanwhile, the government is weakening existing standards and stripping back protections elsewhere, from <a href=\"https:\/\/publications.parliament.uk\/pa\/cm5803\/cmpublic\/DataProtectionDigitalInformation\/memo\/DPDIB06.htm\" target=\"_blank\" rel=\"noopener\">data protection<\/a> to <a href=\"https:\/\/www.thetimes.co.uk\/article\/rishi-sunak-threat-pull-uk-out-of-echr-immigration-cnrqltpjp\" target=\"_blank\" rel=\"noopener\">human rights<\/a>. Let alone the government&#8217;s wider agenda to\u00a0make the United Kingdom an unsafe place for <a href=\"https:\/\/www.jcwi.org.uk\/illegal-migration-bill-2023-briefing#:~:text=The%20Bill%20stops%20survivors%20of,the%2030%2Dday%20reflection%20period.\">migrants<\/a>, those <a href=\"https:\/\/images.bpas-campaigns.org\/wp-content\/uploads\/2023\/06\/19095347\/Reforming-Abortion-Law-Position-statement.pdf\">seeking abortion<\/a><a href=\"https:\/\/www.amnesty.org.uk\/press-releases\/uk-un-view-trans-rights-much-needed-common-sense\">, trans people<\/a>, or <a href=\"https:\/\/www.libertyhumanrights.org.uk\/wp-content\/uploads\/2023\/01\/Libertys-briefing-on-the-Strikes-Minimum-Service-Levels-Bill-for-Second-Reading-in-the-House-of-Commons-Jan-2023.pdf\">workers<\/a>, to name just a few.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">What can be done?<\/h2>\n\n\n\n<p>First and foremost we must demand strong AI regulation, based on human rights and principles of anti-oppression. That includes demanding that transparency be mandatory, that there are clear mechanisms for accountability, that the public are consulted on new tools before they are deployed by the government, that existing regulators have the expertise, funding and capacity to enforce a regulatory regime (or that a specialist regulator is created), and that people can seek redress when things go wrong. Certain AI that threatens fundamental rights, like facial recognition, should also be prohibited outright \u2013 something that the government has so far explicitly refused to do but is common elsewhere.<\/p>\n\n\n\n<p>Secondly, organise, organise and organise. Whether you\u2019re a <a href=\"https:\/\/www.redpepper.org.uk\/the-tech-fightback-an-interview-with-united-tech-and-allied-workers-utaw\/\">tech worker<\/a> (who can be demanding strong whistleblower protection policies), or a public sector worker (who can refuse to deploy algorithmic decisions), many of us have a role to play in resisting the oppressive use of AI tools. You can find out where (some) AI is being used by public bodies by looking at the Public Law Project\u2019s <a href=\"https:\/\/trackautomatedgovernment.shinyapps.io\/register\/\">Tracking Automated Government (TAG) register<\/a>. But even if you don\u2019t think you\u2019re being directly harmed, that does not mean you shouldn\u2019t be concerned. We must organise loudly alongside those who can\u2019t be so vocal.<\/p>\n\n\n\n<p>Finally, remember that you don\u2019t need to be an expert (whatever that means) to have a voice in this conversation; often it\u2019s those of us that aren\u2019t that are best placed to contribute to the debate. After all, AI does not arise in a vacuum, but rather is a vehicle to institute wider policy aims. That also means we cannot simply delegate responsibility to the AI engineers. As the co-founders of the AI Institute Meredith Whittaker and Kate Crawford explore in this fantastic <a href=\"https:\/\/www.vox.com\/podcasts\/2019\/4\/8\/18299736\/artificial-intelligence-ai-meredith-whittaker-kate-crawford-kara-swisher-decode-podcast-interview\">podcast episode<\/a>, \u2018a lot of these questions aren\u2019t AI questions! They aren\u2019t about: \u201care you using a deep neural net to do this,\u201d it\u2019s about: \u201cunder what policy is this implemented.\u201d [So] in the healthcare domain you would want doctors, you would want nurses unions, you would want people that understand the arcane workings of the US insurance system [\u2026] at the table, on equal footing with AI experts to actually create the practices to verify and ensure that these systems are safe and beneficial.\u2019<\/p>\n\n\n\n<p>Yes, AI is scary, but it is ultimately a reflection of our current system, and it is the introduction of AI into an already repressive environment that we must question. Our movements for justice and liberation must be continually fought for, and hard. Do not let the fearmongering around AI distract you or drive you to despair, may it mobilise you.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Emmanuelle Andrews argues that yes, AI is scary, but these systems can and must be regulated to provide greater public security and purpose<\/p>\n","protected":false},"author":29,"featured_media":38219,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[244,295],"tags":[2576],"class_list":["post-35134","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-race-racism","category-technology","tag-emmanuelle-andrews"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.1.1 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Against AI sensationalism - Red Pepper<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/\" \/>\n<meta property=\"og:locale\" content=\"en_GB\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Against AI sensationalism - Red Pepper\" \/>\n<meta property=\"og:description\" content=\"Emmanuelle Andrews argues that yes, AI is scary, but these systems can and must be regulated to provide greater public security and purpose\" \/>\n<meta property=\"og:url\" content=\"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/\" \/>\n<meta property=\"og:site_name\" content=\"Red Pepper\" \/>\n<meta property=\"article:published_time\" content=\"2023-08-03T07:00:42+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2023-09-28T17:12:11+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/rpdev2023.redpepper.org.uk\/wp-content\/uploads\/2023\/08\/Artificial-Intelligence.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1200\" \/>\n\t<meta property=\"og:image:height\" content=\"600\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Liam Kennedy\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Liam Kennedy\" \/>\n\t<meta name=\"twitter:label2\" content=\"Estimated reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"6 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/\"},\"author\":{\"name\":\"Liam Kennedy\",\"@id\":\"https:\/\/www.redpepper.org.uk\/#\/schema\/person\/d391b8ef0753a453a698266884d32a42\"},\"headline\":\"Against AI sensationalism\",\"datePublished\":\"2023-08-03T07:00:42+00:00\",\"dateModified\":\"2023-09-28T17:12:11+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/\"},\"wordCount\":1304,\"image\":{\"@id\":\"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.redpepper.org.uk\/wp-content\/uploads\/2023\/08\/Artificial-Intelligence.jpg\",\"keywords\":[\"Emmanuelle Andrews\"],\"articleSection\":[\"Race and racism\",\"Technology\"],\"inLanguage\":\"en-GB\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/\",\"url\":\"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/\",\"name\":\"Against AI sensationalism - Red Pepper\",\"isPartOf\":{\"@id\":\"https:\/\/www.redpepper.org.uk\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.redpepper.org.uk\/wp-content\/uploads\/2023\/08\/Artificial-Intelligence.jpg\",\"datePublished\":\"2023-08-03T07:00:42+00:00\",\"dateModified\":\"2023-09-28T17:12:11+00:00\",\"author\":{\"@id\":\"https:\/\/www.redpepper.org.uk\/#\/schema\/person\/d391b8ef0753a453a698266884d32a42\"},\"breadcrumb\":{\"@id\":\"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/#breadcrumb\"},\"inLanguage\":\"en-GB\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-GB\",\"@id\":\"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/#primaryimage\",\"url\":\"https:\/\/www.redpepper.org.uk\/wp-content\/uploads\/2023\/08\/Artificial-Intelligence.jpg\",\"contentUrl\":\"https:\/\/www.redpepper.org.uk\/wp-content\/uploads\/2023\/08\/Artificial-Intelligence.jpg\",\"width\":1200,\"height\":600,\"caption\":\"CREDIT: GERALT VIA WIKIMEDIA COMMONS\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.redpepper.org.uk\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Culture\",\"item\":\"https:\/\/www.redpepper.org.uk\/culture-media\/\"},{\"@type\":\"ListItem\",\"position\":3,\"name\":\"Technology\",\"item\":\"https:\/\/www.redpepper.org.uk\/culture-media\/technology\/\"},{\"@type\":\"ListItem\",\"position\":4,\"name\":\"Against AI sensationalism\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.redpepper.org.uk\/#website\",\"url\":\"https:\/\/www.redpepper.org.uk\/\",\"name\":\"Red Pepper\",\"description\":\"\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.redpepper.org.uk\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-GB\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.redpepper.org.uk\/#\/schema\/person\/d391b8ef0753a453a698266884d32a42\",\"name\":\"Liam Kennedy\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Against AI sensationalism - Red Pepper","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/","og_locale":"en_GB","og_type":"article","og_title":"Against AI sensationalism - Red Pepper","og_description":"Emmanuelle Andrews argues that yes, AI is scary, but these systems can and must be regulated to provide greater public security and purpose","og_url":"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/","og_site_name":"Red Pepper","article_published_time":"2023-08-03T07:00:42+00:00","article_modified_time":"2023-09-28T17:12:11+00:00","og_image":[{"width":1200,"height":600,"url":"https:\/\/rpdev2023.redpepper.org.uk\/wp-content\/uploads\/2023\/08\/Artificial-Intelligence.jpg","type":"image\/jpeg"}],"author":"Liam Kennedy","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Liam Kennedy","Estimated reading time":"6 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/#article","isPartOf":{"@id":"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/"},"author":{"name":"Liam Kennedy","@id":"https:\/\/www.redpepper.org.uk\/#\/schema\/person\/d391b8ef0753a453a698266884d32a42"},"headline":"Against AI sensationalism","datePublished":"2023-08-03T07:00:42+00:00","dateModified":"2023-09-28T17:12:11+00:00","mainEntityOfPage":{"@id":"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/"},"wordCount":1304,"image":{"@id":"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/#primaryimage"},"thumbnailUrl":"https:\/\/www.redpepper.org.uk\/wp-content\/uploads\/2023\/08\/Artificial-Intelligence.jpg","keywords":["Emmanuelle Andrews"],"articleSection":["Race and racism","Technology"],"inLanguage":"en-GB"},{"@type":"WebPage","@id":"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/","url":"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/","name":"Against AI sensationalism - Red Pepper","isPartOf":{"@id":"https:\/\/www.redpepper.org.uk\/#website"},"primaryImageOfPage":{"@id":"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/#primaryimage"},"image":{"@id":"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/#primaryimage"},"thumbnailUrl":"https:\/\/www.redpepper.org.uk\/wp-content\/uploads\/2023\/08\/Artificial-Intelligence.jpg","datePublished":"2023-08-03T07:00:42+00:00","dateModified":"2023-09-28T17:12:11+00:00","author":{"@id":"https:\/\/www.redpepper.org.uk\/#\/schema\/person\/d391b8ef0753a453a698266884d32a42"},"breadcrumb":{"@id":"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/#breadcrumb"},"inLanguage":"en-GB","potentialAction":[{"@type":"ReadAction","target":["https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/"]}]},{"@type":"ImageObject","inLanguage":"en-GB","@id":"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/#primaryimage","url":"https:\/\/www.redpepper.org.uk\/wp-content\/uploads\/2023\/08\/Artificial-Intelligence.jpg","contentUrl":"https:\/\/www.redpepper.org.uk\/wp-content\/uploads\/2023\/08\/Artificial-Intelligence.jpg","width":1200,"height":600,"caption":"CREDIT: GERALT VIA WIKIMEDIA COMMONS"},{"@type":"BreadcrumbList","@id":"https:\/\/rpdev2023.redpepper.org.uk\/culture-media\/technology\/artificial-intelligence-impact-regulation-sensationalism\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.redpepper.org.uk\/"},{"@type":"ListItem","position":2,"name":"Culture","item":"https:\/\/www.redpepper.org.uk\/culture-media\/"},{"@type":"ListItem","position":3,"name":"Technology","item":"https:\/\/www.redpepper.org.uk\/culture-media\/technology\/"},{"@type":"ListItem","position":4,"name":"Against AI sensationalism"}]},{"@type":"WebSite","@id":"https:\/\/www.redpepper.org.uk\/#website","url":"https:\/\/www.redpepper.org.uk\/","name":"Red Pepper","description":"","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.redpepper.org.uk\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-GB"},{"@type":"Person","@id":"https:\/\/www.redpepper.org.uk\/#\/schema\/person\/d391b8ef0753a453a698266884d32a42","name":"Liam Kennedy"}]}},"type_of_article":[{"ID":35474,"post_title":"Feature","post_content":"","post_excerpt":"","post_author":"50","post_date":"2023-09-14 20:08:05","post_date_gmt":"2023-09-14 19:08:05","post_status":"publish","comment_status":"closed","ping_status":"closed","post_password":"","post_name":"feature","to_ping":"","pinged":"","post_modified":"2023-09-14 20:08:05","post_modified_gmt":"2023-09-14 19:08:05","post_content_filtered":"","post_parent":0,"guid":"https:\/\/www.redpepper.org.uk\/?post_type=type_of_article&#038;p=35474","menu_order":0,"post_type":"type_of_article","post_mime_type":"","comment_count":"0","comments":false,"id":35474}],"_links":{"self":[{"href":"https:\/\/www.redpepper.org.uk\/wp-json\/wp\/v2\/posts\/35134","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.redpepper.org.uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.redpepper.org.uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.redpepper.org.uk\/wp-json\/wp\/v2\/users\/29"}],"replies":[{"embeddable":true,"href":"https:\/\/www.redpepper.org.uk\/wp-json\/wp\/v2\/comments?post=35134"}],"version-history":[{"count":4,"href":"https:\/\/www.redpepper.org.uk\/wp-json\/wp\/v2\/posts\/35134\/revisions"}],"predecessor-version":[{"id":38610,"href":"https:\/\/www.redpepper.org.uk\/wp-json\/wp\/v2\/posts\/35134\/revisions\/38610"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.redpepper.org.uk\/wp-json\/wp\/v2\/media\/38219"}],"wp:attachment":[{"href":"https:\/\/www.redpepper.org.uk\/wp-json\/wp\/v2\/media?parent=35134"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.redpepper.org.uk\/wp-json\/wp\/v2\/categories?post=35134"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.redpepper.org.uk\/wp-json\/wp\/v2\/tags?post=35134"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}