In April 2022, Apple came under scrutiny when the Photos app was discovered to have blocked photos taken at sites related to the Holocaust. A team of journalists at the tech website 9to5Mac were analyzing an iPhone beta update, when they found a list of 12 "sensitive locations" set to be excluded from Memories.
These locations included the Auschwitz-Birkenau concentration camp and the Berlin Holocaust Memorial. 9to5Mac described the block as an attempt to avoid the creation of "unwanted memories."
My recent study examined the comment section of the original 9to5Mac article to gauge how iPhone users and readers reacted to this news. The response illuminated a host of concerns about the ways Apple scans users' personal photographs for the automated creation of Memories.
User frustration
Memories is a feature in the Apple Photos app that automatically curates content to create personalized slideshows. Using artificial intelligence, Memories claims to identify the important people, places and events in the Photos app libraries of its users.
Many iPhone users store thousands of photographs on their devices and often even more on Apple's iCloud service. Managing this unprecedented amount of personal photographs is no small task, and we increasingly rely on automated tools to help us revisit and reflect upon important moments in our lives.
News of the block raised concerns about how our interactions with the past are being shaped by the technologies we employ to remember.
The block was based on geotags, a form of metadata that identifies the geographic location where a photograph was taken. The journalists also noted that the list could be updated in future iOS updates.
iPhone users took to the comment section of the 9to5Mac article to voice their concerns, and the discussion revealed a growing amount of skepticism, alarm and dissatisfaction.
Some users voiced their frustration with the lack of customization in the Memories feature, while others equated the move with outright censorship. Many took the opportunity to speculate on what other types of content Apple may attempt to block from appearing in Memories down the line.
Given the gravity of the history targeted through this measure, some commentors considered the potential stakes of developments like these. A comment by user Paul Martin reflected on the potential impact of forgetting: "Let's not sanitize the world and forget our past or we risk repeating our mistakes."
Memory management
The move was generally seen as an overstep by Apple, which has long faced criticism for its "closed world" approach to hardware and software development. By limiting access to the internal workings of their products and services, many users felt increasingly restricted by Apple's offerings.
Kojack, another commentor on the 9to5 article, noted that "Apple has been doing this for years with all of their products. They want you to use it how THEY see fit."
Memories slideshows are made using machine learning, a field of artificial intelligence that uses data and algorithms in an attempt to imitate the ways humans think. Though this technology is remarkably adept at certain tasks, it can also struggle with more nuanced forms of assessment. When curating personal photographs, an algorithm may fail to recognize the aspects of an image that imbue with it meaning.
A personal connection to a photograph is not easily represented through metadata. Our feelings about certain photographs may also change throughout our lives, and this cannot be adequately reflected by a one-size-fits-all algorithm.
Should we really expect a technology like Apple Memories to identify the important photographs we wish to remember and exclude those we would rather forget?
Preventing unwelcome encounters with potentially triggering or offensive content can be seen as a form of censorship. But debates in this space are usually centered around restricting access to content produced by others. This development forces us to consider a scenario where we may be blocked from accessing content that we ourselves have produced.
Controlling access
Though this stop-gap solution may succeed in preventing upsetting or inappropriate representations of the Holocaust from appearing in the Memories feature, it raises concerns surrounding Big Tech's reach into our personal lives and experiences.
In the wrong hands, automated systems like Memories could identify and limit access to photographs of all kinds. For example, a similar measure could one day block users from accessing photographic evidence of violence or wrongdoing.
Because the Memories feature is included in Apple's Photos app, it has a far-reaching ability to shape how users interact with their personal photographs. This means it can influence not only what content users are encouraged to remember, but when and how they are invited to remember it.
Apple's ability and willingness to omit this important history warrants serious consideration, and highlights the need for greater oversight and regulation of tech companies.
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Citation: Apple's decision to exclude Holocaust sites from its Memories feature raised red flags about memory management (2024, May 30) retrieved 30 May 2024 from https://techxplore.com/news/2024-05-apple-decision-exclude-holocaust-sites.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.