For years, I believed that flagging photographs was a waste of time because they are rarely removed.
Google recently introduced the “Reports” list under Other contributions in the apps.
This has made it possible to check if our flags get approved or rejected.
Additionally, the European Union has required platforms allow users to appeal if their contributions are denied/removed. As a result, EU individuals can now file appeals after flagging photos that were unsuccessful.
So I felt it was time to give flagging photos another chance and investigate if it is now worth the effort to flag photos.
So I decided to make a small but systematic test of flagging 40 photos on Google Maps.
I chose a very distinct sculpture, so it is easy to determine if photos are showing the sculpture or not. The sculpture named “He” (or “Him”) was created in 2012 by Elmgreen and Dragset. Below is my photo of the sculpture with the nickname *Little Mermaid Brother*. Patient visitors can experience his eyes winking at random. He is 1.8 meters high and made of shiny polished stainless steel.
Find him on Google Maps here: Him statue by Elmgreen and Dragset
There are almost 734 reviews posted to the sculpture, so there are also many photos. As is often the case in tourist areas, many photos get posted to the wrong place, and some believe taking a selfie is helpful. So I decided to systematically flag photos for these two issues.
Procedure
40 images were selected and added to this spreadsheet.
Here is a screenshot from the spreadsheet with detailed data for each of the 40 images.
For each photo, I saved the Maps link, a thumbnail, and dropdown selection boxes indicating what the image was flagged for and the outcome (Not removed, Removed after flagging, Removed after appeal).
TEST 1: Flagging 20 images not showing the sculpture.
TEST 2: Flagging 10 images taken 40+ km away from the sculpture.
TEST 3: Flagging 10 images having people posing with the sculpture.
The three tests could not overlap, as this would make it too difficult to track the outcomes. It took a total of about 10 days to conduct the tests with daily checks under Reports under Other contributions.
Unfortunately, the reports do not show a link or thumbnail of the photo, so it was necessary to do the flagging in the same order as listed in the spreadsheet and regularly try opening the image links to determine which images were taken down. Links to removed images will be redirected to the place.
TEST 1: Not showing the sculpture
The first twenty incorrectly placed photographs were flagged. They all show something else in the neighborhood of the sculpture but do not display the sculpture itself. They typically show ships docked nearby, and other nearby sculptures You can find thumbnails and Maps links in the spreadsheet.
About an hour later I checked the status of each of them. 1 image was removed, and 19 were not removed. For each of the 19 photos, I submitted an appeal. I received email notifications for each of these appeals.
7 minutes after submitting the appeals, the appeal outcomes started arriving in my inbox. The next morning, a total of 2 appeals were approved. So my efforts to remove 20 wrongly placed photos resulted in only the following 3 images being removed:
Outcome
1 got removed after flagging
2 got removed after appeal
Result: 3 out of 20 got removed = 15% only
Conclusion and discussion 1
Only 3 photos removed out of 20 is a 15% success rate. This is disappointing in my opinion.
The first and second removed images show places approximately 200 meters southeast of the sculpture. I don’t know where the third image was taken, but it’s not from closeby.
The 17 photographs that were not removed were taken within around 100 meters of the sculpture and depict surrounding ships, views, and removed artworks.
These findings show that the flagging and appeals processes are based on whether a photo was taken near the sculpture rather than the content of the photo.
TEST 2: Photos from 40+ km away
Then I flagged 10 images clearly from another city (Copenhagen). They were all rejected and appealed within an hour. After only 1.5 hours, 7 of these appeals were approved and confirmed via email. When checking under Reports in the Maps app, 9 appeals were actually approved. That is a stunning 90% success rate.
The only Copenhagen image that was not removed after appeal is a photo showing The Little Mermaid located 46 kilometers south of the sculpture.
You can see thumbnails of the 9 removed images in the spreadsheet under the heading TEST 2.
Outcome
0 removed after flagging
9 out of 10 removed after appeal
Result: 90 percent got removed
Conclusion and discussion 2
Google’s evaluation of flags for posting to the wrong location is probably not using GPS metadata, but the appeals process sure is. Hence, it is well worth taking the time to appeal if you recognize the image to be from far away and it was not removed after flagging.
Maybe the little mermaid photo had no GPS meta data.
TEST 3 Flagging selfies and group photos
Test 3 was designed to investigate the efficiency of flagging photos for privacy.
First I flagged the 10 selfies/privacy photos. You can see them in the spreadsheet under the heading TEST 3.
After 5 minutes the first flag was rejected. After an hour, 8 were still being checked. After 5 hours, half were still being checked. So it is likely that checking flags for privacy takes longer than wrong locations. After 21 hours, 1 image was removed, and 2 were still pending. The last one was rejected Monday morning. So I went ahead and appealed for the remaining 9 photos.
They were processed pretty quickly except for one image which took 3-4 days. It was the same image that took the longest to get rejected for the initial flagging. But it is also the most obvious violation of the privacy guidelines. It shows a 4-5 year old child next to the sculpture. Looking straight into the camera. Still, it was not removed—not even after appeal.
Outcome
1 got removed after flagging
0 got removed after appeals
Result: 1 got removed = 10 percent. This is disappointing.
Conclusion and discussion 3
The only privacy photo that got removed is one that shows a person standing next to the sculpture facing away from the camera!
It is ludicrous to fail to remove the most heinous privacy offenses while removing a faceless photo. I believe Google Maps’ method for processing privacy flags is broken.
Overall conclusions
The algorithm used to process photos that are flagged for not showing the place is not very sophisticated. There is no AI system analyzing what the photos show. It seems to be only based on checking the GPS data and calculating the distance from the location. This test suggests that if the distance is more than maybe 100 meters, the photo can be removed after appeal - not after just flagging such photos. Keep in mind, appeals are only available in the EU. So flagging photos for the wrong location outside the EU are, in my opinion, a waste of our time. The process to process flaggings for wrong location needs fixing.
Photos not showing the sculpture but other objects in the near vicinity will not be removed.
To not waste your time, keep in mind that only photos from far away from the location can get removed after appeals.
Flagging photos for privacy reasons is not worth our time until the algorithms get fixed.
Random follow-up remarks
Keep in mind this test was initiated over a weekend; maybe flagging on working days gets processed more quickly.
I tried to flag the Little Mermaid photos once more. Based on this one test, it is probably not worth trying again. It took only a few minutes to process the flag and appeal. So the system likely remembered the first outcome.
I noticed once an appeal was not “Not approved”; instead, it got canceled. I failed to understand the difference.
This test is based on flagging in the Android app. But I noticed that flagging on desktop also immediately shows up under reports on mobile.
Maybe someone would care to systematically test the other kinds of flags.
We could test if crowd flagging is effective.
I have a suspicion that the system starts auto-rejecting flags from a user if we flag a lot of photos. But this needs further testing.