Testing what happens when we flag photos on Google Maps

For years, I believed that flagging photographs was a waste of time because they are rarely removed.

Google recently introduced the “Reports” list under Other contributions in the apps.

This has made it possible to check if our flags get approved or rejected.

Additionally, the European Union has required platforms allow users to appeal if their contributions are denied/removed. As a result, EU individuals can now file appeals after flagging photos that were unsuccessful.

So I felt it was time to give flagging photos another chance and investigate if it is now worth the effort to flag photos.

So I decided to make a small but systematic test of flagging 40 photos on Google Maps.

I chose a very distinct sculpture, so it is easy to determine if photos are showing the sculpture or not. The sculpture named “He” (or “Him”) was created in 2012 by Elmgreen and Dragset. Below is my photo of the sculpture with the nickname *Little Mermaid Brother*. Patient visitors can experience his eyes winking at random. He is 1.8 meters high and made of shiny polished stainless steel.

Find him on Google Maps here: Him statue by Elmgreen and Dragset

There are almost 734 reviews posted to the sculpture, so there are also many photos. As is often the case in tourist areas, many photos get posted to the wrong place, and some believe taking a selfie is helpful. So I decided to systematically flag photos for these two issues.

Procedure

40 images were selected and added to this spreadsheet.

Here is a screenshot from the spreadsheet with detailed data for each of the 40 images.

For each photo, I saved the Maps link, a thumbnail, and dropdown selection boxes indicating what the image was flagged for and the outcome (Not removed, Removed after flagging, Removed after appeal).

TEST 1: Flagging 20 images not showing the sculpture.
TEST 2: Flagging 10 images taken 40+ km away from the sculpture.
TEST 3: Flagging 10 images having people posing with the sculpture.

The three tests could not overlap, as this would make it too difficult to track the outcomes. It took a total of about 10 days to conduct the tests with daily checks under Reports under Other contributions.

Unfortunately, the reports do not show a link or thumbnail of the photo, so it was necessary to do the flagging in the same order as listed in the spreadsheet and regularly try opening the image links to determine which images were taken down. Links to removed images will be redirected to the place.

TEST 1: Not showing the sculpture

The first twenty incorrectly placed photographs were flagged. They all show something else in the neighborhood of the sculpture but do not display the sculpture itself. They typically show ships docked nearby, and other nearby sculptures You can find thumbnails and Maps links in the spreadsheet.

About an hour later I checked the status of each of them. 1 image was removed, and 19 were not removed. For each of the 19 photos, I submitted an appeal. I received email notifications for each of these appeals.

7 minutes after submitting the appeals, the appeal outcomes started arriving in my inbox. The next morning, a total of 2 appeals were approved. So my efforts to remove 20 wrongly placed photos resulted in only the following 3 images being removed:

Outcome

1 got removed after flagging
2 got removed after appeal
Result: 3 out of 20 got removed = 15% only

Conclusion and discussion 1

Only 3 photos removed out of 20 is a 15% success rate. This is disappointing in my opinion.

The first and second removed images show places approximately 200 meters southeast of the sculpture. I don’t know where the third image was taken, but it’s not from closeby.

The 17 photographs that were not removed were taken within around 100 meters of the sculpture and depict surrounding ships, views, and removed artworks.

These findings show that the flagging and appeals processes are based on whether a photo was taken near the sculpture rather than the content of the photo.

TEST 2: Photos from 40+ km away

Then I flagged 10 images clearly from another city (Copenhagen). They were all rejected and appealed within an hour. After only 1.5 hours, 7 of these appeals were approved and confirmed via email. When checking under Reports in the Maps app, 9 appeals were actually approved. That is a stunning 90% success rate.

The only Copenhagen image that was not removed after appeal is a photo showing The Little Mermaid located 46 kilometers south of the sculpture.

You can see thumbnails of the 9 removed images in the spreadsheet under the heading TEST 2.

Outcome

0 removed after flagging
9 out of 10 removed after appeal
Result: 90 percent got removed

Conclusion and discussion 2

Google’s evaluation of flags for posting to the wrong location is probably not using GPS metadata, but the appeals process sure is. Hence, it is well worth taking the time to appeal if you recognize the image to be from far away and it was not removed after flagging.

Maybe the little mermaid photo had no GPS meta data.

TEST 3 Flagging selfies and group photos

Test 3 was designed to investigate the efficiency of flagging photos for privacy.

First I flagged the 10 selfies/privacy photos. You can see them in the spreadsheet under the heading TEST 3.

After 5 minutes the first flag was rejected. After an hour, 8 were still being checked. After 5 hours, half were still being checked. So it is likely that checking flags for privacy takes longer than wrong locations. After 21 hours, 1 image was removed, and 2 were still pending. The last one was rejected Monday morning. So I went ahead and appealed for the remaining 9 photos.

They were processed pretty quickly except for one image which took 3-4 days. It was the same image that took the longest to get rejected for the initial flagging. But it is also the most obvious violation of the privacy guidelines. It shows a 4-5 year old child next to the sculpture. Looking straight into the camera. Still, it was not removed—not even after appeal.

Outcome

1 got removed after flagging
0 got removed after appeals
Result: 1 got removed = 10 percent. This is disappointing.

Conclusion and discussion 3

The only privacy photo that got removed is one that shows a person standing next to the sculpture facing away from the camera!

It is ludicrous to fail to remove the most heinous privacy offenses while removing a faceless photo. I believe Google Maps’ method for processing privacy flags is broken.

Overall conclusions

The algorithm used to process photos that are flagged for not showing the place is not very sophisticated. There is no AI system analyzing what the photos show. It seems to be only based on checking the GPS data and calculating the distance from the location. This test suggests that if the distance is more than maybe 100 meters, the photo can be removed after appeal - not after just flagging such photos. Keep in mind, appeals are only available in the EU. So flagging photos for the wrong location outside the EU are, in my opinion, a waste of our time. The process to process flaggings for wrong location needs fixing.

Photos not showing the sculpture but other objects in the near vicinity will not be removed.

To not waste your time, keep in mind that only photos from far away from the location can get removed after appeals.

Flagging photos for privacy reasons is not worth our time until the algorithms get fixed.

Random follow-up remarks

Keep in mind this test was initiated over a weekend; maybe flagging on working days gets processed more quickly.

I tried to flag the Little Mermaid photos once more. Based on this one test, it is probably not worth trying again. It took only a few minutes to process the flag and appeal. So the system likely remembered the first outcome.

I noticed once an appeal was not “Not approved”; instead, it got canceled. I failed to understand the difference.

This test is based on flagging in the Android app. But I noticed that flagging on desktop also immediately shows up under reports on mobile.

Maybe someone would care to systematically test the other kinds of flags.

We could test if crowd flagging is effective.

I have a suspicion that the system starts auto-rejecting flags from a user if we flag a lot of photos. But this needs further testing.

30 Likes

@MortenCopenhagen gran post :+1:

1 Like

@MortenCopenhagen
Great post.
One day Google will start diverting some of the spare Gemini capacity to deal with these simple workflows (Takes about 3 minutes to prompt a privacy check for example that works quite well).
Not sure why wouldn’t map try and be a little more collaborative in cleaning up the maps data.
I understand that if these images are not featured, the harm done is likely very low if ignoring data storage cost (how many really scoll through all the photos in a location?).
I have seen however some featured photos with wrong location photos and reporting did not help at all. Maybe it is worth trying again now…

I would probably rate a much higher effort that should be put for relevancy of featured photos (how many photos of location from 2-3 years back are there when it now looks completely different?)

Only so much we can ask for…

3 Likes

I can understand the ressources issue of AI testing all new and old photos on Google Maps, but that cannot explain why Google is not allocating the needed resources to evaluate flags from users.

Users want to help and don’t like to see their efforts being ignored.

4 Likes

Hello, @MortenCopenhagen , A week ago, I also reported some media issues, which have been approved. Please find the screenshots attached.

1 Like

Great job!

I also flagged some incorrect images (though I didn’t have much success with most of them).

By the way, tests like this really should be handled by Google’s QA team…

2 Likes

Ive been watching this, but not such a thorough test as yours.

All my reports have been for “Not a photo or video of the place”, and some have been successfully removed.

I’m not in the EU and therefore have no appeal process, but my success rate appears to be around 40%.

Will attempt to do a properly structured test and come back to this thread @MortenCopenhagen.

3 Likes

Some of my reports were for places that had few photos @abermans. I agree that if I had to scroll hundreds to find bad photos to report, other users are likely not seeing those photos anyway.

Are you saying you had a 100% success rate @PrasadVR? All of those reports were approved? Congratulations!!

1 Like

Great, please consider to design your test so the distance issue can be confirmed or rejected.

Great test and insights! :clap: The results really show how inconsistent Google’s flagging/removal system can be, especially when appeals are only available in certain regions. @MortenCopenhagen sir Thanks for sharing your experiment.

1 Like

Not even that did work for me @MortenCopenhagen. Photos showing a square with same name in Germany, but 300 km away from the pin were not removed even after appeal.

1 Like

I think this sticks with me the most, it just sounds crazy, but can’t argue with what your test showed. Maybe it’s just not worth the compute time / resources.

I was wondering if some exif data or tags were applied to an image when they were first uploaded. The contents of the photo are even recognised if you Like a photo via the app, I made a little post, it even recognises a turtle. The image is analysed initially, because a decision is made very quickly if this is the best photo for the POI or not. But then there is no image recognition when flagging as not a photo of the POI?? GPS data cannot be the only when making that decision, because that would be only a small part of the story right?

Recently found out that editing via the lightroom app removed the gps data, and google photos won’t add the location data unless the files are backed up, which none have been as I delete straight after upload. At least I know that GPS data is not necessary for featured photos.

Love the testing Morten

2 Likes

Great post @MortenCopenhagen I have flagged for a while, it really bothers me when guides randomly post a photo or video that has nothing to do with the POI. Also for the other reasons you mentioned.
Results… It’s an estimation but maybe around 10% only are removed.
Today I found that someone had stole one my photos again and his stolen photo is now the featured photo. This is a different process then the others , we will see what happens.

1 Like

Hi @TerryPG

I’m sure you know how to report a Maps photo of yours being stolen. Indeed it is another thing reporting this. I shared the steps here if others need to know how:

https://www.localguidesconnect.com/t/how-to-detect-and-report-your-photo-got-stolen/400846

1 Like

@MortenCopenhagen Thank you very much for your post, and congratulations on your dedication to this verification, with facts and spreadsheets.

1 Like

No @tony_b , there is a one rejection out of ten recent submissions.

1 Like

Such a useful testing of a very important feature @MortenCopenhagen
I usually flag the photos available in reviews if the faces of people are visible. Sadly, it’s disappointing to see the results. Not only in my experience, but even after checking your testing results. Google should include the AI system which can check for flagged photos. Add to that, we in India don’t even have the option to appeal of our request gets rejected.

2 Likes

That’s really interesting investigation @MortenCopenhagen

1 Like