We saw several mistakes by Google AI when it chose the feature photo of a place/area and most of these mistakes were about featuring a copied photo from the web (stolen ones indeed). The AI selects them as top photos because those are professional photos with good quality that can be found on the web. The automated filter should do better to avoid adding copied content.
But featuring a photo like what I share here, is really wonderful! Especially when it comes to a city. When you search/click the city of Mashhad on the map this photo appears on top.
This photo has many properties of a bad photo! Very low quality, stolen from the web, cropped from the main photo, and even a graphical shape over the photo!
So, I want to ask from Googlers or an image processing engineer Local Guide, how is even possible to feature such a poor-quality photo? Does this mean the AI filter is still too basic? I definitely understand that the system could fail sometimes but such a big one is not acceptable, I think.
I hope this feedback helps Google Maps engineers prepare a solution for such wrong photo selections by the system.
Indeed I get used to such wrong selections of the AI Just check some top tourist attractions here and you find lots of copied photos and reviews. Especially because here we haven’t the main “Add photo” and “Add review” button on the app and people has less chance to upload their exclusive content. So, the users who are using computers are more active and many of them use stolen materials to increase their points!
Around here the amount of stolen images is close to zero percent. Maybe because mobile phones with great cameras are abundant. Where I live there are a ton of tourist attractions. So adding photos to the wrong pin is a huge problem. At some pins, I see 10-20 percent of misplaced photos.
Does high resolution photos clicked on DSLR or any pro camera get priority over photos clicked on mobile camera (decent ones like Pixel or iPhone or any flagship device) to get featured at #1 under a POI or a town?
Also, I would like to know that does this AI give preference to the photos clicked on Google’s own mobile Pixel over any other mobile or pro camera?
Modern phones produce high quality images well beyond what is needed on Google Maps. The resolution drops a lot when someone is not aware of copyrights and screenshots an image from the internet. Go check using a desktop and you will see that such images can not even fill the screen (have black borders).
Using a dslr is in my opinion impractical, overkill, and way too time consuming. I doubt 1 percent of user uploaded images are from a dslr. For uploads from business owners the percentage is higher.
I don’t think pics from pixel phones get any priority. But I never studied this. Why should they? The AIs are trained to select the most helpful images.
And the AI picking images for cities really like intens blue sky and arial shot as you can see at the pin shared by @Amiran in the opening post.
I have been using a Huawei P30 pro for 3-4 years until 3 months ago when I bought a pixel 7a. I did not notice that it became easier to get featured after I started using the new phone.
I’m not sure about the Pixel but as I saw shots by iPhone have more chance to be featured. Still have doubt if it is related to brand but I think because of the pure properties of a not-edited photo taken by iPhone.