In the past I have submitted Google Maps updates to remove an non-existent location which eventually was reflected on the live public Google Map, but then after an unknown amount of time the bogus location would reappear. Can anyone with a personal connection to Google Maps employed sources help correct this, below is one specific example but as a universal rule for Goole Maps to incorporate in the future - if Google Maps accepts a Local Guide update to the map that invalidates data from one of thousands of data sources Google Maps scrapes for info, then Google Maps systems need to track that and prevent future data loads from recreating the same garbage data from the same or similar data load sources that created the garbage data a Local Guide already corrected.
Here is the location link to a representative address -
324 Horton Ave, San Diego, CA 92101
The location claims to be “U.S. Army Recruiting Center - University Branch Office”
visible on Maps at
https://www.google.com/maps/place//@32.7316929,-117.1679908,20.46z?entry=ttu
Clicking on the location label brings up the left frame Maps location info:
- 324 Horton Ave does not exist as an address anywhere in the greater San Diego county area in any zip code, verifiable at the County Recorder office GIS website
- the map pin location for the non-existent 324 Horton Ave address is displayed on Google Maps in the 2500 block of Horton Ave (undeniably wrong)
- the map pin location is also on an empty lot per Street View and real life
- the website that Google Maps assocaites to the place/location is a dead URL - html 404 error URL not found on this server
- yet Google Maps recreates this bogus location on the rare occassion Suggest an Edit updates submitted are actually Accepted
How can Google Maps automated data scrapes even allow such an obvious example of bad data with flaws on 3 or more data fields become official Maps data in this alleged age of artificial intelligence (obviously marketing hype) but easy for humans to code automated data quality checks into data scraping data input verification checks before adding locations into Maps?
The Maps Local Guides updates Suggest an Edit submissions don’t allow enough detail to categorize what the problem is, and when a submitted edit is Not Accepted by auto-processed or human-processed review of the submitted edit occurs, there is no feedback/justification to the Local Guide about why the edit wasn’t accepted.