-
-
Notifications
You must be signed in to change notification settings - Fork 86
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HI , AFTER A WHILE THE LINK FIELD IS ALWAYS EMPTY EVEN THOUGH DURING SCRAPING IT APPEARS THE GOOGLE.COM/MAPS..... LINK #54
Comments
thanks for the software btw... it was great and super fast, untill the links stoped working... i can use the plus code but its not the same thing... anyways... thanks for everything, you saved me a lot of time and money. bye |
i have same problem, how do you resolve this? |
hi, i write a python script that does what this website does...
https://pleper.com/index.php?do=tools&sdo=cid_converter... just transform
the cid in a url that ends up being the same thing. good luck.
…On Fri, Jun 7, 2024 at 5:43 AM Rasyidan Akbar F. ***@***.***> wrote:
i have same problem, how do you resolve this?
—
Reply to this email directly, view it on GitHub
<#54 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AXIDJGGM3X3FDBMQDRPVYQDZGFXDZAVCNFSM6AAAAABIAMXWWWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNJUGM3TMMJVGA>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
HOLY SHIT i just know about this recently thanks to u. I didnt know u can generate google map url with cid itself. i can temporarily solve it by your method. Thanks a lot ser, have a nice day! |
@lucaslw32 thanks for the report. I am on it |
fixed in v1.2.3 |
HERES THE COMMAND I USUALLY TYPE::: ./google-maps-scraper -input example-queries.txt -results restaurants-in-cyprus.csv -exit-on-inactivity 3m.... AND I DONT CHANGE THE NAMES OF THE FILES BECAUSE EVERYTIME I DID IT MESSED UP A BUNCH OF FILES CREATING DUPLICATED FILES AS OUTPUT AND THINGS LIKE THAT ... BUT i could use just renaming the files later but lately the link header is always empty in every row even though during scraping on the cmd terminal it appears the link that was scraped for every search...
The text was updated successfully, but these errors were encountered: