A powerful, free, and open-source Google Maps scraper for extracting business data at scale. Available as CLI, Web UI, REST API, or deployable to Kubernetes/AWS Lambda.
Love this project? A star helps others discover it and motivates continued development. Become a sponsor to directly support new features and maintenance.
Sponsored By
This project is made possible by our amazing sponsors
Scrap.io - Extract ALL Google Maps listings at country-scale
No keywords needed. No limits. Export millions of businesses in 2 clicks. Try it free →
# Playwright version (default)
docker pull gosom/google-maps-scraper
# Rod version (alternative)
docker pull gosom/google-maps-scraper:latest-rod
Build from Source
Requirements: Go 1.25.5+
git clone https://github.com/gosom/google-maps-scraper.git
cd google-maps-scraper
go mod download
# Playwright version (default)
go build
./google-maps-scraper -input example-queries.txt -results results.csv -exit-on-inactivity 3m
# Rod version (alternative)
go build -tags rod
./google-maps-scraper -input example-queries.txt -results results.csv -exit-on-inactivity 3m
First run downloads required browser libraries (Playwright or Chromium depending on version).
Features
Feature
Description
33+ Data Points
Business name, address, phone, website, reviews, coordinates, and more
Email Extraction
Optional crawling of business websites for email addresses
Multiple Output Formats
CSV, JSON, PostgreSQL, S3, LeadsDB, or custom plugins
Proxy Support
SOCKS5, HTTP, HTTPS with authentication
Scalable Architecture
Single machine to Kubernetes cluster
REST API
Programmatic control for automation
Web UI
User-friendly browser interface
Fast Mode (Beta)
Quick extraction of up to 21 results per query
AWS Lambda
Serverless execution support (experimental)
Extracted Data Points
Click to expand all 33 data points
#
Field
Description
1
input_id
Internal identifier for the input query
2
link
Direct URL to the Google Maps listing
3
title
Business name
4
category
Business type (e.g., Restaurant, Hotel)
5
address
Street address
6
open_hours
Operating hours
7
popular_times
Visitor traffic patterns
8
website
Official business website
9
phone
Contact phone number
10
plus_code
Location shortcode
11
review_count
Total number of reviews
12
review_rating
Average star rating
13
reviews_per_rating
Breakdown by star rating
14
latitude
GPS latitude
15
longitude
GPS longitude
16
cid
Google's unique Customer ID
17
status
Business status (open/closed/temporary)
18
descriptions
Business description
19
reviews_link
Direct link to reviews
20
thumbnail
Thumbnail image URL
21
timezone
Business timezone
22
price_range
Price level ($, $$, $$$)
23
data_id
Internal Google Maps identifier
24
images
Associated image URLs
25
reservations
Reservation booking link
26
order_online
Online ordering link
27
menu
Menu link
28
owner
Owner-claimed status
29
complete_address
Full formatted address
30
about
Additional business info
31
user_reviews
Customer reviews (text, rating, timestamp)
32
emails
Extracted email addresses (requires -email flag)
33
user_reviews_extended
Extended reviews up to ~300 (requires -extra-reviews)
34
place_id
Google's unique place id
Custom Input IDs: Define your own IDs in the input file:
Matsuhisa Athens #!#MyCustomID
Configuration
Command Line Options
Usage: google-maps-scraper [options]
Core Options:
-input string Path to input file with queries (one per line)
-results string Output file path (default: stdout)
-json Output JSON instead of CSV
-depth int Max scroll depth in results (default: 10)
-c int Concurrency level (default: half of CPU cores)
Email & Reviews:
-email Extract emails from business websites
-extra-reviews Collect extended reviews (up to ~300)
Location Settings:
-lang string Language code, e.g., 'de' for German (default: "en")
-geo string Coordinates for search, e.g., '37.7749,-122.4194'
-zoom int Zoom level 0-21 (default: 15)
-radius float Search radius in meters (default: 10000)
Web Server:
-web Run web server mode
-addr string Server address (default: ":8080")
-data-folder Data folder for web runner (default: "webdata")
Database:
-dsn string PostgreSQL connection string
-produce Produce seed jobs only (requires -dsn)
Proxy:
-proxies string Comma-separated proxy list
Format: protocol://user:pass@host:port
Export:
-leadsdb-api-key Export directly to LeadsDB (get key at getleadsdb.com)
Advanced:
-exit-on-inactivity duration Exit after inactivity (e.g., '5m')
-fast-mode Quick mode with reduced data
-debug Show browser window
-writer string Custom writer plugin (format: 'dir:pluginName')
Run ./google-maps-scraper -h for the complete list.
Using Proxies
For larger scraping jobs, proxies help avoid rate limiting. Here's how to configure them: