Norzer Google Index Bot: Automate SEO Google Page Indexing Checks with Python & Google Search Console API

What is the Norzer Google Index Bot?

As a small business owner or SEO enthusiast, you know how vital it is to have your website’s pages indexed by Google. Indexed pages appear in search results, driving organic traffic and leads to your site. But manually checking indexing status in Google Search Console (GSC) can be a tedious chore—logging in, inspecting URLs one by one, and sifting through reports. That’s where the Norzer Google Index Bot steps in to simplify Google Search Console index monitoring to improve your SEO.

This free, open-source Python tool automates the process. It fetches URLs from your XML sitemap, checks their indexing status via the Google Search Console API, and delivers a clear email report—no endless logins required. With the Norzer Google Index Bot, you get actionable insights straight to your inbox, helping you stay ahead of indexing issues and keep your site visible in search results.

Norzer Index Monitor Bot screenshot

Why It Helps

Here’s why the Norzer Google Index Bot is a must-have for small businesses and SEO professionals:

  • Saves Time: No more manual GSC checks. Set it up once, and it runs in the background.
  • Proactive Alerts: URLs are sorted into “Indexed,” “Issues” (e.g., “Discovered - currently not indexed”), and “API Errors,” so you know exactly where to focus.
  • Customizable: Exclude specific URLs (like privacy pages) and define “issue” statuses in the config file.
  • Scalable: Multi-threading handles multiple URLs efficiently, with a throttle delay to respect Google’s API limits.
  • Peace of Mind: Regular reports keep you informed without lifting a finger.

Imagine launching a new blog post or product page, only to find Google hasn’t indexed it yet. Without monitoring, you could lose weeks of traffic. The Norzer Google Index Bot catches these problems early and sends you an email alert, letting you request indexing and optimize your SEO strategy fast.

Why It Beats Manual GSC Checks

Google Search Console is powerful but not user-friendly for routine checks. Manually inspecting URLs means navigating menus, submitting pages individually, and waiting for updates—a hassle for sites with dozens or hundreds of pages. The Norzer Google Index Bot outshines manual checks by:

  • Automating the entire process from start to finish.
  • Sending results in an easy-to-read email.
  • Running on your schedule (e.g., via cron).

Whether you’re a busy small business owner or an SEO consultant managing client sites, this tool frees you to focus on growth—not grunt work. Ready to streamline your SEO monitoring? Follow our step-by-step guide to set up the Norzer Google Index Bot on Ubuntu Linux!


Step-by-Step Installation Guide: Norzer Google Index Bot on Ubuntu

This guide walks you through installing the Norzer Google Index Bot on Ubuntu Linux (e.g., 20.04, 22.04, or 24.04/24.10). While adaptable to Windows, macOS, or other Linux distros, these steps focus on Ubuntu for simplicity. Basic terminal knowledge helps, but we’ve kept it beginner-friendly. The script, config file, and video tutorial is posted at the end of this post.

Prerequisites

  • Ubuntu Linux System: A working Ubuntu setup with internet access.
  • Google Search Console Access: Your site verified in GSC and a service account JSON key.
  • SMTP Server: Email credentials (e.g., Gmail or a custom SMTP like smtp.norzer.me).
  • Root or Sudo Access: For installing dependencies.

Step 1: Update Your System

Keep your system current to avoid compatibility hiccups:

sudo apt update && sudo apt upgrade -y

Step 2: Install Python 3, Pip, and SQLite3

The script needs Python 3 and SQLite3 (included in Python’s standard library on Ubuntu). We’ll also install the SQLite3 CLI for troubleshooting.

  • Check Python: python3 -V (e.g., Python 3.x.x). If missing: sudo apt install python3 -y
  • Install tools: sudo apt install python3-pip python3-venv sqlite3 -y
  • Verify:
    • python3 -V
    • sqlite3 --version (e.g., 3.45.1)
    • pip3 --version
      Use sqlite3 sitemap_data.db "SELECT * FROM sitemap_urls;" later to inspect the database if needed.

Step 3: Set Up a Virtual Environment and Install Packages

Ubuntu 24.04+ restricts global pip installs (PEP 668), so we’ll use a virtual environment.

  • Create a directory: mkdir ~/indexmon && cd ~/indexmon
  • Create a virtual environment: python3 -m venv venv
  • Activate it: source venv/bin/activate (prompt changes to (venv)).
  • Install packages: pip install requests configparser google-auth-oauthlib google-auth-httplib2 google-api-python-client pytz
  • Verify: pip list (check for requests, google-api-python-client, etc.).
  • Stay activated for the next steps. Deactivate later with: deactivate

Step 4: Set Up Google Search Console API Access

You’ll need a service account key named gsc-key.json for GSC API access.

  1. Create a Google Cloud Project:
    • Visit Google Cloud Console.
    • Click “Create Project,” name it (e.g., “Norzer Index Bot”), and create.
  2. Enable Search Console API:
    • Go to “APIs & Services” > “Library.”
    • Search “Google Search Console API” and click “Enable.”
  3. Create a Service Account:
    • Navigate to “APIs & Services” > “Credentials.”
    • Click “Create Credentials” > “Service Account.”
    • Name it (e.g., “index-bot-service”), skip optional fields, and create.
    • Under “Keys,” select “Add Key” > “Create new key” > “JSON.” Download the file.
  4. Rename and Secure the JSON File:
    • Rename it to gsc-key.json and move it to ~/indexmon/gsc-key.json.
    • Set permissions: chmod 600 ~/indexmon/gsc-key.json
  5. Grant GSC Access:
    • In Google Search Console, go to “Settings” > “Users and Permissions” > “Add User.”
    • Add the service account email (e.g., index-bot-service@your-project.iam.gserviceaccount.com) with “Full” permission.

Step 5: Download and Configure the Script

  • Save the Script (below): With the virtual environment active, save the script as indexmon.py in ~/indexmon:
    • nano indexmon.py, paste the code, save (Ctrl+O, Enter), exit (Ctrl+X).
  • Save and Edit config.ini: Create config.ini in ~/indexmon:
    • nano config.ini, paste the config, and update:
      • SITEMAP_URL: Your sitemap (e.g., https://yourdomain.com/sitemap.xml).
      • GSC_CREDENTIALS: Set to ~/indexmon/gsc-key.json.
      • SMTP section: Add your SERVER, PORT, USERNAME, EMAIL_RECIPIENT, and SMTP_PASSWORD.
      • Optional: Adjust EXCLUDED_URLS, ISSUE_STATUSES, GSC_MAX_WORKERS, GSC_THROTTLE_DELAY, TIMEZONE.
    • Save and exit.
  • Optional: Use an Environment Variable for SMTP_PASSWORD:
    • For security, leave SMTP_PASSWORD blank in config.ini and set it via: export SMTP_PASSWORD="your_password_here" before running.
    • The script fetches it with os.getenv(). You can also update the python script if you choose.
  • Secure config.ini: chmod 600 config.ini

Step 6: Test the Script

  • Run it: cd ~/indexmon && python indexmon.py
  • Check indexbot.log and your email for the report.
  • Troubleshoot errors (e.g., SMTP or API issues) using the log.
  • Deactivate: deactivate

Step 7: Automate with Cron to run daily at 8am (Optional)

  • crontab -e
  • Add: 0 8 * * * ~/indexmon/venv/bin/python /home/user/indexmon/indexmon.py
  • Save and exit.

Troubleshooting

  • No Email: Verify SMTP settings in config.ini and logs.
  • API Errors: Check gsc-key.json path, permissions, and API quota (adjust GSC_THROTTLE_DELAY).
  • Permissions: Ensure gsc-key.json and config.ini are 600 (ls -l).
  • Virtual Environment: If pip fails, reactivate: source venv/bin/activate.

Video Tutorial Walkthrough

Conclusion

You’re now equipped with the Norzer Google Index Bot, automating your SEO indexing checks on Ubuntu! Activate the virtual environment (source ~/indexmon/venv/bin/activate) whenever you work on it. It’s free, open-source, and ready for you to tweak!I The indexmon.py code and config.ini file are below!

Questions? Hop onto the free the Discord server and ask.


indexmon.py (Updated: 6/5/25)

"""
Norzer Google Index Bot
------------------------
Website: https://norzer.me
Author: Ryan McCain (ryan@norzer.me)
Date: 6/9/25
Version: 3.0
"""

import configparser
import requests
import sqlite3
import xml.etree.ElementTree as ET
import smtplib
import os
import logging
import time
import pytz
from datetime import datetime
from email.mime.text import MIMEText
from email.mime.multipart import MIMEMultipart
from google.oauth2 import service_account
from googleapiclient.discovery import build
from urllib.parse import urlparse
import fnmatch
from concurrent.futures import ThreadPoolExecutor, as_completed

# INITIAL SETUP
SMTP_SERVER = None
SMTP_PORT = None
SMTP_USERNAME = None
EMAIL_RECIPIENT = None
SMTP_PASSWORD = None

LOG_FILE = "indexbot.log"
logging.basicConfig(
    level=logging.INFO,
    format="%(asctime)s - %(levelname)s - %(message)s",
    handlers=[logging.FileHandler(LOG_FILE), logging.StreamHandler()]
)

def send_email(subject, body, smtp_server=None, smtp_port=None,
               smtp_username=None, smtp_password=None, email_recipient=None):
    smtp_server = smtp_server or SMTP_SERVER
    smtp_port = smtp_port or SMTP_PORT
    smtp_username = smtp_username or SMTP_USERNAME
    smtp_password = smtp_password or SMTP_PASSWORD
    email_recipient = email_recipient or EMAIL_RECIPIENT

    if not smtp_password:
        logging.error("❌ SMTP password not set. Email not sent.")
        return

    msg = MIMEMultipart()
    msg["From"] = smtp_username
    msg["To"] = email_recipient
    msg["Subject"] = subject
    msg.attach(MIMEText(body, "plain"))

    try:
        with smtplib.SMTP_SSL(smtp_server, smtp_port) as server:
            server.login(smtp_username, smtp_password)
            server.send_message(msg)
        logging.info("✅ Email sent successfully.")
    except Exception as e:
        logging.error(f"❌ Failed to send email: {str(e)}")

def main():
    global SMTP_SERVER, SMTP_PORT, SMTP_USERNAME, EMAIL_RECIPIENT, SMTP_PASSWORD                                                       , LOG_FILE

    config = configparser.ConfigParser()
    config_path = os.path.join(os.path.dirname(__file__), "config.ini")
    config.read(config_path)

    SITEMAP_URL = config.get("SETTINGS", "SITEMAP_URL")

    if not SITEMAP_URL:
        error_message = "❌ SITEMAP_URL is missing from the config file. Exiting                                                       ."
        logging.error(error_message)
        send_email("Norzer Google Index Bot - Configuration Error", error_messag                                                       e)
        exit(1)

    SMTP_SERVER = config.get("SMTP", "SERVER", fallback=None)
    SMTP_PORT = config.getint("SMTP", "PORT", fallback=None)
    SMTP_USERNAME = config.get("SMTP", "USERNAME", fallback=None)
    EMAIL_RECIPIENT = config.get("SMTP", "EMAIL_RECIPIENT", fallback=None)
    SMTP_PASSWORD = os.getenv("SMTP_PASSWORD", config.get("SMTP", "SMTP_PASSWORD                                                       ", fallback=None))

    # Log SMTP details, masking the password
    logging.info(f"SMTP_SERVER: {SMTP_SERVER}")
    logging.info(f"SMTP_PORT: {SMTP_PORT}")
    logging.info(f"SMTP_USERNAME: {SMTP_USERNAME}")
    logging.info(f"EMAIL_RECIPIENT: {EMAIL_RECIPIENT}")
    if SMTP_PASSWORD:
        masked_password = SMTP_PASSWORD[:5] + "*****" + SMTP_PASSWORD[-5:] if le                                                       n(SMTP_PASSWORD) > 10 else "*****"
        logging.info(f"SMTP_PASSWORD: {masked_password}")
    else:
        logging.info("SMTP_PASSWORD: Not set")

    if not SMTP_PASSWORD:
        logging.error("❌ SMTP password not found in config.ini or environment v                                                       ariable.")

    SERVICE_ACCOUNT_FILE = config.get("SETTINGS", "GSC_CREDENTIALS", fallback=No                                                       ne)
    if not SERVICE_ACCOUNT_FILE or not os.path.exists(SERVICE_ACCOUNT_FILE):
        error_message = f"❌ Google service account JSON file is missing or not                                                        found: {SERVICE_ACCOUNT_FILE}"
        logging.error(error_message)
        send_email("Norzer Google Index Bot - Configuration Error", error_messag                                                       e)
        raise FileNotFoundError(error_message)

    SCOPES = ["https://www.googleapis.com/auth/webmasters.readonly"]
    DOMAIN_NAME = urlparse(SITEMAP_URL).netloc
    EXCLUDED_URLS = config.get("EXCLUDED_URLS", "urls", fallback="").split(", ")
    ISSUE_STATUSES = config.get("ISSUE_STATUSES", "statuses", fallback="").split                                                       (", ")
    LOG_FILE = os.path.join(os.path.dirname(__file__), config.get("SETTINGS", "L                                                       OG_FILE", fallback="indexbot.log"))
    logging.getLogger().handlers = [logging.FileHandler(LOG_FILE), logging.Strea                                                       mHandler()]
    DB_FILE = os.path.join(os.path.dirname(__file__), "sitemap_data.db")
    GSC_MAX_WORKERS = config.getint("GOOGLE_SEARCH_CONSOLE", "GSC_MAX_WORKERS",                                                        fallback=3)
    GSC_THROTTLE_DELAY = config.getfloat("GOOGLE_SEARCH_CONSOLE", "GSC_THROTTLE_                                                       DELAY", fallback=0.3)

    DEFAULT_TIMEZONE = config.get("SETTINGS", "TIMEZONE", fallback="America/Chic                                                       ago")
    try:
        LOCAL_TZ = pytz.timezone(DEFAULT_TIMEZONE)
    except pytz.UnknownTimeZoneError:
        error_message = f"❌ Invalid TIMEZONE setting: {DEFAULT_TIMEZONE}. Using                                                        default 'America/Chicago'."
        logging.error(error_message)
        send_email("Norzer Google Index Bot - Configuration Error", error_messag                                                       e)
        LOCAL_TZ = pytz.timezone("America/Chicago")

    def initialize_database():
        conn = sqlite3.connect(DB_FILE)
        cursor = conn.cursor()
        cursor.execute("""
            CREATE TABLE IF NOT EXISTS sitemap_urls (
                id INTEGER PRIMARY KEY AUTOINCREMENT,
                url TEXT NOT NULL UNIQUE
            )
        """)
        conn.commit()
        conn.close()

    def fetch_sitemap_urls(sitemap_url):
        logging.info(f"Fetching URLs from sitemap: {sitemap_url}")
        try:
            headers = {
                "User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWe                                                       bKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36",
                "Accept": "application/xml, text/xml"
            }
            response = requests.get(sitemap_url, headers=headers, timeout=10)
            response.raise_for_status()
            root = ET.fromstring(response.content)
            namespaces = {"sitemap": "http://www.sitemaps.org/schemas/sitemap/0.                                                       9"}
            urls = []

            if root.tag.endswith("sitemapindex"):
                for sitemap in root.findall("sitemap:sitemap", namespaces):
                    loc = sitemap.find("sitemap:loc", namespaces).text
                    logging.info(f"Found nested sitemap: {loc}")
                    urls.extend(fetch_sitemap_urls(loc))
            else:
                for url in root.findall("sitemap:url", namespaces):
                    loc = url.find("sitemap:loc", namespaces).text
                    urls.append(loc)

            conn = sqlite3.connect(DB_FILE)
            cursor = conn.cursor()
            for url in urls:
                try:
                    cursor.execute("INSERT OR IGNORE INTO sitemap_urls (url) VAL                                                       UES (?)", (url,))
                except sqlite3.IntegrityError:
                    continue
            conn.commit()
            conn.close()
            logging.info(f"✅ Fetched and stored {len(urls)} URLs from sitemap."                                                       )
            return urls
        except Exception as e:
            error_message = f"❌ Failed to fetch sitemap {sitemap_url}: {str(e)}                                                       "
            logging.error(error_message)
            send_email("Norzer Google Index Bot - Sitemap Error", error_message)
            return []

    def is_url_excluded(url):
        return any(fnmatch.fnmatch(url, pattern) for pattern in EXCLUDED_URLS)

    def format_time(iso_time):
        if not iso_time or iso_time == "Unknown":
            return "Unknown"
        try:
            dt_utc = datetime.strptime(iso_time, "%Y-%m-%dT%H:%M:%SZ").replace(t                                                       zinfo=pytz.utc)
            dt_local = dt_utc.astimezone(LOCAL_TZ)
            return dt_local.strftime("%m/%d/%y %I:%M %p %Z")
        except Exception:
            return "Unknown"

    def authenticate_gsc():
        credentials = service_account.Credentials.from_service_account_file(
            SERVICE_ACCOUNT_FILE, scopes=SCOPES
        )
        return build("searchconsole", "v1", credentials=credentials, cache_disco                                                       very=False)

    def get_indexing_status(url):
        try:
            service = authenticate_gsc()
            request = service.urlInspection().index().inspect(
                body={'inspectionUrl': url, 'siteUrl': f'sc-domain:{DOMAIN_NAME}                                                       '}
            )
            response = request.execute()
            result = response.get('inspectionResult', {}).get('indexStatusResult                                                       ', {})
            coverage_state = result.get("coverageState", "Unknown")
            last_crawl = format_time(result.get("lastCrawlTime", "Unknown"))
            return coverage_state, last_crawl
        except Exception as e:
            error_str = str(e).lower()
            if "429" in error_str or "quota" in error_str:
                return "❌ API Error: Quota Exceeded", "Unknown"
            elif "timeout" in error_str:
                return "❌ API Error: Request Timed Out", "Unknown"
            elif "403" in error_str:
                return "❌ API Error: Permission Denied", "Unknown"
            elif "404" in error_str:
                return "❌ API Error: URL Not Found", "Unknown"
            elif "503" in error_str:
                return "❌ API Error: Service Unavailable", "Unknown"
            elif "500" in error_str:
                return "❌ API Error: Server Error", "Unknown"
            else:
                return f"❌ API Error: {str(e)}", "Unknown"

    def update_indexing_status():
        logging.info("Checking database for existing URLs...")
        conn = sqlite3.connect(DB_FILE)
        cursor = conn.cursor()
        cursor.execute("SELECT url FROM sitemap_urls")
        urls = [row[0] for row in cursor.fetchall()]
        conn.close()
        logging.info(f"Found {len(urls)} URLs in database.")

        logging.info("Fetching URLs from sitemap to ensure latest data...")
        urls = fetch_sitemap_urls(SITEMAP_URL)

        excluded_urls = [url for url in urls if is_url_excluded(url)]
        urls_to_check = [url for url in urls if not is_url_excluded(url)]

        logging.info(f"📜 Logging to: {os.path.abspath(LOG_FILE)}")
        logging.info(f"🌎 Using Timezone: {LOCAL_TZ.zone}")
        logging.info(f"🔄 Max Threads (Workers): {GSC_MAX_WORKERS}")
        logging.info(f"⏳ Throttle Delay (Seconds): {GSC_THROTTLE_DELAY}")
        logging.info(f"🔍 Checking indexing status for {len(urls_to_check)} URLs                                                       ...\n")

        email_header = (
            f"📜 Logging to: {os.path.abspath(LOG_FILE)}\n"
            f"🌎 Timezone: {LOCAL_TZ.zone}\n"
            f"🔄 Max Threads: {GSC_MAX_WORKERS}\n"
            f"⏳ Throttle Delay: {GSC_THROTTLE_DELAY} seconds\n"
            "---------------------------------\n"
        )

        issues, indexed_urls, api_errors = [], [], []

        with ThreadPoolExecutor(max_workers=GSC_MAX_WORKERS) as executor:
            future_to_url = {executor.submit(get_indexing_status, url): url for                                                        url in urls_to_check}
            for future in as_completed(future_to_url):
                url = future_to_url[future]
                status, last_crawl = future.result()
                log_msg = f"{url} -> {status} (Last Indexed: {last_crawl})"
                logging.info(log_msg)
                if status in ISSUE_STATUSES:
                    issues.append(log_msg)
                elif "API Error" in status:
                    api_errors.append(log_msg)
                else:
                    indexed_urls.append(log_msg)
                time.sleep(GSC_THROTTLE_DELAY)

        logging.info("\n📛 EXCLUDED URLs:")
        if excluded_urls:
            for url in excluded_urls:
                logging.info(f"{url}")
        else:
            logging.info("None")

        # Prepare joined text blocks before putting into f-strings
        issues_text     = "\n".join(issues) or "None"
        indexed_text    = "\n".join(indexed_urls) or "None"
        api_errors_text = "\n".join(api_errors) or "None"
        excluded_text   = "\n".join(excluded_urls) or "None"

        email_body = (
            email_header
            + f"🚨 ISSUES:\n{issues_text}\n\n"
            + f"✅ INDEXED URLs:\n{indexed_text}\n\n"
            + f"🚫 API ERRORS:\n{api_errors_text}\n\n"
            + f"📛 EXCLUDED URLs:\n{excluded_text}\n\n"
            + "---\n"
            + "🚀 Powered by Norzer\n"
            + "✨ Helping small businesses increase leads with expert Local SEO.                                                       \n"
            + "🌐 High-performance, secure website design & hosting.\n"
            + "🔗 https://norzer.me | 📧 hello@norzer.me"
        )

        send_email(f"Norzer Google Index Report for {DOMAIN_NAME}", email_body)

    initialize_database()
    update_indexing_status()

if __name__ == "__main__":
    try:
        main()
    except Exception as e:
        error_message = (
            f"❌ Fatal Error: {str(e)}\n\n"
            f"Script terminated unexpectedly. Check the log file for details: {o                                                       s.path.abspath(LOG_FILE)}"
        )
        logging.error(error_message)
        send_email("Norzer Google Index Bot - Fatal Error", error_message)

config.ini

#Norzer Google Index Bot Settings
#Website: https://norzer.me
#Author: Ryan McCain (ryan@norzer.me)
#Version 2.1
#
#Description:
#This script checks the indexing status of URLs from a website's XML sitemap
#using the Google Search Console API. It sends an email report with results,
#categorizing URLs into Indexed, Not Indexed (Issues), and Excluded URLs.

[SETTINGS]

# Secure this file by restricting permissions.
# Run the following command to allow only the owner to read/write:
# chmod 600 config.ini
# This protects sensitive credentials stored below.

# URL for the website's XML sitemap
SITEMAP_URL = https://mizakandpacetti.com/sitemap_index.xml

# Path to the Google Cloud API credentials JSON file for Search Console access
GSC_CREDENTIALS = /root/indexmon/gsc-key.json

#Log file
LOG_FILE = indexbot.log

# === Timezone Configuration ===
# Set your desired timezone to ensure correct timestamps in reports.
# Available U.S. Timezones:
# - America/New_York       (Eastern Time - ET)
# - America/Chicago        (Central Time - CT)
# - America/Denver         (Mountain Time - MT)
# - America/Phoenix        (Mountain Standard Time - MST, no DST)
# - America/Los_Angeles    (Pacific Time - PT)
# - America/Anchorage      (Alaska Time - AKST)
# - Pacific/Honolulu       (Hawaii-Aleutian Time - HST, no DST)
TIMEZONE = America/Chicago

# SMTP email settings for sending reports

[SMTP]
SERVER=smtp.norzer.me
PORT=465
USERNAME=mailer@norzer.me
EMAIL_RECIPIENT=ryan@norzer.me

# WARNING: Storing passwords in plain text is a security risk.
# Ensure this file is secured with proper permissions (chmod 600).
SMTP_PASSWORD=$$$$ma1l4444$$$$

# List of URL patterns to exclude from indexing checks.
# Any URL matching these patterns will not be checked.
# Use '*' as a wildcard (e.g., https://domain.com/tags/* to exclude all tag pages).

[EXCLUDED_URLS]
urls=https://mizakandpacetti.com/category/*, https://mizakandpacetti.com/privacy-policy-2/, https://mizakandpacetti.com/why-mizak-pacetti/

# List of indexing statuses considered as issues.
# If a URL has any of these statuses anywhere in the result, it will be categorized as an issue in the email report.
# Explanation of common statuses:
# - "Discovered - currently not indexed": Google found the page but hasn't indexed it yet.
# - "URL is unknown to Google": Google has no record of the URL.
# - "Error" / "ERROR" / "ALERT": The page has indexing issues that require attention.

[ISSUE_STATUSES]
statuses=Error, ERROR, ALERT, currently not indexed, Crawled - currently not indexed, Discovered - currently not indexed, URL is unknown to Google, Unknown

[GOOGLE_SEARCH_CONSOLE]
# Controls how may URLs are checked at the same time.
# More workers = Faster, but risks hitting Google API rate limits.
# Fewer workers = Slower, but safer.
#
# Recommended range:
# - Safe Low-End: 3 (slower, best for small sites or avoiding API limits)
# - Safe High-End: 10 (faster, but may hit rate limits)
# - Default: 5 (balanced)
GSC_MAX_WORKERS=8

# Controls how long to wait between API requests (in seconds).
# Lower = Faster, but risks hitting API rate limits.
# Higher = Safer, but slower.
#
# Recommended range:
# - Safe Low-End: 0.3 seconds (faster, but may hit API limits)
# - Safe High-End: 2.0 seconds (safer, but slower)
# - Default: 0.5 seconds (balanced)
GSC_THROTTLE_DELAY=0.3

That's it! Enjoy! 👽

RECENT POSTS

GMBspy Chrome extension video tutorial - Google My Business
Video: How to find out what Google My Business (GMB) categories your competition is using using GMBspy.
In this video tutorial, I show you how to use a really cool (and free) Chrome extension called GMBspy (GMB Spy) to find out all of the Google My Business (GMB) business categories a business is using. This is important because Google only shows your the primary category chosen. It's critical that you understand what […]
Thruuu SEO Google SERP Scraper Tool Video Tutorial
Video Tutorial: Scrape Google Search Results (SERP) using a free SEO tool called Thruuu! It’s Similar to Surfer SEO.
In this video tutorial, I show you how to use Thruuu. Thruuu is an awesome tool I just came across today. It does a lot of what Surfer SEO does without the price tag. It could be a good free alternative option to Surfer SEO for you. Thruuu website: https://app.samuelschmitt.com/​ Creator Twitter profile: https://twitter.com/samuelschmitt   […]
Google NLP API SEO Content Optimization
Video Tutorial: How to use the Google NLP API to build SEO-optimized content/copy. AI/MI SEO Basics!
In this video, I will show you how to use the Google Natural Language Processing (NLP) API to help you build and SEO-optimize your content and copy. This video will provide you with a very basic understanding of how machine learning (artificial intelligence) analyzes and interprets content. These are the resources mentioned in the video […]

WHY NORZER?


We’re not a giant agency chasing big brands—we’re a small business built by an old-school hacker and animal lover who still believes business should be personal. At Norzer, we combine cutting-edge SEO and GEO tools with hands-on, relationship-driven service. We take the time to understand your niche, your competition, and what truly drives results in your local market.


Based in Louisiana and proudly crypto-friendly, we deliver expert Local SEO, WordPress design, secure hosting, and blazing-fast performance built to generate real leads. With one client per city and a private chat room for direct access to our team, you get unmatched support and a long-term partner invested in your growth. We genuinely value our partners and send surprise gifts every month. It's just our way of saying thanks. Learn more about Norzer!

TESTIMONIALS

WORK WITH NORZER

There are many digital marketing agencies out there offering professional digital marketing and website services. None are quite like Norzer. We make the process simple and risk-free, even for small businesses that don’t even know where to begin. Here’s how the process works:

Step 1Schedule a free consultation.

Step 2: We will discuss your current business goals and present a plan of how Norzer can help you increase leads.

Step 3: Norzer goes to work developing and implementing a strategy that will produce results that align with your business goals.

The best part: you don’t owe us a dime until for our initial analysis. You are free to take our analysis and do your own work if you choose. You work tirelessly to ensure the best possible experience for every customer that steps foot in your door or visits your website. We do the same. Help us help you!