How to Protect Google Search API from Bots and Cut Costs by Almost 90%

When Bots Start Wasting Your API Budget
Many websites use the paid Google Search API, including Programmable Search and the Custom Search JSON API, to power internal website search. Each search request may trigger a paid API call.
As traffic grows, automated bots and scripts can start abusing the search form, generating large volumes of useless search requests. This creates wasted API quota, unstable usage patterns, and unnecessary expenses.
In our case, protecting Google Search API from bots helped reduce Custom Search API costs from $750.75 in January to $145.81 in February, $130.95 in March, and $92.38 in April.
This case study explains how to protect Google Search API from bots before automated requests reach the paid API.
.png)
The Problem
Before implementing protection, our Google Search API usage showed several clear signs of automated abuse:
- Sudden spikes in API requests
- Unstable and unpredictable quota consumption
- Increasing monthly Google Search API costs
- Repetitive and meaningless automated queries
- Suspiciously high request frequency
Investigation revealed that a large portion of API calls came from bots, scrapers, and automated scripts using the search form as a proxy to access Google Search.
Each automated request was a paid API call without any real user value. Instead of paying mainly for legitimate website searches, the API budget was being consumed by bot-generated requests.
The Solution: Protect Google Search API from Bots
We implemented bot protection at the search form level to stop automated requests before they reach Google Search API.
The key rule was simple:
A Google API request is executed only if the user passes validation.
The protection included:
- Behavioral analysis, including request timing and interaction patterns
- Filtering of automated and scripted traffic
- Detection of suspicious and repeated search patterns
- Blocking abusive requests before they can trigger paid API calls
This ensured that only real users could trigger paid search requests.
Simple Integration Example
Below is a simplified example of how to protect Google Search API from bots. In production, the logic can be expanded depending on the website architecture, search form, and backend flow.
Step 1 - Install SDK
composer require cleantalk/php-antispam Step 2 - Add CleanTalk Anti-Spam handler to your form handler
$apikey = ''; // get it here cleantalk.org (free trial)
$message = $_POST['query_input']; // get it from your form
$cleantalk_antispam = new CleantalkAntispam($apikey, '');
$cleantalk_antispam->setMessage($message);
$api_result = $cleantalk_antispam->handle(); Step 2.1 - Add JS lib to your HTML template
<script src="https://fd.cleantalk.org/ct-bot-detector-wrapper.js" defer></script> Step 3 - Handle the result received from the cloud
if ($api_result && $api_result->allow === 0) {
die('Blocked. Spam protection OK. Reason: ' . $api_result->comment);
} Even a basic implementation produced measurable results. For more advanced validation logic and flexible integration, you can also use the Anti-Spam API: https://cleantalk.org/help/api-antispam
For more information, see: https://github.com/CleanTalk/php-antispam
Results: Google Search API Costs Dropped by Up to 87.7%
After deploying protection to block bots from accessing Google Search API, the cost difference became clear:
- Monthly Custom Search API cost dropped from $750.75 in January to $145.81 in February
- Costs remained lower in the following months: $130.95 in March and $92.38 in April
- This represents an 80.6% decrease in February compared to January
- By April, monthly cost was down by 87.7% compared to January
- API usage became more stable and predictable
- Automated traffic was significantly reduced
- Analytics reflected real user behavior more accurately
In practice, we began paying mainly for legitimate user searches instead of bot-generated requests.
Additional Benefits
Beyond reducing API costs, the implementation also lowered overall server load by eliminating unnecessary automated requests before they reached the backend. It prevented the website from being used as a scraping proxy through the search form, improving overall platform stability.
The protection strengthened website security by blocking automated and abusive traffic, while also making API usage more predictable and easier to budget thanks to the removal of sudden spikes caused by bots.
For broader website anti-bot protection across forms, registrations, comments, and other entry points, the same approach can be extended beyond internal search: https://cleantalk.org/anti-spam
Conclusion
If your website uses Google Search API, protecting it from bots is critical to prevent wasted quota and unnecessary expenses. By stopping automated requests before they reach the API, we reduced Custom Search API costs from $750.75 in January to $92.38 in April, while making usage more stable and ensuring that only genuine user searches trigger paid requests.