How to Protect Google Search API from Bots with Anti-Spam by CleanTalk

When Bots Start Wasting Your API Budget
Many websites use the paid Google Search API (Programmable Search / Custom Search JSON API) to power their internal search. Each search request triggers a paid API call.
As traffic grows, automated bots and scripts often begin abusing the search form, generating large volumes of useless requests. This leads to wasted API quota and unnecessary expenses.
This case study explains how protecting Google Search API from bots significantly reduced our API usage and costs.
The Problem
Before implementing protection, our Google Search API usage showed:
- Sudden spikes in API requests
- Unstable and unpredictable quota consumption
- Increasing monthly costs
- Repetitive and meaningless automated queries
- Suspiciously high request frequency
Investigation revealed that a large portion of API calls came from bots, scrapers, and automated scripts using our search form as a proxy to access Google Search.
Each automated request was a paid API call without any real user value.
The Solution - Protect Google Search API from Bots
We implemented bot protection at the search form level to stop automated requests before they reach Google Search API.
The key rule:
A Google API request is executed only if the user passes validation.
The protection included:
- Behavioral analysis (request timing, interaction patterns)
- Filtering automated and scripted traffic
- Blocking suspicious and repeated search patterns
This ensured that only real users could trigger paid API calls.
Simple Integration Example
Below is a simplified example of how to protect Google Search API from bots. In production, the logic can be more advanced.
Step 1 - install SDK
composer require cleantalk/php-antispam
Step 2 - add Cleantalk Anti-Spam handler to your form handler
$apikey = ''; // get it here cleantalk.org (free trial)
$message= $_POST['query_input']; // get it from your form
$cleantalk_antispam = new CleantalkAntispam($apikey, '');
$cleantalk_antispam->setMessage($message);
$api_result = $cleantalk_antispam->handle();
Step 2.1 - add js lib to your html template
<script src="https://fd.cleantalk.org/ct-bot-detector-wrapper.js" defer></script>
Step 3 - handle the result received from the cloud
if ($api_result && $api_result->allow === 0) {
die('Blocked. Spam protection OK. Reason: ' . $api_result->comment);
}
Even a basic implementation produced measurable results. For more information, see: https://github.com/CleanTalk/php-antispam
Results
After deploying protection to block bots from accessing Google Search API:
- API requests decreased several times
- Monthly Google Search API costs dropped significantly
- Quota usage became stable and predictable
- Automated traffic was almost completely eliminated
- Analytics reflected real user behavior only
In practice, we began paying only for legitimate user searches.
Additional Benefits
Beyond reducing API costs, the implementation also lowered overall server load by eliminating unnecessary automated requests before they reached the backend. It prevented the website from being used as a scraping proxy through the search form, improving overall platform stability. The protection strengthened website security by blocking automated and abusive traffic, while also making API usage more predictable and easier to budget thanks to the removal of sudden spikes caused by bots.
Conclusion
If your website uses Google Search API, protecting it from bots is critical to prevent wasted quota and unnecessary expenses. By stopping automated requests before they reach the API, we significantly reduced costs and stabilized usage while ensuring that only genuine user searches trigger paid requests.