# Detect crawlers and bots

To detect crawlers or bots, first [pass the user agent to Hypertune](https://docs.hypertune.com/guides/pass-the-user-agent-to-hypertune).

Next, add a rule to your flag that checks whether the user agent matches a crawler or bot using a regular expression:

{% code title="regex" %}

```regex
(?i)(bot|crawler|crawl|spider|slurp|fetch|search|monitor|scraper|python|perl|php|java|wget|curl|httpclient|libwww|bingpreview|mediapartners\-google)
```

{% endcode %}

<figure><img src="https://2048905609-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FWa3rQLiu4JZhBRkiyoKz%2Fuploads%2FCQh4WZsJbgYQ2WS0sObg%2FScreenshot%202025-08-23%20at%2015.41.46.png?alt=media&#x26;token=0c5248eb-33dc-4b8b-b684-773ba467a042" alt=""><figcaption></figcaption></figure>

In the example above, the rule runs before the default rule that contains the `Experiment` expression. As a result, crawlers and bots will exit the flag logic immediately. They won’t enter the experiment, which keeps exposures and analytics clean.

For this reason, we recommend keeping all targeting logic within a single flag rather than spreading it across multiple flags or hardcoding it in your application.

Hypertune makes this possible via its flexible configuration language, [Hyperlang](https://docs.hypertune.com/concepts/logic#hyperlang).
