Link11 WAAP
v2.16
v2.16
  • Link11 WAAP v2.16 Portal
  • Introduction
  • Getting Started
  • Setup Checklists
  • Marketplace onboarding
  • Console UI Walkthrough
    • General UI flow
    • Traffic
      • Traffic Concepts
      • Dashboard
      • View Log
    • Security
      • Security Section Concepts
      • Dynamic Rules
      • Quarantined
      • Profiles
        • Profile Concepts
        • Profiles
        • ACL Policies
        • WAF/IPS Policies
        • Custom Signature
      • Args Analysis
      • Tag Rules
      • Rate Limiting
      • Cloud Functions
    • Settings
      • Web Proxy
      • Backend Services
      • Error Pages
      • SSL
      • DNS
      • Planet Overview
      • Account
  • Using the product
    • Best Practices
      • Saving and Publishing Your Changes
      • Enabling Passive Challenges
      • Using the Reblaze Query Box
      • Understanding and Diagnosing Traffic Issues
    • How Do I...
      • Ban, Unban, and Whitelist Traffic Sources
      • Bypass Rate Limits for Loadtesting
      • Control Caching Behavior
      • Filter by Content
      • Quickly Block an Attacker
      • Secure Traffic from a Third-Party Page
      • Set Rate Limits and Exemptions
      • Set up SIEM/SOC integration
      • Video Tutorials
        • DNS Training
    • API
      • Reblaze REST API
      • Mobile SDK
  • Reference Information
    • Access log-structure
    • Acronyms
    • Deployment Terminology
    • Hostile Bot Detection / RCSI
      • Environmental detection and browser verification
      • Client authentication
      • Biometric behavioral verification
    • HTTP Response Codes
    • Pattern Matching Syntax
    • Signatures
    • Tags
    • TTL Expression Syntax
  • Support
Powered by GitBook
On this page
  • Biometric Bot Detection
  • Implementation
  • Testing
  • Disabling Active Challenges (Optional)

Was this helpful?

Export as PDF
  1. Using the product
  2. Best Practices

Enabling Passive Challenges

PreviousSaving and Publishing Your ChangesNextUsing the Reblaze Query Box

Last updated 3 years ago

Was this helpful?

As described in , out of the box Reblaze includes an Active Challenge process. This is very useful in distinguishing humans from bots.

Active Challenges work well, but an even better option is Passive Challenges.

  • Active Challenges temporarily redirect the user's browser. This can affect site metrics gathered by products such as Google Analytics. (Specifically, the initial referrer information is lost.) Passive Challenges are simple pieces of Javascript. They do not redirect the user's browser; they merely ask it to solve a challenge, and then insert the Reblaze cookies.

  • Active Challenges will not occur when site content is served from a CDN. Passive Challenges can still detect bots in this situation.

Most importantly, Passive Challenges allow Reblaze to use Biometric Bot Detection—an advanced and sophisticated means of distinguishing humans from automated traffic sources.

Biometric Bot Detection

With Biometric Bot Detection, Reblaze continually gathers and analyzes stats such as client-side I/O events, triggered by the user’s keyboard, mouse, scroll, touch, zoom, device orientation, movements, and more. Based on these metrics, the platform uses Machine Learning to construct and maintain behavioral profiles of legitimate human visitors. Reblaze learns and understands how actual humans interact with the web apps it is protecting. Continuous multivariate analysis verifies that each user is indeed conforming to expected behavioral patterns, and is thus a human user with legitimate intentions.

We recommend that all customers enable Passive Challenges if it is possible to do so. Biometric Bot Detection provides much more robust detection of automated traffic than is possible without it.

Implementation

Implementing Passive Challenges is simple. Place this Javascript code within the pages of your web applications:

<script src="/c3650cdf-216a-4ba2-80b0-9d6c540b105e58d2670b-ea0f-484e-b88c-0e2c1499ec9bd71e4b42-8570-44e3-89b6-845326fa43b6" type="text/javascript"></script>

The code snippet can go into the header or at the end of the page. The best practice is to place it within the header. This ensures that subsequent calls contain authenticating cookies.

(This matters for the first page served to a visitor. Subsequent pages will already have the authenticating cookies to include with their requests.)

Most customers set up the code snippet as a tag within their tag manager. This makes it simple to install it instantly across their entire site/application.

If desired, the script code can include async and/or defer attributes:

<script async src="/c3650cdf-216a-4ba2-80b0-9d6c540b105e58d2670b-ea0f-484e-b88c-0e2c1499ec9bd71e4b42-8570-44e3-89b6-845326fa43b6" type="text/javascript"></script>
<script defer src="/c3650cdf-216a-4ba2-80b0-9d6c540b105e58d2670b-ea0f-484e-b88c-0e2c1499ec9bd71e4b42-8570-44e3-89b6-845326fa43b6" type="text/javascript"></script>

These usually are not necessary, and their effect will depend on the placement of the script within the page. Their use is left to your discretion.

Testing

To test the implementation, use a browser to visit a page containing the Javascript snippet. Once it runs, the browser should have a cookie named rbzid.

Disabling Active Challenges (Optional)

There are two primary situations where customers sometimes want to disable Active Challenges:

  • When a customer needs site analytics to correctly reflect all referrers. (Active Challenges can interfere with this.)

Other than those situations, Active Challenges can be very beneficial.

We recommend that you keep Active Challenges enabled if possible. They automatically eliminate almost all DDoS traffic, scanning tools, and other hostile bot traffic.

If you wish to turn off Active Challenges, do the following.

  • For an entire site/application: remove the "Deny Bot" ACL Policy from all Profiles within the site.

  • For specific traffic sources: Add an ACL Policy that will 'Allow' those specific requestors. The requestors should be defined as narrowly as possible.

Turning off Active Challenges will disable the direct blocking of bots (where a requestor is blocked merely for being identified as a bot). However, automated traffic will still be excluded via all other relevant means.

If you have not enabled Passive Challenges (and successfully tested them), disabling Active Challenges is not recommended.

For API endpoints. Active Challenges are designed to verify the client's browser environment; for most API calls, there is no browser environment to verify. (For users of our , this is not a problem. They can still use active challenges for these endpoints.)

For specific URLs/locations: remove the default "Deny Bot" from all that are assigned to those locations.

If you merely remove the Deny Bot ACL Policy from the relevant Profiles, then bots will still be excluded by the other active , , , and so on. If instead you added an "Allow" ACL Policy to specific requestors, then other ACL Policies will not block those requestors; they will be exempted from ACL Policy filtering.

Traffic Concepts
More information about this.
Mobile SDK
ACL Policy
Profiles
ACL Policies
Dynamic Rules
content filtering