« Connect to DuckDB in … | Home | MBS FileMaker Plugin,… »

Improving a Web Form: Fighting Back Against Bots

Over time, we’ve built a few web forms—some simple, some more complex. Unfortunately, bots love web forms. They fill them out with spam, test vulnerabilities, or just cause general headaches. If you’ve ever maintained a form in production, you know exactly what I mean.

To cut down on junk submissions, we’ve implemented a few strategies that have worked well for us. These aren’t foolproof, but they definitely raise the bar and help filter out a lot of automated noise.

  1. Honeypot Fields

    One of the simplest tricks: add a hidden field that normal users will never see or fill out. Bots, on the other hand, often just dump data into every available field. If this hidden field has any content on submission, you can safely reject it.

    This technique is quick to implement and effective against the lazier bots.

  2. Timestamped Tokens

    Include a timestamp (or a hashed version of one) in a hidden field when the form is rendered. Then, validate that timestamp on submission. This helps catch bots that scrape your form and replay the same request hours or days later.

    It also helps reduce the risk of people accidentally re-submitting an outdated form after leaving a tab open.

  3. JavaScript-Generated Fields

    Add a hidden field that gets filled by JavaScript when the page loads. If a submission comes in without this value, chances are high it came from a non-browser client or an environment where JavaScript didn’t run—classic bot behavior.

    This method helps weed out scripts that skip rendering and just POST directly to your endpoint.

  4. User Agent + Progressive Rendering

    You can use the user agent to detect known bots (like crawlers or testing tools) and choose not to render your form at all in those cases. Or you can use progressive enhancement to only inject the form into the page via JavaScript, keeping it hidden from bots that don’t run scripts.

    Of course, bots can spoof user agents, so don’t rely solely on this method—but it’s another useful layer.

  5. Duplicate Submission Detection

    Sometimes, the same user accidentally submits the form twice—maybe they reopened an old tab or hit refresh. It’s good practice to check for duplicate entries in your database and block or warn about resubmissions.

    This also catches bots that try sending the same payload over and over.

Got Any Tricks of Your Own?

These are just a few of the techniques we’ve found useful. There’s no silver bullet—realistically, you’re trying to make your form just annoying enough that bots move on to an easier target.

What anti-bot tricks have you used? I’d love to hear what’s worked for you.

10 04 25 - 11:56