family law software
Call us on +1 888 291 7201


What Are Crawlers and Why Do I Want Them in My Software?

By Vound Software

Crawlers. Even the name sends shivers down your spine as it conjures up images of creepy things that hide from the light. Yet most of us take advantage of the work they do every day, and some of us could not function without them.

Those who are in the digital forensics field rely on crawlers to maximize data processing, which in turn, enables them to identify critical evidence in their cases. The link between crawler activity in your forensic search tools and your ability to solve cases involving drug trafficking, human trafficking, weapon smuggling, and child exploitation is stronger than you may think.

What are Crawlers, Anyway?

Crawlers, aka spiders, aka bots are essentially programs designed to do one thing: crawl through your data. Their job is to ingest data and collect it in preparation for indexing, similar to the way ants collect your lunch during a picnic.

Much like the industrious creepy crawlies of the world, bots carry on their work behind the scenes while you are busy focusing on other things. Think of the last time you accessed the internet. Did you use a search engine to find what you were looking for? Your search was successful in part because web crawlers combed through the website you visited and indexed key terms and phrases.

The Beauty of the Index

Indexing is a process of tracking organized data in a way that shortens the time needed to find it again. Imagine you had a book in your hands, and you wanted to locate the word, "wine". If the book is a picture book with only a handful of words per page, it won't take long at all to find the word, "wine". Now imagine that book is a chapter book with small font and 1500 pages, and you have to identify every time the author uses the word, "wine". All of a sudden, the Three Musketeers is no longer your favorite book. But what if there was an index at the back of the book that listed those pages out for you? Now, you can easily flip through the pages and locate exactly where the word, "wine" is used in the story.

Indexing creates tables with data markers that databases can use to jump from one piece of data to another. This optimization process shortens the time it takes to search, which means you get your data, faster.

But What do Crawlers and Indexes Have to do with Intella?

When you add evidence to a case in Intella, Intella's crawlers set to work ingesting your data. In this process, information from the evidence is loaded into the case index. To make your life easier, we optimize your memory settings to automatically calculate the number of crawlers assigned to your case files based on your system's configuration. We take things like the amount of RAM and CPU cores on your machine to ensure we don't take resources away from the other critical process occurring on your machine and determine these settings automatically. We ensure that we don't take too many resources away from the other critical process occurring on your machine.

What if I Want to Change the Default Crawler Settings?

We know you are busy. We know you have better things to do than wait hours or days for a case to be processed before you can start your digital investigation and review. We know that sometimes it's important to assign extra resources to the job to get it done faster, or that you may have purchased a high-end system to handle the extra workload.

Intella gives you the option of customizing your memory settings to optimize for better performance. You can increase the number of crawlers assigned to a case as well as the amount of memory allocated to each crawler.

Where Do I Go to Learn How to Adjust Those Settings?

Glad you asked. Here are a few resources available for you.

  1. Check out our training courses. You can take them remotely or in-person.
  2. Check out this forum post:
  3. Put in a support ticket and talk to our support team about getting these settings right for your team.

Even better, register here to attend our live webinar on November 5th where tech support specialists, Jon Hirsh and Carlos Montiel, will walk you through everything you need to know about memory settings and crawlers when using Intella.