What's new at Detectify

The latest product updates, improvements, and new tests.

Coming Soon
July 02, 2024

Upcoming improvements to Application Scanning crawler

Image #1
Linus Kingfors
Product Manager

Description: We’ve been hard at work experimenting with how our crawlers interact with customer assets in response to the challenges Application Security teams face today, and what may come along in the future. That’s why we’re focusing on expanding our crawling capabilities by grouping assets by purpose, optimizing the mapping of customer assets, and making it possible to continue crawling from where a previous scan concluded. These improvements will not just mean our crawling is more efficient, but it will allow us to go even deeper into user assets by combining what we have learned through previous scans and the latest vulnerabilities we’ve crowdsourced from elite ethical hackers. Keep an eye out for more updates on the topic in the future.

Focus areas:

  • Grouping assets by their purpose. Web crawling today usually means that the crawler will inspect each component of an asset, such as interacting with every button or text field. However, many of these components are duplicates, such as comment fields or login buttons on different pages of a customer’s asset. As we look to the future of application scanning, we are exploring methods to group components by their intended outcome, such as in the comment field section or adding an item to a shopping cart. This will allow our crawler to move through scanning more efficiently, and become more intelligent by allocating the right amount of resources to components that serve the same purpose.

  • Improving how we map out our customer’s assets, such as web applications. There are usually several ways to reach different parts of any given website, for example reaching a site's terms-of-service page might be done by clicking a link in the footer, in some drop-down menu or even directly by entering the address in the browser address bar. Which method is the best tends to vary depending on the situation and the crawler will dynamically adapt to this and always aim to find the shortest and most reliable way of reaching the destination. Having the crawler dynamically adapt to this allows it to more efficiently traverse websites while also creating a map that can be reused in future scans. With these improvements, it will allow us to make it more efficient for our crawlers to traverse our customer’s assets, allowing us to understand how the assets works and subsequently where our scanners should focus crawling and fuzzing.

  • Restoring scans from a previous state. Scanning applications is generally accepted to require time, something that is challenging in today’s development environment as our customers work to ship new code several times a day. While we work to improve the efficiency of our application scanning, we are also looking to pick up a scan of an asset from where it previously ended. For some users, this will save them time by focusing on what is potentially new to their applications rather than scanning the entire application every time they want to test their products or services.


emoji negative reaction for 'Upcoming improvements to Application Scanning crawler' emoji neutral reaction for 'Upcoming improvements to Application Scanning crawler' emoji positive reaction for 'Upcoming improvements to Application Scanning crawler'
Thanks for your feedback