Contact us now
+38 (095) 7990080

Atomated tests

How we verify website security

To secure a website or a web application, one has to first understand the target application, how it works and the scope behind it. The auditer should have deep knowledge of programming and scripting languages, and also web security.

A website security audit usually consists of two steps.  Most of the time, the first step usually is to launch an automated scan.  Afterwards, depending on the results and the website’s complexity, a manual penetration test follows.  To properly complete both the automated and manual audits, a number of tools are available, to simplify the process and make it efficient from the business point of view.  Automated tools help making sure the whole website is properly crawled, and that no input or parameter is left unchecked.  Automated web vulnerability scanners also help in finding a high percentage of the technical vulnerabilities, and give a very good overview of the website’s structure, and security status.  Thanks to automated scanners, you can have a better overview and understanding of the target website, which eases the manual penetration process.

For the manual security audit, one should also have a number of tools to ease the process, such as tools to launch fuzzing tests, tools to edit HTTP requests and review HTTP responses, proxy to analyse the traffic and so on.

In this white paper we explain in detail how we do complete website security audit and focus on using the right approach.  We describe the whole process of securing a website in an easy to read step by step format; what needs to be done prior to launching an automated website vulnerability scan up till the manual penetration testing phase.

Manual Assessment of target website or web application

Securing a website or a web application with an automated web vulnerability scanner can be a straight forward and productive process, if all the necessary pre-scan tasks and procedures are taken care of. Depending on the size and complexity of the web application structure, launching an automated web security scan with typical ‘out of the box’ settings, may lead to a number of false positives, waste of time and frustration.

Even though in recent year’s web vulnerability scanning technology has improved, a good web vulnerability scanner sometimes needs to be pre-configured. Web vulnerability scanners are designed to scan a wide variety of complex custom made web applications.  Therefore most of the times, one would need to fine tune the scanner to his or her needs to achieve the desired correct scan results.

Before launching any kind of automated security scanning process, a manual assessment of the target website needs to be performed.  It is a well known fact that an automated scanner will scan every entry point in your website which most likely you tend to forget, and test it for a wide variety of vulnerabilities.

During the manual assessment, following information should be obtained:

  • number of pages and files present in the website;
  • directory and file structure;
  • website’s root directory and source code;
  • structure of the URL’s;
  • all the submission and other type of online forms available on the website.

During the pre-automated scan manual assessment, apart from getting used to directory structures and number of files, get to know what web technology is used to develop the target website, e.g.  .NET or PHP.  There are a number of vulnerabilities which are specific for different types of technologies.  Other details you should lookout for when manually assessing a website are;

  • Does the website require client certificates to be accessed?
  • Is the target website using a backend database?  If yes, what type of database is it?
  • Is the database server running on the same server as the website?
  • Are all the sensitive records being encrypted? Is it realy secure?
  • Are there any URL parameters or URL rewrite rules being used for site navigation?
  • When a non existing URL is requested, does the web server return a HTTP Status Code 404, or does it return a custom error page and responds with a HTTP Status Code 200?
  • Are there any particular input forms or one time entry forms (such as CAPTCHA and Single Sign on forms) that need user input during an automated scan?
  • Are there any password protected sections in the website?

Completing manual assessment process, we know enough about the target website to determine if the website was properly crawled from the automated black box scanner before a scan is launched.  If the website is not crawled properly, i.e. the scanner is unable to crawl some parts or parameters from the website; the whole “securing the website” point is invalidated.  The manual assessment will help us go a long way towards heading off invalid scans and false positives.  It will also help us get more familiar with the website itself, and that’s the best way to help configure the automated scanner to cover and check the entire website.

Automated black box scanning

To use black box scanning technique higly recommended to configure following:

  • Custom 404 Pages – If the server returns HTTP status code 200 when a non existing URL is requested.
  • URL Rewrite rules – If the website is using search engine friendly URL’s, configure these rules to help the scanner understand the website structure so it can crawl it properly.
  • Login Sequences – A login sequence to train the scanner to automatically login to the password protected section, crawl it and scan it.
  • Mark page which need manual intervention – Such as CAPTCHA containing pages.
  • Submission Forms – If required to use specific details each time a particular form is crawled from the scanner then proper scenner configuration should be made.
  • Scanner Filters – Specify a file, or a file type, or directory which to be excluded from the scan.

Protect data and prevent data corruption

From time to time I noticed people complaining that web vulnerability scanners are too invasive, therefore they opt not to run them against their website.  This is definitely a bad presumption and wrong solution, because if an automated web vulnerability scanner can break down your website, imagine what a malicious user can do.  The solution is to start securing your website and make sure it can handle properly an automated scan.
To start off with, automated web vulnerability scanners tend to perform invasive scans against the target website, since they try to input data which a website has not been designed to handle.  If the automated vulnerability scanner is not that invasive against a target website, then it is not really checking for all vulnerabilities and is not doing an in-depth security check.  Such security checks could and will lead to a number of unwanted results; such as deletion of database records, change a blog’s theme, a number of garbage posts placed on your forum, a huge number of emails in your mailbox, and even worse, a non functional website.  This is expected, because like a malicious user would do, the automated black box scanner will try its best to find security holes in your website, and tries to find ways and means how to get unauthorized access.

Therefore it is imperative that such scans are not launched against live servers.  Ideally a replica of the live environment should be created in a test lab, so if something goes wrong, only the replica is affected.  Though, if a test lab is not available, make sure you have latest backups.  If something goes wrong, the live website can be restored and be functional again in the shortest time possible.

Launching the scan

Once the manual website analysis is ready, and the black box scanner is configured, we are ready to launch the automated scan.  If time permits, we first run a crawl of the website, so once the crawl is ready.  Once confirmation got that all the files are crawled, we can safely proceed with the automated scan.

After the scan – Analyzing the results

Once the automated security scan is ready, we already have a good overview of your website’s security level. Here we can look into the details of every reported vulnerability and make sure we have all the required information to fix the vulnerability.  A black box scanner will report a good amount of detail about the discovered vulnerability, such as the HTTP request and response headers, HTML response, a description of the vulnerability and a number of web links from where you can learn more about the vulnerability reported, and how to fix it.
Analysing the automated scan results in detail will also help us understand more the way the web application works and how the input parameters are used, thus giving us an idea of what type of tests to launch in the manual penetration test and which parameters to target.

Manual penetration test

As much as the automated scan, the manual penetration test is also a very important step in securing a website.   If the advanced manual penetration testing tools are used properly, they can ease the manual penetration test process and help us be more efficient.  The manual penetration testing helps audit your website and check for logical vulnerabilities.  Even though automated scans can hint of such vulnerabilities, and help in pin pointing them out, most of them can only be discovered and verified manually.

Conclusion

As we can see from the above, web security is very different from network security.  As a concept, network security can be simplified to “allow good guys in and block the bad guys.”  Web security is different; it is much more than that.  Though never give up.   Here we are who will do most of the job for you, assist you and make the whole process easier and faster.

-->