« Safari 5 Is Out | Main | SSL is a Pain in the Ass »


Scanning for Unsafe URLs

One thing that I have noticed in recent PCI Compliance scans is that some scanning vendors are now starting to scan for "unsafe" urls exposed to the Internet on web sites.  Typically, an "unsafe" URL could be a tomcat status page, a jboss management console, an apache mod_status page, or similar URL that you do not want exposed to the Internet--not necessarily because of a vulnerability that may exist within that URI but because it may disclose information that you wouldn't want the general public to view. 

Take a look at the Apache Software Foundations server status page.  The ASF doesn't sell anything online, so they probably do not need to be bothered with PCI scans but take a look at some of the information the server status page provides.  You get the apache version number, a general overview of whether the site is running openssl, mod_perl, mod_php, etc., and you get a nice listing of all the client IP addresses currently accessing the site.  If you are running an electronic commerce site, you don't want this kind of information displayed to the Internet-at-large, (although you may want to be able to view this information yourself--hence the BigIP iRule also references in this post).

So, I've put a small perl script together that people can use as a starting point that will scan for specific URLs on a web site and report back on whether or not an "unsafe" URL has been found.  The script is intended to be used as a remediation validation tool by an individual who is responsible for maintaining the security of their web sites.  I'm also including a basic irule that can be placed on your BigIP LTM (if you use one) that will reject requests for these URLs if they hit the BigIP--and you can drop them before passing them along to the web or application server for your site.  Placing them on your BigIP allows you to keep the functionality that this information provides to the administrator while restricting it from public view.

The list of URLs in these scripts is intentionally meant to be short and is not complete.  It only is meant to serve as a starting point and to keep this post relatively short.  ColdFusion shops may want to tweak the script and irule to include the ColdFusion administrator, for example.  You could tweak it to scan or reject access to those scary FrontPage URLs.  The list of URLs that will get you flagged during a PCI scan will, unfortunately, grow quite large in a heterogeneous network. 

Below is a started irule to get started with:

    switch -glob [HTTP::uri] {
        "/server-info*" {
            reject }
        "/jk-status*" {
            reject }
        "/server-status*" {
            reject }
        "/balancer-manager*" {
            reject }
        "/IISSamples*" {
            reject }
        "/MSADC*" {
            reject }
        "/IISHelp*" {
            reject }
        "/IISAdmin*" {
            reject }
        "/manager/html*" {
            reject }
        "/jsp-examples*" {
            reject }
        "/servlets-examples*" {
            reject }
        "/jmx-console*" {
            reject }
        "/status*" {
            reject }
        "/web-console*" {
            reject }
        "/admin-console*" {
            reject }

Below is the perl script I wrote that will scan for these URLs and report the results, color-coded red for failures and green for successes.

#!/usr/bin/env perl
# Usage:  ./badurlscan.pl --host <host>  --port <port>

use Modern::Perl;
use Term::ANSIColor qw(:constants);
use Tie::Hash::Indexed;
use Net::SSLeay;
use LWP::UserAgent;
use Getopt::Long;

my ( $host, $port, $scheme );
my $useragent = 'techstacks.com-bad-url-scanner/v0.1.1';

my $key;
my $value;

sub usage{
  say "Usage:  badurlscan.pl --host HOSTNAME  --port PORT";

usage() if ( ! GetOptions("host=s" => \$host, "port=s" => \$port,) or ( ! defined $host) or ( ! defined $port) );

if ( $port eq 443 ) {
  $scheme = 'https';
elsif ( $port eq 8443 ) {
  $scheme = 'https';
else {
  $scheme = 'http';

tie my %bad_urls, 'Tie::Hash::Indexed';

%bad_urls = (
  'Apache mod_status page' => 'server-status',
  'Apache mod_info page' => 'server-info',
  'Apache mod_jk status page' => 'jk-status',
  'Apache mod_proxy_balancer' => 'balancer-manager',
  'IIS Samples' => 'IISsamples/',
  'IIS MSADC Directory' => 'MSADC/',
  'IIS Help' => 'IISHelp/',
  'IIS Admin' => 'IISAdmin/',
  'Tomcat Manager' => 'manager/html',
  'Tomcat JSP Examples' => 'jsp-examples/index.html',
  'Tomcat Servlet Examples' => 'servlets-examples/index.html',
  'JBoss JMX Console' => 'jmx-console',
  'JBoss Tomcat Status Page' => 'status',
  'JBoss Web Console' => 'web-console',
  'JBoss 5.x Admin Console' => 'admin-console',
  'ColdFusion Administrator' => 'CFIDE/administrator/index.cfm',

sub scan_for_unsafe_urls{
  print "\n";
  say "Scanning for 'Unsafe' URLs....";
  sleep 2;
  my $response;
  my $request;

  while (($key,$value) = each(%bad_urls)) {
    my $url = "$scheme://$host:$port/$value";
    my $ua = LWP::UserAgent->new;
      $ua->agent( $useragent );

    my $request = HTTP::Request->new( GET => $url );
    my $response = $ua->request($request);

  if ($response->code == '200') {
    print RED, "  " . $key . " -- /". $value . " found." . "\n", RESET;
    else {
      print GREEN, "  " . $key . " -- /" . $value . " not found." . "\n", RESET;


Thanks for taking the time to look through this. Let me know if it was helpful or has any glaring errors!


TrackBack URL for this entry:

Listed below are links to weblogs that reference Scanning for Unsafe URLs: