Wednesday, April 07, 2010

Tool review: NMAP

Nmap has been around for a few years. It's a very powerful scanning tool used to check servers or network devices for any open ports and information. You can use this for auditing or security testing. I was first introduced to Nmap while at work, auditing Windows servers.

I found the ability to check a server for open ports handy but not sure where this would be helpful for besides running security scans. It wasn't until much later when troubleshooting applications that I found the port viewing to be extremely helpful to debug a server if it's actually listening on a port or not. Also it was helpful to determine if the local firewall was causing any issues on the server, blocking an application.

Now there are other tools that offer (somewhat) similar scanning but before Nmap I used Sysinternals' TCPView for finding out about a system's ports. The problem is this is a view is not what other systems see, it's from the inside. So Nmap offers a much better real world view of a system.

Another point where Nmap is very helpful is testing connectivity between servers or applications. I recently install Webmin on a server (great tool) but the default port 10000 was not working. Using Nmap I was able to tell that 10000 was being blocked somewhere on the network because it was not blocked on the server's firewall. This is a handy tool when you don't have access to hardware firewalls but want to have at least some evidence that you may be on the right track.

The latest version of Nmap is now a GUI version and it's very easy to run. Honestly I liked the command line version of before but this new version really makes it easy for anyone to use.

Check it out at, available for Windows and other OS.
Thinking too hard for the solution

At work we've been rolling out Nagios agents on the servers, it's simple enough on the Linux and Windows. But the problem was a few servers we had running in the DMZ, an isolated network where their required greater security. A problem started up where certain ports were not working, the Nagios ports, but all that was working was 22, standard SSH port.

Trying to figure out this problem I looked through the system, thinking it might be an internal firewall rule, stopped iptables which no change or result. It wasn't until I used NMAP to scan the servers that I realized all of the ports were closed, expect 22.

At this point I asked the network engineer and he confirmed that yes, there were excessive ports being blocked on the DMZ network.

This brings a very important point, how to troubleshoot and gather information.

First, when dealing with any new problem, alway understand what is your environment. I can't stress this enough. I once helped a friend over the phone with a network problem at a large grade school where she was deploying a network device. The problem was she couldn't get terminal access to the device that she installed on the network. So we ran through the steps, the IP address she was given to use, ping commands, etc.

The problem turned out to be the school administrator gave her a duplicate IP address, and while she could ping the address (which was not her device), she could not terminal into the machine. She assumed that since it was the IP address given to her, it must be working, but in the end it was not. It's important to always double check and know the environment.

This can also be not only trusting the information you are given but knowing what to check when something is wrong. Another example is a problem I experienced once while changing an server's IP address. I followed the company standards for moving an Windows IP address from one subnet to another, also I worked closely with our network engineer who had access to the switch I was connecting to.

After making the IP address change, I could not get access to the network. I saw that I had link lights on the server, but my server could not access the network or reply from a ping. I checked all of the cables, even plugged into another port on the patch panel, nothing. I asked the network engineer three times if the network ports were changed to the right subnet, each time he checked and confirmed. Finally I asked my manager, who still had switch access, if there was some problem I couldn't find. He logged into the switch, and found it was set to the wrong subnet.

The problem in this case is I assumed the network engineer who checked three times, actually checked his work. From this case, it was much harder to check the work since I did not have access, but since I checked all possible connections and problems on the servers, I could confidently say it was not an OS issue. This is important to know when a problem appears and who 's side should fix it. Often it's going to be a battle back and forth on where the problem lays.

In larger companies resolving issues becomes difficult, sometimes the department you work with on a project may be half way across the world. In my work, many of the co-workers are on not local and I have limited access to remote servers. It's difficult but I still use the same skills to know when a problem is from our side or theirs, I inspect the environment and then apply the same knowledge to figure out where is the problem.

In any work you need to know how to fix something, it may be a broken computer to a issue on a project. They are very different but require similar skills, the knowledge of the environment to make a decision.

Monday, April 05, 2010

Installing Squid proxy on Ubuntu 9.10 server

Recently at work we have been battling over the issue of controlling the Internet access for our users. The Internet is basically open to everything and of course in a business environment there's some abuse of the access. Taking a snap shot of the usage using WireShark connected to the core switch, allowing port mirroring we found that accounted for the majority of Internet traffic. Also we found that less than 4% of network traffic was internally routed, this could mean that we had many users accessing the other offices or way too many Internet browsing.

Since Youtube was our biggest website and it was not business related, we got the ok to block this site. But ran into our first problem. If you check the domain name Youtube from any nslookup utility you will find that Youtube IP address spans multiple addresses, and some cross over and While we got Youtube blocked, it was slightly difficult and since it was done on the firewall, is not the recommended method.

So, looking around the Open Source world, there might been another solution.

Squid is a well know proxy for Linux that is easy to configure but appears a bit difficult at first. I was really overwhelmed looking at the config file but here I'll show you how easy it is to get running with very little work.

First, I'm going to give the step by step using Ubuntu Server 9.10, which I think is pretty easy to get running. We'll use the packages install which save some time and install any other requirements at the same time.

First part is having a Ubuntu server running, this is the easy part, just takes some time. Remember to have OpenSSH running so we can remote console into the server. Also since this is a server we will make it configure with a static IP address.

1) First, make sure your server is updated.

Run the following

sudo apt-get update (this is fast)

sudo apt-get upgrade (takes about 5 minutes depending on your Internet connection)

2) Then we will set up the IP address with a static entry. First, let's find out which network we need to configure.

Run ifconfig and take note of the network card that reports an IP address, usually it's "eth0".

Edit the following file.


Then add the following information using a editor like VI and then save your work. Remember to save!

iface eth0 inet static
address (enter your IP address here)
netmask (enter your subnet mask here)
gateway (enter your gateway here)

3) Now install Squid from the package, note this will install the requirements needed for Squid as well.

sudo apt-get install squid (takes about 5 minutes)

4) Now, Squid is installed, and we'll go over the important files of Squid.


This file holds all of the important details of Squid, it's very long but we'll only need to edit a few parts.

First, make a backup of the file

cp squid.conf squid.conf_backup

Then edit the file

vi squid.conf

5) We're looking for a few areas in the squid.conf file, so we'll use the search function of VI.

Open the file in VI

vi squid.conf

Then search for the first item "TAG: http_port"

TAG: http_port(then press enter)

For me it's at line 1022. This value changes the setting for the proxy server settings on the browser. Normally it's set to port 8080, so we will change this. Enter in the following after the "TAG: http_port"

http_port 8080

6) Now search for "visible_hostname"


This value just gives an alternative name to the server, I just made it simple and used "proxy". The line was 3399.

7) Allowing access, this is a tricky part.

Ok, now we need to know the subnet you're allowing access for the proxy access. For this example, we'll use (basically addresses from 1 to 255).

Search the squid.conf file for "TAG: acl" It's about at line 425. Go a few pages down until you see some uncommented entries without a # sign and enter your details there.

For this example we'll enter the following.

acl allowhome src

This means allow a new network called "allowhome" from the source address of ""

Remember the name "allowhome" we'll need this later.

Also we'll need to define the access for "allowhome".

Search for "http_access" and once you scroll down you should see a few "allow" and "deny" entries. Enter in the following at the top of the section.

http_access allow allowhome

Now save your file and exit.

Just to be sure it took effect, restart Squid.

/etc/init.d/squid restart

Now from your home computer, point your browser to your server's IP address and port 8080. You should connect as before.

Pretty simple!

Next time I'll post about working with blacklists, blocking certain ports or websites, and Webmin as a GUI tool for administration.

The pros and cons of working with Linux

Recently at work my duties with the Linux servers has been increasing. I'm really excited to work with Linux and gain experience but it's also somewhat frustrating at the same time. So coming from a Windows background I noted some things that are different about Linux, both good and bad points.

Applications need to be complied in Linux

Not all applications but a good amount need to be complied, then run some scripts to have them installed. It's not difficult but when things go wrong, takes some time to figure out, what's really causing the issue. At first this was the biggest problem I had switching over to Linux. Why would a software vendor send out applications that are not even ready to install?

After working with some problematic Windows applications I started to see the light. Many times in the Windows world, we would be required to install some applications that are questionable. Not in terms of they have malware issues but problems in how they were developed, hard coded names, etc. In one application we had many issues because the timeout of a child process was extremely short, so short that any lag in the network caused the entire application to crash. Perhaps in a Linux version we could have seen this in a config file and made the change?

Open Source in Linux application

This is more of a personal preference than anything else. I've worked mostly on commerical applicaitons, for example Microsoft Windows family, Microsoft Exchange, or any other popular applicaiton used in the business environment.

We'll see how well Open Source works for me in the future so far there are some projects it's really helping but also it's somewhat hard to get the correct support.