Without Context

Over the weekend, I tested the capabilities of my newly operating LAN.  I read an article on lifehacker about a program designed to test the speed of your home network.  This article basically shouted “try me.”  And so I did.

While my LAN should theoretically be gigabit in speed, it has been almost impossible for me to find what exactly gigabit is.  The only absolute I can find is that the term gigabit means 1 billion bits.  The problem is finding the context of that.  What is truly considered a gigabit network?

This program creates a file on another computer on your network and calculates the data for you.  I sent 1 gigabyte files to my server and HTPC.  By sending this large of a file, it tests more than speed, but throughput of the network.  Every computer on my network has a gigabit network card except for my server, which has a 10/100 network card.  This is where I’m somewhat confused, as the results were somewhat similar for both machines.

Results:

Server

—Writing— —Reading—
Packet length : 1,048,576,000 1,048,576,000
Time to complete: 98.0580000 148.7040000
Bytes per second: 10,693,426 7,051,431
Bits per second : 85,547,408 56,411,448
————- ————-
Mbps: 81.5843658 53.7981491

HTPC

—Writing— —Reading—
Packet length : 1,048,576,000 1,048,576,000
Time to complete: 92.2800000 92.9600000
Bytes per second: 11,362,982 11,279,862
Bits per second : 90,903,856 90,238,896
————- ————-
Mbps: 86.6926727 86.0585175

What I’d really like to find out is if these numbers are “in line” with what a cat5e LAN with a gigabit router and gigabit switch should be.

While I’ve done a fair amount of “googling” this, I haven’t found any information that hits a home run.  Most of what I’ve found is out of date, or goes off on a tangent in a completely different direction.

If you happen upon this blog and are a network engineer, or knowledgeable on this subject, please let me know!