Okay, here's a quiz for those of you who are much smarter than me - that should cover a lot of ground. I've never been embarrassed to admit how stupid I am about certain things.
Is the bandwidth incurred by a file streaming off a web server related only to the size of the file, or does the resolution the file was saved in (e.g. 128k mp3, 56k mp3, etc.) also play a factor when it's a streaming rather than file downloading scenario?
I'm trying to calculate total bandwidth used in a situation where there are no tools available to track it.
But the file resolution directly correlates into the size of the file, so I'd guess its six one way and half a dozen the other. Either or would get you to about the same place in the end.
Also, you have to assume that since its streaming media that the user is actually streaming the entire file. They could break the stream anywhere if they don't like what they're streaming from the server. So you may only need a percentage of the overall file size x number of hits.
The filesize *is* the bandwidth... same thing, basically.
What's in a file, how it's delivered, how it was encoded, etc. -- is all irrelevant to bandwidth.
Simply put, if a file is 1 MB in size, that's the bandwidth required to transfer it from one place to another.
And that's exactly how most servers track it.
To ward off the technical nits-and-lice hunters, it's true that there is some co-ordinative traffic between server and client on streaming (and any other type of) files -- but that's a matter of a few bytes being tossed back and forth. [Hey, server, you ready? Yeah, I'm ready. Good, send it. Okay, here it comes...] It doesn't amount to much -- so little, in fact, that typical server stats utilities don't bother tracking the handshaking traffic at all.
Size is measured in bytes, or kilobytes, megabytes or gigabytes.
Bandwidth is size per unit time, like kilobits per second.
(A byte is eight bits.)
The size is important to server owners, since they are generally charged by how many bits (or bytes or megabytes) that they have delivered. The server owner doesn't care if they were delivered in a quick burst or over a long period of time. You get charged in $$$ per gigabyte.
The bandwidth is important to ISPs, and end users. If the bandwidth of a song is too high, it uses all of the bandwidth in the IPS's pipe, and playback stutters. If it's too low, the fidelity of the song will be poor. Goldilocks likes it just right.
For downloads we can assume that somebody grabs the whole file, so size is the important measure.
For streaming the key point is the bandwidth times how many listeners you have times the average session length. That determines how many gigabytes that you deliver each day.
If your users tend to stream one radio show (file) and then stop, then the size of the file is key. If they tend to stream for a certain amount of time, then bandwidth is the key.
My impression was, Christopher was looking at total bandwidth usage... and I should note my comments were in that regard.
Speed is a somewhat different matter; and of course, latency can occur if a pipe's not up to carrying all of the bandwidth demanded of it. That's generally far more a client concern, these days, than a server concern.
Gosh, Christopher -- I hope we're all not just confusing you, worse... lol!
The "file size == bandwidth" angle was pretty much what I was figuring, but since I've never written any streaming code thought it best to ask more seasoned critters. Wait, it's probably a little too close to Thanksgiving to phrase it like that...
Anyway, it was easy enough calculating the peaks, etc. from a simple data transfer perspective (didn't worry about handshaking overhead), and that got me where I needed to be. I figure when I become afraid to ask dumb questions I start getting more stupid with each passing day.