Jump to content
Sign in to follow this  
galzohar

Arma 2 bandwidth requirements

Recommended Posts

I remember a long time ago I was spending a lot of time searching for this data, but couldn't find it anywhere (at least not in a precise form, more like "yeah 100MBIT worked for our X players session). However now when I have more experience playing on low-end connections with low player counts, I think I have this more or less figured out.

Keep in mind that everything here comes from seeing how many players can play smoothly vs how many players need to log in in order to clear see desync issues - Never actually looking at ingoing/outgoing data (can't do it anyway on a hosted server as far as I know).

One thing I noticed is that the amount of AI doesn't seem to have a noticeable effect on how many players I can get on the server without problems.

My conclusion so far is that you need something around 0.1MBIT per player. How?

All upload speeds were measured by speedtest.net

My connection: 0.2~0.23 MBIT upload. Maxmium players other than myself: 2, with perfectly smooth play. If 3 players log in, significant lag occurs and it becomes almost unplayable.

A friend's connection: Not sure whether it was 0.4 or 0.5 MBIT upload, but max players that could connect and play smoothly was 4 (other than the host), 5th player connecting would cause obvious lag.

Another one who I think (but not totally sure) had 0.8MB could have up to 7-8 players connect (again don't exactly remember, this was very long time ago, and I usually don't have that many players connecting).

If anyone has more data that could give us more accurate estimations of Arma 2's exact bandwidth requirements, please post it and I will try add it to the first post if it provides new information that everyone should have easy access to. It would especially be interesting to see if with more players we would see a non-linear increase in bandwidth requirements, as can be suspected when you look at the results I got.

Share this post


Link to post
Share on other sites

Nah, this page (and other similar ones, such as Kelly's Heroes great dedicated server settings guide) don't really mention the exact (or even estimated) amount of bandwidth a server requires in order to run with X players. That's why I made this thread, so that we can have all info of this sort in 1 place.

Share this post


Link to post
Share on other sites

Good idea Galzohar, this question has been asked so many times with no real definitive answer. Many people wonder how much traffic their home networks can really handle.

From my tests on my own dedicated servers your results seem just about right.

I'd also be interested to see what others say.

Share this post


Link to post
Share on other sites

The bandwidth requirement varies, depending on what mission you are running. Heavily scripted missions, with alot of AI, will have a higher bandwidth requirement, on average.

I was doing some testing last night, with my dedicated server in LAN. Setting the Min/MaxBandwidth's to 92Mbps/100Mbps. Connecting with my client, the server reported sending 128kbps to 12Mbps, depending on MaxSizeGuaranteed/MaxSizeNonGuaranteed. Which, left default 512/256 gave me the 128kbps, and when modified to 1500/1500 (MAX MTU ) the up skyrocketed to 12Mbps. I used MaxMsgSend setting of 16384 for all testing.

Also note, typically the server was recieving 1Kbps to 64kbps from my client. Test mission was Benny's Warfare 2.066 Lite CO.

Ideally, if you can get clients to use 128Kbps only-, you should be able to get 10 clients on a 1Mbps pipe. There are no 'defined' options on how to force clients to recieve a set amount.

...Syn...

Edited by VisceralSyn
typos and other grammatical errors, as usual...

Share this post


Link to post
Share on other sites

Due to the differences in clients and missions, we use 256k as a standard and I use 512k as limiting factor for telling how many clients that can connect.

But this is just to say that we can "only" have 50 clients on our 26mbit line.

But if you find the "exact" answer to this one, please do tell us.

256k is from OFP times and "confirmed" for ArmA. We still uses this for our Arma2.

Share this post


Link to post
Share on other sites

Some rough estimates could be very useful for server admins, gathered from trial and error of other server admins.

Bare in mind that all figures would be rough estimates, and could vary greatly based on the mission being played.

You would also have different values based on average gameplay, joining in progress, and map downloading.

During normal gameplay, bandwidth usage is low, however the bandwidth required per client would spike when downloading a mission (start of game) or joining in progress (downloading mission AND all the JIP data to be up to sync with other players).

Then on top of that, there is factors you cannot estimate, such as players having custom sounds, custom faces, squad logos, etc.

Share this post


Link to post
Share on other sites

Yeah I completely neglected temporary stuff that only happen upon player connection or new mission loading, such as sending mission files, JIP info or custom faces, but those do happen only once.

If you look at the limits I've been getting, limiting to only 50 clients on a 26MBIT line seems very restrictive. Of course we don't really know if there are non-linear stuff with more players. Maybe try to push this higher and see what happens?

Regarding mission-specific stuff, while I did notice differences in in/out data reported by a dedicated server when you do different things in a mission or play different missions, as long as you don't do something crazy like run a quick loop with a publicVariable command, the actual requirement for smooth play doesn't seem to change too much - At least not enough to notice when playing with small player count and low bandwidth - Whether the mission was a TDM or a high-AI-count COOP mission, the maximum number of players that could fit on the same connection without lagging didn't seem to really change.

Share this post


Link to post
Share on other sites

Just to add some experiences, I was hosting a single-town Warfare co-op mission I am working on last night. There was only one other player connected. The Blufor have 5 playable units synced to the Warfare module, and the Opfor had 5 unplayable units synced. The max squad size was set to 12.

The lag was so bad that units were warping all over the place for the connected player. I, of course had no lag, since I was hosting the mission. I have, I believe, a 2.0Mb upload, which I can verify with speedtest when I get back to it. There is no reason for only 2 players to lag a server. This leads me to believe that AI count definitely effects the scenario. Especially when those AI can add more units to their squads. My machine is a quad core Phenom II at 3.0ghz, with 8GB dual channel memory at 1066mhz. CPU usage was 40-50% max on any core at any given time, and memory was only 35% utilized. So the issue is definitely not hardware capability.

Assuming AI bots effect the amount of data that must be sent, I am going to treat each bot as a player. I am first going to try a 1v1 human player setup, with maximum squad size set to 5, so that total unit count, human or otherwise, does not exceed 10. I will adjust from there to see how far I can go before lag rears its ugly head.

Hopefully this will give insight for more complex mission structures like Warfare.

All my buds now have OA as well, so perhaps OA's improvements will help the situation. Thanks BIS for yet another great patch!!!!!!!!!

Edited by Eclipse4349
OA update

Share this post


Link to post
Share on other sites

I highly doubt a bot takes as much as a player, since I noticed no difference between TDM and "normal" coop missions (aka not warfare with a billion AI). I wouldn't be surprised if AI units require bandwidth, but I highly doubt it's anywhere near the requirement of a player. Especially considering that servers don't update AI positions very often for AI units that are far away from the player (hence all the complaints about "AI warping").

Another thing, if you want to check if your server is hardware-limited, especially in AI-heavy situations, you shouldn't really look at CPU/RAM/whatever%, but rather look at if your FPS are dropping or not (though this is something you probably would have noticed if that was really your problem in this case). When you run incredible amounts of AI, total CPU usage % actually goes down, because AI only run on 1 thread, and if that thread takes too long to complete 1 cycle, the rest of the threads running "wait" for it to finish and thus actually don't run as much, but of course your FPS goes down accordingly. It's as if the AI runs on a single core and that core is the bottleneck, however windows usually just moves that thread between cores to spread the heat around which makes it seem like your CPU isn't working hard at all when you look at the task manager, while in fact it is limited by its single-core speed (this is very easy to test if you just write a c program that runs a single infinite loop - It'll use 25% of each core even though it is running as fast as your CPU is allowing it to).

Do you have any idea of how many AI were actually running? I would have thought with 2MBIT upload you should be able to run so many AI that your CPU will die before you would notice any network issues with 1 player. Are you sure his internet was functioning right? Did you try a blank mission or mission with few AI in order to compare?

Share this post


Link to post
Share on other sites
I highly doubt a bot takes as much as a player, since I noticed no difference between TDM and "normal" coop missions (aka not warfare with a billion AI). I wouldn't be surprised if AI units require bandwidth, but I highly doubt it's anywhere near the requirement of a player. Especially considering that servers don't update AI positions very often for AI units that are far away from the player (hence all the complaints about "AI warping").

Another thing, if you want to check if your server is hardware-limited, especially in AI-heavy situations, you shouldn't really look at CPU/RAM/whatever%, but rather look at if your FPS are dropping or not (though this is something you probably would have noticed if that was really your problem in this case). When you run incredible amounts of AI, total CPU usage % actually goes down, because AI only run on 1 thread, and if that thread takes too long to complete 1 cycle, the rest of the threads running "wait" for it to finish and thus actually don't run as much, but of course your FPS goes down accordingly. It's as if the AI runs on a single core and that core is the bottleneck, however windows usually just moves that thread between cores to spread the heat around which makes it seem like your CPU isn't working hard at all when you look at the task manager, while in fact it is limited by its single-core speed (this is very easy to test if you just write a c program that runs a single infinite loop - It'll use 25% of each core even though it is running as fast as your CPU is allowing it to).

Do you have any idea of how many AI were actually running? I would have thought with 2MBIT upload you should be able to run so many AI that your CPU will die before you would notice any network issues with 1 player. Are you sure his internet was functioning right? Did you try a blank mission or mission with few AI in order to compare?

Apparently, 2Mb was sticking in my head my old connection. I actually have ~7Mb/s upload!

After more testing this evening, it seems to only happen for him when he is looking through optics at long range. I guess the distant AI having their position updated less frequently is probably the culprit. I didn't realize that was how BIS has it working. Hopefully the improved netcode of the latest patch Has solved the issue. We weren't able to play Combined Ops because he wasn't finished with the download yet.

Thanks for the reply!

Share this post


Link to post
Share on other sites

It had been like that for as long as I remember. Extremely noticeable when you have a script updating a marker onto a position of an enemy AI car, with the enemy AI car being on the other side of the island - The marker will almost never update even though the car is actually moving just fine on the server - Once you get closer the marker will get updated more often as only then the server will actually start sending you the actual car's position. Of course it seems that default settings for this don't update AI positions often enough thus when you try to snipe you see warping regardless of how good your connection is (there is a server setting to reduce this issue at the cost of bandwidth, not sure how much this actually helps per bandwidth used).

Share this post


Link to post
Share on other sites
It had been like that for as long as I remember. Extremely noticeable when you have a script updating a marker onto a position of an enemy AI car, with the enemy AI car being on the other side of the island - The marker will almost never update even though the car is actually moving just fine on the server - Once you get closer the marker will get updated more often as only then the server will actually start sending you the actual car's position. Of course it seems that default settings for this don't update AI positions often enough thus when you try to snipe you see warping regardless of how good your connection is (there is a server setting to reduce this issue at the cost of bandwidth, not sure how much this actually helps per bandwidth used).

Any idea what that setting is, and is it able to be implemented on a hosted server in addition to a dedicated?

Share this post


Link to post
Share on other sites

No idea for hosted, it's called something like minErrToSend, you can find it in Kelly's Heroes dedicated server configuration guide.

Share this post


Link to post
Share on other sites

Yeah, I had seen those settings. I guess if I ever end up with a dedicated those will be very handy. I have toyed with the idea of building a pc with a quad core and plenty of memory to use as a dedicated, but at this point, it would be purely for arma2, and i want a new video card first!

Any idea what minerrtosend actually does? I didn't get much out of it's description. Maybe google can shed some light...

If anyone knows how to tweak a hosted, non-dedicated server, I would love to know!

Share this post


Link to post
Share on other sites

MinErrorToSend fixed my long-distance warping issues on my quad-core Q6600/4Gb LAN server. I set it to .005

As far as the bandwidth discussion, there are too many variables to quote a quick and easy bandwidth requirement. From my experience, the bandwidth used is actually quite low (on BE 2.066), judging by the server #monitor command outputs.

Something to keep in mind is that when you get more AI into your squad, they are locally managed at the client - the more vehicles, AI you have in your own squad, the more /upload/ bandwidth will be used as a result, as your client needs to update the status of your units to the server.

Of course on the server its the other way around -- the more AI stuff going on on the server, the more bandwidth is required on the download side.

In any case, with 2 people on my LAN server, I find the actual bandwidth requirements to be very low -- but hosting over an internet connection, which usually has a clamped upload speed (512 Kbps for mine), the number of AI you control locally might become an issue.

Of course im always interested in squeezing more performance out of the server itself, but network loads arent the primary factor (at least in my case).

One thing that helps is to set the oaserver process to a high priority -- on windows and linux, I set the server process to real-time priority and it helps alot.

Again, I use BE with towns-amount=large, base/town patrols on to do benchmarking, but this makes for a long and tedious process in working out tuning parameters. If anyone knows of a quicker way to load down the server in all aspects, id love to hear it -- I think itd be a good suggestion to BIS to provide a multiplayer server benchmark tool.

Lastly, im still unclear as to the right settings of the following -- my choices are somewhat arbitrary (guesses), so if anyone has any wisdom regarding these settings, that would be great -- but the bottom line, at least for me, is that network bandwidth isnt the bottleneck -- its CPU, and server performance is directly related to CPU power for the most part, at least with a low number of users (2-3).

MaxMsgSend=2048;

MaxSizeGuaranteed=1024;

MaxSizeNonGuaranteed=256;

MinBandwidth=20000000;

MaxBandwidth=1000000000;

MinErrorToSend=0.005;

---------- Post added at 04:22 PM ---------- Previous post was at 04:16 PM ----------

Oh - one final note - I normally run a linux server, but since the perf issues of 1.56, I started running the windows server so I could run the 1.57 beta. It seemed faster than what I was used to with Linux, perhaps because BIS wasnt optimally using threading on linux(?). In any case, with the general release of 1.57, I was surprised to find that the Linux server was faster, and appeared to make better/more efficient use of my 4 cores. It may be config differences, I dont know - but im real happy with the multi-core performance of the linux 1.57 server...

Share this post


Link to post
Share on other sites

MaxMsgSend can be an extra-ordinarily large number. Its the how many packets the server can send as a maximum. I have found that to use my paltry 2.2Mbps upload correctly, I ended up using somewhat, large seeming MaxMsgSend, and both guaranteeds way smaller, but multiplied by each other.

example:

MaxMsgSend=16384;

MaxSizeGuaranteed=128;

MaxSizeNonguaranteed=32;

With that math, the 128byte guaranteed packets will not exceed 2,097,152bps ( right around 2Mbps ). With the simple goal of having the #monitor provide me with upload of nothing exceeding 2.2Mbps, which is what my upload speed is rated at.

More importantly, you have 20Mbps set as your Minbandwidth, and 1Gbps set as the max. if you are gaming on 1Gbps LAN, and all machines are connected at that speed. Bandwidth will be a non-issue for you. However, I typically set minbandwidth to 92% of the total bandwidth, to allow for overhead.

I reinstalled ArmA² when 1.54 was hot, and never not used Linux for server duties. Because thats just what Linux does better than winders. the Linux 1.57 dedicated seems more stable than the 2 before it. Possible just as fast, or slightly faster than 1.54. But BE 2.066 brings it to its knee's. And thats on a Xeon dual core 3Ghz cpu. The Super Powers CTI mish that BIS did, doesn't work out the CPU as much, but the AI gets stuck, ALOT more often.

Share this post


Link to post
Share on other sites

**Following with intrest

MaxSizeGuaranteed - Maximum size of guaranteed packet in bytes (without headers). small messages are packed to larger frames. Guaranteed messages are used for non-repetitive events like shooting. Default: 512

MaxSizeNonGuaranteed - Maximum size of non-guaranteed packet in bytes (without headers). Non-guaranteed messages are used for repetitive updates like soldier or vehicle position. Increasing this value may improve bandwidth requirement, but it may increase lag. Default: 256

I've tried to edit those values for a smoother gameplay.. less studdering and rubberbanding, but in doing so it allso created a lot more desync - tested on a full server Warfare BE 40 ish (cant remember the exact version when we tried it out)

From

MaxMsgSend=1024;

MaxSizeGuaranteed=1024;

MaxSizeNonguaranteed=384;

MinErrorToSend=0.0049999999

To

MaxMsgSend=1024;

MaxSizeGuaranteed=256;

MaxSizeNonGuaranteed=128;

MinErrorToSend=0.001;

Result was that in the begining of the game everything looked a lot better and more fluid movements + hitreg seemd to be better. But after a while all players had more desync problems than before.

Share this post


Link to post
Share on other sites

Cri: Yes - I once ran double the values for the Max parameters, and saw similar results and trimmed them back down.

Theoretically speaking, it makes sense that the more network activity you allow, the less time the CPU has to service the AI itself. I think the important difference between the guaranteed and non-guaranteed is that the non-guaranteed msgs can be dropped in favor of FPS stability, whereas if you set the guaranteed high, the server is forced to service that traffic regardless of server FPS performance. As with all tuning issues, there is a balance to attain there, and that balance is highly dependent on what and how you run the server. In less demanding missions, I can see where setting the guaranteed numbers high wouldnt cause any server FPS degradation at all - where in complex, heavy AI missions, it would. I think that there arent any set profiles intended to achieve that balance because finding the right settings involves alot of time and testing, and waiting for a complex mission (BE) to develop enough to put strain on things. Again, a server benchmarking tool would allow us to find our individual balances much more quickly and would probably develop better published standards on server tuning. Come to think of it, I might just build a mission with loads of AI with loads of waypoints for benchmarking purposes. It wouldnt be that hard to do I suppose.

BE does indeed drag the server to its knees (FPS-wise), and the CPU utilization does go down as the FPS goes down - an interesting and non-intuitive paradox, yes. I find that my BE missions max out at about 230 - 240% cpu useage (about 2.5 full cores), but more usually runs at an average of 130 - 180%. The explanation given above is a good one as to how FPS goes down, but CPU utilization does too. The bottom line, though, at least for me (with one or 2 players on the server) is that warping doesnt increase much at all as the server hits 8 and 9 FPS, and is still quite playable. What the worry is when FPS gets that low is whether scripts are going to start breaking due to timing issues and latency. I normally see server FPS at 46-50 at BE start, and 10-11FPS at the end. On islands like Lingor, I see 8 and 9... I believe some of the tuning parameters are intended to regulate how low server FPS can go (to the degree they can), and use the non-guaranteed stuff (among other things?) to prioritize operations and throttle network useage accordingly, to try and maintain optimal FPS. Yes, this is all speculation on my part.

RE: The Linux server -- there were whisperings of the pthreads library causing linux to be somewhat slower than windows - and Im with you, Vertical -- I would always prefer to run linux server over windows. Also consider that until recently (1.54??) the linux server wasnt threaded, where the windows server was. It just appears to me, as I monitor CPU core useage on my server that 1.57 seems to use the other cores much more and more than before, where the windows server doesnt seem to use the alternate cores as much -- interesting (and possibly false) observation, considering that the base arma code for threading must be quite the same (other than the library calls), across both windows and linux.

I think a heavy-AI mission for benchmarking purposes is on my to-do list now.

Thanks guys for your inputs. Ill be following this thread.

Share this post


Link to post
Share on other sites

The CPU usage goes down when FPS goes down because there are several threads running in parallel and each thread needs to complete certain amount of actions before the next frame can start. If some threads are done first, they idle until the rest of the threads are done with that frame. If you have a lot of AI, then the AI thread takes much much longer than any other thread (AFAIK Arma 2 has and always had only 1 thread that deals with AI), and thus the other threads are using very little of the CPU, and instead spending most of the time waiting for the AI thread to finish what its doing before moving to the next frame. After all, when you have more AI in a mission, you're only increasing AI calculations, while the other calculations remain the same. As soon as AI calculations are limited by the speed of a single core, the AI thread can do less frames per second and thus all threads drop in FPS because they need to be synced.

The only solution for this I can hope for from BIS is making the (heavy part of the) AI calculations able to use more than 1 thread. Other than that, not much we can do other than getting a faster CPU (not more cores, that won't help if they are the same type and speed).

Of course that seems to have little to do with bandwidth requirements.

Share this post


Link to post
Share on other sites
The CPU usage ...

Of course that seems to have little to do with bandwidth requirements.

Yeah, but they're all related.

...Syn...

Share this post


Link to post
Share on other sites

Everything in this thread is about hosting a server, so what's about joining one? Will ADSL 4Mbps/0.8Mbps be enough for joining a server with 50 players? I need to know before I buy the game.

Share this post


Link to post
Share on other sites
Everything in this thread is about hosting a server, so what's about joining one? Will ADSL 4Mbps/0.8Mbps be enough for joining a server with 50 players? I need to know before I buy the game.

I have a similar connection back home and I can seemingly join servers quite fine.

Share this post


Link to post
Share on other sites

I know that you can join some servers without problems, but how many players in it? For a massive game, 50+ players will 4Mbps be enough? If not, what speed will be?

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×