Jump to content

micovery

Member
  • Content Count

    23
  • Joined

  • Last visited

  • Medals

Community Reputation

2 Neutral

1 Follower

About micovery

  • Rank
    Private First Class

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. I've been noticing that sometimes getPosATL gives me 1.#QNAN for the X, and Y coordinates. [1.#QNAN, 1.#QNAN, 0] I have not narrowed it down to know exactly when it happens. My first question here is: :icon_question: Is there any Arma 3 command I could use to reliably use to determine if a variable which is supposed to be a finite SCALAR is instead 1.#QNAN ? I am aware of this command: https://community.bistudio.com/wiki/finite But I have no way to test it, since I do not know how to artificially create a variable that has 1.QNAN value. I know that "NaN" stands for Not-A-Number ... so my second question is: :icon_question: What conditions are needed for a variable in SQF to end up being "1.QNAN" ? In other languages like JavaScript you can end up with "NaN" when doing arithmetic on values that are not numbers.
  2. Made this simple online utility to convert Steam ID to BattlEye GUID. http://codepen.io/micovery/full/bNbLqL It's entirely JavaScript based. Here is the source if you are interested: var uid2guid = function(uid) { if (!uid) { return; } var steamId = bigInt(uid); var parts = [0x42,0x45,0,0,0,0,0,0,0,0]; for (var i = 2; i < 10; i++) { var res = steamId.divmod(256); steamId = res.quotient; parts[i] = res.remainder.toJSNumber(); } var wordArray = CryptoJS.lib.WordArray.create(new Uint8Array(parts)); var hash = CryptoJS.MD5(wordArray); return hash.toString(); }; It uses's crypto-js and BigInteger.js
  3. Gotta play around more with extDB first, so that I can make a good side-by-side comparison. The main difference I believe is that extDB relies on having a pre-defined SQL schema, and queries, and procedures. Sock-rpc-stats on the other hand is schema-less ... which means it does not force a schema on you. However, at the end of the day ... the schema definition really lives in your mission code. For example, here is how I organized the information for the proof of concept player stats (pstats.sqf) that I included in the sample mission. { "76561198015751465_civ": { "createdAt_": 1413312842050, "updatedAt_": 1413313170677, "signature_": "gBV5BbuEFQDGR8q9JXS1fg==", "primary_weapon": "srifle_EBR_ACO_F", "primary_weapon_magazine": [ "20Rnd_762x51_Mag", 10, "true", 1, "srifle_EBR_ACO_F" ], "primary_weapon_items": [ "muzzle_snds_B", "", "optic_Arco" ], "secondary_weapon": "launch_I_Titan_short_F", "secondary_weapon_magazine": null, "secondary_weapon_items": [ "", "", "" ], "handgun_weapon": "hgun_Pistol_Signal_F", "handgun_weapon_magazine": [ "6Rnd_RedSignal_F", 4, "true", 2, "hgun_Pistol_Signal_F" ], "handgun_weapon_items": [ "", "", "" ], "pos_atl": [ 16081.4, 17047.8, 0.00139951 ], "dir": 259.019, "animation": "amovpercmstpsraswpstdnon", "damage": 0, "fatigue": 0, "uniform": "U_C_HunterBody_grn", "goggles": "G_Balaclava_combat", "headgear": "H_Booniehat_grn", "assigned_items": [ "ItemWatch", "ItemRadio", "ItemGPS", "Rangefinder" ], "backpack": "B_BergenC_blu", "vest": "V_Chestrig_blk", "magazines": [ [ "7Rnd_408_Mag", 7, "false", -1, "Vest" ], [ "20Rnd_762x51_Mag", 10, "true", 1, "srifle_EBR_ACO_F" ], [ "6Rnd_RedSignal_F", 4, "true", 2, "hgun_Pistol_Signal_F" ] ], "backpack_items": null, "vest_items": null, "uniform_items": null, "current_weapon": "hgun_Pistol_Signal_F" } } Ty, good idea ... though if you have too many mission variables, it can be hard to manage. But, it's still useful. A mission maker for example could add a command line argument like -foo=someFile.sqf. Then, as you pointed, he would ask for the value using regex, or a substring, and then read the referenced file itself from within the mission in SQF. The one problem with this is that you have to be careful to not expose the values of parameters like -config, and -profile ... that'd be a pretty bad security exposure. People could read the contents of your config files ...or, if they know the location where the BattleEye config file is stored, they could even probe until hit the "not-so-random" renaming of BEserver.cfg.
  4. Thanks for that! This project (sock-rpc-stats) is not specifically meant to be a replacement for Arma2Net. Its single purpose is to work as stats server, with a simple JSON abstraction for storing and retrieving data. You can think of it as an alternative to other persistence solutions like extDB, and iniDB. However, sock-rpc-stats is built on top of the sock-rpc Node.js module, which could technically be a replacement for Arma2Net extensions. I have released the sock-rpc module on a different thread over at Node.js Extension for Arma 3 (sock.sqf, sock.dll, sock-rpc). Here is a break-down of the main differences between Arma2Net, and sock-rpc. I've tried to make it as factual, and unbiased as possible. For folks who are knowledgeable about Arma2Net, feel free to correct me, whereever I make inaccurate statements (I will incorporate the corrections as edits). [table=align: left, class: grid, width: 1200] [tr] [th][/th] [th]Arma2Net[/th] [th]Node.js sock-rpc[/th] [/tr] [tr] [td=width: 150]Architecture[/td] [td]As I understand Arma2Net works as a bridge between C,C++ unmanaged land, and the .NET managed land. You have to write code, and compile it into a DLL. [/td] [td]RPC stands for: Remote Procedure Call. You are technically invoking remote JavaScript functions from within SQF. There is no compiling at all. You just make the code changes, and restart the RPC server. Meanwhile the game server itself can still be running.[/td] [/tr] [tr] [td]Language[/td] [td]You can use any language that can be compiled into Microsoft's Common Intermediate Language (see the list at http://en.wikipedia.org/wiki/List_of_CLI_languages)[/td] [td]It's a Node.js application, so you write the code in JavaScript. JavaScript, has been around since the release of Netscape Navigator 2.0 in September 1995 ... and nearly 20 years later, it's still going.[/td] [/tr] [tr] [td]callExtension buffer size limitation[/td] [td]In Arma 3, the callExtension response buffer size is 10240 bytes. With Arma2Net, I think you have to manually work-around this limitation. [/td] [td]When using sock-rpc, this is handled transparently for you. As a developer, you don't have to worry about it. You simply worry about writing your misison's SQF code, and JavaScript functions.[/td] [/tr] [tr] [td]Debugging[/td] [td] When using Arma2Net (or any extension), in Arma 3, debugging is not easy (at least on Windows). You have to have write print statements to trace the logic you are debugging. Back in Arma 2, you could attach Visual Studio to the game process, and load the symbols for the DLLs, and literally step through the extension code while the game was waiting for the callExtension invocation to exit. This does not work anymore in Arma 3, maybe the BIS developers have put some traps to prevent hackers from attaching debuggers to the game client. Today, once you attach a debugger to A3, and stop at a breakpoint ... the game simply crashes. In Linux, you can still attach GDB to the server process ... without it crashing. [/td] [td]With sock-rpc, debugging is a breeze, or as easy as in any Node.js application. Node.js (based on Google's V8 engine), comes with its own built-in debugger. You can start a node application in debug mode, and step through the code, or attach a remote debugger to it. Some popular ones are node-inspector (see ), and the debugger inside IntelliJ's WebStorm IDE (available at https://www.jetbrains.com/webstorm/) [/td][/tr] [tr] [td]Peformance[/td] [td]While Arm2Net acts a bridge, the performance of the callExtension invocations is not hindered much ... as the call never leaves the Arma process.[/td] [td]With sock-rpc, the calls technically exit the Arma process, and go over TCP to the RPC server Node.js process. This adds some overhead, however, when the RPC server listens on the loop-back address (127.0.0.1), most TCP stacks handle the routing internally, so it's not a big hit. I would not recommend running the RPC server on a different machine.[/td] [/tr] [tr] [td]Ease of use, and community support[/td] [td]Well, you have to know your way around the .NET libraries, and be able to write code in one of the CLI languages. There is a lot of community support, as .NET is very popular. Folks coming from a traditional OO (object oriented) background/education tend to gravitate towards this.[/td] [td]JavaScript is very easy to learn for beginners. There is no lack of support there either ... Node.js, though not as old as .NET, is very popular as well. Also, you get to re-use any NPM (https://www.npmjs.org/) module you want. [/td] [/tr] [tr] [td]Cross platform support[/td] [td]With Arm2Net, Windows is the main platform being targeted. Writing Arma2Net "addins" that work in Linux can be tricky. Not sure if anyone has gotten it work in Linux. Maybe using Wine, or mono?[/td] [td]sock-rpc supports both Windows, and Linux (sock.dll, and sock.so) ... your actual application code is in JavaScript which itself is platform agnostic. Node.js is officially supported on both Windows, and Linux (http://nodejs.org/download/)[/td] [/tr] [tr] [td]Addins support[/td] [td]This is a concept specific to Arma2Net. It supports bridging to multiple "addins"[/td] [td]sock-rpc does not have the concept of "addins" ... but does not mean it's not possible. You can technically expose functionality from multiple independent Node.js modules, by creating a wrapper module that "requires" the dependent modules, and registers their functions as RPC calls.[/td] [/tr] [tr] [td]Client side SQF libraries[/td] [td]With Arma2Net, you have to deal with the low-level "callExtension" calls directly.[/td] [td]sock-rpc provides a layer of abstraction with a single SQF function (sock_rpc) that you can call from anywhere (client or server). If invoked from client-side, the call is automatically routed to the server side, where the sock.dll/sock.so extension is normally installed.[/td] [/tr] [/table]
  5. Alright, so new release of sock-rpc-stats v0.0.5 now available on NPM Here is the change-log: 0.0.5 Defect: Fix major problem where the interval function was always disabled Defect: Fix issue with default values 0, and empty string, being treated as undefined [*] 0.0.4 Enhancement: add support for pop, push, shift, and unshift array operations Enhancement: add support for JavaScript/JSON dot notation on keys [*] 0.0.3 Defect: Fixed issue with line endings for Linux [*] 0.0.2 Enhancement: meta function added for viewing scope meta-data [*] 0.0.1 - Initial release Also, added support on the Client SQF API within the sock-rpc-stats.mission for the new array manipulation functions, as well as a wrapper/adapter for iniDB. That should make it easier to port missions that are using iniDB (I used that for Wasteland). Hopefully I can keep working more on tools for Arma 3 :) I took the conversation about porting Wasteland over to their forums ... please use this thread over there now: http://forums.a3wasteland.com/index.php?topic=565.msg3468#msg3468 I have forked their Wasteland Altis source, and ported it to use sock-rpc-stats with minimal changes.
  6. Ok, on the topic of porting Wasteland ... looks like iniDB has the concept of accessing separate files as independent databases, as well as a concept of sections within a database. In order to map that data model to the JSON data-model, I think easiest way would be to do something like this: { "database1": { "section1": { }, "section2": { } }, "database2": { "section1": { } } } This way each "database" becomes a scope, and the underlying sections become keys within the scope. In order to make it easier for manipulating data, and accessing the data ... I will add support for JavaScript style dot-notation, on stats_set, and stats_get function. e.g. ["player1", ".money.bank", 100] call stats_set; ["player1", ".money.cash", 200] call stats_set; ["player2", ".vest.items[0]", "ItemRadio"] call stats_set; Would turn into: { "player1": { "money": { "bank": 100, "cash": 200 } }, "player2": { "vest": { "items": ["ItemRadio"] } } } I also noticed that iniDB has a function for deleting a database ... which seems kind of dangerous to me, even with backups enabled. So I think I will not allow an entire scope to be deleted. However, a "section" under a scope can be deleted simply by setting it to nil like this. ["database", "section1", nil] call stats_set; I'll update this thread once I make the "dot-notation" improvement to the Node.js module. I will need with provisioning a box to host the wasteland mission for live testing.
  7. Thanks, I have not integrated it with Wasteland ... but it should be technically feasible. I can lend some help. Where do I grab the wasteland mission from, is it this: https://github.com/A3Wasteland/Release_Files ? Edit: Going through the source, seems that it would not be too complex. Looks like it;s using iniDB, and all calls for set/get values go through two main functions: iniDB_read; initDB_write; I think that all it would take would be to take the functions in the file: fn_inidb_custom.sqf, and simply forward the calls to stats_get, and stats_set as needed (I could be underestimating the work).
  8. Thanks Tupolov. Been checking out the ALiVE mod, looks awesome.
  9. What is this ? It's a stats/persistence system for Linux or Windows Arma 3 dedicated server. The following database/storage systems are supported: MongoDB CouchDB Cassandra MySQL Redis File-System (if you don't have a database) At its core, the stats server exposes a key-value store that has this kind of structure: { "player1_id" : { "name": "player1", "money": 1000 }, "player2_id" : { "name": "player2", "money": 3000 } } Essentially, it is a big hash-map, with smaller nested hash-maps. The nested hash-maps are referred to as scopes. There is no limit to how many scopes you can have, or what a scope is. It's up to you. One possible design pattern is to have a scope for each player, using their UID as the scope name. Another pattern is to have a scope for each player/faction combination. (e.g. "player1_civ", "player1_bluefor", "player1_west", etc); The only restriction is that all scope names, and keys must be strings, but values can be pretty much anything (except an in-game object) Prerequisites Visual C++ Runtime (http://www.microsoft.com/en-us/download/details.aspx?id=26999) (if using Windows) lib32stdc++6 libc6-i386 lib32gcc1 (if using Linux) Node.js (http://nodejs.org/download/) Architecture It's made of three parts: Stats server, implemented using Node.js (source and documentation at: https://bitbucket.org/micovery/sock-rpc-stats) sock extension for Arma 3 (source and documentation at: https://bitbucket.org/micovery/sock.dll) Sample mission that contains the Client SQF API (source and documentation at: https://bitbucket.org/micovery/sock-rpc-stats.mission) The Client SQF API It has two main functions stats_set, and stats_get for setting, and getting key-value pairs Here are a few examples: //set key1,value1 within "scope" ["scope", "key1", "value1"] call stats_set; //set a single key-value pair ["scope", ["key1", "value1"]] call stats_set; //set multiple key-value pairs ["scope", ["key1", "value1"], ["key2", "value2"]] call stats_set; //get the value for "key1" ["scope", "key1"] call stats_get; //get the value for "key1", or return "default1" if not found ["scope", "key1", "default1"] call stats_get; //get the values for all keys within "scope" ["scope"] call stats_get; //get the values for "key1", "key2", and "key3" (note no default value given for "key3") ["scope", ["key1", "default1"], [key2, "default2"], ["key3"]] call stats_get; See the full Client SQF API documentation here. REPL for managing the key-value store Unless you know what you are doing, I would not recommend manipulating the data directly at the database level. To make the job easier, I have included an interactive REPL (Read-Eval-Print loop) that allows you to manipulate the data with simple set/get functions, in the same way as the Client SQF API. See the REPL documentation here. Fast read/write Once the data for a scope is read from the underlying storage system, it is cached in memory (until flushed). All read/write operations performed through the Client SQF API happen exclusively in-memory. By default, the data from memory is persisted to the underlying storage system every minute (you can modify the interval). However, it only persists the scopes that were modified in-memory since the last save. Automatic backups When the data for a scope is read from the underlying storage system, a backup is automatically created (you can disable this). This is mainly to help mitigate situations where a hacker tampers with the stats of the players in-game. The list of existing backups for a scope is kept within its meta-data. Server administrators can use the meta command in the REPL to view this information. The backups are stored in the same way as any other scope. The scope name for backups follows this format: originalScope + "." + timeStamp + ".bak" For now, restoring data from backup has to be done manually using the set/get commands. (I will be adding a restore command to the REPL to help with bulk restore). Extra goodies in the sample mission As proof of concept, I implemented a player stats system, and included it sample mission. The player stats system (pstats.sqf), saves the following information when a player disconnects: primary weapon (including items, and magazine with ammo count) secondary weapon (including items, and magazine with ammo count) handgun weapon (including items, and magazine with ammo count) position (above terrain level) stance damage fatigue goggles headgear assigned_items (Compass, GPS, Radio, Watch, Map, etc) backpack (with magazines and items) vest (with magazines and items) uniform (with magazines and items) Selected weapon The saved stats are by player UID, and faction. When the player reconnects, his faction specific stats are loaded, and restored. As long as the player is in-game, the saved stats will stay in memory (on the Node.js Stats Sever) ... but on-disconnect, his stats are flushed. Configuration Guide See the full instructions here for how to setup the Node.js Stats Server, the sock extension, and the sample mission. Here are a couple of demo videos I recorded following the steps in the configuration guide. First video shows me installing, and playing around with the REPL of the Node.js Stats server (connected to a remote MongoDB instance). Second video shows me configuring the Arma 3 dedicated server to use the sample mission, and sock extension for connecting to the Node.js Stats Server.
  10. New versions of both the sock-rpc Node.js module, and the Arma 3 sample mission are now available. (v1.0.0, and v.0.0.2 respectively) Server side changes On the server side, the callback mechanism for the JavaScript RPC functions has changed. The callback function is now passed as the last argument. If you are expecting variable arguments, in your RPC function, you must pop the last argument to get the callback. e.g. rpc.register("echo", function () { var args = Array.prototype.slice.call(arguments, 0); var callback = args.pop(); callback(null, args); }); Also, notice when invoking the callback, the signature is now: callback(err, result) This is more in-line with the callback standards used throughout Node.js. Client side changes On the client side, I have improved the sock.sqf library to allow calling the sock_rpc function from client-side SQF as well. Behind the scenes, it uses publicVariableServer, and publicVariableClient to pass the request, and response around. The one caveat is that when invoking the sock_rpc function client-side SQF, you have to do it within a scheduled environment, as it uses SQF sleep to wait for the server's response. e.g. [] spawn { private["_response"]; _response = ["echo", ["arg1", "arg2"], false] call sock_rpc; diag_log _response; }; If you are invoking the sock_rpc function on the server side, the "scheduled environment" restriction does not apply.
  11. I have ported the library to Linux. If you want to build it yourself, on Ubuntu 14.04 (64bit): sudo apt-get update sudo apt-get install git build-essential g++-multilib git clone [url]https://micovery@bitbucket.org/micovery/sock.dll.git[/url] cd sock.dll make If you want to use the pre-buit binary, on Ubuntu 14.04 (64bit): sudo apt-get install lib32stdc++6 libc6-i386 lib32gcc1 wget https://bitbucket.org/micovery/sock.dll/raw/v0.0.2/Release/sock.so It's nearly the exact same code-base as the windows version. (just added a few #ifdef __linux here and there). I did a bit of testing on the Linux version to make sure it actually worked, but that was pretty much it. Enjoy
  12. Great that you got it working! I've never setup Arma 3 server in a linux box, but I at home programming in linux, and sticking to POSIX. I'll give these instructions a try this weekend: https://community.bistudio.com/wiki/Arma_3_Dedicated_Server#Instructions_.28Linux_o.2Fs.29 If it works well, then you can have a sock.so lib soon :)
  13. That's awesome. I saw a post about the cheap SSL on hacker news (https://news.ycombinator.com/) the other day and was going to post back here. Looks you are were much ahead :-)
  14. Pretty nice! Good job. I have couple of security concerns, and user experience suggestions. 1. Enable HTTPS (Even this forum is using HTTP, so I can't safely use it on public networks) 2. Use OpenID connect, and store minimal amount of personal user information 3. The squad xml URL should be a first class item in the UI experience. Looks like it's tucked away within the members panel.
  15. Thanks! The sock.dll extension, and the sock.sqf library manage that transparently for you ... I have documented the internals at https://bitbucket.org/micovery/sock.dll#markdown-header-sock-sqf-protocol ... You can pretty much retrieve any size response (or rather, however long is the maximum you can fit inside an SQF string)
×