- Created Thursday, February 4th 2016 @ 20:08:12
I have created a bot, but the ML model I am using is 20Mb. the compressed zip file is thus about 6Mb
When I attempt to upload it, I get a file too large error.
What file size limits are enforced here?
- Created Friday, February 5th 2016 @ 14:04:09
I guess up to 2MB is supported.
- Created Friday, February 5th 2016 @ 17:57:27
I think it is 2048 kb zipped. I have successfully uploaded a bot a bit over 2000 kb zipped; and I have tried to upload a bot of 2166 kb that was rejected. I don't know if there are also restrictions on the uncompressed size.
- Created Friday, February 5th 2016 @ 19:19:03
hm... can I ask for some help then? I've tried changing to shorter variable names, and I could remove some of the parenthesis and spaces, but the logic remains the bulk of the code. I have 81 one-vs-all classifiers. I can't reduce the size of the model significantly, as it would lose precision if I do (won't be as good)
here's what one of the 81 classifiers looks like: http://pastebin.com/BwifuBBD
Do you have any advice on how I can compress this any further, or if there are any plans on relaxing the bot size limit in the near future?
- Created Friday, February 5th 2016 @ 19:48:20
I'm kind of surprised that 20 mb of text doesn't compress better than that. First and simplest suggestion: are you compressing at whatever the maximum compression is for whatever program you're using to create the zip?
My second suggestion is to trim the precision of your values. You currently have something like 15 digits after the decimal place. I would bet dollars to donuts that you would have nearly the exact same results from your model if you only saved 5 or 6 digits. That would cut it down dramatically. Looking at that file it's mostly doubles so saving 50-60% on each one should cut things down a huge amount.
- Created Friday, February 5th 2016 @ 19:49:39
Oh, one other thing to double check is to make sure that you aren't including any of your locally-compiled binaries and intermediate files. They aren't necessary and are (relative to the source) pretty huge.
- Created Friday, February 5th 2016 @ 20:28:24
will try trimming.
The source file is currently at 18.3MB. compressing at ultra gives me an output file of 6.5MB (just that one file, ignoring the rest of the scaffolding around it)
removing extra parenthesis and spaces around the ternary operators has reduced the compressed size to just under 6MB
- Created Friday, February 5th 2016 @ 20:37:02
You might also want to try a tar.gz. I was able to compress my source an extra 33% using 7z to create a tar and then (again using 7zip) use the gzip ultra compression. Granted, my entire uncompressed code base is only 78k, so I don't know if it will scale.
- Created Friday, February 5th 2016 @ 23:58:36
truncating the values down to 6 decimal places reduces the compressed size down to 3Mb. Still too large for this competition.
to the moderators: are there any plans for relaxing the file limits? I could try a worse model, but it'll still have to be the equivalent of 81 models in one, limiting each model at 25kb... which is bound to give a bad performance as I am feeding in 270 binary features.
- Created Friday, February 12th 2016 @ 14:57:42
So can you separate learning part from game playing part? It solves problem. You can also make 2 projects inside of 1 solution for their relations.
- Created Tuesday, February 16th 2016 @ 09:32:45
"truncating the values down to 6 decimal places reduces the compressed size down to 3Mb. Still too large for this competition." Well, if you store the values in a binary format, it'll probable be much smaller. Let's assume you've got 6 decimals, then you get 1.345678 That's 8 bytes per value, without counting the overhead for formatting stuff. If you store it as a binary float, it'll be 4 bytes without overhead and without truncating. And it'll also be much faster to load, which can also be an advantage for this competition.