Netlinx Compiler Running very slow
RicardoSiqueira
Posts: 373
Up to today, most of my projects would compile within seconds. Now it is taking minutes. Even though I have a very fast laptop workstation.The compiler is running very slow. I've been running the latest version of Studio for a while and this slow down only started recently. Are any of you experiencing this? Any ideas on how to solve this problem?
Windows 7 Professional 64 bits
Dell Precision Laptop 16GIG RAM / 750GIG HD / INTEL i7 QUAD
Netlinx Studio: V3.4.908
Compiler: 2.5.2.300
Windows 7 Professional 64 bits
Dell Precision Laptop 16GIG RAM / 750GIG HD / INTEL i7 QUAD
Netlinx Studio: V3.4.908
Compiler: 2.5.2.300
0
Comments
Edited: it's the devicesdkrt.jar file. I ran out and looked.
Sent from my Nexus 5 using Tapatalk
I have also deleted the files suggested by ericmedley (snapirouter and devicesdkrt.jar). Then, I uninstalled Studio, rebooted, reinstalled studio and rebooted again, but slow compiler is still here.
Maybe a Windows 7 recent update. I will dig deeper...
Have you installed/re-configured any antivirus software recently?
**Conclusion: Carbonite does not play well with the AMX Netlinx Compiler. A large project that takes 8 seconds to compile without Carbonite, takes 4 minutes to compile when Carbonite is running. Again, even when you turn off Carbonite, the services processes still are running on the background and the only way to have the Netlinx Compiler working properly is to manually disable and kill the above Carbonite services mentioned above.
I will have to find a way to automate this somehow in Windows...
Thanks for the reply messages to this tread, specially for nickm to point the issue to me.
Sent from my Nexus 5 using Tapatalk
They screw up TPD4 too.
Try setting your workfiles in a location that carbonite isn't backup up, like a scratch partition. Drag finished bits when you are happy to a partition that IS backed up.
Personally, I don't feel these services quite work well in our space. If you think about it, the actual file backups needed are hardly a blip on the radar as far as file size is concerned. An entire years code would easily fit on a typical thumb drive. But having the ability to go through your year's work at the development level and step through the changes using familiar coding environment is invaluable. Just seeing a file dated such-and-so means nothing unless you take very detailed and awesome notes.
But being ble to out two axi's side by side and compare between versions is much more useful.
So, while I do have my own server which is continually backed up and so forth, I use Versioning Software (in my case GIT) as my cloud backup. Any of the current versioning solutions get the gig done. Like most things folks prefer their particular flavor but all that aside even klunky old SVN gets the job done.
I would probably make most Linux minions angry if they saw that I version all my files (axi,jar,module tko,project files and even UI files, IR files, and documentation like PDFs and word docs). The only thing I don't version is the tko, tkn and src files generated by NS.
This keeps my processor clen too and I can get to the files anywhere and even manage allowing other people access to my code when need be.
I'd highly recommend looking into it. Your other PC files and m3s, pics of your dog and whatnot can just be backed up some other way.
Can you tell Carbonite to ignore certain files? Maybe ignoring *.tkn *.src *.tko would let it compile.
I think Carbonite will let you ignore directories. But either way it's a lot of maintainance if you ask me. Git does that quickly and efficiently.
+1 for Bitbucket with their free private repositories for up to 5 people. They support both git and mercurial and I successfully use both for AMX and other less-popular automation products.
Git/Hg are extremely sophisticated tools but you only need a handful of commands to benefit. "git init ." in your project directory will start you off. Bitbucket will give you some commands to copy and paste to hook it up to their server. Then at the end of each day's work run "git add .", "git commit" and "git push". Boom, you have an off-site backup that will let you jump to the end of any day's work. Bitbucket will give you a neat web interface where you can compare exactly what changes you made to which files on which days.
Totally worth it.
I have all of the AMX jobs I am currently working on in the dropbox folder. Then I use a separate folder for git repos. The git repos reference the dropbox folder as a remote repo so the git database info does not get in the way of dropbox. I have never had a problem compiling within the dropbox folder either.
I might try to go 100% git and move code out of dropbox, but I'm not sure I want to fetch a fresh version of the bitbucket repo every time I switch computers.
Sent from my XT1058 using Tapatalk
+1 more for Bitbucket, though with Hg here
You can actually get more free users, up to 8 total, by inviting at least three of your users as well. I am slowly working at migrating all my existing code base to Bitbucket/Hg from my old, poorly organized SVN repository. Though that process is probably going to get sped along a bit over the next few weeks because the server the SVN repo lives on is dying and I don't really want to set it up again on the new server that's on the way.
I think this a use like your's is what SparleShare is going for (except it's self hosted). I haven't tried it yet.