Home AMX User Forum NetLinx Studio

Netlinx Compiler Running very slow

Up to today, most of my projects would compile within seconds. Now it is taking minutes. Even though I have a very fast laptop workstation.The compiler is running very slow. I've been running the latest version of Studio for a while and this slow down only started recently. Are any of you experiencing this? Any ideas on how to solve this problem?

Windows 7 Professional 64 bits
Dell Precision Laptop 16GIG RAM / 750GIG HD / INTEL i7 QUAD
Netlinx Studio: V3.4.908
Compiler: 2.5.2.300

Comments

  • nickmnickm Posts: 152
    Can't say that I'm currently experiencing this, but let me ask you this: Are you compiling from a folder that is constantly synced (such as Dropbox) or is a network folder you've made available offline on your local machine? These types of directories will have extremely adverse affects on the compiler.
  • ericmedleyericmedley Posts: 4,177
    Do a search on your drives for duplicate copies of the snapirouter file. If you have duplicates, delete them all, empty your recycle bin and restart and reinstall NS. There is a second file that gets corrupted too but I'm not near my laptop to search and remember its name. Perhaps someone else remembers... But do find that other file as well.

    Edited: it's the devicesdkrt.jar file. I ran out and looked.
  • It is a local folder. I do have Carbonite remote backup though. I wonder if the Carbonite service is slowing it down... I did install Carbonite last week...
  • nickmnickm Posts: 152
    Bingo. Try disabling Carbonite, reboot and then compile a project. I would bet that's the culprit.

    Sent from my Nexus 5 using Tapatalk
  • It is not Carbonite. I fully disabled it and rebooted and compiler still takes forever to compile any project that would take seconds before. My large projects that would take 15 or 20 seconds to compile are taking now 5 minutes.

    I have also deleted the files suggested by ericmedley (snapirouter and devicesdkrt.jar). Then, I uninstalled Studio, rebooted, reinstalled studio and rebooted again, but slow compiler is still here.

    Maybe a Windows 7 recent update. I will dig deeper...
  • Joe HebertJoe Hebert Posts: 2,159
    The compiler is running very slow.

    Have you installed/re-configured any antivirus software recently?
  • No. I use Microsoft Security Essentials. It was installed over a year ago, when I got the Laptop. It does have auto update on...
  • nickm was right. I spent a few hours today disabling Windows 7 services and when both CarboniteService and CarboniteUI.exe*32 are manually disabled, the Netlinx compiler works properly again. Now, compiling large projects takes 5 to 10 seconds as it should be. I thought by turning Carbonite remote backup off, it would work, but Carbonite service still runs all the time on the background, even though on Carbonite's control panel, I've selected BACKUP OFF. This is a bummer, as I can't use Carbonite with Netlinx. The whole idea of getting Carbonite was to automatically remotely backup Netlinx projects, so if something happens to my computer or the external USB drive, I would still be able to recover projects.

    **Conclusion: Carbonite does not play well with the AMX Netlinx Compiler. A large project that takes 8 seconds to compile without Carbonite, takes 4 minutes to compile when Carbonite is running. Again, even when you turn off Carbonite, the services processes still are running on the background and the only way to have the Netlinx Compiler working properly is to manually disable and kill the above Carbonite services mentioned above.

    I will have to find a way to automate this somehow in Windows...

    Thanks for the reply messages to this tread, specially for nickm to point the issue to me.
  • nickmnickm Posts: 152
    Glad I was able to help.

    Sent from my Nexus 5 using Tapatalk
  • viningvining Posts: 4,368
    This is a bummer, as I can't use Carbonite with Netlinx.
    Leave if off while working and then turn it back on when you're done. As nickm mentioned the same is true with Dropbox and probably any automatic file sync program. Plus by turning it off your protected from doing something real stupid that totally screws up your code or deletes it since you can always back from central storage or linked PC by just saving it and thus giving it the newest time stamp. If you leave it on you not only screw up the file on your PC but every where. When you're comfortable with what you have then turn it back on. Of course I think they all have some sort of version control so even if you do screw up the files everywhere you're not totally screwed.

    They screw up TPD4 too.
  • John NagyJohn Nagy Posts: 1,742
    The issue seems to be that it's trying to copy the files while they are actually still changing. And it fights with Studio as it tries to lock the file for backup.
    Try setting your workfiles in a location that carbonite isn't backup up, like a scratch partition. Drag finished bits when you are happy to a partition that IS backed up.
  • ericmedleyericmedley Posts: 4,177
    Re:Carbonite et al,

    Personally, I don't feel these services quite work well in our space. If you think about it, the actual file backups needed are hardly a blip on the radar as far as file size is concerned. An entire years code would easily fit on a typical thumb drive. But having the ability to go through your year's work at the development level and step through the changes using familiar coding environment is invaluable. Just seeing a file dated such-and-so means nothing unless you take very detailed and awesome notes.

    But being ble to out two axi's side by side and compare between versions is much more useful.

    So, while I do have my own server which is continually backed up and so forth, I use Versioning Software (in my case GIT) as my cloud backup. Any of the current versioning solutions get the gig done. Like most things folks prefer their particular flavor but all that aside even klunky old SVN gets the job done.

    I would probably make most Linux minions angry if they saw that I version all my files (axi,jar,module tko,project files and even UI files, IR files, and documentation like PDFs and word docs). The only thing I don't version is the tko, tkn and src files generated by NS.

    This keeps my processor clen too and I can get to the files anywhere and even manage allowing other people access to my code when need be.

    I'd highly recommend looking into it. Your other PC files and m3s, pics of your dog and whatnot can just be backed up some other way.
  • travistravis Posts: 180
    Are you hosting the repositories in-house or using some service like github?



    Can you tell Carbonite to ignore certain files? Maybe ignoring *.tkn *.src *.tko would let it compile.
  • ericmedleyericmedley Posts: 4,177
    travis wrote: »
    Are you hosting the repositories in-house or using some service like github?



    Can you tell Carbonite to ignore certain files? Maybe ignoring *.tkn *.src *.tko would let it compile.
    I happen to use bit bucket. They can handle most flavors of versioning.

    I think Carbonite will let you ignore directories. But either way it's a lot of maintainance if you ask me. Git does that quickly and efficiently.
  • tomktomk Posts: 24
    ericmedley wrote: »
    I happen to use bit bucket. They can handle most flavors of versioning.

    +1 for Bitbucket with their free private repositories for up to 5 people. They support both git and mercurial and I successfully use both for AMX and other less-popular automation products.

    Git/Hg are extremely sophisticated tools but you only need a handful of commands to benefit. "git init ." in your project directory will start you off. Bitbucket will give you some commands to copy and paste to hook it up to their server. Then at the end of each day's work run "git add .", "git commit" and "git push". Boom, you have an off-site backup that will let you jump to the end of any day's work. Bitbucket will give you a neat web interface where you can compare exactly what changes you made to which files on which days.

    Totally worth it.
  • the8thstthe8thst Posts: 470
    I'm a bitbucket + git fan as well for versioning control and 1 of my issue backups, but I still prefer dropbox to keep my computers in sync.
    I have all of the AMX jobs I am currently working on in the dropbox folder. Then I use a separate folder for git repos. The git repos reference the dropbox folder as a remote repo so the git database info does not get in the way of dropbox. I have never had a problem compiling within the dropbox folder either.

    I might try to go 100% git and move code out of dropbox, but I'm not sure I want to fetch a fresh version of the bitbucket repo every time I switch computers.

    Sent from my XT1058 using Tapatalk
  • rfletcherrfletcher Posts: 217
    tomk wrote: »
    +1 for Bitbucket with their free private repositories for up to 5 people. They support both git and mercurial and I successfully use both for AMX and other less-popular automation products.

    Git/Hg are extremely sophisticated tools but you only need a handful of commands to benefit. "git init ." in your project directory will start you off. Bitbucket will give you some commands to copy and paste to hook it up to their server. Then at the end of each day's work run "git add .", "git commit" and "git push". Boom, you have an off-site backup that will let you jump to the end of any day's work. Bitbucket will give you a neat web interface where you can compare exactly what changes you made to which files on which days.

    Totally worth it.

    +1 more for Bitbucket, though with Hg here :)

    You can actually get more free users, up to 8 total, by inviting at least three of your users as well. I am slowly working at migrating all my existing code base to Bitbucket/Hg from my old, poorly organized SVN repository. Though that process is probably going to get sped along a bit over the next few weeks because the server the SVN repo lives on is dying and I don't really want to set it up again on the new server that's on the way.
  • travistravis Posts: 180
    the8thst wrote: »
    I'm a bitbucket + git fan as well for versioning control and 1 of my issue backups, but I still prefer dropbox to keep my computers in sync.
    I have all of the AMX jobs I am currently working on in the dropbox folder. Then I use a separate folder for git repos. The git repos reference the dropbox folder as a remote repo so the git database info does not get in the way of dropbox. I have never had a problem compiling within the dropbox folder either.

    I might try to go 100% git and move code out of dropbox, but I'm not sure I want to fetch a fresh version of the bitbucket repo every time I switch computers.

    Sent from my XT1058 using Tapatalk

    I think this a use like your's is what SparleShare is going for (except it's self hosted). I haven't tried it yet.
  • the8thstthe8thst Posts: 470
    Thanks for the recommendation. I will check it out when I have a little free time.
Sign In or Register to comment.