Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Run MPM from network share? #1889

Open
spotopolis opened this issue Jun 20, 2018 · 4 comments
Open

Run MPM from network share? #1889

spotopolis opened this issue Jun 20, 2018 · 4 comments

Comments

@spotopolis
Copy link

spotopolis commented Jun 20, 2018

Hi. I started looking into this back in v2.7.2.4.

I stopped working on it because development sped up on releases and was making the changes I made break the program.

What I want to be able to do is have a sinlge unified location on my network share that I can keep current and updated with GIT but not have to tweak each machine I have it on.

The idea was to have just the bat file on the target machine that would then point to the network location where the program was stored.

That way, if there was a ps1 miner file or some other update, I just had to maintain the one forlder and all of my systems would then use that update. The only thing I had to do was store stats and logs in individual folders for what I had that particular rig named. So it would in the ./logs and ./stats folder create a RIG1, RIG2, ect. based on the $WorkerName and it worked for the most part, but I kept running into issues where no matter how much I dug into the .ps1 files, they would all use the same stats and log folders and mess each other up even though I tried to get them to create separate logs based on the rig name in the bat file that was running it.

Is there an easy way to implement this change? To allow a single location, but depending on rig name, it creates it's own unique folder for that rigs logs and stats?

@grantemsley
Copy link
Contributor

Unfortunately no, the script expects the it's running on only one system. What you could do, if you have a lot of systems to update, is keep it on a network share, and have a script copy from that share to each machine on a regular basis.

Or use something like the update-dev.ps1 script from my repo that automatically pulls down changes from git, clears stats for updated miners, and downloads new binaries all at once. That's been my solution to keeping all my machines updated.

Or you might be able to do some magic with mklink to make each machine seem to have it's own stats folder - not sure how that works out with network drives.

@UselessGuru
Copy link
Contributor

UselessGuru commented Jun 21, 2018

Is there an easy way to implement this change? To allow a single location, but depending on rig name, it creates it's own unique folder for that rigs logs and stats?

Not with the current codebase. All paths are hardcoded.
In the future we might add support for this. I could imagine a new command line parameter which would be inserted in all the relevant paths in

  • Get-Stat
  • Set-Stat
  • Write-Log
    etc. (these are the functions that come to the top of my mind without further code inspection).

@spotopolis
Copy link
Author

Yeah. I did a lot of editing to get it able to run from a sinlge network location on each machine. It was a lot of trial of error, then referencing the reported errors and finding them in notepad++ and figure out what it was needing next.

I actually learned a decent amount about MPM and powershell in general from it.

I had it all working. Just the bat file on the local machine pointing to the network share and it would run with no issues, but as soon as I fired it up on another machine, they would fight over the log and stat files and then I would have messed up profits and algos would want to rebench.

So I came to the conclusion that I needed to have MPM create a new stat folder based on the $WorkerName in the bat file. Then I could run it on however many machines I wanted from one location, but no matter how many lines that referenced those two folders I modified, they only ever output to one folder.

@Spudz76
Copy link

Spudz76 commented Jul 6, 2018

You will also run into silly network share locking problems.

Such as, git can't write new files because "everyone" has it open with opportunistic locking (running an exe off a share locks it until all clients exit), so basically a whole new problem instead of solving any - all miners have to shut down / disuse the files before the main share can update, await the update, then jump back on the files.

Best to treat the share location as a mirror and just have each client box rsync the contents whenever they are refiring. Then nobody locks any of the files on the share server at any time, except maybe git during updates.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants