Skip to content
This repository has been archived by the owner on Jun 26, 2023. It is now read-only.

Commit

Permalink
Merge pull request #70 from MITLibraries/retire-scripts
Browse files Browse the repository at this point in the history
Disable Scheduled Jobs
  • Loading branch information
cabutlermit committed Apr 10, 2023
2 parents 013046f + 00ca83e commit 9eea4b9
Show file tree
Hide file tree
Showing 8 changed files with 133 additions and 119 deletions.
107 changes: 62 additions & 45 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,30 +1,43 @@
# About this project
This repository contains scripts related to the Alma migration, and is primarily used in the alma-sftp-ec2 instance.

## Important note: How to update files on the alma-sftp-ec2 instance.
This repository contains scripts related to the Alma migration, and is primarily used in the alma-sftp-ec2 instance.

## IMPORTANT NOTE 1

This app is due for retirement. The scripts are all migrated to our AWS Organization as standalone containers. The EC2 instance is currently turned off, and this repo will eventually be archived, but in the meantime, this disables any cron jobs that might get pushed to the EC2 instance if it accidently gets turned back on.

## Important note 2: How to update files on the alma-sftp-ec2 instance.

Files in this repo are sync'd to the alma-sftp-ec2 instance on the first deployment of the machine, and from there, are manually synced when needed via `git pull` as the gituser.

# Directories
## Cron.d
## Directories

### Cron.d

This directory contains cron jobs, added to the cron.d directory on alma-sftp-ec2 automatically.
Cron job files MUST end in a newline.

## scripts
### scripts

This directory contains the scripts that need to run on alma-sftp-ec2.
Scripts should be "chmod +x" executable in order to run as a cron job successfully.

## SSM parameter store usage with alma-scripts
### SSM parameter store usage with alma-scripts

* SSM parameters in the /apps/alma-sftp/ namespace are accessible by scripts in the alma-scripts repo
* Parameters should be placed in the parameter store by developers in that path
* Secret parameters should be made type - `SecureString` and `Use the default KMS key for this account or specify a customer-managed key for this account.`

## SES usage within alma-scripts
### SES usage within alma-scripts

* Emails from the SES service must come from `noreply@libraries.mit.edu` for this app

## Development
### Development

The following env variables are required and should be set as follows in a `.env` file
for local development:
```

```bash
WORKSPACE=dev
SSM_PATH=/dev/
```
Expand All @@ -34,54 +47,58 @@ Additional env variables may be required depending on the work being done. Check
needed.

If an multi-line value, such as a private key, is needed in the `.env` file, use single quotes
```

```bash
SAP_DROPBOX_KEY='-----BEGIN RSA PRIVATE KEY-----
many
lines
-----END RSA PRIVATE KEY-----'
```

### Using Moto for local development
#### Using Moto for local development

Certain SSM parameters are for the SAP invoices process, however we don't currently have a dev instance of SSM to work with. [Moto](https://github.com/spulec/moto) should be used in [Standalone Server Mode](https://github.com/spulec/moto#stand-alone-server-mode) during local development to mimic these required SSM parameters rather than using stage or prod SSM Parameter Store.

To use:
1. Start moto in standalone server mode with `pipenv run moto_server`
2. Add `SSM_ENDPOINT_URL=http://localhost:5000` to your `.env` file (Note: be sure to comment this out before running tests or they will fail)
3. Start a Python shell and initialize the SSM client:
```
pipenv run python
from llama.ssm import SSM
ssm = SSM()
```
4. Check logging output to confirm that ssm was initialized with endpoint=http://localhost:5000
5. Still in the Python shell, create initial required values (only one for now):
```
ssm.update_parameter_value("/dev/SAP_SEQUENCE", "1001,20210722000000,ser", "StringList")
```

### Creating sample SAP data

1. Start moto in standalone server mode with `pipenv run moto_server`
2. Add `SSM_ENDPOINT_URL=http://localhost:5000` to your `.env` file (Note: be sure to comment this out before running tests or they will fail)
3. Start a Python shell and initialize the SSM client:

```bash
pipenv run python
from llama.ssm import SSM
ssm = SSM()
```

4. Check logging output to confirm that ssm was initialized with endpoint=http://localhost:5000
5. Still in the Python shell, create initial required values (only one for now):

```
ssm.update_parameter_value("/dev/SAP_SEQUENCE", "1001,20210722000000,ser", "StringList")
```

#### Creating sample SAP data

Running the SAP Invoices process during local development or on staging requires that
there be sample invoices ready to be paid in the Alma sandbox. To simplify this, there
is a CLI command that will create four sample invoices in the sandbox. To do this:
1. Make sure the following variables are set in your `.env`:
```
WORKSPACE=dev
SSM_PATH=/dev/ (you don't need any SSM parameters created locally for this command,
but config still requires a path in env)
ALMA_API_URL=<the Alma API base URL>
ALMA_API_ACQ_READ_WRITE_KEY=<the SANDBOX Alma Acq read/write key>
```
2. Run `pipenv run llama create-sandbox-sap-data`. You should get a final log message
saying there are four invoices ready for manual approval in Alma.
3. Go to the Alma sandbox UI > Acquisitions module > Review (Invoice) > Unassigned
tab. There should be four invoices listed whose numbers start with TestSAPInvoice.
4. For each of those invoices, click on it and then click "Save and Continue". They
will now show up in the Waiting For Approval Invoices queue.
5. From that queue, using the three dots to the right of each invoice, choose "Edit"
and then click "Approve" in the upper right corner.
6. Once the invoices have been approved, they are ready to be paid and will be
retrieved and processed using the llama sap-invoices CLI command.

1. Make sure the following variables are set in your `.env`:

```bash
WORKSPACE=dev
SSM_PATH=/dev/
ALMA_API_URL=<the Alma API base URL>
ALMA_API_ACQ_READ_WRITE_KEY=<the SANDBOX Alma Acq read/write key>
```

2. Run `pipenv run llama create-sandbox-sap-data`. You should get a final log message saying there are four invoices ready for manual approval in Alma.
3. Go to the Alma sandbox UI > Acquisitions module > Review (Invoice) > Unassigned tab. There should be four invoices listed whose numbers start with TestSAPInvoice.
4. For each of those invoices, click on it and then click "Save and Continue". They will now show up in the Waiting For Approval Invoices queue.
5. From that queue, using the three dots to the right of each invoice, choose "Edit" and then click "Approve" in the upper right corner.
6. Once the invoices have been approved, they are ready to be paid and will be retrieved and processed using the llama sap-invoices CLI command.

Note that sample invoices will remain in the Alma sandbox in the "Waiting to be Sent"
status until a "real", "final" sap-invoices process has been run, at which point they
will be marked as paid and new sample invoices will need to be created.
will be marked as paid and new sample invoices will need to be created.
3 changes: 1 addition & 2 deletions cron.d/credit-card-slips
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
# Cron jobs for credit card slips
#
# Daily at 0801
01 08 * * * gituser /home/gituser/alma-scripts/scripts/Credit-Card-Slips.sh
# 01 08 * * * gituser /home/gituser/alma-scripts/scripts/Credit-Card-Slips.sh
#
#make sure to keep required newline at end of file -

3 changes: 1 addition & 2 deletions cron.d/patron-load
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
# Cron jobs for patron load
#
# Daily at 0301
01 03 * * * gituser /home/gituser/alma-scripts/scripts/Patron-load.sh
# 01 03 * * * gituser /home/gituser/alma-scripts/scripts/Patron-load.sh
#
#make sure to keep required newline at end of file -

3 changes: 1 addition & 2 deletions cron.d/timdex-exports
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Cron jobs for TIMDEX exports
#
# Daily at 0853
53 08 * * * gituser /home/gituser/alma-scripts/scripts/Update-Export.sh
# 53 08 * * * gituser /home/gituser/alma-scripts/scripts/Update-Export.sh
#
# # At 0030 on the 2nd of the month (runs on full export from the 1st of the
# # month)
Expand All @@ -10,4 +10,3 @@
#30 0 5 * * gituser /home/gituser/alma-scripts/scripts/Full-Export.sh

#make sure to keep required newline at end of file -

26 changes: 13 additions & 13 deletions scripts/Credit-Card-Slips.sh
Original file line number Diff line number Diff line change
@@ -1,18 +1,18 @@
#!/bin/bash
#source the environment variables here so that the script runs correctly for the cron user
source /etc/profile
# #!/bin/bash
# #source the environment variables here so that the script runs correctly for the cron user
# source /etc/profile

#make the logs dir if it doesn't already exist
mkdir /home/gituser/logs
# #make the logs dir if it doesn't already exist
# mkdir /home/gituser/logs

#change to the alma-scripts directory
cd /home/gituser/alma-scripts
# #change to the alma-scripts directory
# cd /home/gituser/alma-scripts

#install the dependencies
/usr/local/bin/pipenv install
# #install the dependencies
# /usr/local/bin/pipenv install

#run the update, which automatically only uses the current days files
# IF its the PROD instance, send it to the prod email address
[[ $WORKSPACE == "prod" ]] && /usr/local/bin/pipenv run llama cc-slips --source_email noreply@libraries.mit.edu --recipient_email ils-lib@mit.edu --recipient_email monoacq@mit.edu > /home/gituser/logs/credit-card-slips.log 2>&1 || /usr/local/bin/pipenv run llama cc-slips --source_email noreply@libraries.mit.edu --recipient_email lib-alma-notifications@mit.edu > /home/gituser/logs/credit-card-slips.log 2>&1
# #run the update, which automatically only uses the current days files
# # IF its the PROD instance, send it to the prod email address
# [[ $WORKSPACE == "prod" ]] && /usr/local/bin/pipenv run llama cc-slips --source_email noreply@libraries.mit.edu --recipient_email ils-lib@mit.edu --recipient_email monoacq@mit.edu > /home/gituser/logs/credit-card-slips.log 2>&1 || /usr/local/bin/pipenv run llama cc-slips --source_email noreply@libraries.mit.edu --recipient_email lib-alma-notifications@mit.edu > /home/gituser/logs/credit-card-slips.log 2>&1

aws ses send-email --region us-east-1 --from noreply@libraries.mit.edu --to lib-alma-notifications@mit.edu --subject "Creditcardslips $WORKSPACE Job Completed" --text file:///home/gituser/logs/credit-card-slips.log
# aws ses send-email --region us-east-1 --from noreply@libraries.mit.edu --to lib-alma-notifications@mit.edu --subject "Creditcardslips $WORKSPACE Job Completed" --text file:///home/gituser/logs/credit-card-slips.log
30 changes: 15 additions & 15 deletions scripts/Full-Export.sh
Original file line number Diff line number Diff line change
@@ -1,21 +1,21 @@
#!/bin/bash
#source the environment variables here so that the script runs correctly for the cron user
source /etc/profile
# #!/bin/bash
# #source the environment variables here so that the script runs correctly for the cron user
# source /etc/profile

#make the logs dir if it doesn't already exist
mkdir /home/gituser/logs
# #make the logs dir if it doesn't already exist
# mkdir /home/gituser/logs

#change to the alma-scripts directory
cd /home/gituser/alma-scripts
# #change to the alma-scripts directory
# cd /home/gituser/alma-scripts

#Get the current date, but dont use the day, we hard code that to use the 1st
CURRENT_DATE="$(date +"%Y%m")01"
# #Get the current date, but dont use the day, we hard code that to use the 1st
# CURRENT_DATE="$(date +"%Y%m")01"

#install the dependencies
/usr/local/bin/pipenv install
# #install the dependencies
# /usr/local/bin/pipenv install

#run the full update, this should only happen on the second of the month for files generated on the first.
/usr/local/bin/pipenv run llama concat-timdex-export --export_type FULL --date "$CURRENT_DATE" > /home/gituser/logs/timdex-concat.log 2>&1
# #run the full update, this should only happen on the second of the month for files generated on the first.
# /usr/local/bin/pipenv run llama concat-timdex-export --export_type FULL --date "$CURRENT_DATE" > /home/gituser/logs/timdex-concat.log 2>&1

# IF its the PROD instance, send it to the prod email address, otherwise, send to just our dev emails
[[ $WORKSPACE == "prod" ]] && aws ses send-email --region us-east-1 --from noreply@libraries.mit.edu --to lib-alma-timdex-notifications@mit.edu --subject "PROD FULL Concat Job Completed" --text file:///home/gituser/logs/timdex-concat.log || aws ses send-email --region us-east-1 --from noreply@libraries.mit.edu --to lib-alma-notifications@mit.edu --subject "TESTING FULL Concat Job Completed" --text file:///home/gituser/logs/timdex-concat.log
# # IF its the PROD instance, send it to the prod email address, otherwise, send to just our dev emails
# [[ $WORKSPACE == "prod" ]] && aws ses send-email --region us-east-1 --from noreply@libraries.mit.edu --to lib-alma-timdex-notifications@mit.edu --subject "PROD FULL Concat Job Completed" --text file:///home/gituser/logs/timdex-concat.log || aws ses send-email --region us-east-1 --from noreply@libraries.mit.edu --to lib-alma-notifications@mit.edu --subject "TESTING FULL Concat Job Completed" --text file:///home/gituser/logs/timdex-concat.log
54 changes: 27 additions & 27 deletions scripts/Patron-load.sh
Original file line number Diff line number Diff line change
@@ -1,38 +1,38 @@
#!/bin/bash
#source the environment variables here so that the script runs correctly for the cron user
source /etc/profile
# #!/bin/bash
# #source the environment variables here so that the script runs correctly for the cron user
# source /etc/profile

#make the logs dir if it doesn't already exist
mkdir /home/gituser/logs
# #make the logs dir if it doesn't already exist
# mkdir /home/gituser/logs

#change to the alma-scripts directory
cd /home/gituser/alma-scripts/patronload
# #change to the alma-scripts directory
# cd /home/gituser/alma-scripts/patronload

#install the dependencies
/usr/local/bin/pipenv install
# #install the dependencies
# /usr/local/bin/pipenv install

#Run the staff load
/usr/local/bin/pipenv run python staff.py > /home/gituser/logs/patron-load.log 2>&1
# #Run the staff load
# /usr/local/bin/pipenv run python staff.py > /home/gituser/logs/patron-load.log 2>&1

#Run the student load
/usr/local/bin/pipenv run python student.py >> /home/gituser/logs/patron-load.log 2>&1
# #Run the student load
# /usr/local/bin/pipenv run python student.py >> /home/gituser/logs/patron-load.log 2>&1

#make the folder if it doesn't already exist, this perl script errors out without the folder
mkdir SEND
# #make the folder if it doesn't already exist, this perl script errors out without the folder
# mkdir SEND

#Run the "zip function" that also does a diff and makes sure the staff files take precedence
perl scripts/pack_all_records.pl >> /home/gituser/logs/patron-load.log 2>&1
# #Run the "zip function" that also does a diff and makes sure the staff files take precedence
# perl scripts/pack_all_records.pl >> /home/gituser/logs/patron-load.log 2>&1

#Delete existing zips if they exist, we cant double up zip files if an alma run fails, files, once used, are renamed to .old
aws s3 rm s3://$ALMA_BUCKET/exlibris/PatronLoad/ --exclude "*" --include "*.zip" >> /home/gituser/logs/patron-load.log 2>&1
# #Delete existing zips if they exist, we cant double up zip files if an alma run fails, files, once used, are renamed to .old
# aws s3 rm s3://$ALMA_BUCKET/exlibris/PatronLoad/ --exclude "*" --include "*.zip" >> /home/gituser/logs/patron-load.log 2>&1

#Sync to the s3 bucket s3://alma-sftp-prod/exlibris/PatronLoad/
# MV also deletes the zip files if they are succesfully copied
aws s3 mv SEND/ s3://$ALMA_BUCKET/exlibris/PatronLoad/ --exclude "*" --include "*.zip" --recursive >> /home/gituser/logs/patron-load.log 2>&1
# #Sync to the s3 bucket s3://alma-sftp-prod/exlibris/PatronLoad/
# # MV also deletes the zip files if they are succesfully copied
# aws s3 mv SEND/ s3://$ALMA_BUCKET/exlibris/PatronLoad/ --exclude "*" --include "*.zip" --recursive >> /home/gituser/logs/patron-load.log 2>&1

#Send notify of job completion
aws ses send-email --region us-east-1 --from noreply@libraries.mit.edu --to lib-alma-notifications@mit.edu --subject "Patronload $WORKSPACE Job Completed" --text file:///home/gituser/logs/patron-load.log
# #Send notify of job completion
# aws ses send-email --region us-east-1 --from noreply@libraries.mit.edu --to lib-alma-notifications@mit.edu --subject "Patronload $WORKSPACE Job Completed" --text file:///home/gituser/logs/patron-load.log

# Remove the "rejects" files from the filesystem
#rm rejects_students_script.txt
#rm rejects_staff_script.txt
# # Remove the "rejects" files from the filesystem
# #rm rejects_students_script.txt
# #rm rejects_staff_script.txt
26 changes: 13 additions & 13 deletions scripts/Update-Export.sh
Original file line number Diff line number Diff line change
@@ -1,18 +1,18 @@
#!/bin/bash
#source the environment variables here so that the script runs correctly for the cron user
source /etc/profile
# #!/bin/bash
# #source the environment variables here so that the script runs correctly for the cron user
# source /etc/profile

#make the logs dir if it doesn't already exist
mkdir /home/gituser/logs
# #make the logs dir if it doesn't already exist
# mkdir /home/gituser/logs

#change to the alma-scripts directory
cd /home/gituser/alma-scripts
# #change to the alma-scripts directory
# cd /home/gituser/alma-scripts

#install the dependencies
/usr/local/bin/pipenv install
# #install the dependencies
# /usr/local/bin/pipenv install

#run the update, which automatically only uses the current days files
/usr/local/bin/pipenv run llama concat-timdex-export --export_type UPDATE > /home/gituser/logs/timdex-concat.log 2>&1
# #run the update, which automatically only uses the current days files
# /usr/local/bin/pipenv run llama concat-timdex-export --export_type UPDATE > /home/gituser/logs/timdex-concat.log 2>&1

# IF its the PROD instance, send it to the prod email address, otherwise, send to just our dev emails
[[ $WORKSPACE == "prod" ]] && aws ses send-email --region us-east-1 --from noreply@libraries.mit.edu --to lib-alma-timdex-notifications@mit.edu --subject "PROD UPDATE Concat Job Completed" --text file:///home/gituser/logs/timdex-concat.log || aws ses send-email --region us-east-1 --from noreply@libraries.mit.edu --to lib-alma-notifications@mit.edu --subject "TESTING UPDATE Concat Job Completed" --text file:///home/gituser/logs/timdex-concat.log
# # IF its the PROD instance, send it to the prod email address, otherwise, send to just our dev emails
# [[ $WORKSPACE == "prod" ]] && aws ses send-email --region us-east-1 --from noreply@libraries.mit.edu --to lib-alma-timdex-notifications@mit.edu --subject "PROD UPDATE Concat Job Completed" --text file:///home/gituser/logs/timdex-concat.log || aws ses send-email --region us-east-1 --from noreply@libraries.mit.edu --to lib-alma-notifications@mit.edu --subject "TESTING UPDATE Concat Job Completed" --text file:///home/gituser/logs/timdex-concat.log

0 comments on commit 9eea4b9

Please sign in to comment.