-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Import script #3
Comments
Couple of questions:
|
Yes. We're migrating away from Stormpath. I've built a proof-of-concept stormpath export script that'll get us data in the described format. We'll get a big-ass folder with 600,000 users in it. It'll then be split into small chunks by a shell script.
Yes, would be great. That keeps things flexible. We need to know which user-import failed though. There might be 2 users that have the same "tag" for instance: expecting that to be rare, we'll resolve those issues manually.
Yup. Desired "output" is just an illustration. I think I took the format of the
Yes, there might be extra fields in the input that have to be ignored as well. |
All right, let's have a format as payload for user creation (
Not sure which from 2. would be easier to wrap in |
About generating passwords — do we care about what they are, or we expect user to change it later? |
I suppose we can pipeline with multiple smaller tools written with whatever is more convenient.
then some bash glue around that. |
Password can be set to random/safe ones... We will mass email the users that they might need to change their password when they connect with a new device. |
My thinking is that starting node process for 600k files is a bit wasteful (takes 50ms just to start, do nothing and exit; about the same as read file, parse json, serialize and print it out):
Though, doing it this way is a lot less code. But maybe piping together a bunch of event emitters isn't that much code and I'd rather write js than bash :) Let's see… |
|
I guess it's so small that you can also have the nodejs conversion script do the sending of a single request, report on stdout. Then simply find/xargs the whole directory to the script > output.json, shouldn't require a lot of |
Implement an import script, with input data organized at described below.
All users are stored in a directory. This directory will contain a lot of JSON files, named after the username (
<username>.json
). Each JSON file represents a single user.It contains an object with the following fields (all being of type string):
"Facebook"
or"Email"
givenName
equals"Facebook"
We want to import those into ganomede-directory with the following transformation.
input.username
randomPassword()
input.username
tagizer(input.username)
input.email
input.middleName
APP_ID
a constant provided as an env variable or CLI argument (required).randomPassword()
is a function that generates a safe random password.tagizer()
is a function that generate the unambiguous version of a username. See the ganomede-tagizer micro-library.Example input data
cat NamilleX07.json
cat 05lala61.json
Corresponding output data
The text was updated successfully, but these errors were encountered: