Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Scan + merge #140

Open
char0n opened this issue Mar 19, 2019 · 12 comments
Open

Scan + merge #140

char0n opened this issue Mar 19, 2019 · 12 comments

Comments

@char0n
Copy link

char0n commented Mar 19, 2019

Version

  • i18next: 15.0.6
  • i18next-scanner: 2.10.1

Is it possible to use scanner in a merge mode ? I'll describe what I mean...I already have some .json files translated, so they contain keys and translated values. I want to use the scanner to scan the codebase if there are new keys to be added. If yes, it will add the new keys to existing translation files. All the old keys will remain in translations files, it doesn't matter if they are used or not. Basically this covers usecase for progressively using scanner to add new keys to translation files from the source code in automated way. I haven't find a way how to do that. removeUnusedKeys is just not not having any effect. Can you pls advice if this is something that scanner can already do ?

Thank you

@andreidiaconescu
Copy link

andreidiaconescu commented Mar 22, 2019

Hello,
I have similar problem,
I have existing translation files, which contain already translations, and i want to extract all keys and merge where the translation already exists;
But it always overwrites all translation values with _STRING_NOT_TRANSLATED_
Did you manage to find a fix for this issue ?

@char0n
Copy link
Author

char0n commented Mar 23, 2019

Hi @andreidiaconescu,

I've managed to solve it. Checkout this repository https://github.com/char0n/i18next-test. When creating config for scanner it allows you to configure custom flush function which can control flushing the data into files. See this configuration (line 33-34 is the key) for more information. Algorithm can be modified to compensate for your requirements if they arise.

@andreidiaconescu
Copy link

andreidiaconescu commented Mar 25, 2019

hello @char0n
thank you, i tried the solution you proposed:

'use strict';

var fs = require('fs');
var chalk = require('chalk');

const eol = require('eol');
const path = require('path');
const VirtualFile = require('vinyl');


function flush(done) {
  const { parser } = this;
  const { options } = parser;

  // Flush to resource store
  const resStore = parser.get({ sort: options.sort });
  const { jsonIndent } = options.resource;
  const lineEnding = String(options.resource.lineEnding).toLowerCase();

  Object.keys(resStore).forEach((lng) => {
    const namespaces = resStore[lng];

    Object.keys(namespaces).forEach((ns) => {
      const resPath = parser.formatResourceSavePath(lng, ns);
      let resContent;
      try {
        resContent = JSON.parse(
          fs.readFileSync(
            fs.realpathSync(path.join('src', 'locales', resPath))
          ).toString('utf-8')
        );
      } catch (e) {
        resContent = {};
      }
      const obj = { ...namespaces[ns], ...resContent };
      let text = JSON.stringify(obj, null, jsonIndent) + '\n';

      if (lineEnding === 'auto') {
        text = eol.auto(text);
      } else if (lineEnding === '\r\n' || lineEnding === 'crlf') {
        text = eol.crlf(text);
      } else if (lineEnding === '\n' || lineEnding === 'lf') {
        text = eol.lf(text);
      } else if (lineEnding === '\r' || lineEnding === 'cr') {
        text = eol.cr(text);
      } else { // Defaults to LF
        text = eol.lf(text);
      }

      let contents = null;

      try {
        // "Buffer.from(string[, encoding])" is added in Node.js v5.10.0
        contents = Buffer.from(text);
      } catch (e) {
        // Fallback to "new Buffer(string[, encoding])" which is deprecated since Node.js v6.0.0
        contents = new Buffer(text);
      }

      this.push(new VirtualFile({
        path: resPath,
        contents: contents
      }));
    });
  });

  done();
}

module.exports = {
    input: [
        'src/app/**/*.{ts,html,htm}',
        '!**/node_modules/**',
    ],
    output: './tmp',
    options: {
        debug: true,
        sort: true,
        removeUnusedKeys: false,
        func: {
            list: ['i18nService.translate', 'mi18nService.translate'],
            extensions: ['.ts', '.html', '.htm']
        },
        attr: {
            list: ['[attx]', '[qwe]', '[mkey]', '[akey]'],
            extensions: ['.html', '.htm']
        },
        trans: false,
        lngs: ['fr', 'nl', 'en'],
        ns: [
            'translation',
        ],
        defaultLng: 'fr',
        defaultNs: 'translation',
        defaultValue: '__STRING_NOT_TRANSLATED__',
        resource: {
            loadPath: 'tmp/i18n/translations.{{lng}}.json',
            savePath: 'tmp/i18n/translations.{{lng}}.json',
            jsonIndent: 2,
            lineEnding: '\n'
        },
        nsSeparator: false, // namespace separator
        keySeparator: false, // key separator
        interpolation: {
            prefix: '{{',
            suffix: '}}'
        },
    },
    transform: function customTransform(file, enc, done) {
        "use strict";

        console.log(`Parsing file: file=${JSON.stringify(file.relative)}`);

        const parser = this.parser;
        // console.log('+++parser.get()+++', parser.get());
        // console.log('+++parser.get({ sort: true })+++', parser.get({ sort: true }));
        // process.exit(0);

        const content = fs.readFileSync(file.path, enc);
        let count = 0;

        parser.parseAttrFromString(content, (key, options) => {
          console.log(`Found attribute value to be translated translation: ${key}`);
            key = key.replace(/^'/, '').replace(/'$/, '');
            console.log(`Key after trimming single quotes: ${key}`);
            parser.set(key, Object.assign({}, options, {
                nsSeparator: false,
                keySeparator: false
            }));

            // parser.set(key, 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaa');
            ++count;
        });

        if (count > 0) {
            console.log(`i18next-scanner: count=${chalk.cyan(count)}, file=${chalk.yellow(JSON.stringify(file.relative))}`);
        }
        done();
    },
    flush,
};

  • i run

./node_modules/i18next-scanner/bin/cli.js --config i18next-scanner.config.js

  • the existing translation values are still being overwritten each time with _STRING_NOT_TRANSLATED_
  • so for me it is not working;

@char0n
Copy link
Author

char0n commented Mar 25, 2019

If you clone the repo and install and run npm run i18n:extract:scanner then you can see only new keys are being added and the already translated stuff is remaining untouched. So it does for me what I wanted. Maybe you need a little bit different thing than I described in my first comment of this issue.

@Justkant
Copy link

Justkant commented Aug 9, 2019

@andreidiaconescu You probably just need to make this line fs.realpathSync(path.join('src', 'locales', resPath)) match your output config in your case fs.realpathSync(path.join('tmp', resPath))

@Justkant
Copy link

Hi, I improved a bit the flush function for my use case (gist):

  • It always overwrite the defaultLng (to keep the JSON file up to date with the code defaultMessages & btw it seems like I wouldn't need a JSON file for the defaultLng if I use defaultMessage in my code)
  • and it supports the removeUnusedKeys option

@char0n
Copy link
Author

char0n commented Aug 16, 2019

@Justkant nice work. Thanks for sharing!

@illuminist
Copy link

I just notice that using output in combination with loadPath/savePath causes the parser to incorrectly load the json files which makes parser to write a new data file instead of merging with the old one.

Changing output to ./ and set loadPath and savePath to concatenate with output instead fixed the issue.

@softerboy
Copy link

Thank you @Justkant. Works like a charm )

@transmissions11
Copy link

@illuminist Thank you so much for finding that. Saved me hours!

@luca-saggese
Copy link

@illuminist: thanks it worked!

@Drecu
Copy link

Drecu commented Aug 27, 2021

for my case i just had to omit the output part and just use loadPath and savePath to get this "merge behaviour" but thanks for @illuminist for pointing me into the right direction!

goooooouwa added a commit to goooooouwa/goplan-web that referenced this issue Jul 21, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants