r/w stream of glsl tokens
JavaScript GLSL
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
lib
test
.gitignore
.npmignore Add synchronous API/tests/docs, general cleanup Nov 7, 2014
LICENSE.md Add synchronous API/tests/docs, general cleanup Nov 7, 2014
README.md
index.js
package.json 2.1.2 May 26, 2016
stream.js
string.js

README.md

glsl-tokenizer

Maps GLSL string data into GLSL tokens, either synchronously or using a streaming API.

var tokenString = require('glsl-tokenizer/string')
var tokenStream = require('glsl-tokenizer/stream')
var fs = require('fs')

// Synchronously:
var tokens = tokenString(fs.readFileSync('some.glsl'))

// Streaming API:
fs.createReadStream('some.glsl')
  .pipe(tokenStream())
  .on('data', function(token) {
    console.log(token.data, token.position, token.type)
  })

API

tokens = require('glsl-tokenizer/string')(src, [opt])

Returns an array of tokens given the GLSL source string src

You can specify opt.version string to use different keywords/builtins, such as '300 es' for WebGL2. Otherwise, will assume GLSL 100 (WebGL1).

var tokens = tokenizer(src, {
  version: '300 es'
})

stream = require('glsl-tokenizer/stream')([opt])

Emits 'data' events whenever a token is parsed with a token object as output.

As above, you can specify opt.version.

Tokens

{ 'type': TOKEN_TYPE
, 'data': "string of constituent data"
, 'position': integer position within the GLSL source
, 'line': line number within the GLSL source
, 'column': column number within the GLSL source }

The available token types are:

  • block-comment: /* ... */
  • line-comment: // ... \n
  • preprocessor: # ... \n
  • operator: Any operator. If it looks like punctuation, it's an operator.
  • float: Optionally suffixed with f
  • ident: User defined identifier.
  • builtin: Builtin function.
  • eof: Emitted on end; data will === '(eof)'.
  • integer
  • whitespace
  • keyword

License

MIT, see LICENSE.md for further information.