Skip to content
This repository has been archived by the owner on Dec 27, 2019. It is now read-only.

multiple streams and stream needs to be piped to writable #38

Open
eljefedelrodeodeljefe opened this issue Oct 11, 2017 · 1 comment
Open

Comments

@eljefedelrodeodeljefe
Copy link

eljefedelrodeodeljefe commented Oct 11, 2017

Hi,

I am struggling to see, why the below test would not work. Maybe I get something wrong. But right now I assume there is something very quirky during the Stream Implementation happening. Maybe some vents won't be passed.

In any case the need to piping to a writable a to get the end event is somehow weird.

I am running on Node 8.

The use case here would be to have an array of query and recursively querying them. I am creating new instances of the client and streams wherever possible.

var helper = require('./helper')
var QueryStream = require('../')
var concat = require('concat-stream')
var pg = require('pg')

var Transform = require('stream').Transform

var mapper = new Transform({ objectMode: true })

mapper._transform = function (obj, enc, cb) {
  console.log('will see data a couple of times')

  this.push(obj)
  cb(null)
}

helper('mapper', function (client) {
  it('works', function (done) {
    exec(client, () => {
      // // commenting that in works
      // return done()
      const client1 = new pg.Client()

      client1.connect(() => {
        exec(client1, () => {
          done()
        })
      })
    })
  })
})

function exec (client, cb) {
  var stream = new QueryStream('SELECT * FROM generate_series(0, 201) num', [], { highWaterMark: 100, batchSize: 50 })
  stream.on('end', function () {
    // console.log('stream end')
    return cb()
  })
  client.query(stream)
  // stream.pipe(mapper).pipe(concat(function (res) {
  //   cb()
  // }))

  // lib actually needs to pipe to a writable stream
  stream.pipe(mapper)
}

any help appreciated.

@eljefedelrodeodeljefe eljefedelrodeodeljefe changed the title multiple streams and sream needs to be piped to writable multiple streams and stream needs to be piped to writable Oct 11, 2017
@jnikles
Copy link

jnikles commented May 17, 2019

@eljefedelrodeodeljefe to my knowledge a transform stream is only an interim state in the streaming pipeline. The stream must be consumed somewhere for you to receive the end event. This could be either a piped writable or just a simple .on('data'..) listener. It's not a bug in this library

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants