You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Description:
When trying to simulate an event-stream with JMeter and use it as source on siddhi it works for a little time but ends with RAM being overused and the execution of the program stops.
I tried executing the code with a database,without a database, using a partition to get the events one by one.
@sink(type='file', @Map(type='json'),
@attributes( tipoDato='$.tipoDato', fecha='$.fecha', valor='$.valor', servicio='$.servicio'),
file.uri='/dev/null')
define stream fileSweetProduction (tipoDato string, fecha string, valor double, servicio string);
And this is the query executed to copy from one stream to another:
@info(name='query2')
from insertSweetProduction
select tipoDato,fecha,valor,servicio
insert into fileSweetProduction;
The expected results are that the wso2worker would show that all the events were processed and inserted on the sink stream. On JMeter I am simulating 1 user introducing 6000 events during 1 hour and it looks like the memory ends up overused and the simulation stops. Tried with partition and the memory usage improved a lot but still ended up in failure. All I can think is that is a coding problem but cant seem to find anything that could cause this.
The text was updated successfully, but these errors were encountered:
Description:
When trying to simulate an event-stream with JMeter and use it as source on siddhi it works for a little time but ends with RAM being overused and the execution of the program stops.
I tried executing the code with a database,without a database, using a partition to get the events one by one.
Affected Siddhi Version:
WSO2 Stream Processor latest version.
OS, DB, other environment details and versions:
The virtual machine where this is setup has 8GB RAM, 4 CPU.
Steps to reproduce:
This is the stream code:
@source(type = 'http',
receiver.url='http://172.23.3.22:8007/insertSweetProduction',
basic.auth.enabled='false',
@Map(type='json', @attributes( tipoDato='$.tipoDato', fecha='$.fecha', valor='$.valor', servicio='$.servicio')))
define stream insertSweetProduction (tipoDato string, fecha string, valor double, servicio string);
This is the sink stream:
@sink(type='file',
@Map(type='json'),
@attributes( tipoDato='$.tipoDato', fecha='$.fecha', valor='$.valor', servicio='$.servicio'),
file.uri='/dev/null')
define stream fileSweetProduction (tipoDato string, fecha string, valor double, servicio string);
And this is the query executed to copy from one stream to another:
@info(name='query2')
from insertSweetProduction
select tipoDato,fecha,valor,servicio
insert into fileSweetProduction;
The expected results are that the wso2worker would show that all the events were processed and inserted on the sink stream. On JMeter I am simulating 1 user introducing 6000 events during 1 hour and it looks like the memory ends up overused and the simulation stops. Tried with partition and the memory usage improved a lot but still ended up in failure. All I can think is that is a coding problem but cant seem to find anything that could cause this.
The text was updated successfully, but these errors were encountered: