Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] Add USDT Volume to the alert (Optional) #10

Closed
Majenunez opened this issue Jul 8, 2021 · 18 comments
Closed

[FEATURE] Add USDT Volume to the alert (Optional) #10

Majenunez opened this issue Jul 8, 2021 · 18 comments

Comments

@Majenunez
Copy link

Thank you very much for this excellent work.

Would it be possible to add to the alert the USDT Volume traded at the moment? It would be very good, since it is an important parameter to decide whether to enter or not to trade the market. It could be optional.

Regards

@brianleect
Copy link
Owner

I went to take a look at the REST api and it is possible to get the volume but a different end point will have to be called

gitquikpi

Based on binance stated rate limits 1200 requests per minute , assuming we call the 24h ticker is called every 1s, we will end up with a weight of 2400 requests per minute which is over the limit. Assuming we aim for minimum interval for getting information, it can theoretically be called every 2s.

I'll have to restructure the program to cater to this endpoint for volume data but it should be quite doable.

@brianleect
Copy link
Owner

Thank you very much for this excellent work.

Would it be possible to add to the alert the USDT Volume traded at the moment? It would be very good, since it is an important parameter to decide whether to enter or not to trade the market. It could be optional.

Regards

I have a demo version which I updated which fits your requirements in theory. I have yet to test it but it should work. I compared the interval by determining volume change in % from previous interval and sends that information along with the normal updates. You can access the test script in the 'TEST_VOL_UPDATE' .

However, after implementing the code I realized that the fact binance uses the rolling window for 24h volume might result in the comparison being slightly flawed due to volume changes 24h ago impacting the numbers I'm using for the calculation. A large % change for volume interval might be possibly due to especially low trading volume 24h from the current time of comparison.

Do give the new code a try and let me know if you face any issues and if values obtained are accurate. A possible alternative solution would be to use web sockets but I'm sadly not very familiar with how to properly extract the information required from them.

I'll try running the code perhaps tomorrow as well to see if it works.

@Majenunez
Copy link
Author

Thank you very much for this excellent work.
Would it be possible to add to the alert the USDT Volume traded at the moment? It would be very good, since it is an important parameter to decide whether to enter or not to trade the market. It could be optional.
Regards

I have a demo version which I updated which fits your requirements in theory. I have yet to test it but it should work. I compared the interval by determining volume change in % from previous interval and sends that information along with the normal updates. You can access the test script in the 'TEST_VOL_UPDATE' .

However, after implementing the code I realized that the fact binance uses the rolling window for 24h volume might result in the comparison being slightly flawed due to volume changes 24h ago impacting the numbers I'm using for the calculation. A large % change for volume interval might be possibly due to especially low trading volume 24h from the current time of comparison.

Do give the new code a try and let me know if you face any issues and if values obtained are accurate. A possible alternative solution would be to use web sockets but I'm sadly not very familiar with how to properly extract the information required from them.

I'll try running the code perhaps tomorrow as well to see if it works.

Okay, that's good news. I will test it and let you know.

I mean in theory it should not affect the limits of the declared binance rate of requests per minute then.

@Majenunez
Copy link
Author

Thank you very much for this excellent work.
Would it be possible to add to the alert the USDT Volume traded at the moment? It would be very good, since it is an important parameter to decide whether to enter or not to trade the market. It could be optional.
Regards

I have a demo version which I updated which fits your requirements in theory. I have yet to test it but it should work. I compared the interval by determining volume change in % from previous interval and sends that information along with the normal updates. You can access the test script in the 'TEST_VOL_UPDATE' .
However, after implementing the code I realized that the fact binance uses the rolling window for 24h volume might result in the comparison being slightly flawed due to volume changes 24h ago impacting the numbers I'm using for the calculation. A large % change for volume interval might be possibly due to especially low trading volume 24h from the current time of comparison.
Do give the new code a try and let me know if you face any issues and if values obtained are accurate. A possible alternative solution would be to use web sockets but I'm sadly not very familiar with how to properly extract the information required from them.
I'll try running the code perhaps tomorrow as well to see if it works.

Okay, that's good news. I will test it and let you know.

I mean in theory it should not affect the limits of the declared binance rate of requests per minute then.

I got this error:

Traceback (most recent call last):
File ".\price_vol_tracker.py", line 137, in
getPercentageChange(asset)
File ".\price_vol_tracker.py", line 91, in getPercentageChange
change = round((asset_dict['price'][-1] - asset_dict['price'][-1-int(data_points/EXTRACT_INTERVAL)]) / asset_dict['price'][-1],5)
ZeroDivisionError: float division by zero

For these parameters:

intervals = ['5s', '15s','30s','1m'] # Try to ensure extract interval does not clash
outlier_param = {'5s':0.01,'15s':0.02,'30s':0.03,'1m':0.05}

@velajos
Copy link

velajos commented Jul 10, 2021

I went to take a look at the REST api and it is possible to get the volume but a different end point will have to be called

gitquikpi

Based on binance stated rate limits 1200 requests per minute , assuming we call the 24h ticker is called every 1s, we will end up with a weight of 2400 requests per minute which is over the limit. Assuming we aim for minimum interval for getting information, it can theoretically be called every 2s.

I'll have to restructure the program to cater to this endpoint for volume data but it should be quite doable.

HI Brian, I hope you are well.

I don't understand the Binance Api limit issue.

On my VPS server I have some scripts running that use the Binance Spot and Futures Api's and I have never had any problems with them.

I wonder if using your code I could have problems? Since you comment that your program is almost at the limit. If so, how could I reduce that amount of requests with some kind of delay, it doesn't matter if I miss some Pumps or Dumps, but it's more important for me to avoid my ip to be restricted because of too many requests.

Or in what way the call of requests could be optimized in a way that a restriction by Binance is avoided as much as possible.

Thanks in advance for the clarification.

@brianleect
Copy link
Owner

Thank you very much for this excellent work.
Would it be possible to add to the alert the USDT Volume traded at the moment? It would be very good, since it is an important parameter to decide whether to enter or not to trade the market. It could be optional.
Regards

I have a demo version which I updated which fits your requirements in theory. I have yet to test it but it should work. I compared the interval by determining volume change in % from previous interval and sends that information along with the normal updates. You can access the test script in the 'TEST_VOL_UPDATE' .
However, after implementing the code I realized that the fact binance uses the rolling window for 24h volume might result in the comparison being slightly flawed due to volume changes 24h ago impacting the numbers I'm using for the calculation. A large % change for volume interval might be possibly due to especially low trading volume 24h from the current time of comparison.
Do give the new code a try and let me know if you face any issues and if values obtained are accurate. A possible alternative solution would be to use web sockets but I'm sadly not very familiar with how to properly extract the information required from them.
I'll try running the code perhaps tomorrow as well to see if it works.

Okay, that's good news. I will test it and let you know.
I mean in theory it should not affect the limits of the declared binance rate of requests per minute then.

I got this error:

Traceback (most recent call last): File ".\price_vol_tracker.py", line 137, in getPercentageChange(asset) File ".\price_vol_tracker.py", line 91, in getPercentageChange change = round((asset_dict['price'][-1] - asset_dict['price'][-1-int(data_points/EXTRACT_INTERVAL)]) / asset_dict['price'][-1],5) ZeroDivisionError: float division by zero

For these parameters:

intervals = ['5s', '15s','30s','1m'] # Try to ensure extract interval does not clash outlier_param = {'5s':0.01,'15s':0.02,'30s':0.03,'1m':0.05}

Thanks for testing the code, I tried testing it myself as well and it appears that the error is caused by BCCUSDT pair lastPrice being recognized as 0.

image

image

On a quick search, it appears that BCC actually stood for Bitconnect, which isn't even actually offered for trading in Binance.

image
It appears the pair randomly pops up and is not consistently received as part of the output. When it did work however, it appears that not only is the delisted BCC pair part of the output, there are quite a few others pairs as well.

Delisted: BCCBTC
Added symbol: BTCUSDT
Added symbol: ETHUSDT
Delisted: HSRBTC
Delisted: OAXETH
Delisted: DNTETH
Delisted: MCOETH
Delisted: ICNETH

^ Part of the output when testing the code to filter, there's actually over 20-30 delisted pairs that comes with the output apparently.

I have modified the code to now skip pairs when it is detected to have a price of $0 as a solution for now. Seems like the simplest implementation to me. Do let me know if you feel that there is a better way of approaching this issue.

@brianleect
Copy link
Owner

brianleect commented Jul 10, 2021

Also @Majenunez on the topic of viewing the % change in volume I have implemented it in the latest version which you should be able to download located in the TEST_VOL folder same as the previous update.

image

I have tested it with the default parameters and it appears to work thus far. The values being outputted seems to have an especially large range from -640k to +30k. A possible explanation is that there was high activity 24h prior resulting in a large drop in volume relatively (-640k) or little to no activity prior resulting in a large increase (30k).

So while it does work, I'm a little unsure on the value it provides in assessing trader movements due to the limited scope for comparison. Do let me know your thoughts on this matter.

Cheers!

@brianleect
Copy link
Owner

brianleect commented Jul 10, 2021

I went to take a look at the REST api and it is possible to get the volume but a different end point will have to be called
gitquikpi
Based on binance stated rate limits 1200 requests per minute , assuming we call the 24h ticker is called every 1s, we will end up with a weight of 2400 requests per minute which is over the limit. Assuming we aim for minimum interval for getting information, it can theoretically be called every 2s.
I'll have to restructure the program to cater to this endpoint for volume data but it should be quite doable.

HI Brian, I hope you are well.

I don't understand the Binance Api limit issue.

On my VPS server I have some scripts running that use the Binance Spot and Futures Api's and I have never had any problems with them.

I wonder if using your code I could have problems? Since you comment that your program is almost at the limit. If so, how could I reduce that amount of requests with some kind of delay, it doesn't matter if I miss some Pumps or Dumps, but it's more important for me to avoid my ip to be restricted because of too many requests.

Or in what way the call of requests could be optimized in a way that a restriction by Binance is avoided as much as possible.

Thanks in advance for the clarification.

The original version which only looks at price is far from hitting the rate limit as far as I'm aware. It's currently sends 1 requests per second with the request having a weight of 2, on a per minute basis it only takes up 120 requests which is 10% of the maximum allowed weight/min indicated in the Binance API (1200 weight).

However, if you are looking to utilize the TEST_VOL version of the program, at the rate of 1 request per 2s it takes up 40 weight per request putting it at the max of 1200 requests per minute which might pose a problem for your current setup. I created a separate parameter labelled under 'EXTRACT_INTERVAL' that I set at the default of '5s' which takes 12 requests / min = 480 weight/min which is around 40% of the allowed weight. It should not be a problem to run it concurrently with your other scripts.

At the current point I do not see much utility for the TEST_VOL version until a more suitable point of reference can be determined which would more accurately reflect trading movements consistently.

You can refer to https://binance-docs.github.io/apidocs/spot/en/#market-data-endpoints for more information on the weights of each endpoint if you are interested.

@velajos
Copy link

velajos commented Jul 10, 2021

I went to take a look at the REST api and it is possible to get the volume but a different end point will have to be called
gitquikpi
Based on binance stated rate limits 1200 requests per minute , assuming we call the 24h ticker is called every 1s, we will end up with a weight of 2400 requests per minute which is over the limit. Assuming we aim for minimum interval for getting information, it can theoretically be called every 2s.
I'll have to restructure the program to cater to this endpoint for volume data but it should be quite doable.

HI Brian, I hope you are well.
I don't understand the Binance Api limit issue.
On my VPS server I have some scripts running that use the Binance Spot and Futures Api's and I have never had any problems with them.
I wonder if using your code I could have problems? Since you comment that your program is almost at the limit. If so, how could I reduce that amount of requests with some kind of delay, it doesn't matter if I miss some Pumps or Dumps, but it's more important for me to avoid my ip to be restricted because of too many requests.
Or in what way the call of requests could be optimized in a way that a restriction by Binance is avoided as much as possible.
Thanks in advance for the clarification.

The original version which only looks at price is far from hitting the rate limit as far as I'm aware. It's currently sends 1 requests per second with the request having a weight of 2, on a per minute basis it only takes up 120 requests which is 10% of the maximum allowed weight/min indicated in the Binance API (1200 weight).

However, if you are looking to utilize the TEST_VOL version of the program, at the rate of 1 request per 2s it takes up 40 weight per request putting it at the max of 1200 requests per minute which might pose a problem for your current setup. I created a separate parameter labelled under 'EXTRACT_INTERVAL' that I set at the default of '5s' which takes 12 requests / min = 480 weight/min which is around 40% of the allowed weight. It should not be a problem to run it concurrently with your other scripts.

At the current point I do not see much utility for the TEST_VOL version until a more suitable point of reference can be determined which would more accurately reflect trading movements consistently.

You can refer to https://binance-docs.github.io/apidocs/spot/en/#market-data-endpoints for more information on the weights of each endpoint if you are interested.

I am not really interested in TEST_VOL, although it is very interesting and I could use it in the future.

I have the doubt if the Futures Api is considered independent of the Spot Api?

I mean, if you run a script for the Spot market and another one for the Futures market, the number of requests would be added together or would they be independent?

@brianleect
Copy link
Owner

I went to take a look at the REST api and it is possible to get the volume but a different end point will have to be called
gitquikpi
Based on binance stated rate limits 1200 requests per minute , assuming we call the 24h ticker is called every 1s, we will end up with a weight of 2400 requests per minute which is over the limit. Assuming we aim for minimum interval for getting information, it can theoretically be called every 2s.
I'll have to restructure the program to cater to this endpoint for volume data but it should be quite doable.

HI Brian, I hope you are well.
I don't understand the Binance Api limit issue.
On my VPS server I have some scripts running that use the Binance Spot and Futures Api's and I have never had any problems with them.
I wonder if using your code I could have problems? Since you comment that your program is almost at the limit. If so, how could I reduce that amount of requests with some kind of delay, it doesn't matter if I miss some Pumps or Dumps, but it's more important for me to avoid my ip to be restricted because of too many requests.
Or in what way the call of requests could be optimized in a way that a restriction by Binance is avoided as much as possible.
Thanks in advance for the clarification.

The original version which only looks at price is far from hitting the rate limit as far as I'm aware. It's currently sends 1 requests per second with the request having a weight of 2, on a per minute basis it only takes up 120 requests which is 10% of the maximum allowed weight/min indicated in the Binance API (1200 weight).
However, if you are looking to utilize the TEST_VOL version of the program, at the rate of 1 request per 2s it takes up 40 weight per request putting it at the max of 1200 requests per minute which might pose a problem for your current setup. I created a separate parameter labelled under 'EXTRACT_INTERVAL' that I set at the default of '5s' which takes 12 requests / min = 480 weight/min which is around 40% of the allowed weight. It should not be a problem to run it concurrently with your other scripts.
At the current point I do not see much utility for the TEST_VOL version until a more suitable point of reference can be determined which would more accurately reflect trading movements consistently.
You can refer to https://binance-docs.github.io/apidocs/spot/en/#market-data-endpoints for more information on the weights of each endpoint if you are interested.

I am not really interested in TEST_VOL, although it is very interesting and I could use it in the future.

I have the doubt if the Futures Api is considered independent of the Spot Api?

I mean, if you run a script for the Spot market and another one for the Futures market, the number of requests would be added together or would they be independent?

image

Based on what is indicated in the API FAQ for binance, the number of requests would be added together as long you are running the script under the same IP address. But based on the current usage for the original version at 120 weight req/minute I believe it should not pose an issue. To answer your question, if you were to run the two instances of the script one for spot and for futures concurrently, it should results in 240 weight req/ minute incurred, 20% of the maximum allowed of 1200.

If you are looking at the TEST_VOL version, I have added an EXTRACT_INTERVAL parameter for it as well so you could theoretically set it to much lower intervals such as '15s' or even '1m' based on your requirements. However I believe this would result in missing out short-term pumps < '1m' but still capture pumps on the larger timeframes specified.

@velajos
Copy link

velajos commented Jul 11, 2021

I went to take a look at the REST api and it is possible to get the volume but a different end point will have to be called
gitquikpi
Based on binance stated rate limits 1200 requests per minute , assuming we call the 24h ticker is called every 1s, we will end up with a weight of 2400 requests per minute which is over the limit. Assuming we aim for minimum interval for getting information, it can theoretically be called every 2s.
I'll have to restructure the program to cater to this endpoint for volume data but it should be quite doable.

HI Brian, I hope you are well.
I don't understand the Binance Api limit issue.
On my VPS server I have some scripts running that use the Binance Spot and Futures Api's and I have never had any problems with them.
I wonder if using your code I could have problems? Since you comment that your program is almost at the limit. If so, how could I reduce that amount of requests with some kind of delay, it doesn't matter if I miss some Pumps or Dumps, but it's more important for me to avoid my ip to be restricted because of too many requests.
Or in what way the call of requests could be optimized in a way that a restriction by Binance is avoided as much as possible.
Thanks in advance for the clarification.

The original version which only looks at price is far from hitting the rate limit as far as I'm aware. It's currently sends 1 requests per second with the request having a weight of 2, on a per minute basis it only takes up 120 requests which is 10% of the maximum allowed weight/min indicated in the Binance API (1200 weight).
However, if you are looking to utilize the TEST_VOL version of the program, at the rate of 1 request per 2s it takes up 40 weight per request putting it at the max of 1200 requests per minute which might pose a problem for your current setup. I created a separate parameter labelled under 'EXTRACT_INTERVAL' that I set at the default of '5s' which takes 12 requests / min = 480 weight/min which is around 40% of the allowed weight. It should not be a problem to run it concurrently with your other scripts.
At the current point I do not see much utility for the TEST_VOL version until a more suitable point of reference can be determined which would more accurately reflect trading movements consistently.
You can refer to https://binance-docs.github.io/apidocs/spot/en/#market-data-endpoints for more information on the weights of each endpoint if you are interested.

I am not really interested in TEST_VOL, although it is very interesting and I could use it in the future.
I have the doubt if the Futures Api is considered independent of the Spot Api?
I mean, if you run a script for the Spot market and another one for the Futures market, the number of requests would be added together or would they be independent?

image

Based on what is indicated in the API FAQ for binance, the number of requests would be added together as long you are running the script under the same IP address. But based on the current usage for the original version at 120 weight req/minute I believe it should not pose an issue. To answer your question, if you were to run the two instances of the script one for spot and for futures concurrently, it should results in 240 weight req/ minute incurred, 20% of the maximum allowed of 1200.

If you are looking at the TEST_VOL version, I have added an EXTRACT_INTERVAL parameter for it as well so you could theoretically set it to much lower intervals such as '15s' or even '1m' based on your requirements. However I believe this would result in missing out short-term pumps < '1m' but still capture pumps on the larger timeframes specified.

OK I understand the Api issue.

I understand that the default Extract Interval in the original code is 1s.

This EXTRACT_INTERVAL parameter that you have added to the TEST_VOL version, could it be added to the Original version?

Because in my case I am interested in detections from 1m only, this way I could optimize the number of requests.

Being this value of 1m for my case, what would be the ideal extraction interval?

@brianleect
Copy link
Owner

brianleect commented Jul 12, 2021

I went to take a look at the REST api and it is possible to get the volume but a different end point will have to be called
gitquikpi
Based on binance stated rate limits 1200 requests per minute , assuming we call the 24h ticker is called every 1s, we will end up with a weight of 2400 requests per minute which is over the limit. Assuming we aim for minimum interval for getting information, it can theoretically be called every 2s.
I'll have to restructure the program to cater to this endpoint for volume data but it should be quite doable.

HI Brian, I hope you are well.
I don't understand the Binance Api limit issue.
On my VPS server I have some scripts running that use the Binance Spot and Futures Api's and I have never had any problems with them.
I wonder if using your code I could have problems? Since you comment that your program is almost at the limit. If so, how could I reduce that amount of requests with some kind of delay, it doesn't matter if I miss some Pumps or Dumps, but it's more important for me to avoid my ip to be restricted because of too many requests.
Or in what way the call of requests could be optimized in a way that a restriction by Binance is avoided as much as possible.
Thanks in advance for the clarification.

The original version which only looks at price is far from hitting the rate limit as far as I'm aware. It's currently sends 1 requests per second with the request having a weight of 2, on a per minute basis it only takes up 120 requests which is 10% of the maximum allowed weight/min indicated in the Binance API (1200 weight).
However, if you are looking to utilize the TEST_VOL version of the program, at the rate of 1 request per 2s it takes up 40 weight per request putting it at the max of 1200 requests per minute which might pose a problem for your current setup. I created a separate parameter labelled under 'EXTRACT_INTERVAL' that I set at the default of '5s' which takes 12 requests / min = 480 weight/min which is around 40% of the allowed weight. It should not be a problem to run it concurrently with your other scripts.
At the current point I do not see much utility for the TEST_VOL version until a more suitable point of reference can be determined which would more accurately reflect trading movements consistently.
You can refer to https://binance-docs.github.io/apidocs/spot/en/#market-data-endpoints for more information on the weights of each endpoint if you are interested.

I am not really interested in TEST_VOL, although it is very interesting and I could use it in the future.
I have the doubt if the Futures Api is considered independent of the Spot Api?
I mean, if you run a script for the Spot market and another one for the Futures market, the number of requests would be added together or would they be independent?

image
Based on what is indicated in the API FAQ for binance, the number of requests would be added together as long you are running the script under the same IP address. But based on the current usage for the original version at 120 weight req/minute I believe it should not pose an issue. To answer your question, if you were to run the two instances of the script one for spot and for futures concurrently, it should results in 240 weight req/ minute incurred, 20% of the maximum allowed of 1200.
If you are looking at the TEST_VOL version, I have added an EXTRACT_INTERVAL parameter for it as well so you could theoretically set it to much lower intervals such as '15s' or even '1m' based on your requirements. However I believe this would result in missing out short-term pumps < '1m' but still capture pumps on the larger timeframes specified.

OK I understand the Api issue.

I understand that the default Extract Interval in the original code is 1s.

This EXTRACT_INTERVAL parameter that you have added to the TEST_VOL version, could it be added to the Original version?

Because in my case I am interested in detections from 1m only, this way I could optimize the number of requests.

Being this value of 1m for my case, what would be the ideal extraction interval?

I have added the ability to customize extraction intervals based on your request. I have tested it on '15s' interval and thus far there doesn't seem to be any issue.

image

For your case of interest only for '1m' detections, the most ideal extraction interval would be '1m' and outlier intervals that are multiples of '1m'. If you wish to adjust to any other timeframe, my advice would be to simply use the smallest interval as the base for EXTRACTION_INTERVAL.

Do let me know if you face any problems with the updated version. I also fixed a bug where it indicates a PUMP when we had a negative change as well.

@velajos
Copy link

velajos commented Jul 12, 2021

I went to take a look at the REST api and it is possible to get the volume but a different end point will have to be called
gitquikpi
Based on binance stated rate limits 1200 requests per minute , assuming we call the 24h ticker is called every 1s, we will end up with a weight of 2400 requests per minute which is over the limit. Assuming we aim for minimum interval for getting information, it can theoretically be called every 2s.
I'll have to restructure the program to cater to this endpoint for volume data but it should be quite doable.

HI Brian, I hope you are well.
I don't understand the Binance Api limit issue.
On my VPS server I have some scripts running that use the Binance Spot and Futures Api's and I have never had any problems with them.
I wonder if using your code I could have problems? Since you comment that your program is almost at the limit. If so, how could I reduce that amount of requests with some kind of delay, it doesn't matter if I miss some Pumps or Dumps, but it's more important for me to avoid my ip to be restricted because of too many requests.
Or in what way the call of requests could be optimized in a way that a restriction by Binance is avoided as much as possible.
Thanks in advance for the clarification.

The original version which only looks at price is far from hitting the rate limit as far as I'm aware. It's currently sends 1 requests per second with the request having a weight of 2, on a per minute basis it only takes up 120 requests which is 10% of the maximum allowed weight/min indicated in the Binance API (1200 weight).
However, if you are looking to utilize the TEST_VOL version of the program, at the rate of 1 request per 2s it takes up 40 weight per request putting it at the max of 1200 requests per minute which might pose a problem for your current setup. I created a separate parameter labelled under 'EXTRACT_INTERVAL' that I set at the default of '5s' which takes 12 requests / min = 480 weight/min which is around 40% of the allowed weight. It should not be a problem to run it concurrently with your other scripts.
At the current point I do not see much utility for the TEST_VOL version until a more suitable point of reference can be determined which would more accurately reflect trading movements consistently.
You can refer to https://binance-docs.github.io/apidocs/spot/en/#market-data-endpoints for more information on the weights of each endpoint if you are interested.

I am not really interested in TEST_VOL, although it is very interesting and I could use it in the future.
I have the doubt if the Futures Api is considered independent of the Spot Api?
I mean, if you run a script for the Spot market and another one for the Futures market, the number of requests would be added together or would they be independent?

image
Based on what is indicated in the API FAQ for binance, the number of requests would be added together as long you are running the script under the same IP address. But based on the current usage for the original version at 120 weight req/minute I believe it should not pose an issue. To answer your question, if you were to run the two instances of the script one for spot and for futures concurrently, it should results in 240 weight req/ minute incurred, 20% of the maximum allowed of 1200.
If you are looking at the TEST_VOL version, I have added an EXTRACT_INTERVAL parameter for it as well so you could theoretically set it to much lower intervals such as '15s' or even '1m' based on your requirements. However I believe this would result in missing out short-term pumps < '1m' but still capture pumps on the larger timeframes specified.

OK I understand the Api issue.
I understand that the default Extract Interval in the original code is 1s.
This EXTRACT_INTERVAL parameter that you have added to the TEST_VOL version, could it be added to the Original version?
Because in my case I am interested in detections from 1m only, this way I could optimize the number of requests.
Being this value of 1m for my case, what would be the ideal extraction interval?

I have added the ability to customize extraction intervals based on your request. I have tested it on '15s' interval and thus far there doesn't seem to be any issue.

image

For your case of interest only for '1m' detections, the most ideal extraction interval would be '1m' and outlier intervals that are multiples of '1m'. If you wish to adjust to any other timeframe, my advice would be to simply use the smallest interval as the base for EXTRACTION_INTERVAL.

Do let me know if you face any problems with the updated version. I also fixed a bug where it indicates a PUMP when we had a negative change as well.

Thanks Brian. I am currently testing and eventually gets stuck on the following message:

"DUMP: TCTUSDT / Change: -1.16 % / Price: 0.023277 Interval: 1m
Time taken to extract and append: 6.69248366355896
Extracting after 15 s"

And I have to interrupt the process and then restart it and have it start up again. I think maybe it is due to some failures in my internet connection, but I think the code should try to reconnect after some specific period of time, to avoid getting stuck at that point. Update: This situation occurred three times during the day.

Another bug occurs when too many notifications accumulate for Telegram message and I get the following error:

"DUMP: VTHOUSDT / Change: -1.09 % / Price: 0.005849 Interval: 1m
DUMP: STORJUSDT / Change: -1.15 % / Price: 0.9025 Interval: 1m
DUMP: MANAUSDT / Change: -1.03 % / Price: 0.6822 Interval: 1m
Telegram bot error
Telegram bot error
Telegram bot error
Telegram bot error
Telegram bot error
DUMP: BONDUSDT / Change: -2.26 % / Price: 28.76 Interval: 3m
DUMP: VTHOUSDT / Change: -1.09 % / Price: 0.005849 Interval: 1m"

I don't know if configuring a minimum pause between each notification can fix it.

@velajos
Copy link

velajos commented Jul 12, 2021

A few minutes ago I received the following error:

Error: HTTPSConnectionPool(host='api.binance.com', port=443):

"PUMP: ACMUSDT / Change: 3.28 /% Price: 6.765 Interval: 10m
PUMP: SLPUSDT / Change: 4.28 /% Price: 0.2918 Interval: 30m
Time taken to extract and append: 5.10305118560791
Extracting after 60 s
Time taken to extract and append: 2.422328472137451
Extracting after 60 s
Time taken to extract and append: 6.2276291847229
Extracting after 60 s
Time taken to extract and append: 2.349924325942993
Extracting after 60 s
Error: HTTPSConnectionPool(host='api.binance.com', port=443): Max retries exceeded with url: /api/v3/ticker/price (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x000002100F3D63C8>: Failed to establish a new connection: [Errno 11001] getaddrinfo failed'))
Retrying in 1s
Time taken to extract and append: 15.77428674697876
Extracting after 60 s
Time taken to extract and append: 6.378751993179321
Extracting after 60 s"

This is the setup I am using:

_intervals = ['1m', '3m', '5m', '10m', '15m', '30m']
outlier_param = {'1m':0.015,'3m':0.02,'5m':0.03,'10m':0.03,'15m':0.03,'30m':0.04}
pairs_of_interest = ['USDT'] # Other options include 'BTC' , 'ETH'
watchlist = [] # E.g. ['ADAUSDT', 'ETHUSDT'] # Note that if watchlist has pairs, ONLY pairs in watchlist will be monitored

FUTURE_ENABLED=False # Determine whether to look at future markets
DUMP_ENABLED = True # Determine whether to look at DUMP
MIN_ALERT_INTERVAL = '90s' # Minimum interval between alerts for SAME pair
RESET_INTERVAL = '3h' # Interval for clearing array to prevent MEM ERROR
PRINT_DEBUG = True # If false we do not print messages
EXTRACT_INTERVAL = '60s' # Minimum interval is 2s but reccommend for 5s and above due to 10054 errors_

Update: looking at the operation log I noticed that this error has occurred twice during the day.

@brianleect
Copy link
Owner

@Majenunez , any updates on how the new version is going for you? I'll be closing this thread if everything's good in a few days.

@Majenunez
Copy link
Author

@Majenunez , any updates on how the new version is going for you? I'll be closing this thread if everything's good in a few days.

Currently everything is working properly. Thank you very much

@brianleect
Copy link
Owner

Thanks for the update! Closing the thread.

@velajos
Copy link

velajos commented Aug 3, 2021

Hi @brianleect, Could you add the USDT volume that Binance always shows and which is independent of the candles? I think it is the volume for the last 24 hours.

But not the percentage change in volume, I mean just the amount of USDT being traded at the moment.

For example:

"DUMP: TCTUSDT / Change: -1.16 % / Price: 0.023277 / Volume (USDT): 14.16M / Interval: 5m"

"PUMP: BTCUSDT / Change: 3.25 % / Price: 38536 / Volume (USDT): 20.14B / Interval: 30m"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants