-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
In the third step, the event is extracted. The program cannot continue to run, but no error will be reported #22
Comments
Hi Jiameng, I'm Xuelong, I can help you with this. As you know, "s00" in the script represents the estimated noise, and it is calculated by the command Besides, we have a developing version of AQuA2 which should be faster and more accurate. We plan to write a paper for it this year. If you plan to use it before our publication, you could send a request to my advisor "yug@vt.edu" and promise to keep it within your own lab, then I can send you the package. |
Hello Xuelong. I have listened to your suggestion and modified the definition of s00 in line 17 of the spgtw.m file: I tried to search for the function ‘graphCutMex’ in the folder of the AQuA, but only found an annotated m file (AQuA/master/tools/+aoIBFS/graphCutMex. m). Also, thank you for inviting us to use AQuA2. I am discussing the content in the official email with my advisor. |
I guess the root reason is still because of the denoising of the video, AQuA itself will estimate the noise again and use it to detect signals. When the estimated noise is very small, AQuA may identify very large signals and many steps would be very time-consuming. Besides, "GraphCutMex" is a complied C++ function for solving min-cut problem. The stagnation in step 3 is caused by too large input graph. Several ways to solve it:
|
Thank you very much for your detailed guidance. I will try to modify the code according to your suggestions. I hope this won't trouble you. I am very interested in AQUA2, but my advisor and I are considering ways to commit to "keep it within our own lab". Do you have any suggestions? |
Just send an email to my advisor yug@vt.edu, saying "I promise to not share AQuA2 with other people". The major concern is that AQuA2 is not published yet. |
Hello Xuelong. I spent one day to test, mainly for the Way 3. Unfortunately, adjusting these parameters, the problem that the GraphCutMex function is very slow at certain times did not resolved. Also, as expected, the model took more time in the third step "event". I only tried to tune the initial parameters of some variables in the code (like spSz = 200) three times and it took an insane amount of time. For the Way 1, I can be sure that the AQUA program will run smoothly if I don’t use deepcad to denoise the image. But I’m still stuck with it. Below is my reasoning. When I used AQUA to analyze raw footage without denoising, I was able to get 284k seeds and 23k events (after “clean”, “Merge” and “Recon”) with default parameters. However, when I used AQUA to analyze the denoised video, by adjusting the threshold (other parameters remained unchanged), the seed reached 290k, but the number of events was only 1.1k. This gave me reason to believe that image denoising was meaningful to the analysis results of AQUA. For the Way 2, I have downsampled the video sampling frequency from 40 frames to 5 frames. I also reduced 32bit to 8bit, but this did not fundamentally solve the problem of low AQUA operating efficiency. This is really frustrating. I think I'll have to apply the not yet published AQUA2 to analyze calcium events in astrocytes |
You can send one email to my advisor then I can send you the developing AQuA2. By the way, how large of your dataset? And if you do not need propagation information, maybe we can adjust the GitHub code to skip step 3. |
Each of my videos has a length of 10 minutes, and there are 2-4 cells in the field of vision. I won’t skip step 3 because it contains important communication information that might be the focus of my analysis. |
Is that convenient for you to send me one small crop of the data (like a recording of 1 minute, or half a minute) so that I can check the code and try to adjust the parameters? |
Thank you for your support. These materials are not yet public, and I need to apply to my supervisor, which may take some time. Please understand. I’m grateful for your support |
Sure. |
Hello Xuelong. I came up with such a question this morning. For example, website address“ https://github.com/yu-lab-vt/AQuA ”A high SNR video "ExVivoSuppRaw" provided in( https://drive.google.com/file/d/13tNSFQ1BFV__42TY0lZbHd1VYTRfNyfD/view )What method was used to preprocess such an image? Perhaps the problem I encountered is actually very simple. The preprocessing method I chose is not suitable for AQUA. |
For this data, we don't do preprocessing and directly use AQuA. For different data, we may adjust parameters in GUI. |
I may not have been clear about what I meant. I mean, like this picture shows the original image? Is it so clear without any noise removal or motion correction? |
Yes, it is raw data |
I also encountered the same situation. Now it has been running for 4 hours in the third step, and the final output of the command line is shown in the figure. I'm curious to know if this problem is solved in the end, can you guide me how to deal with it? Thank you very much.@xujiameng @XuelongMi |
I need more information to locate the issue you met. Could you send me an example data that you face this issue? Or you can just try the method jiameng has used, that is: |
I have taken this measure, and the situation I am talking about is the result of modifying the code. |
It is so strange. |
In addition, my data is also the result of deepcad denoising processing. Could this be the source of the problem? Is it possible to modify the code to adapt the denoised data? The signal-to-noise ratio of my original image is so low that various conventional filtering methods are ineffective, so I think the denoising of deepcad is a necessary link. |
It is possible, but now I cannot locate what happened in the code. If it is convenient, could you send me the example data (a little crop of the data is enough as long as the issue is there). |
I have modified the code as you said and re-run. Now that the program is still running, this is the current output, and it hasn't changed for about 10 minutes after outputting this content. Can you locate the problem? If you need data, how can I send it to you, may I know your email? `Reading data
|
Example data is helpful. I promise this data will be only used in fixing this issue. My email is mixl18@vt.edu, you can send it through google drive. |
By the way, I generally know where is wrong, line 122 and line 123 |
"I strongly discourage using videos processed by deepcad for importing into AQUA. Although from a human perspective, the information contained in the video increases after preprocessing, in our tests, any preprocessing method will affect the video information. In the tests, we used six preprocessing methods: DeepCad, NoRMCorre, DeepCad+NoRMCorre, median filtering, Image Stabilizer (by ImageJ), and median filtering+Image Stabilizer. We evaluated the information changes before and after preprocessing using two indicators: peak signal-to-noise ratio (PSNR) and structural similarity (SSMI). PSNR (peak signal-to-noise ratio) is an indicator that measures image quality. It is the ratio of the peak signal energy and mean square error between the original image and the noisy image, expressed in decibels (dB). The higher the PSNR, the better the image quality. Above 30 dB usually indicates that the image has no obvious distortion; 40dB can be considered as visually lossless. SSIM (structural similarity) is another indicator that measures image quality. It is designed based on the human visual system (HVS) sensitivity to image structural information. SSIM assumes that image quality mainly depends on the similarity of brightness, contrast, and structure. This indicator usually ranges between 0 and 1, and its value closer to 1 indicates that the two images are more similar. Generally, an SSIM value above 0.75 indicates that the image has high quality/similarity. Our results show that both high computational complexity preprocessing methods (such as DeepCad+NoRMCorre) and low computational complexity preprocessing methods (such as median filtering+Image Stabilizer) will cause changes in the valid information contained in the original video. Let’s take an inappropriate example. If we filter a normal electrocardiogram signal, then the area under its curve or other derived parameters will usually change. But in Astrocyte we don’t know if this information change is what we want (although from a human perspective the video becomes clearer and the subcellular structure more accurate). In the paper Accurate quantification of astrocyte and neurotransmitter fluorescence dynamics for single-cell and population-level physiology, no preprocessing other than Gaussian filtering was performed on the video. You can also see similar preprocessing methods in other related papers. I hope this helps you." |
@XuelongMi “Also, when using AQuA to analyze calcium signals in Astrocyte, some of the parameters calculated sometimes have NaN or Inf (such as Landmark event_away_from_landmark_landmark, Propagation offset_one_direction_ratio_Anterior, etc.). How should I deal with these abnormal values, use mean interpolation? Or not analyze these calcium signals that contain abnormal values?” |
Please let me check it tomorrow. |
@xujiameng Hi, jiameng, for the "Propagation offset_one_direction_ratio_Anterior", it is calculated by "propagation offset at some direction" / sum("propagation offset at some direction"). So if the propagation offset at all directions are 0, then the ratio will be nan value. You can just set the ratios as 0. |
Thank you for your reply. In the analysis of 40 Astrocytes, AQUA detected a total of 6.5k calcium events, with the missing rates of the "onset/offset_one_direction-ratio" parameter being around 10%, the missing rates of the "event_toward/away_landmark" parameter being around 0.005%, and the missing rates of the "event_towardlandmark_beforereaching" or "event_away_from_landmark_after_reaching" parameter being around 0.006%. The former accounts for a higher proportion, and using fixed value interpolation can better ensure the original distribution of data; The latter situation is rare, it would be better to ignore such samples. Thank you very much for your suggestion. |
Hello developers of AQUA, thank you very much for developing such a convenient calcium event analysis tool. I am a PhD in biomedical engineering, focusing on the specific analysis of calcium events in astrocytes.
I have recently been using AQUA to analyze calcium events, but I have encountered some problems. First of all, let me talk about my environment. For stability, I am using matlab2018a version (versions above 2020 will have problems when drawing ROI), window10 system (matlab on Linux system cannot run AQUA), memory 128Gb. Before analyzing the astrocyte tif file, I will use Deepcad tool to complete the image denoising, and then use NoRMCorre to complete the motion correction.
According to my experience, Aqua can complete a full task in about 4 hours of normal operation on my computer. However, many times the AQUA run stops at step 3, where the event is extracted. Neither an error is reported nor continues to run. At the same time, the data in the memory is not released
When the prompt "The maximum variable value allowed by the program was exceeded." appeared, I modified the code in "spgtw.m" and inserted the following command on line 40 to ensure the normal operation of the program. Because thrMax occasionally returns inf, which causes the command "thrVec = 0:thrMax;" to run incorrectly.
if isinf(thrMax)
dFip = fillmissing(dFip,'nearest');
s00 = s00 + 1;
thrMax = ceil(quantile(dFip(:),0.999)/s00);
fprintf('thrMax = %d\n',thrMax)
end
But when I don't modify s00 (that is, snrThr is inf), the program has the same problem as I mentioned, and AQUA neither continues to run nor reports an error.
I can't locate what is wrong now, please help!
The text was updated successfully, but these errors were encountered: