Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question of dtw #1

Closed
skyWalker1997 opened this issue Nov 20, 2018 · 15 comments
Closed

Question of dtw #1

skyWalker1997 opened this issue Nov 20, 2018 · 15 comments

Comments

@skyWalker1997
Copy link

skyWalker1997 commented Nov 20, 2018

I installed all dependencies and try to run this program but in util/distance.py,the import code:
from distances.dtw.dtw import dynamic_time_warping as dtw cant work well. It just can not find distances.dtw.dtw.So how to fix it?
I guess the distances/dtw/setup.py can help,but it cant run either:
dtw.c:623:31: Fatal Error:numpy/arrayobject.h:No such file or directory
#include "numpy/arrayobject.h"
^

I used python3.6.7 and CentOS 7.2, GCC 4.8.5
Thx a lot!

@hfawaz
Copy link
Owner

hfawaz commented Nov 20, 2018

Hello,

I think you need to run the shell script utils/build-cython.sh

Just run the script using ./utils/build-cython.sh in order to generate the c files.

Let me know if this helps.

Thank you for your interest in the work and good luck !!

@skyWalker1997
Copy link
Author

Hello,

I think you need to run the shell script utils/build-cython.sh

Just run the script using ./utils/build-cython.sh in order to generate the c files.

Let me know if this helps.

Thank you for your interest in the work and good luck !!

It works terrific!I just used yum and install some dependencies and everything goes right.
BTW, you used DTW and there is a knn.py in your code.Is that you change the distance compute method in KNN into DTW?I have not seen how you use knn in the paper.And why knn, why not the svm-svc?

@hfawaz
Copy link
Owner

hfawaz commented Nov 22, 2018

Great, happy to help.

As for KNN we used the get_neighbors method only from knn.py in order to get the nearest neighbors from the DTW inter-datasets similarity matrix.

Section V.C in the paper explains how we used the nearest neighbor to extract the similar datasets from the matrix.

Hope this clears things up !

Do not hesitate if you have more questions.

@sunshinesun555
Copy link

'results-bake-off-train-test-split-ucr.csv' does not exist?

@hfawaz
Copy link
Owner

hfawaz commented Dec 11, 2018

Yes, you do not need that, just put add_bake_off=False in line 754

Let me know if this solves the issue.

@SophieZhou
Copy link

Thanks very much. The work is very good. As I have seen in your paper, the data was normalized at first. But I do not see any code in the project.

@hfawaz
Copy link
Owner

hfawaz commented Feb 11, 2019

Thank you for your interest in our work.
Yes in fact it was normalized by the dataset providers.
However you can use this code to normalize each sequence (time series) alone.
Let me know if you need more help.
Regards,

@SophieZhou
Copy link

Thank you for your interest in our work.
Yes in fact it was normalized by the dataset providers.
However you can use this code to normalize each sequence (time series) alone.
Let me know if you need more help.
Regards,

you mean the data was normalized by the dataset providers before we download the data?

I normalized the data each sequence alone. But the model was poor with low accuracy esp. in validation data.

@hfawaz
Copy link
Owner

hfawaz commented Feb 13, 2019

you mean the data was normalized by the dataset providers before we download the data?

Yes

I normalized the data each sequence alone. But the model was poor with low accuracy esp. in validation data.

Perhaps the problem is with not enough data ? I would be happy to help you if you can share more about the problem.

@SophieZhou
Copy link

you mean the data was normalized by the dataset providers before we download the data?

Yes

I normalized the data each sequence alone. But the model was poor with low accuracy esp. in validation data.

Perhaps the problem is with not enough data ? I would be happy to help you if you can share more about the problem.

Thanks very much. I have read the description of the data already, and it was normalized already indeed. Thanks again.

@hfawaz
Copy link
Owner

hfawaz commented Feb 14, 2019

You are welcome, do not hesitate to open an issue for any problem you encounter.
Best regards.

@hfawaz hfawaz closed this as completed Feb 14, 2019
@hfawaz hfawaz mentioned this issue Jul 22, 2019
@MarcoCode23
Copy link

Hello,

I think you need to run the shell script utils/build-cython.sh

Just run the script using ./utils/build-cython.sh in order to generate the c files.

Let me know if this helps.

Thank you for your interest in the work and good luck !!

Hello!
Congratulations for your work!
Running that command my output contains some lines such as:

  • ../anaconda3/lib/python3.7/site-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: ../bigdata18-master/distances/dtw/dtw.pyx
  • dtw.c:626:10: fatal error: numpy/arrayobject.h: No such file or directory

Should I have installed Python2 too to solve the problem?

@hfawaz
Copy link
Owner

hfawaz commented Dec 11, 2019

Hi thanks for your interest in the project.
I do not believe Python2 is a solution here, if anything it would generate more errors I think, so you should stick with Python3.
Did you install Cython?

@MarcoCode23
Copy link

MarcoCode23 commented Dec 11, 2019

Yes, I did.
First, I tried to use my conda environment and I received that output.
Then, I tried to not use that environment but I received the same output.

Here the whole output:

rm: cannot remove '.so': No such file or directory
rm: cannot remove 'pycache': No such file or directory
Compiling dtw.pyx because it changed.
[1/1] Cythonizing dtw.pyx
../.local/lib/python3.6/site-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: ../bigdata18-master/distances/dtw/dtw.pyx
tree = Parsing.p_module(s, pxd, full_module_name)
running build_ext
building 'dtw' extension
creating build
creating build/temp.linux-x86_64-3.6
x86_64-linux-gnu-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.6m -c dtw.c -o build/temp.linux-x86_64-3.6/dtw.o
dtw.c:611:10: fatal error: numpy/arrayobject.h: No such file or directory
#include "numpy/arrayobject.h"
^~~~~~~~~~~~~~~~~~~~~
compilation terminated.
error: command 'x86_64-linux-gnu-gcc' failed with exit status 1
./utils/build-cython.sh: line 12: cd: ../shapeDTWefficient: No such file or directory
rm: cannot remove 'shapeDTWefficient.c': No such file or directory
rm: cannot remove '
.so': No such file or directory
rm: cannot remove 'pycache': No such file or directory
running build_ext
building 'dtw' extension
creating build
creating build/temp.linux-x86_64-3.6
x86_64-linux-gnu-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.6m -c dtw.c -o build/temp.linux-x86_64-3.6/dtw.o
dtw.c:611:10: fatal error: numpy/arrayobject.h: No such file or directory
#include "numpy/arrayobject.h"
^~~~~~~~~~~~~~~~~~~~~
compilation terminated.
error: command 'x86_64-linux-gnu-gcc' failed with exit status 1

@hfawaz
Copy link
Owner

hfawaz commented Dec 11, 2019

I am not sure how to resolve this, it has to do with the c++ compiler on your machine I think.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants