Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

compile unable to handle subfolder #62

Closed
kennethlimcp opened this issue Jun 14, 2014 · 5 comments
Closed

compile unable to handle subfolder #62

kennethlimcp opened this issue Jun 14, 2014 · 5 comments
Milestone

Comments

@kennethlimcp
Copy link
Contributor

This problem arises due to Spark-CLI since the WEB IDE never allowed sub-folders

e:\SDcompile\SD
e:\SDcompile\Spark-CardInfo_FRAMrw.cpp
attempting to compile firmware
pushing file: e:\SDcompile\SD
pushing file: e:\SDcompile\Spark-CardInfo_FRAMrw.cpp

stream.js:94
throw er; // Unhandled stream error in pipe.
^
Error: EISDIR, read

@dmiddlecamp
Copy link
Contributor

Hmm, files in subdirectories are okay, but it's probably mad because the subdir itself is getting included instead of its contents.

@alexanderweiss
Copy link

Yup. I'm having the same issue. A solution is to include the files in spark.include, but that will "promote" these files to the project's "root folder" in the cloud when compiling. So say I was doing #include "mylib/stuff.h" and I added mylib/stuff.h to spark.include it wouldn't work and I would instead have to do #include "stuff.h". But that's probably a separate issue.

@m-mcgowan
Copy link
Contributor

I hit the same problem today, and also the issue with needing some files in an include folder. I'm wondering if the cli should take a json.file so that the build details can be configured:

  • root directory (assume cwd?)
  • files/directories to include in the compile
  • include directories (for mylib/stuff.h case)
  • list of libraries to bring in so code can compile against the existing suite of online libraries.

@dmiddlecamp dmiddlecamp modified the milestone: 0.4.4 Oct 13, 2014
@towynlin
Copy link
Contributor

towynlin commented Nov 7, 2014

Just adding a 👍 because I helped a user overcome this issue today.

@dmiddlecamp
Copy link
Contributor

added support for globs in 0.4.6, so to include subfolders use the following spark.include file:

**\*.h
**\*.cpp
**\*.c
**\*.ino

still initially off by default to avoid confusion / breaking changes

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants