New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Metadata not in correct format - in v3.7.2 only #470

Closed
sueyacoub opened this Issue Aug 20, 2014 · 9 comments

Comments

Projects
None yet
5 participants
@sueyacoub

sueyacoub commented Aug 20, 2014

(IMO) ever since this commit,

We are getting the following runtime error:
The file #{content_filename} appears to start with a metadata section (three or five dashes at the top) but it does not seem to be in the correct format.

We use yaml headers on each page, and they are in the format:
---
filename: <fileame>
<custom-layout-type>: <layout-name>
---

We haven't changed how the metadata headers are formatted. Has anything changed in how the yaml header / metadata should be formatted?

To avoid this error, we have had to set the nanoc version in the gemfile to 3.7.1 and uninstall version 3.7.2 of the nanoc gem from our local machines.

Thanks in advance.

@gpakosz

This comment has been minimized.

Show comment
Hide comment
@gpakosz

gpakosz Aug 20, 2014

Member

Has anything changed in how the yaml header / metadata should be formatted?

Yes, #463 changed the way content is separated from metadata. However, we didn't notice any regression when running the tests.

Could you please copy paste the exact content of the offending file? Or ideally upload the offending file somewhere?

Member

gpakosz commented Aug 20, 2014

Has anything changed in how the yaml header / metadata should be formatted?

Yes, #463 changed the way content is separated from metadata. However, we didn't notice any regression when running the tests.

Could you please copy paste the exact content of the offending file? Or ideally upload the offending file somewhere?

@gpakosz gpakosz added the waiting label Aug 20, 2014

@gpakosz gpakosz self-assigned this Aug 20, 2014

@ddfreyne ddfreyne added the bug label Aug 20, 2014

@Fjan

This comment has been minimized.

Show comment
Hide comment
@Fjan

Fjan Aug 21, 2014

Contributor

It fails on any file that has \r\n line endings instead of \n line endings. I narrowed down the problem to this regex that is new in 3.7.2:

   pieces = data.split(/^(-{5}|-{3})[ \t]*\n/)

One way to fix it would be:

   pieces = data.split(/^(-{3,5})[ \t]*\r?\n/,5)

Note that the last parameter to split limits the number of pieces so there is no need to join them later, which could potentially mess up a file that has '---' somewhere in the middle.

Contributor

Fjan commented Aug 21, 2014

It fails on any file that has \r\n line endings instead of \n line endings. I narrowed down the problem to this regex that is new in 3.7.2:

   pieces = data.split(/^(-{5}|-{3})[ \t]*\n/)

One way to fix it would be:

   pieces = data.split(/^(-{3,5})[ \t]*\r?\n/,5)

Note that the last parameter to split limits the number of pieces so there is no need to join them later, which could potentially mess up a file that has '---' somewhere in the middle.

@gpakosz

This comment has been minimized.

Show comment
Hide comment
@gpakosz

gpakosz Aug 21, 2014

Member

@Fjan please see my comments on #471.

Member

gpakosz commented Aug 21, 2014

@Fjan please see my comments on #471.

@gpakosz gpakosz removed the waiting label Aug 21, 2014

@kraftkern

This comment has been minimized.

Show comment
Hide comment
@kraftkern

kraftkern Aug 27, 2014

Hi @gpakosz,

i have this error all over the place, but in files that actually have NO CONTENT. I.e. files with metadata but no content part. This worked great in previous versions and I have lots of files like that :) This is much needed feature I always used.

kraftkern commented Aug 27, 2014

Hi @gpakosz,

i have this error all over the place, but in files that actually have NO CONTENT. I.e. files with metadata but no content part. This worked great in previous versions and I have lots of files like that :) This is much needed feature I always used.

@gpakosz

This comment has been minimized.

Show comment
Hide comment
@gpakosz

gpakosz Aug 27, 2014

Member

@kraftkern we believe we identified the regression, can you confirm your files are in CR-LF mode? (dos encoding)

Member

gpakosz commented Aug 27, 2014

@kraftkern we believe we identified the regression, can you confirm your files are in CR-LF mode? (dos encoding)

@kraftkern

This comment has been minimized.

Show comment
Hide comment
@kraftkern

kraftkern Aug 28, 2014

hey @gpakosz ! Thanks for quick reply. No, can't confirm that. All files are UTF-8 with LF-only (I'm working on a mac). Are they supposed to be CR-LF?

kraftkern commented Aug 28, 2014

hey @gpakosz ! Thanks for quick reply. No, can't confirm that. All files are UTF-8 with LF-only (I'm working on a mac). Are they supposed to be CR-LF?

@kraftkern

This comment has been minimized.

Show comment
Hide comment
@kraftkern

kraftkern Aug 28, 2014

Here's an example (full content with all new lines):
index_md_ nanonick__git__master

P.S.: i got the site working again with nanoc 3.6.1

kraftkern commented Aug 28, 2014

Here's an example (full content with all new lines):
index_md_ nanonick__git__master

P.S.: i got the site working again with nanoc 3.6.1

gpakosz added a commit to gpakosz/nanoc that referenced this issue Aug 28, 2014

@gpakosz gpakosz added this to the 3.7.3 milestone Aug 28, 2014

@gpakosz

This comment has been minimized.

Show comment
Hide comment
@gpakosz

gpakosz Aug 28, 2014

Member

@kraftkern reproduced and fixed it in the PR. Thank you.

Member

gpakosz commented Aug 28, 2014

@kraftkern reproduced and fixed it in the PR. Thank you.

@ddfreyne

This comment has been minimized.

Show comment
Hide comment
@ddfreyne

ddfreyne Aug 31, 2014

Member

Fixed in #471 (3.7.3).

Member

ddfreyne commented Aug 31, 2014

Fixed in #471 (3.7.3).

@ddfreyne ddfreyne closed this Aug 31, 2014

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment