Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP

Loading…

missing newlines in md file #342

Closed
piccolbo opened this Issue · 6 comments

2 participants

@piccolbo

Hi I've been using knitr for a while and suddenly all newlines in code chunks are gone. All code chunks are externalized and tidy is off.

This is a code chunk

## @knitr input
  input.size = if(rmr.options.get('backend') == "local") 10^4 else 10^6
  data = keyval(rep(list(1), input.size),as.list(1:input.size), vectorized = TRUE)
  input = to.dfs(data)

And this is in md

input.size = if(rmr.options.get('backend') == "local") 10^4 else 10^6  data = keyval(rep(list(1), input.size),as.list(1:input.size), vectorized = TRUE)  input = to.dfs(data)

I have everything in git, both Rmd and md, so I am looking at diffs and scratching my head.

apc-2:docs antonio$ git diff introduction-to-vectorized-API.Rmd | head -30
apc-2:docs antonio$ git diff introduction-to-vectorized-API.md | head -30
diff --git a/rmr/pkg/docs/introduction-to-vectorized-API.md b/rmr/pkg/docs/introduction-to-vectorized-API.md
index 945b31e..8faf5bd 100644
--- a/rmr/pkg/docs/introduction-to-vectorized-API.md
+++ b/rmr/pkg/docs/introduction-to-vectorized-API.md
@@ -21,285 +21,166 @@ Because of the inefficiency of the native R serialization on small objects, even
 First let's create some input. Input size is arbitrary but it is the one used in the automated tests included in the package and to obtain the timings. Here we are assuming a `"hadoop"` backend. By setting `vectorized` to `TRUE` we instruct `keyval` to consider its arguments collections of keys and values, not individual ones. At the same time `to.dfs` automatically switches to a simplified serialization format. The nice thing is that we don't have to remember if a data set was written out with one or the other serialization, as they are compatible on the read side. For testing purposes  we need to create a data set in this format to maximize the benefits of the `vectorized` option.



-  input.size = if(rmr.options.get('backend') == "local") 10^4 else 10^6
-  data = keyval(rep(list(1), input.size),as.list(1:input.size), vectorized = TRUE)
-  input = to.dfs(data)
+  input.size = if(rmr.options.get('backend') == "local") 10^4 else 10^6  data = keyval(rep(list(1), input.size),as.list(1:input.size), vectorized = TRUE)  input = to.dfs(data)



-
-
 ### Read it back
 The simplest possible task is to read back what we just wrote out, and we know how to do that already.

-
 ```r
     from.dfs(input)
 ```

-
-
 In the next code block we switch on vectorization. That is, instead of reading in a list of key-value pairs we are going to have list of vectorized key-value pairs, that is every pair contains a list of keys and a list of values of the same length. The `vectorized` argument can be set to an integer as well for more precise control of how many keys and values should be stored in a key-value pair. With this change alone, from my limited, standalone mode testing we have achieved an almost 7X speed-up (raw timing data is in [`vectorized-API.R`](../tests/vectorized-API.R))
apc-2:docs antonio$ 
@yihui
Owner

Can you try the latest version (0.7) on Github? devtools::install_github('knitr', 'yihui')

I cannot reproduce your problem with either v0.7 on Github or 0.6.3 on CRAN.

@piccolbo

That's what I am running, 0.7

@piccolbo

I checked in several examples, for instance look here
https://github.com/RevolutionAnalytics/RHadoop/tree/dev/rmr/pkg/docs
files: tutorial.{Rmd,md,html}

If you can't repro I need to provide more information, any logs, and function to debug, let me know.

@yihui
Owner

Oh, wait, I see the problem now. I must have introduce a bug in 992cea6. I'll fix it soon. Thanks!

@yihui yihui closed this in b18786d
@yihui yihui referenced this issue from a commit
@yihui news about #342 413bf22
@yihui
Owner

should be fixed now; thanks!

@piccolbo

It is indeed! Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.