/
non-translated-functions-with-spark-apply.html
633 lines (589 loc) · 50 KB
/
non-translated-functions-with-spark-apply.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
<!DOCTYPE html>
<html xmlns="http://www.w3.org/1999/xhtml" lang="" xml:lang="">
<head>
<meta charset="utf-8" />
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
<title>Chapter 6 Non-translated functions with spark_apply | Using Spark from R for performance with arbitrary code</title>
<meta name="description" content="This bookdown publication attempts to provide practical insights into using the sparklyr interface to gain the benefits of Apache Spark while still retaining the ability to use R code organized in custom-built functions and packages." />
<meta name="generator" content="bookdown 0.16 and GitBook 2.6.7" />
<meta property="og:title" content="Chapter 6 Non-translated functions with spark_apply | Using Spark from R for performance with arbitrary code" />
<meta property="og:type" content="book" />
<meta property="og:description" content="This bookdown publication attempts to provide practical insights into using the sparklyr interface to gain the benefits of Apache Spark while still retaining the ability to use R code organized in custom-built functions and packages." />
<meta name="twitter:card" content="summary" />
<meta name="twitter:title" content="Chapter 6 Non-translated functions with spark_apply | Using Spark from R for performance with arbitrary code" />
<meta name="twitter:description" content="This bookdown publication attempts to provide practical insights into using the sparklyr interface to gain the benefits of Apache Spark while still retaining the ability to use R code organized in custom-built functions and packages." />
<meta name="author" content="Jozef Hajnala" />
<meta name="date" content="2020-02-20" />
<meta name="viewport" content="width=device-width, initial-scale=1" />
<meta name="apple-mobile-web-app-capable" content="yes" />
<meta name="apple-mobile-web-app-status-bar-style" content="black" />
<link rel="prev" href="communication-between-spark-and-sparklyr.html"/>
<link rel="next" href="constructing-functions-by-piping-dplyr-verbs.html"/>
<script src="libs/jquery-2.2.3/jquery.min.js"></script>
<link href="libs/gitbook-2.6.7/css/style.css" rel="stylesheet" />
<link href="libs/gitbook-2.6.7/css/plugin-table.css" rel="stylesheet" />
<link href="libs/gitbook-2.6.7/css/plugin-bookdown.css" rel="stylesheet" />
<link href="libs/gitbook-2.6.7/css/plugin-highlight.css" rel="stylesheet" />
<link href="libs/gitbook-2.6.7/css/plugin-search.css" rel="stylesheet" />
<link href="libs/gitbook-2.6.7/css/plugin-fontsettings.css" rel="stylesheet" />
<link href="libs/gitbook-2.6.7/css/plugin-clipboard.css" rel="stylesheet" />
<!-- Global site tag (gtag.js) - Google Analytics -->
<script async src="https://www.googletagmanager.com/gtag/js?id=UA-1149069-22"></script>
<script>
window.dataLayer = window.dataLayer || [];
function gtag(){dataLayer.push(arguments);}
gtag('js', new Date());
gtag('config', 'UA-1149069-22');
</script>
<style type="text/css">
a.sourceLine { display: inline-block; line-height: 1.25; }
a.sourceLine { pointer-events: none; color: inherit; text-decoration: inherit; }
a.sourceLine:empty { height: 1.2em; }
.sourceCode { overflow: visible; }
code.sourceCode { white-space: pre; position: relative; }
pre.sourceCode { margin: 0; }
@media screen {
div.sourceCode { overflow: auto; }
}
@media print {
code.sourceCode { white-space: pre-wrap; }
a.sourceLine { text-indent: -1em; padding-left: 1em; }
}
pre.numberSource a.sourceLine
{ position: relative; left: -4em; }
pre.numberSource a.sourceLine::before
{ content: attr(data-line-number);
position: relative; left: -1em; text-align: right; vertical-align: baseline;
border: none; pointer-events: all; display: inline-block;
-webkit-touch-callout: none; -webkit-user-select: none;
-khtml-user-select: none; -moz-user-select: none;
-ms-user-select: none; user-select: none;
padding: 0 4px; width: 4em;
color: #aaaaaa;
}
pre.numberSource { margin-left: 3em; border-left: 1px solid #aaaaaa; padding-left: 4px; }
div.sourceCode
{ }
@media screen {
a.sourceLine::before { text-decoration: underline; }
}
code span.al { color: #ff0000; font-weight: bold; } /* Alert */
code span.an { color: #60a0b0; font-weight: bold; font-style: italic; } /* Annotation */
code span.at { color: #7d9029; } /* Attribute */
code span.bn { color: #40a070; } /* BaseN */
code span.bu { } /* BuiltIn */
code span.cf { color: #007020; font-weight: bold; } /* ControlFlow */
code span.ch { color: #4070a0; } /* Char */
code span.cn { color: #880000; } /* Constant */
code span.co { color: #60a0b0; font-style: italic; } /* Comment */
code span.cv { color: #60a0b0; font-weight: bold; font-style: italic; } /* CommentVar */
code span.do { color: #ba2121; font-style: italic; } /* Documentation */
code span.dt { color: #902000; } /* DataType */
code span.dv { color: #40a070; } /* DecVal */
code span.er { color: #ff0000; font-weight: bold; } /* Error */
code span.ex { } /* Extension */
code span.fl { color: #40a070; } /* Float */
code span.fu { color: #06287e; } /* Function */
code span.im { } /* Import */
code span.in { color: #60a0b0; font-weight: bold; font-style: italic; } /* Information */
code span.kw { color: #007020; font-weight: bold; } /* Keyword */
code span.op { color: #666666; } /* Operator */
code span.ot { color: #007020; } /* Other */
code span.pp { color: #bc7a00; } /* Preprocessor */
code span.sc { color: #4070a0; } /* SpecialChar */
code span.ss { color: #bb6688; } /* SpecialString */
code span.st { color: #4070a0; } /* String */
code span.va { color: #19177c; } /* Variable */
code span.vs { color: #4070a0; } /* VerbatimString */
code span.wa { color: #60a0b0; font-weight: bold; font-style: italic; } /* Warning */
</style>
<link rel="stylesheet" href="static/css/style.css" type="text/css" />
</head>
<body>
<div class="book without-animation with-summary font-size-2 font-family-1" data-basepath=".">
<div class="book-summary">
<nav role="navigation">
<ul class="summary">
<li><a href="./index.html">Using Spark from R for performance</a></li>
<li class="divider"></li>
<li class="chapter" data-level="1" data-path="index.html"><a href="index.html"><i class="fa fa-check"></i><b>1</b> Welcome</a><ul>
<li class="chapter" data-level="1.1" data-path="index.html"><a href="index.html#what-will-you-find-in-this-book"><i class="fa fa-check"></i><b>1.1</b> What will you find in this book</a></li>
<li class="chapter" data-level="1.2" data-path="index.html"><a href="index.html#book-sources"><i class="fa fa-check"></i><b>1.2</b> Book sources</a></li>
<li class="chapter" data-level="1.3" data-path="index.html"><a href="index.html#acknowledgments"><i class="fa fa-check"></i><b>1.3</b> Acknowledgments</a></li>
</ul></li>
<li class="chapter" data-level="2" data-path="setting-up-spark-with-r-and-sparklyr.html"><a href="setting-up-spark-with-r-and-sparklyr.html"><i class="fa fa-check"></i><b>2</b> Setting up Spark with R and sparklyr</a><ul>
<li class="chapter" data-level="2.1" data-path="setting-up-spark-with-r-and-sparklyr.html"><a href="setting-up-spark-with-r-and-sparklyr.html#interactive-manual-installation"><i class="fa fa-check"></i><b>2.1</b> Interactive manual installation</a></li>
</ul></li>
<li class="chapter" data-level="3" data-path="using-a-ready-made-docker-image.html"><a href="using-a-ready-made-docker-image.html"><i class="fa fa-check"></i><b>3</b> Using a ready-made Docker Image</a><ul>
<li class="chapter" data-level="3.1" data-path="using-a-ready-made-docker-image.html"><a href="using-a-ready-made-docker-image.html#installing-docker"><i class="fa fa-check"></i><b>3.1</b> Installing Docker</a></li>
<li class="chapter" data-level="3.2" data-path="using-a-ready-made-docker-image.html"><a href="using-a-ready-made-docker-image.html#using-the-docker-image-with-r"><i class="fa fa-check"></i><b>3.2</b> Using the Docker image with R</a><ul>
<li class="chapter" data-level="3.2.1" data-path="using-a-ready-made-docker-image.html"><a href="using-a-ready-made-docker-image.html#interactively-with-rstudio"><i class="fa fa-check"></i><b>3.2.1</b> Interactively with RStudio</a></li>
<li class="chapter" data-level="3.2.2" data-path="using-a-ready-made-docker-image.html"><a href="using-a-ready-made-docker-image.html#interactively-with-the-r-console"><i class="fa fa-check"></i><b>3.2.2</b> Interactively with the R console</a></li>
<li class="chapter" data-level="3.2.3" data-path="using-a-ready-made-docker-image.html"><a href="using-a-ready-made-docker-image.html#running-an-example-r-script"><i class="fa fa-check"></i><b>3.2.3</b> Running an example R script</a></li>
</ul></li>
<li class="chapter" data-level="3.3" data-path="using-a-ready-made-docker-image.html"><a href="using-a-ready-made-docker-image.html#interactively-with-the-spark-shell"><i class="fa fa-check"></i><b>3.3</b> Interactively with the Spark shell</a></li>
</ul></li>
<li class="chapter" data-level="4" data-path="connecting-and-using-a-local-spark-instance.html"><a href="connecting-and-using-a-local-spark-instance.html"><i class="fa fa-check"></i><b>4</b> Connecting and using a local Spark instance</a><ul>
<li class="chapter" data-level="4.1" data-path="connecting-and-using-a-local-spark-instance.html"><a href="connecting-and-using-a-local-spark-instance.html#packages-and-data"><i class="fa fa-check"></i><b>4.1</b> Packages and data</a></li>
<li class="chapter" data-level="4.2" data-path="connecting-and-using-a-local-spark-instance.html"><a href="connecting-and-using-a-local-spark-instance.html#connecting-to-spark-and-providing-it-with-data"><i class="fa fa-check"></i><b>4.2</b> Connecting to Spark and providing it with data</a></li>
<li class="chapter" data-level="4.3" data-path="connecting-and-using-a-local-spark-instance.html"><a href="connecting-and-using-a-local-spark-instance.html#first-glance-at-the-data"><i class="fa fa-check"></i><b>4.3</b> First glance at the data</a></li>
</ul></li>
<li class="chapter" data-level="5" data-path="communication-between-spark-and-sparklyr.html"><a href="communication-between-spark-and-sparklyr.html"><i class="fa fa-check"></i><b>5</b> Communication between Spark and sparklyr</a><ul>
<li class="chapter" data-level="5.1" data-path="communication-between-spark-and-sparklyr.html"><a href="communication-between-spark-and-sparklyr.html#sparklyr-as-a-spark-interface-provider"><i class="fa fa-check"></i><b>5.1</b> Sparklyr as a Spark interface provider</a></li>
<li class="chapter" data-level="5.2" data-path="communication-between-spark-and-sparklyr.html"><a href="communication-between-spark-and-sparklyr.html#an-r-function-translated-to-spark-sql"><i class="fa fa-check"></i><b>5.2</b> An R function translated to Spark SQL</a><ul>
<li class="chapter" data-level="5.2.1" data-path="communication-between-spark-and-sparklyr.html"><a href="communication-between-spark-and-sparklyr.html#how-does-spark-know-the-r-function-tolower"><i class="fa fa-check"></i><b>5.2.1</b> How does Spark know the R function <code>tolower()</code>?</a></li>
</ul></li>
<li class="chapter" data-level="5.3" data-path="communication-between-spark-and-sparklyr.html"><a href="communication-between-spark-and-sparklyr.html#an-r-function-not-translated-to-spark-sql"><i class="fa fa-check"></i><b>5.3</b> An R function not translated to Spark SQL</a></li>
<li class="chapter" data-level="5.4" data-path="communication-between-spark-and-sparklyr.html"><a href="communication-between-spark-and-sparklyr.html#a-hive-built-in-function-not-existing-in-r"><i class="fa fa-check"></i><b>5.4</b> A Hive built-in function not existing in R</a></li>
<li class="chapter" data-level="5.5" data-path="communication-between-spark-and-sparklyr.html"><a href="communication-between-spark-and-sparklyr.html#using-non-translated-functions-with-sparklyr"><i class="fa fa-check"></i><b>5.5</b> Using non-translated functions with sparklyr</a></li>
</ul></li>
<li class="chapter" data-level="6" data-path="non-translated-functions-with-spark-apply.html"><a href="non-translated-functions-with-spark-apply.html"><i class="fa fa-check"></i><b>6</b> Non-translated functions with spark_apply</a><ul>
<li class="chapter" data-level="6.1" data-path="non-translated-functions-with-spark-apply.html"><a href="non-translated-functions-with-spark-apply.html#what-is-so-important-about-this-distinction"><i class="fa fa-check"></i><b>6.1</b> What is so important about this distinction?</a></li>
<li class="chapter" data-level="6.2" data-path="non-translated-functions-with-spark-apply.html"><a href="non-translated-functions-with-spark-apply.html#what-happens-when-we-use-custom-functions-with-spark_apply"><i class="fa fa-check"></i><b>6.2</b> What happens when we use custom functions with <code>spark_apply()</code></a></li>
<li class="chapter" data-level="6.3" data-path="non-translated-functions-with-spark-apply.html"><a href="non-translated-functions-with-spark-apply.html#what-happens-when-we-use-translated-or-hive-built-in-functions"><i class="fa fa-check"></i><b>6.3</b> What happens when we use translated or Hive built-in functions</a></li>
<li class="chapter" data-level="6.4" data-path="non-translated-functions-with-spark-apply.html"><a href="non-translated-functions-with-spark-apply.html#which-r-functionality-is-currently-translated-and-built-in-to-hive"><i class="fa fa-check"></i><b>6.4</b> Which R functionality is currently translated and built-in to Hive</a></li>
<li class="chapter" data-level="6.5" data-path="non-translated-functions-with-spark-apply.html"><a href="non-translated-functions-with-spark-apply.html#making-serialization-faster-with-apache-arrow"><i class="fa fa-check"></i><b>6.5</b> Making serialization faster with Apache Arrow</a><ul>
<li class="chapter" data-level="6.5.1" data-path="non-translated-functions-with-spark-apply.html"><a href="non-translated-functions-with-spark-apply.html#what-is-apache-arrow-and-how-it-improves-performance"><i class="fa fa-check"></i><b>6.5.1</b> What is Apache Arrow and how it improves performance</a></li>
</ul></li>
<li class="chapter" data-level="6.6" data-path="non-translated-functions-with-spark-apply.html"><a href="non-translated-functions-with-spark-apply.html#conclusion-take-home-messages"><i class="fa fa-check"></i><b>6.6</b> Conclusion, take-home messages</a></li>
<li class="chapter" data-level="6.7" data-path="non-translated-functions-with-spark-apply.html"><a href="non-translated-functions-with-spark-apply.html#but-we-still-need-arbitrary-functions-to-run-fast"><i class="fa fa-check"></i><b>6.7</b> But we still need arbitrary functions to run fast</a></li>
</ul></li>
<li class="chapter" data-level="7" data-path="constructing-functions-by-piping-dplyr-verbs.html"><a href="constructing-functions-by-piping-dplyr-verbs.html"><i class="fa fa-check"></i><b>7</b> Constructing functions by piping dplyr verbs</a><ul>
<li class="chapter" data-level="7.1" data-path="constructing-functions-by-piping-dplyr-verbs.html"><a href="constructing-functions-by-piping-dplyr-verbs.html#r-functions-as-combinations-of-dplyr-verbs-and-spark"><i class="fa fa-check"></i><b>7.1</b> R functions as combinations of dplyr verbs and Spark</a><ul>
<li class="chapter" data-level="7.1.1" data-path="constructing-functions-by-piping-dplyr-verbs.html"><a href="constructing-functions-by-piping-dplyr-verbs.html#trying-it-with-base-r-functions"><i class="fa fa-check"></i><b>7.1.1</b> Trying it with base R functions</a></li>
<li class="chapter" data-level="7.1.2" data-path="constructing-functions-by-piping-dplyr-verbs.html"><a href="constructing-functions-by-piping-dplyr-verbs.html#using-a-combination-of-supported-dplyr-verbs-and-operations"><i class="fa fa-check"></i><b>7.1.2</b> Using a combination of supported dplyr verbs and operations</a></li>
<li class="chapter" data-level="7.1.3" data-path="constructing-functions-by-piping-dplyr-verbs.html"><a href="constructing-functions-by-piping-dplyr-verbs.html#investigating-the-sql-translation-and-its-spark-plan"><i class="fa fa-check"></i><b>7.1.3</b> Investigating the SQL translation and its Spark plan</a></li>
</ul></li>
<li class="chapter" data-level="7.2" data-path="constructing-functions-by-piping-dplyr-verbs.html"><a href="constructing-functions-by-piping-dplyr-verbs.html#a-more-complex-use-case---joins-group-bys-and-aggregations"><i class="fa fa-check"></i><b>7.2</b> A more complex use case - Joins, group bys, and aggregations</a></li>
<li class="chapter" data-level="7.3" data-path="constructing-functions-by-piping-dplyr-verbs.html"><a href="constructing-functions-by-piping-dplyr-verbs.html#using-the-functions-with-local-versus-remote-datasets"><i class="fa fa-check"></i><b>7.3</b> Using the functions with local versus remote datasets</a><ul>
<li class="chapter" data-level="7.3.1" data-path="constructing-functions-by-piping-dplyr-verbs.html"><a href="constructing-functions-by-piping-dplyr-verbs.html#unified-front-end-different-back-ends"><i class="fa fa-check"></i><b>7.3.1</b> Unified front-end, different back-ends</a></li>
<li class="chapter" data-level="7.3.2" data-path="constructing-functions-by-piping-dplyr-verbs.html"><a href="constructing-functions-by-piping-dplyr-verbs.html#differences-in-na-and-nan-values"><i class="fa fa-check"></i><b>7.3.2</b> Differences in <code>NA</code> and <code>NaN</code> values</a></li>
<li class="chapter" data-level="7.3.3" data-path="constructing-functions-by-piping-dplyr-verbs.html"><a href="constructing-functions-by-piping-dplyr-verbs.html#dates-times-and-time-zones"><i class="fa fa-check"></i><b>7.3.3</b> Dates, times and time zones</a></li>
<li class="chapter" data-level="7.3.4" data-path="constructing-functions-by-piping-dplyr-verbs.html"><a href="constructing-functions-by-piping-dplyr-verbs.html#joins"><i class="fa fa-check"></i><b>7.3.4</b> Joins</a></li>
<li class="chapter" data-level="7.3.5" data-path="constructing-functions-by-piping-dplyr-verbs.html"><a href="constructing-functions-by-piping-dplyr-verbs.html#portability-of-used-methods"><i class="fa fa-check"></i><b>7.3.5</b> Portability of used methods</a></li>
</ul></li>
<li class="chapter" data-level="7.4" data-path="constructing-functions-by-piping-dplyr-verbs.html"><a href="constructing-functions-by-piping-dplyr-verbs.html#conclusion-take-home-messages-1"><i class="fa fa-check"></i><b>7.4</b> Conclusion, take-home messages</a></li>
</ul></li>
<li class="chapter" data-level="8" data-path="constructing-sql-and-executing-it-with-spark.html"><a href="constructing-sql-and-executing-it-with-spark.html"><i class="fa fa-check"></i><b>8</b> Constructing SQL and executing it with Spark</a><ul>
<li class="chapter" data-level="8.1" data-path="constructing-sql-and-executing-it-with-spark.html"><a href="constructing-sql-and-executing-it-with-spark.html#r-functions-as-spark-sql-generators"><i class="fa fa-check"></i><b>8.1</b> R functions as Spark SQL generators</a></li>
<li class="chapter" data-level="8.2" data-path="constructing-sql-and-executing-it-with-spark.html"><a href="constructing-sql-and-executing-it-with-spark.html#executing-the-generated-queries-via-spark"><i class="fa fa-check"></i><b>8.2</b> Executing the generated queries via Spark</a><ul>
<li class="chapter" data-level="8.2.1" data-path="constructing-sql-and-executing-it-with-spark.html"><a href="constructing-sql-and-executing-it-with-spark.html#using-dbi-as-the-interface"><i class="fa fa-check"></i><b>8.2.1</b> Using DBI as the interface</a></li>
<li class="chapter" data-level="8.2.2" data-path="constructing-sql-and-executing-it-with-spark.html"><a href="constructing-sql-and-executing-it-with-spark.html#invoking-sql-on-a-spark-session-object"><i class="fa fa-check"></i><b>8.2.2</b> Invoking sql on a Spark session object</a></li>
<li class="chapter" data-level="8.2.3" data-path="constructing-sql-and-executing-it-with-spark.html"><a href="constructing-sql-and-executing-it-with-spark.html#using-tbl-with-dbplyrs-sql"><i class="fa fa-check"></i><b>8.2.3</b> Using tbl with dbplyr’s sql</a></li>
<li class="chapter" data-level="8.2.4" data-path="constructing-sql-and-executing-it-with-spark.html"><a href="constructing-sql-and-executing-it-with-spark.html#wrapping-the-tbl-approach-into-functions"><i class="fa fa-check"></i><b>8.2.4</b> Wrapping the tbl approach into functions</a></li>
</ul></li>
<li class="chapter" data-level="8.3" data-path="constructing-sql-and-executing-it-with-spark.html"><a href="constructing-sql-and-executing-it-with-spark.html#where-sql-can-be-better-than-dbplyr-translation"><i class="fa fa-check"></i><b>8.3</b> Where SQL can be better than dbplyr translation</a><ul>
<li class="chapter" data-level="8.3.1" data-path="constructing-sql-and-executing-it-with-spark.html"><a href="constructing-sql-and-executing-it-with-spark.html#when-a-translation-is-not-there"><i class="fa fa-check"></i><b>8.3.1</b> When a translation is not there</a></li>
<li class="chapter" data-level="8.3.2" data-path="constructing-sql-and-executing-it-with-spark.html"><a href="constructing-sql-and-executing-it-with-spark.html#when-translation-does-not-provide-expected-results"><i class="fa fa-check"></i><b>8.3.2</b> When translation does not provide expected results</a></li>
<li class="chapter" data-level="8.3.3" data-path="constructing-sql-and-executing-it-with-spark.html"><a href="constructing-sql-and-executing-it-with-spark.html#when-portability-is-important"><i class="fa fa-check"></i><b>8.3.3</b> When portability is important</a></li>
</ul></li>
</ul></li>
<li class="chapter" data-level="9" data-path="using-the-lower-level-invoke-api-to-manipulate-sparks-java-objects-from-r.html"><a href="using-the-lower-level-invoke-api-to-manipulate-sparks-java-objects-from-r.html"><i class="fa fa-check"></i><b>9</b> Using the lower-level invoke API to manipulate Spark’s Java objects from R</a><ul>
<li class="chapter" data-level="9.1" data-path="using-the-lower-level-invoke-api-to-manipulate-sparks-java-objects-from-r.html"><a href="using-the-lower-level-invoke-api-to-manipulate-sparks-java-objects-from-r.html#the-invoke-api-of-sparklyr"><i class="fa fa-check"></i><b>9.1</b> The invoke() API of sparklyr</a></li>
<li class="chapter" data-level="9.2" data-path="using-the-lower-level-invoke-api-to-manipulate-sparks-java-objects-from-r.html"><a href="using-the-lower-level-invoke-api-to-manipulate-sparks-java-objects-from-r.html#getting-started-with-the-invoke-api"><i class="fa fa-check"></i><b>9.2</b> Getting started with the invoke API</a></li>
<li class="chapter" data-level="9.3" data-path="using-the-lower-level-invoke-api-to-manipulate-sparks-java-objects-from-r.html"><a href="using-the-lower-level-invoke-api-to-manipulate-sparks-java-objects-from-r.html#grouping-and-aggregation-with-invoke-chains"><i class="fa fa-check"></i><b>9.3</b> Grouping and aggregation with invoke chains</a><ul>
<li class="chapter" data-level="9.3.1" data-path="using-the-lower-level-invoke-api-to-manipulate-sparks-java-objects-from-r.html"><a href="using-the-lower-level-invoke-api-to-manipulate-sparks-java-objects-from-r.html#what-is-all-that-extra-code"><i class="fa fa-check"></i><b>9.3.1</b> What is all that extra code?</a></li>
</ul></li>
<li class="chapter" data-level="9.4" data-path="using-the-lower-level-invoke-api-to-manipulate-sparks-java-objects-from-r.html"><a href="using-the-lower-level-invoke-api-to-manipulate-sparks-java-objects-from-r.html#wrapping-the-invocations-into-r-functions"><i class="fa fa-check"></i><b>9.4</b> Wrapping the invocations into R functions</a></li>
<li class="chapter" data-level="9.5" data-path="using-the-lower-level-invoke-api-to-manipulate-sparks-java-objects-from-r.html"><a href="using-the-lower-level-invoke-api-to-manipulate-sparks-java-objects-from-r.html#reconstructing-variable-normalization"><i class="fa fa-check"></i><b>9.5</b> Reconstructing variable normalization</a></li>
<li class="chapter" data-level="9.6" data-path="using-the-lower-level-invoke-api-to-manipulate-sparks-java-objects-from-r.html"><a href="using-the-lower-level-invoke-api-to-manipulate-sparks-java-objects-from-r.html#where-invoke-can-be-better-than-dplyr-translation-or-sql"><i class="fa fa-check"></i><b>9.6</b> Where invoke can be better than dplyr translation or SQL</a></li>
<li class="chapter" data-level="9.7" data-path="using-the-lower-level-invoke-api-to-manipulate-sparks-java-objects-from-r.html"><a href="using-the-lower-level-invoke-api-to-manipulate-sparks-java-objects-from-r.html#conclusion"><i class="fa fa-check"></i><b>9.7</b> Conclusion</a></li>
</ul></li>
<li class="chapter" data-level="10" data-path="exploring-the-invoke-api-from-r-with-java-reflection-and-examining-invokes-with-logs.html"><a href="exploring-the-invoke-api-from-r-with-java-reflection-and-examining-invokes-with-logs.html"><i class="fa fa-check"></i><b>10</b> Exploring the invoke API from R with Java reflection and examining invokes with logs</a><ul>
<li class="chapter" data-level="10.1" data-path="exploring-the-invoke-api-from-r-with-java-reflection-and-examining-invokes-with-logs.html"><a href="exploring-the-invoke-api-from-r-with-java-reflection-and-examining-invokes-with-logs.html#examining-available-methods-from-r"><i class="fa fa-check"></i><b>10.1</b> Examining available methods from R</a></li>
<li class="chapter" data-level="10.2" data-path="exploring-the-invoke-api-from-r-with-java-reflection-and-examining-invokes-with-logs.html"><a href="exploring-the-invoke-api-from-r-with-java-reflection-and-examining-invokes-with-logs.html#using-the-java-reflection-api-to-list-the-available-methods"><i class="fa fa-check"></i><b>10.2</b> Using the Java reflection API to list the available methods</a></li>
<li class="chapter" data-level="10.3" data-path="exploring-the-invoke-api-from-r-with-java-reflection-and-examining-invokes-with-logs.html"><a href="exploring-the-invoke-api-from-r-with-java-reflection-and-examining-invokes-with-logs.html#investigating-dataset-and-sparkcontext-class-methods"><i class="fa fa-check"></i><b>10.3</b> Investigating DataSet and SparkContext class methods</a><ul>
<li class="chapter" data-level="10.3.1" data-path="exploring-the-invoke-api-from-r-with-java-reflection-and-examining-invokes-with-logs.html"><a href="exploring-the-invoke-api-from-r-with-java-reflection-and-examining-invokes-with-logs.html#using-helpers-to-explore-the-methods"><i class="fa fa-check"></i><b>10.3.1</b> Using helpers to explore the methods</a></li>
<li class="chapter" data-level="10.3.2" data-path="exploring-the-invoke-api-from-r-with-java-reflection-and-examining-invokes-with-logs.html"><a href="exploring-the-invoke-api-from-r-with-java-reflection-and-examining-invokes-with-logs.html#unexported-helpers-provided-by-sparklyr"><i class="fa fa-check"></i><b>10.3.2</b> Unexported helpers provided by sparklyr</a></li>
</ul></li>
<li class="chapter" data-level="10.4" data-path="exploring-the-invoke-api-from-r-with-java-reflection-and-examining-invokes-with-logs.html"><a href="exploring-the-invoke-api-from-r-with-java-reflection-and-examining-invokes-with-logs.html#how-sparklyr-communicates-with-spark-invoke-logging"><i class="fa fa-check"></i><b>10.4</b> How sparklyr communicates with Spark, invoke logging</a><ul>
<li class="chapter" data-level="10.4.1" data-path="exploring-the-invoke-api-from-r-with-java-reflection-and-examining-invokes-with-logs.html"><a href="exploring-the-invoke-api-from-r-with-java-reflection-and-examining-invokes-with-logs.html#using-dplyr-verbs-translated-with-dbplyr"><i class="fa fa-check"></i><b>10.4.1</b> Using dplyr verbs translated with dbplyr</a></li>
<li class="chapter" data-level="10.4.2" data-path="exploring-the-invoke-api-from-r-with-java-reflection-and-examining-invokes-with-logs.html"><a href="exploring-the-invoke-api-from-r-with-java-reflection-and-examining-invokes-with-logs.html#using-dbi-to-send-queries"><i class="fa fa-check"></i><b>10.4.2</b> Using DBI to send queries</a></li>
<li class="chapter" data-level="10.4.3" data-path="exploring-the-invoke-api-from-r-with-java-reflection-and-examining-invokes-with-logs.html"><a href="exploring-the-invoke-api-from-r-with-java-reflection-and-examining-invokes-with-logs.html#using-the-invoke-interface"><i class="fa fa-check"></i><b>10.4.3</b> Using the invoke interface</a></li>
<li class="chapter" data-level="10.4.4" data-path="exploring-the-invoke-api-from-r-with-java-reflection-and-examining-invokes-with-logs.html"><a href="exploring-the-invoke-api-from-r-with-java-reflection-and-examining-invokes-with-logs.html#redirecting-the-invoke-logs"><i class="fa fa-check"></i><b>10.4.4</b> Redirecting the invoke logs</a></li>
</ul></li>
<li class="chapter" data-level="10.5" data-path="exploring-the-invoke-api-from-r-with-java-reflection-and-examining-invokes-with-logs.html"><a href="exploring-the-invoke-api-from-r-with-java-reflection-and-examining-invokes-with-logs.html#conclusion-1"><i class="fa fa-check"></i><b>10.5</b> Conclusion</a></li>
</ul></li>
<li class="chapter" data-level="11" data-path="combining-approaches-into-lazy-datasets.html"><a href="combining-approaches-into-lazy-datasets.html"><i class="fa fa-check"></i><b>11</b> Combining approaches into lazy datasets</a></li>
<li class="chapter" data-level="12" data-path="references-and-resources.html"><a href="references-and-resources.html"><i class="fa fa-check"></i><b>12</b> References and resources</a><ul>
<li class="chapter" data-level="12.1" data-path="references-and-resources.html"><a href="references-and-resources.html#online-resources"><i class="fa fa-check"></i><b>12.1</b> Online resources</a><ul>
<li class="chapter" data-level="12.1.1" data-path="references-and-resources.html"><a href="references-and-resources.html#getting-started-with-sparklyr"><i class="fa fa-check"></i><b>12.1.1</b> Getting started with sparklyr</a></li>
<li class="chapter" data-level="12.1.2" data-path="references-and-resources.html"><a href="references-and-resources.html#dbi-spark-sql-hive"><i class="fa fa-check"></i><b>12.1.2</b> DBI, Spark SQL, Hive</a></li>
<li class="chapter" data-level="12.1.3" data-path="references-and-resources.html"><a href="references-and-resources.html#docker"><i class="fa fa-check"></i><b>12.1.3</b> Docker</a></li>
<li class="chapter" data-level="12.1.4" data-path="references-and-resources.html"><a href="references-and-resources.html#spark-api-java-scala-and-friends"><i class="fa fa-check"></i><b>12.1.4</b> Spark API, Java, Scala and friends</a></li>
<li class="chapter" data-level="12.1.5" data-path="references-and-resources.html"><a href="references-and-resources.html#apache-arrow"><i class="fa fa-check"></i><b>12.1.5</b> Apache Arrow</a></li>
</ul></li>
<li class="chapter" data-level="12.2" data-path="references-and-resources.html"><a href="references-and-resources.html#physical-books"><i class="fa fa-check"></i><b>12.2</b> Physical Books</a></li>
</ul></li>
<li class="chapter" data-level="13" data-path="appendices.html"><a href="appendices.html"><i class="fa fa-check"></i><b>13</b> Appendices</a><ul>
<li class="chapter" data-level="13.1" data-path="appendices.html"><a href="appendices.html#r-session-information"><i class="fa fa-check"></i><b>13.1</b> R session information</a></li>
<li class="chapter" data-level="13.2" data-path="appendices.html"><a href="appendices.html#setup-of-apache-arrow"><i class="fa fa-check"></i><b>13.2</b> Setup of Apache Arrow</a></li>
</ul></li>
<li class="divider"></li>
<li><a href="https://github.com/rstudio/bookdown" target="blank">Published with bookdown</a></li>
</ul>
</nav>
</div>
<div class="book-body">
<div class="body-inner">
<div class="book-header" role="navigation">
<h1>
<i class="fa fa-circle-o-notch fa-spin"></i><a href="./">Using Spark from R for performance with arbitrary code</a>
</h1>
</div>
<div class="page-wrapper" tabindex="-1" role="main">
<div class="page-inner">
<section class="normal" id="section-">
<div id="non-translated-functions-with-spark_apply" class="section level1">
<h1><span class="header-section-number">Chapter 6</span> Non-translated functions with spark_apply</h1>
<script src="static/js/highcharts.js"></script>
<script src="static/js/highcharts-more.js"></script>
<p>In this chapter, we will look into the <code>spark_apply()</code> interface and how it communicates with the Spark instance. As an example, we will try to rewrite the function from the <a href="communication-between-spark-and-sparklyr.html">previous chapter</a> allowing us to use the non-translated <code>casefold()</code> function with Spark.</p>
<div class="sourceCode" id="cb33"><pre class="sourceCode r"><code class="sourceCode r"><a class="sourceLine" id="cb33-1" data-line-number="1"><span class="co"># Define a custom function using `casefold`</span></a>
<a class="sourceLine" id="cb33-2" data-line-number="2">fun_r_custom <-<span class="st"> </span><span class="cf">function</span>(tbl, colName) {</a>
<a class="sourceLine" id="cb33-3" data-line-number="3"> tbl[[colName]] <-<span class="st"> </span><span class="kw">casefold</span>(tbl[[colName]], <span class="dt">upper =</span> <span class="ot">FALSE</span>)</a>
<a class="sourceLine" id="cb33-4" data-line-number="4"> tbl</a>
<a class="sourceLine" id="cb33-5" data-line-number="5">}</a></code></pre></div>
<p>Now, we can use the <code>spark_apply()</code> interface to execute our custom function on a Spark DataFrame, providing the name of the column on which we want to apply the function as the <code>context</code> argument.</p>
<div class="sourceCode" id="cb34"><pre class="sourceCode r"><code class="sourceCode r"><a class="sourceLine" id="cb34-1" data-line-number="1"><span class="co"># Execute on Spark via `spark_apply`:</span></a>
<a class="sourceLine" id="cb34-2" data-line-number="2"><span class="kw">head</span>(tbl_weather) <span class="op">%>%</span><span class="st"> </span></a>
<a class="sourceLine" id="cb34-3" data-line-number="3"><span class="st"> </span><span class="kw">spark_apply</span>(fun_r_custom, <span class="dt">context =</span> {colName <-<span class="st"> "origin"</span>})</a></code></pre></div>
<pre><code>## # Source: spark<?> [?? x 16]
## id origin year month day hour temp dewp humid wind_dir wind_speed
## <int> <chr> <int> <int> <int> <int> <dbl> <dbl> <dbl> <dbl> <dbl>
## 1 1 ewr 2013 1 1 1 39.0 26.1 59.4 270 10.4
## 2 2 ewr 2013 1 1 2 39.0 27.0 61.6 250 8.06
## 3 3 ewr 2013 1 1 3 39.0 28.0 64.4 240 11.5
## 4 4 ewr 2013 1 1 4 39.9 28.0 62.2 250 12.7
## 5 5 ewr 2013 1 1 5 39.0 28.0 64.4 260 12.7
## 6 6 ewr 2013 1 1 6 37.9 28.0 67.2 240 11.5
## # … with 5 more variables: wind_gust <dbl>, precip <dbl>, pressure <dbl>,
## # visib <dbl>, time_hour <dttm></code></pre>
<div id="what-is-so-important-about-this-distinction" class="section level2">
<h2><span class="header-section-number">6.1</span> What is so important about this distinction?</h2>
<p>We have now shown that we can also send code that was not translated by <span class="rpackage">dbplyr</span> to Spark and get it executed without issues using <code>spark_apply()</code>. So what is the catch and where does the importance of the meaning of the word <em>interface</em> come in?</p>
<p>Let us quickly examine the performance of the 3 operations</p>
<ul>
<li>using a Hive built-in function directly,</li>
<li>using a function translated by <span class="rpackage">dbplyr</span> and</li>
<li>using <code>spark_apply()</code></li>
</ul>
<div class="sourceCode" id="cb36"><pre class="sourceCode r"><code class="sourceCode r"><a class="sourceLine" id="cb36-1" data-line-number="1">microbenchmark<span class="op">::</span><span class="kw">microbenchmark</span>(</a>
<a class="sourceLine" id="cb36-2" data-line-number="2"> <span class="dt">times =</span> <span class="dv">10</span>,</a>
<a class="sourceLine" id="cb36-3" data-line-number="3"> <span class="dt">hive_builtin =</span> <span class="kw">fun_hive_builtin</span>(tbl_weather, origin) <span class="op">%>%</span><span class="st"> </span><span class="kw">collect</span>(),</a>
<a class="sourceLine" id="cb36-4" data-line-number="4"> <span class="dt">translated_dplyr =</span> <span class="kw">fun_implemented</span>(tbl_weather, origin) <span class="op">%>%</span><span class="st"> </span><span class="kw">collect</span>(),</a>
<a class="sourceLine" id="cb36-5" data-line-number="5"> <span class="dt">spark_apply =</span> <span class="kw">spark_apply</span>(tbl_weather, fun_r_custom, <span class="dt">context =</span> {colName <-<span class="st"> "origin"</span>}) <span class="op">%>%</span><span class="st"> </span><span class="kw">collect</span>()</a>
<a class="sourceLine" id="cb36-6" data-line-number="6">)</a></code></pre></div>
<script type="text/javascript">
$(function () {
$('#r201-01-bench-spark-apply').highcharts({
title: {
text: "Simple column transformation on a small dataset"
},
yAxis: {
title: {
text: "time (milliseconds)"
},
min: 0
},
credits: {
enabled: false
},
exporting: {
enabled: false
},
plotOptions: {
series: {
label: {
enabled: false
},
turboThreshold: 0,
marker: {
symbol: "circle"
},
showInLegend: false
},
treemap: {
layoutAlgorithm: "squarified"
},
boxplot: {
fillColor: "#C9E4FF",
lineWidth: 0.5,
medianWidth: 1,
stemDashStyle: "dot",
stemWidth: 1,
whiskerLength: "40%",
whiskerWidth: 1
}
},
chart: {
type: "column"
},
xAxis: {
type: "category",
categories: ""
},
series: [
{
g2: null,
data: [
{
name: "hive_builtin",
low: 396,
q1: 430,
median: 461,
q3: 486,
high: 495
},
{
name: "translated_dplyr",
low: 407,
q1: 431,
median: 462.5,
q3: 501,
high: 511
},
{
name: "spark_apply",
low: 372653,
q1: 374472,
median: 376849,
q3: 381262,
high: 381262
}
],
type: "boxplot",
id: null,
color: "blue",
name: "Simple column transformation on a small dataset"
}
]
}
);
});
</script>
<div id="r201-01-bench-spark-apply">
</div>
<p>We can see that</p>
<ul>
<li>the operations executed via the SQL translation mechanism of <span class="rpackage">dbplyr</span> (via both the hive built-in function and an R function with a SQL translation available) were executed in around <em>0.5 seconds</em> while</li>
<li>the operation via <code>spark_apply()</code> took orders of magnitude longer - more than <em>6 minutes</em></li>
</ul>
<p>Note that the absolute values here will vary based on the setup and the infrastructure, the important message is in the relative differences, not in the absolute timings.</p>
</div>
<div id="what-happens-when-we-use-custom-functions-with-spark_apply" class="section level2">
<h2><span class="header-section-number">6.2</span> What happens when we use custom functions with <code>spark_apply()</code></h2>
<p>We can now see that the operation with <code>spark_apply()</code> is extremely slow compared to the other two. The key to understanding the difference is to examine how the custom transformations of data using R functions are performed within <code>spark_apply()</code>.</p>
<p>In simplified terms, this happens in a few steps:</p>
<ol style="list-style-type: decimal">
<li>the data is moved in row-format from Spark into the R process through a socket connection. This is inefficient as multiple data types need to be deserialized over each row</li>
<li>the data gets converted to columnar format since this is how R data frames are implemented</li>
<li>the R functions are applied to compute the results</li>
<li>the results are again converted to row-format, serialized row-by-row and sent back to Spark over the socket connection</li>
</ol>
</div>
<div id="what-happens-when-we-use-translated-or-hive-built-in-functions" class="section level2">
<h2><span class="header-section-number">6.3</span> What happens when we use translated or Hive built-in functions</h2>
<p>When using functions that can be translated to Spark SQL the process is very different</p>
<ol style="list-style-type: decimal">
<li>The call is translated to Spark SQL using the <span class="rpackage">dbplyr</span> backend</li>
<li>The constructed query is sent to Spark for execution using DBI</li>
<li>Only when <code>collect()</code> or <code>compute()</code> is called, the SQL is executed within Spark</li>
<li>Only when <code>collect()</code> is called the results are also sent to the R session</li>
</ol>
<p>This means that the transfer of data only happens once and only when <code>collect()</code> is called, which saves a vast amount of overhead.</p>
</div>
<div id="which-r-functionality-is-currently-translated-and-built-in-to-hive" class="section level2">
<h2><span class="header-section-number">6.4</span> Which R functionality is currently translated and built-in to Hive</h2>
<p>An important question to answer with regards to performance then is what amount of functionality is available using the fast <span class="rpackage">dbplyr</span> backend. As seen above, these features can be categorized into two groups:</p>
<ol style="list-style-type: decimal">
<li><p>R functions translatable to Spark SQL via <span class="rpackage">dbplyr</span>. The full list of such functions is available on <a href="https://spark.rstudio.com/dplyr/#sql-translation">RStudio’s sparklyr website</a></p></li>
<li><p>Hive built-in functions that get translated as they are and can be evaluated by Spark. The full list is available on the <a href="https://cwiki.apache.org/confluence/display/Hive/LanguageManual+UDF">Hive Operators and User-Defined Functions</a> website.</p></li>
</ol>
</div>
<div id="making-serialization-faster-with-apache-arrow" class="section level2">
<h2><span class="header-section-number">6.5</span> Making serialization faster with Apache Arrow</h2>
<div id="what-is-apache-arrow-and-how-it-improves-performance" class="section level3">
<h3><span class="header-section-number">6.5.1</span> What is Apache Arrow and how it improves performance</h3>
<p>Our benchmarks have shown that using <code>spark_apply()</code> does not scale well and the penalty of the bottleneck in performance caused by serialization, deserialization, and transfer is too high.</p>
<p>To partially mitigate this we can take advantage of <a href="https://arrow.apache.org/">Apache Arrow</a>, a cross-language development platform for in-memory data that specifies a standardized language-independent columnar memory format for flat and hierarchical data.</p>
<p>By adding support for Arrow in sparklyr, it makes Spark perform the row-format to column-format conversion in parallel in Spark, data is then transferred through the socket but no custom serialization takes place and all the R process needs to do is copy this data from the socket into its heap, transform it and copy it back to the socket connection.</p>
<p>This makes the process significantly faster:</p>
<div class="sourceCode" id="cb37"><pre class="sourceCode r"><code class="sourceCode r"><a class="sourceLine" id="cb37-1" data-line-number="1">microbenchmark<span class="op">::</span><span class="kw">microbenchmark</span>(</a>
<a class="sourceLine" id="cb37-2" data-line-number="2"> <span class="dt">times =</span> <span class="dv">10</span>, </a>
<a class="sourceLine" id="cb37-3" data-line-number="3"> <span class="dt">setup =</span> <span class="kw">library</span>(arrow),</a>
<a class="sourceLine" id="cb37-4" data-line-number="4"> <span class="dt">hive_builtin =</span> <span class="kw">fun_hive_builtin</span>(tbl_weather, origin) <span class="op">%>%</span><span class="st"> </span><span class="kw">collect</span>(),</a>
<a class="sourceLine" id="cb37-5" data-line-number="5"> <span class="dt">translated_dplyr =</span> <span class="kw">fun_implemented</span>(tbl_weather, origin) <span class="op">%>%</span><span class="st"> </span><span class="kw">collect</span>(),</a>
<a class="sourceLine" id="cb37-6" data-line-number="6"> <span class="dt">spark_apply_arrow =</span> <span class="kw">spark_apply</span>(tbl_weather, fun_r_custom, <span class="dt">context =</span> {colName <-<span class="st"> "origin"</span>}) <span class="op">%>%</span><span class="st"> </span><span class="kw">collect</span>()</a>
<a class="sourceLine" id="cb37-7" data-line-number="7">)</a></code></pre></div>
<p>We can see that the timing on <code>spark_apply()</code> decreased from more than 6 minutes to around 4.5 seconds, which is a very significant performance boost. Compared to the other methods we however still experience an order of magnitude difference.</p>
<script type="text/javascript">
$(function () {
$('#r201-02-bench-spark-apply').highcharts({
title: {
text: "Simple column transformation on a small dataset"
},
yAxis: {
title: {
text: "time (milliseconds)"
},
min: 0
},
credits: {
enabled: false
},
exporting: {
enabled: false
},
plotOptions: {
series: {
label: {
enabled: false
},
turboThreshold: 0,
marker: {
symbol: "circle"
},
showInLegend: false
},
treemap: {
layoutAlgorithm: "squarified"
},
boxplot: {
fillColor: "#C9E4FF",
lineWidth: 0.5,
medianWidth: 1,
stemDashStyle: "dot",
stemWidth: 1,
whiskerLength: "40%",
whiskerWidth: 1
}
},
chart: {
type: "column"
},
xAxis: {
type: "category",
categories: ""
},
series: [
{
g2: null,
data: [
{
name: "hive_builtin",
low: 494,
q1: 510,
median: 524.5,
q3: 544,
high: 577
},
{
name: "translated_dplyr",
low: 439,
q1: 460,
median: 556.5,
q3: 571,
high: 572
},
{
name: "spark_apply_arrow",
low: 4491,
q1: 4498,
median: 4526.5,
q3: 4571,
high: 4571
}
],
type: "boxplot",
id: null,
color: "blue",
name: "Simple column transformation on a small dataset"
}
]
}
);
});
</script>
<div id="r201-02-bench-spark-apply">
</div>
</div>
</div>
<div id="conclusion-take-home-messages" class="section level2">
<h2><span class="header-section-number">6.6</span> Conclusion, take-home messages</h2>
<p>Adding Arrow to the mix certainly significantly improved the performance of our example code, but is still quite slow compared to the native approach. Based on the above, we could conclude that:</p>
<ul>
<li>Performance benefits are present mainly when all the computation is performed within Spark and R serves merely as a “messaging agent”, sending commands to Spark to be executed</li>
<li>If there are object serialization and transfer of larger objects present, performance is strongly impacted.</li>
</ul>
<p>The take-home message from this exercise is that</p>
<ul>
<li>We should strive to only use R code that can be executed within the Spark instance - If we need some data retrieved, it is advisable that this is data that was previously heavily aggregated within Spark and only a small amount is transferred to the R session.</li>
</ul>
</div>
<div id="but-we-still-need-arbitrary-functions-to-run-fast" class="section level2">
<h2><span class="header-section-number">6.7</span> But we still need arbitrary functions to run fast</h2>
<p>In the next chapters, we will investigate a few options that allow us to retain the performance of Spark while still being able to write arbitrary R functions by using methods already implemented and available in the Spark API.</p>
<ol style="list-style-type: decimal">
<li><a href="constructing-functions-by-piping-dplyr-verbs.html">Rewriting the functions as collections of dplyr verbs that all support translation to Spark SQL</a></li>
<li><a href="constructing-sql-and-executing-it-with-spark.html">Rewriting the functions into Spark SQL and execute them via Spark</a></li>
<li><a href="using-the-lower-level-invoke-api-to-manipulate-sparks-java-objects-from-r.html">Rewriting the functions as series of Scala method invocations</a></li>
</ol>
</div>
</div>
</section>
</div>
</div>
</div>
<a href="communication-between-spark-and-sparklyr.html" class="navigation navigation-prev " aria-label="Previous page"><i class="fa fa-angle-left"></i></a>
<a href="constructing-functions-by-piping-dplyr-verbs.html" class="navigation navigation-next " aria-label="Next page"><i class="fa fa-angle-right"></i></a>
</div>
</div>
<script src="libs/gitbook-2.6.7/js/app.min.js"></script>
<script src="libs/gitbook-2.6.7/js/lunr.js"></script>
<script src="libs/gitbook-2.6.7/js/clipboard.min.js"></script>
<script src="libs/gitbook-2.6.7/js/plugin-search.js"></script>
<script src="libs/gitbook-2.6.7/js/plugin-sharing.js"></script>
<script src="libs/gitbook-2.6.7/js/plugin-fontsettings.js"></script>
<script src="libs/gitbook-2.6.7/js/plugin-bookdown.js"></script>
<script src="libs/gitbook-2.6.7/js/jquery.highlight.js"></script>
<script src="libs/gitbook-2.6.7/js/plugin-clipboard.js"></script>
<script>
gitbook.require(["gitbook"], function(gitbook) {
gitbook.start({
"sharing": {
"github": false,
"facebook": true,
"twitter": true,
"linkedin": false,
"weibo": false,
"instapaper": false,
"vk": false,
"all": ["facebook", "twitter", "linkedin", "weibo", "instapaper"]
},
"fontsettings": {
"theme": "white",
"family": "sans",
"size": 2
},
"edit": {
"link": null,
"text": null
},
"history": {
"link": null,
"text": null
},
"view": {
"link": null,
"text": null
},
"download": null,
"toc": {
"collapse": "subsection"
}
});
});
</script>
</body>
</html>