/
code.ex
1736 lines (1333 loc) · 59.6 KB
/
code.ex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
defmodule Code do
@moduledoc ~S"""
Utilities for managing code compilation, code evaluation, and code loading.
This module complements Erlang's [`:code` module](`:code`)
to add behaviour which is specific to Elixir. Almost all of the functions in this module
have global side effects on the behaviour of Elixir.
## Working with files
This module contains three functions for compiling and evaluating files.
Here is a summary of them and their behaviour:
* `require_file/2` - compiles a file and tracks its name. It does not
compile the file again if it has been previously required.
* `compile_file/2` - compiles a file without tracking its name. Compiles the
file multiple times when invoked multiple times.
* `eval_file/2` - evaluates the file contents without tracking its name. It
returns the result of the last expression in the file, instead of the modules
defined in it. Evaluated files do not trigger the compilation tracers described
in the next section.
In a nutshell, the first must be used when you want to keep track of the files
handled by the system, to avoid the same file from being compiled multiple
times. This is common in scripts.
`compile_file/2` must be used when you are interested in the modules defined in a
file, without tracking. `eval_file/2` should be used when you are interested in
the result of evaluating the file rather than the modules it defines.
The functions above work with Elixir source. If you want to work
with modules compiled to bytecode, which have the `.beam` extension
and are typically found below the _build directory of a Mix project,
see the functions in Erlang's [`:code`](`:code`) module.
## Code loading on the Erlang VM
Erlang has two modes to load code: interactive and embedded.
By default, the Erlang VM runs in interactive mode, where modules
are loaded as needed. In embedded mode the opposite happens, as all
modules need to be loaded upfront or explicitly.
You can use `ensure_loaded/1` (as well as `ensure_loaded?/1` and
`ensure_loaded!/1`) to check if a module is loaded before using it and
act.
## `ensure_compiled/1` and `ensure_compiled!/1`
Elixir also includes `ensure_compiled/1` and `ensure_compiled!/1`
functions that are a superset of `ensure_loaded/1`.
Since Elixir's compilation happens in parallel, in some situations
you may need to use a module that was not yet compiled, therefore
it can't even be loaded.
When invoked, `ensure_compiled/1` and `ensure_compiled!/1` halt the
compilation of the caller until the module becomes available. Note
the distinction between `ensure_compiled/1` and `ensure_compiled!/1`
is important: if you are using `ensure_compiled!/1`, you are
indicating to the compiler that you can only continue if said module
is available.
If you are using `Code.ensure_compiled/1`, you are implying you may
continue without the module and therefore Elixir may return
`{:error, :unavailable}` for cases where the module is not yet available
(but may be available later on).
For those reasons, developers must typically use `Code.ensure_compiled!/1`.
In particular, do not do this:
case Code.ensure_compiled(module) do
{:module, _} -> module
{:error, _} -> raise ...
end
Finally, note you only need `ensure_compiled!/1` to check for modules
being defined within the same project. It does not apply to modules from
dependencies as dependencies are always compiled upfront.
In most cases, `ensure_loaded/1` is enough. `ensure_compiled!/1`
must be used in rare cases, usually involving macros that need to
invoke a module for callback information. The use of `ensure_compiled/1`
is even less likely.
## Compilation tracers
Elixir supports compilation tracers, which allows modules to observe constructs
handled by the Elixir compiler when compiling files. A tracer is a module
that implements the `trace/2` function. The function receives the event name
as first argument and `Macro.Env` as second and it must return `:ok`. It is
very important for a tracer to do as little work as possible synchronously
and dispatch the bulk of the work to a separate process. **Slow tracers will
slow down compilation**.
You can configure your list of tracers via `put_compiler_option/2`. The
following events are available to tracers:
* `:start` - (since v1.11.0) invoked whenever the compiler starts to trace
a new lexical context, such as a new file. Keep in mind the compiler runs
in parallel, so multiple files may invoke `:start` and run at the same
time. The value of the `lexical_tracker` of the macro environment, albeit
opaque, can be used to uniquely identify the environment.
* `:stop` - (since v1.11.0) invoked whenever the compiler stops tracing a
new lexical context, such as a new file.
* `{:import, meta, module, opts}` - traced whenever `module` is imported.
`meta` is the import AST metadata and `opts` are the import options.
* `{:imported_function, meta, module, name, arity}` and
`{:imported_macro, meta, module, name, arity}` - traced whenever an
imported function or macro is invoked. `meta` is the call AST metadata,
`module` is the module the import is from, followed by the `name` and `arity`
of the imported function/macro.
* `{:alias, meta, alias, as, opts}` - traced whenever `alias` is aliased
to `as`. `meta` is the alias AST metadata and `opts` are the alias options.
* `{:alias_expansion, meta, as, alias}` traced whenever there is an alias
expansion for a previously defined `alias`, i.e. when the user writes `as`
which is expanded to `alias`. `meta` is the alias expansion AST metadata.
* `{:alias_reference, meta, module}` - traced whenever there is an alias
in the code, i.e. whenever the user writes `MyModule.Foo.Bar` in the code,
regardless if it was expanded or not.
* `{:require, meta, module, opts}` - traced whenever `module` is required.
`meta` is the require AST metadata and `opts` are the require options.
* `{:struct_expansion, meta, module, keys}` - traced whenever `module`'s struct
is expanded. `meta` is the struct AST metadata and `keys` are the keys being
used by expansion
* `{:remote_function, meta, module, name, arity}` and
`{:remote_macro, meta, module, name, arity}` - traced whenever a remote
function or macro is referenced. `meta` is the call AST metadata, `module`
is the invoked module, followed by the `name` and `arity`.
* `{:local_function, meta, name, arity}` and
`{:local_macro, meta, name, arity}` - traced whenever a local
function or macro is referenced. `meta` is the call AST metadata, followed by
the `name` and `arity`.
* `{:compile_env, app, path, return}` - traced whenever `Application.compile_env/3`
or `Application.compile_env!/2` are called. `app` is an atom, `path` is a list
of keys to traverse in the application environment and `return` is either
`{:ok, value}` or `:error`.
* `{:on_module, bytecode, :none}` - (since v1.11.0) traced whenever a module
is defined. This is equivalent to the `@after_compile` callback and invoked
after any `@after_compile` in the given module. The third element is currently
`:none` but it may provide more metadata in the future. It is best to ignore
it at the moment.
The `:tracers` compiler option can be combined with the `:parser_options`
compiler option to enrich the metadata of the traced events above.
New events may be added at any time in the future, therefore it is advised
for the `trace/2` function to have a "catch-all" clause.
Below is an example tracer that prints all remote function invocations:
defmodule MyTracer do
def trace({:remote_function, _meta, module, name, arity}, env) do
IO.puts "#{env.file}:#{env.line} #{inspect(module)}.#{name}/#{arity}"
:ok
end
def trace(_event, _env) do
:ok
end
end
"""
@typedoc """
A list with all variable bindings.
The binding keys are usually atoms, but they may be a tuple for variables
defined in a different context.
"""
@type binding :: [{atom() | tuple(), any}]
@boolean_compiler_options [
:docs,
:debug_info,
:ignore_module_conflict,
:relative_paths,
:warnings_as_errors
]
@list_compiler_options [:no_warn_undefined, :tracers, :parser_options]
@available_compiler_options @boolean_compiler_options ++ @list_compiler_options
@doc """
Lists all required files.
## Examples
Code.require_file("../eex/test/eex_test.exs")
List.first(Code.required_files()) =~ "eex_test.exs"
#=> true
"""
@doc since: "1.7.0"
@spec required_files() :: [binary]
def required_files do
:elixir_code_server.call(:required)
end
@deprecated "Use Code.required_files/0 instead"
@doc false
def loaded_files do
required_files()
end
@doc false
@deprecated "Use Code.Fragment.cursor_context/2 instead"
def cursor_context(code, options \\ []) do
Code.Fragment.cursor_context(code, options)
end
@doc """
Removes files from the required files list.
The modules defined in the file are not removed;
calling this function only removes them from the list,
allowing them to be required again.
## Examples
# Require EEx test code
Code.require_file("../eex/test/eex_test.exs")
# Now unrequire all files
Code.unrequire_files(Code.required_files())
# Note that modules are still available
function_exported?(EExTest.Compiled, :before_compile, 0)
#=> true
"""
@doc since: "1.7.0"
@spec unrequire_files([binary]) :: :ok
def unrequire_files(files) when is_list(files) do
:elixir_code_server.cast({:unrequire_files, files})
end
@deprecated "Use Code.unrequire_files/1 instead"
@doc false
def unload_files(files) do
unrequire_files(files)
end
@doc """
Appends a path to the end of the Erlang VM code path list.
This is the list of directories the Erlang VM uses for
finding module code.
The path is expanded with `Path.expand/1` before being appended.
If this path does not exist, an error is returned.
## Examples
Code.append_path(".")
#=> true
Code.append_path("/does_not_exist")
#=> {:error, :bad_directory}
"""
@spec append_path(Path.t()) :: true | {:error, :bad_directory}
def append_path(path) do
:code.add_pathz(to_charlist(Path.expand(path)))
end
@doc """
Prepends a path to the beginning of the Erlang VM code path list.
This is the list of directories the Erlang VM uses for finding
module code.
The path is expanded with `Path.expand/1` before being prepended.
If this path does not exist, an error is returned.
## Examples
Code.prepend_path(".")
#=> true
Code.prepend_path("/does_not_exist")
#=> {:error, :bad_directory}
"""
@spec prepend_path(Path.t()) :: true | {:error, :bad_directory}
def prepend_path(path) do
:code.add_patha(to_charlist(Path.expand(path)))
end
@doc """
Deletes a path from the Erlang VM code path list. This is the list of
directories the Erlang VM uses for finding module code.
The path is expanded with `Path.expand/1` before being deleted. If the
path does not exist, this function returns `false`.
## Examples
Code.prepend_path(".")
Code.delete_path(".")
#=> true
Code.delete_path("/does_not_exist")
#=> false
"""
@spec delete_path(Path.t()) :: boolean
def delete_path(path) do
:code.del_path(to_charlist(Path.expand(path)))
end
@doc """
Evaluates the contents given by `string`.
The `binding` argument is a list of variable bindings.
The `opts` argument is a keyword list of environment options.
**Warning**: `string` can be any Elixir code and will be executed with
the same privileges as the Erlang VM: this means that such code could
compromise the machine (for example by executing system commands).
Don't use `eval_string/3` with untrusted input (such as strings coming
from the network).
## Options
Options can be:
* `:file` - the file to be considered in the evaluation
* `:line` - the line on which the script starts
Additionally, you may also pass an environment as second argument,
so the evaluation happens within that environment. However, if the evaluated
code requires or compiles another file, the environment given to this function
will not apply to said files.
Returns a tuple of the form `{value, binding}`, where `value` is the value
returned from evaluating `string`. If an error occurs while evaluating
`string` an exception will be raised.
`binding` is a list with all variable bindings after evaluating `string`.
The binding keys are usually atoms, but they may be a tuple for variables
defined in a different context.
## Examples
iex> {result, binding} = Code.eval_string("a + b", [a: 1, b: 2], file: __ENV__.file, line: __ENV__.line)
iex> result
3
iex> Enum.sort(binding)
[a: 1, b: 2]
iex> {result, binding} = Code.eval_string("c = a + b", [a: 1, b: 2], __ENV__)
iex> result
3
iex> Enum.sort(binding)
[a: 1, b: 2, c: 3]
iex> {result, binding} = Code.eval_string("a = a + b", [a: 1, b: 2])
iex> result
3
iex> Enum.sort(binding)
[a: 3, b: 2]
For convenience, you can pass `__ENV__/0` as the `opts` argument and
all imports, requires and aliases defined in the current environment
will be automatically carried over:
iex> {result, binding} = Code.eval_string("a + b", [a: 1, b: 2], __ENV__)
iex> result
3
iex> Enum.sort(binding)
[a: 1, b: 2]
"""
@spec eval_string(List.Chars.t(), binding, Macro.Env.t() | keyword) :: {term, binding}
def eval_string(string, binding \\ [], opts \\ [])
def eval_string(string, binding, %Macro.Env{} = env) do
validated_eval_string(string, binding, env)
end
def eval_string(string, binding, opts) when is_list(opts) do
validated_eval_string(string, binding, opts)
end
defp validated_eval_string(string, binding, opts_or_env) do
%{line: line, file: file} = env = :elixir.env_for_eval(opts_or_env)
forms = :elixir.string_to_quoted!(to_charlist(string), line, 1, file, [])
{value, binding, _env} = :elixir.eval_forms(forms, binding, env)
{value, binding}
end
@doc ~S"""
Formats the given code `string`.
The formatter receives a string representing Elixir code and
returns iodata representing the formatted code according to
pre-defined rules.
## Options
* `:file` - the file which contains the string, used for error
reporting
* `:line` - the line the string starts, used for error reporting
* `:line_length` - the line length to aim for when formatting
the document. Defaults to 98. Note this value is used as
guideline but there are situations where it is not enforced.
See the "Line length" section below for more information
* `:locals_without_parens` - a keyword list of name and arity
pairs that should be kept without parens whenever possible.
The arity may be the atom `:*`, which implies all arities of
that name. The formatter already includes a list of functions
and this option augments this list.
* `:force_do_end_blocks` (since v1.9.0) - when `true`, converts all
inline usages of `do: ...`, `else: ...` and friends into `do`-`end`
blocks. Defaults to `false`. Note that this option is convergent:
once you set it to `true`, **all keywords** will be converted.
If you set it to `false` later on, `do`-`end` blocks won't be
converted back to keywords.
## Design principles
The formatter was designed under three principles.
First, the formatter never changes the semantics of the code by
default. This means the input AST and the output AST are equivalent.
The second principle is to provide as little configuration as possible.
This eases the formatter adoption by removing contention points while
making sure a single style is followed consistently by the community as
a whole.
The formatter does not hard code names. The formatter will not behave
specially because a function is named `defmodule`, `def`, or the like. This
principle mirrors Elixir's goal of being an extensible language where
developers can extend the language with new constructs as if they were
part of the language. When it is absolutely necessary to change behaviour
based on the name, this behaviour should be configurable, such as the
`:locals_without_parens` option.
## Running the formatter
The formatter attempts to fit the most it can on a single line and
introduces line breaks wherever possible when it cannot.
In some cases, this may lead to undesired formatting. Therefore, **some
code generated by the formatter may not be aesthetically pleasing and
may require explicit intervention from the developer**. That's why we
do not recommend to run the formatter blindly in an existing codebase.
Instead you should format and sanity check each formatted file.
For example, the formatter may break a long function definition over
multiple clauses:
def my_function(
%User{name: name, age: age, ...},
arg1,
arg2
) do
...
end
While the code above is completely valid, you may prefer to match on
the struct variables inside the function body in order to keep the
definition on a single line:
def my_function(%User{} = user, arg1, arg2) do
%{name: name, age: age, ...} = user
...
end
In some situations, you can use the fact the formatter does not generate
elegant code as a hint for refactoring. Take this code:
def board?(board_id, %User{} = user, available_permissions, required_permissions) do
Tracker.OrganizationMembers.user_in_organization?(user.id, board.organization_id) and
required_permissions == Enum.to_list(MapSet.intersection(MapSet.new(required_permissions), MapSet.new(available_permissions)))
end
The code above has very long lines and running the formatter is not going
to address this issue. In fact, the formatter may make it more obvious that
you have complex expressions:
def board?(board_id, %User{} = user, available_permissions, required_permissions) do
Tracker.OrganizationMembers.user_in_organization?(user.id, board.organization_id) and
required_permissions ==
Enum.to_list(
MapSet.intersection(
MapSet.new(required_permissions),
MapSet.new(available_permissions)
)
)
end
Take such cases as a suggestion that your code should be refactored:
def board?(board_id, %User{} = user, available_permissions, required_permissions) do
Tracker.OrganizationMembers.user_in_organization?(user.id, board.organization_id) and
matching_permissions?(required_permissions, available_permissions)
end
defp matching_permissions?(required_permissions, available_permissions) do
intersection =
required_permissions
|> MapSet.new()
|> MapSet.intersection(MapSet.new(available_permissions))
|> Enum.to_list()
required_permissions == intersection
end
To sum it up: since the formatter cannot change the semantics of your
code, sometimes it is necessary to tweak or refactor the code to get
optimal formatting. To help better understand how to control the formatter,
we describe in the next sections the cases where the formatter keeps the
user encoding and how to control multiline expressions.
## Line length
Another point about the formatter is that the `:line_length` configuration
is a guideline. In many cases, it is not possible for the formatter to break
your code apart, which means it will go over the line length. For example,
if you have a long string:
"this is a very long string that will go over the line length"
The formatter doesn't know how to break it apart without changing the
code underlying syntax representation, so it is up to you to step in:
"this is a very long string " <>
"that will go over the line length"
The string concatenation makes the code fit on a single line and also
gives more options to the formatter.
This may also appear in do/end blocks, where the `do` keyword (or `->`)
may go over the line length because there is no opportunity for the
formatter to introduce a line break in a readable way. For example,
if you do:
case very_long_expression() do
end
And only the `do` keyword is above the line length, Elixir **will not**
emit this:
case very_long_expression()
do
end
So it prefers to not touch the line at all and leave `do` above the
line limit.
## Keeping user's formatting
The formatter respects the input format in some cases. Those are
listed below:
* Insignificant digits in numbers are kept as is. The formatter
however always inserts underscores for decimal numbers with more
than 5 digits and converts hexadecimal digits to uppercase
* Strings, charlists, atoms and sigils are kept as is. No character
is automatically escaped or unescaped. The choice of delimiter is
also respected from the input
* Newlines inside blocks are kept as in the input except for:
1) expressions that take multiple lines will always have an empty
line before and after and 2) empty lines are always squeezed
together into a single empty line
* The choice between `:do` keyword and `do`-`end` blocks is left
to the user
* Lists, tuples, bitstrings, maps, structs and function calls will be
broken into multiple lines if they are followed by a newline in the
opening bracket and preceded by a new line in the closing bracket
* Newlines before certain operators (such as the pipeline operators)
and before other operators (such as comparison operators)
The behaviours above are not guaranteed. We may remove or add new
rules in the future. The goal of documenting them is to provide better
understanding on what to expect from the formatter.
### Multi-line lists, maps, tuples, and the like
You can force lists, tuples, bitstrings, maps, structs and function
calls to have one entry per line by adding a newline after the opening
bracket and a new line before the closing bracket lines. For example:
[
foo,
bar
]
If there are no newlines around the brackets, then the formatter will
try to fit everything on a single line, such that the snippet below
[foo,
bar]
will be formatted as
[foo, bar]
You can also force function calls and keywords to be rendered on multiple
lines by having each entry on its own line:
defstruct name: nil,
age: 0
The code above will be kept with one keyword entry per line by the
formatter. To avoid that, just squash everything into a single line.
### Parens and no parens in function calls
Elixir has two syntaxes for function calls. With parens and no parens.
By default, Elixir will add parens to all calls except for:
1. calls that have `do`-`end` blocks
2. local calls without parens where the name and arity of the local
call is also listed under `:locals_without_parens` (except for
calls with arity 0, where the compiler always require parens)
The choice of parens and no parens also affects indentation. When a
function call with parens doesn't fit on the same line, the formatter
introduces a newline around parens and indents the arguments with two
spaces:
some_call(
arg1,
arg2,
arg3
)
On the other hand, function calls without parens are always indented
by the function call length itself, like this:
some_call arg1,
arg2,
arg3
If the last argument is a data structure, such as maps and lists, and
the beginning of the data structure fits on the same line as the function
call, then no indentation happens, this allows code like this:
Enum.reduce(some_collection, initial_value, fn element, acc ->
# code
end)
some_function_without_parens %{
foo: :bar,
baz: :bat
}
## Code comments
The formatter also handles code comments in a way to guarantee a space
is always added between the beginning of the comment (#) and the next
character.
The formatter also extracts all trailing comments to their previous line.
For example, the code below
hello #world
will be rewritten to
# world
hello
Because code comments are handled apart from the code representation (AST),
there are some situations where code comments are seen as ambiguous by the
code formatter. For example, the comment in the anonymous function below
fn
arg1 ->
body1
# comment
arg2 ->
body2
end
and in this one
fn
arg1 ->
body1
# comment
arg2 ->
body2
end
are considered equivalent (the nesting is discarded alongside most of
user formatting). In such cases, the code formatter will always format to
the latter.
## Newlines
The formatter converts all newlines in code from `\r\n` to `\n`.
"""
@doc since: "1.6.0"
@spec format_string!(binary, keyword) :: iodata
def format_string!(string, opts \\ []) when is_binary(string) and is_list(opts) do
line_length = Keyword.get(opts, :line_length, 98)
to_quoted_opts =
[
unescape: false,
warn_on_unnecessary_quotes: false,
literal_encoder: &{:ok, {:__block__, &2, [&1]}},
token_metadata: true
] ++ opts
{forms, comments} = string_to_quoted_with_comments!(string, to_quoted_opts)
to_algebra_opts =
[
comments: comments
] ++ opts
doc = Code.Formatter.to_algebra(forms, to_algebra_opts)
Inspect.Algebra.format(doc, line_length)
end
@doc """
Formats a file.
See `format_string!/2` for more information on code formatting and
available options.
"""
@doc since: "1.6.0"
@spec format_file!(binary, keyword) :: iodata
def format_file!(file, opts \\ []) when is_binary(file) and is_list(opts) do
string = File.read!(file)
formatted = format_string!(string, [file: file, line: 1] ++ opts)
[formatted, ?\n]
end
@doc """
Evaluates the quoted contents.
**Warning**: Calling this function inside a macro is considered bad
practice as it will attempt to evaluate runtime values at compile time.
Macro arguments are typically transformed by unquoting them into the
returned quoted expressions (instead of evaluated).
See `eval_string/3` for a description of `binding` and `opts`.
## Examples
iex> contents = quote(do: var!(a) + var!(b))
iex> {result, binding} = Code.eval_quoted(contents, [a: 1, b: 2], file: __ENV__.file, line: __ENV__.line)
iex> result
3
iex> Enum.sort(binding)
[a: 1, b: 2]
For convenience, you can pass `__ENV__/0` as the `opts` argument and
all options will be automatically extracted from the current environment:
iex> contents = quote(do: var!(a) + var!(b))
iex> {result, binding} = Code.eval_quoted(contents, [a: 1, b: 2], __ENV__)
iex> result
3
iex> Enum.sort(binding)
[a: 1, b: 2]
"""
@spec eval_quoted(Macro.t(), binding, Macro.Env.t() | keyword) :: {term, binding}
def eval_quoted(quoted, binding \\ [], opts \\ [])
def eval_quoted(quoted, binding, %Macro.Env{} = env) do
{value, binding, _env} = :elixir.eval_quoted(quoted, binding, :elixir.env_for_eval(env))
{value, binding}
end
def eval_quoted(quoted, binding, opts) when is_list(opts) do
{value, binding, _env} = :elixir.eval_quoted(quoted, binding, :elixir.env_for_eval(opts))
{value, binding}
end
@doc ~S"""
Converts the given string to its quoted form.
Returns `{:ok, quoted_form}` if it succeeds,
`{:error, {meta, message_info, token}}` otherwise.
## Options
* `:file` - the filename to be reported in case of parsing errors.
Defaults to `"nofile"`.
* `:line` - the starting line of the string being parsed.
Defaults to 1.
* `:column` - (since v1.11.0) the starting column of the string being parsed.
Defaults to 1.
* `:columns` - when `true`, attach a `:column` key to the quoted
metadata. Defaults to `false`.
* `:unescape` (since v1.10.0) - when `false`, preserves escaped sequences.
For example, `"null byte\\t\\x00"` will be kept as is instead of being
converted to a bitstring literal. Note if you set this option to false, the
resulting AST is no longer valid, but it can be useful to analyze/transform
source code, typically in in combination with `quoted_to_algebra/2`.
Defaults to `true`.
* `:existing_atoms_only` - when `true`, raises an error
when non-existing atoms are found by the tokenizer.
Defaults to `false`.
* `:token_metadata` (since v1.10.0) - when `true`, includes token-related
metadata in the expression AST, such as metadata for `do` and `end`
tokens, for closing tokens, end of expressions, as well as delimiters
for sigils. See `t:Macro.metadata/0`. Defaults to `false`.
* `:literal_encoder` (since v1.10.0) - how to encode literals in the AST.
It must be a function that receives two arguments, the literal and its
metadata, and it must return `{:ok, ast :: Macro.t}` or
`{:error, reason :: binary}`. If you return anything than the literal
itself as the `term`, then the AST is no longer valid. This option
may still useful for textual analysis of the source code.
* `:static_atoms_encoder` - the static atom encoder function, see
"The `:static_atoms_encoder` function" section below. Note this
option overrides the `:existing_atoms_only` behaviour for static
atoms but `:existing_atoms_only` is still used for dynamic atoms,
such as atoms with interpolations.
* `:warn_on_unnecessary_quotes` - when `false`, does not warn
when atoms, keywords or calls have unnecessary quotes on
them. Defaults to `true`.
## `Macro.to_string/2`
The opposite of converting a string to its quoted form is
`Macro.to_string/2`, which converts a quoted form to a string/binary
representation.
## The `:static_atoms_encoder` function
When `static_atoms_encoder: &my_encoder/2` is passed as an argument,
`my_encoder/2` is called every time the tokenizer needs to create a
"static" atom. Static atoms are atoms in the AST that function as
aliases, remote calls, local calls, variable names, regular atoms
and keyword lists.
The encoder function will receive the atom name (as a binary) and a
keyword list with the current file, line and column. It must return
`{:ok, token :: term} | {:error, reason :: binary}`.
The encoder function is supposed to create an atom from the given
string. To produce a valid AST, it is required to return `{:ok, term}`,
where `term` is an atom. It is possible to return something other than an atom,
however, in that case the AST is no longer "valid" in that it cannot
be used to compile or evaluate Elixir code. A use case for this is
if you want to use the Elixir parser in a user-facing situation, but
you don't want to exhaust the atom table.
The atom encoder is not called for *all* atoms that are present in
the AST. It won't be invoked for the following atoms:
* operators (`:+`, `:-`, and so on)
* syntax keywords (`fn`, `do`, `else`, and so on)
* atoms containing interpolation (`:"#{1 + 1} is two"`), as these
atoms are constructed at runtime.
"""
@spec string_to_quoted(List.Chars.t(), keyword) ::
{:ok, Macro.t()} | {:error, {location :: keyword, binary | {binary, binary}, binary}}
def string_to_quoted(string, opts \\ []) when is_list(opts) do
file = Keyword.get(opts, :file, "nofile")
line = Keyword.get(opts, :line, 1)
column = Keyword.get(opts, :column, 1)
case :elixir.string_to_tokens(to_charlist(string), line, column, file, opts) do
{:ok, tokens} ->
:elixir.tokens_to_quoted(tokens, file, opts)
{:error, _error_msg} = error ->
error
end
end
@doc """
Converts the given string to its quoted form.
It returns the AST if it succeeds,
raises an exception otherwise. The exception is a `TokenMissingError`
in case a token is missing (usually because the expression is incomplete),
`SyntaxError` otherwise.
Check `string_to_quoted/2` for options information.
"""
@spec string_to_quoted!(List.Chars.t(), keyword) :: Macro.t()
def string_to_quoted!(string, opts \\ []) when is_list(opts) do
file = Keyword.get(opts, :file, "nofile")
line = Keyword.get(opts, :line, 1)
column = Keyword.get(opts, :column, 1)
:elixir.string_to_quoted!(to_charlist(string), line, column, file, opts)
end
@doc """
Converts the given string to its quoted form and a list of comments.
This function is useful when performing textual changes to the source code,
while preserving information like comments and literals position.
Returns `{:ok, quoted_form, comments}` if it succeeds,
`{:error, {line, error, token}}` otherwise.
Comments are maps with the following fields:
* `:line` - The line number the source code
* `:text` - The full text of the comment, including the leading `#`
* `:previous_eol_count` - How many end of lines there are between the comment and the previous AST node or comment
* `:next_eol_count` - How many end of lines there are between the comment and the next AST node or comment
Check `string_to_quoted/2` for options information.
## Examples
iex> Code.string_to_quoted_with_comments("\""
...> :foo
...>
...> # Hello, world!
...>
...>
...> # Some more comments!
...> "\"")
{:ok, :foo, [
%{line: 3, column: 1, previous_eol_count: 2, next_eol_count: 3, text: "\# Hello, world!"},
%{line: 6, column: 1, previous_eol_count: 3, next_eol_count: 1, text: "\# Some more comments!"},
]}
iex> Code.string_to_quoted_with_comments(":foo # :bar")
{:ok, :foo, [
%{line: 1, column: 6, previous_eol_count: 0, next_eol_count: 0, text: "\# :bar"}
]}
"""
@doc since: "1.13.0"
@spec string_to_quoted_with_comments(List.Chars.t(), keyword) ::
{:ok, Macro.t(), list(map())} | {:error, {location :: keyword, term, term}}
def string_to_quoted_with_comments(string, opts \\ []) when is_list(opts) do
charlist = to_charlist(string)
file = Keyword.get(opts, :file, "nofile")
line = Keyword.get(opts, :line, 1)
column = Keyword.get(opts, :column, 1)
Process.put(:code_formatter_comments, [])
opts = [preserve_comments: &preserve_comments/5] ++ opts
with {:ok, tokens} <- :elixir.string_to_tokens(charlist, line, column, file, opts),
{:ok, forms} <- :elixir.tokens_to_quoted(tokens, file, opts) do
comments = Enum.reverse(Process.get(:code_formatter_comments))
{:ok, forms, comments}
end
after
Process.delete(:code_formatter_comments)
end
@doc """
Converts the given string to its quoted form and a list of comments.
Returns the AST and a list of comments if it succeeds, raises an exception
otherwise. The exception is a `TokenMissingError` in case a token is missing
(usually because the expression is incomplete), `SyntaxError` otherwise.
Check `string_to_quoted/2` for options information.