We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
for i n { ... }
Dyon supports a short For loop for counters starting at 0 and incremented until it is greater or equal to a pre-evalutated expression:
for i len(list) { println(list[i]) }
This For loop is approximately 2.9x faster when running on AST than the equivalent traditional For loop:
n := len(list) for i := 0; i < n; i += 1 { println(list[i]) }
The expression len(list) can be inferred by the index list[i] in the body:
len(list)
list[i]
for i { println(list[i]) }
When nesting loops this way, you have to order items in a list in the same order, for example:
for i, j, k { println(list[i][j][k]) // i, j, k must be used in the same order }
However, as long as you don't depend on the previous indices, you can write it any way you like:
sum i, j { list[i] - list[j] }
With index start and end:
for i [2, len(list)) { println(list[i]) }
When nesting loops of same kind, you can pack them together by separating the indices with ",":
for i, j, k { println(list[i][j][k]) }
You can also write ranges in packed version, like for i n, j [i+1, n) { ... }.
for i n, j [i+1, n) { ... }
Computing output for a neural network:
fn run__tensor_input(tensor: [[[f64]]], input: [f64]) -> { input := input for i { input = sift j { sigmoid(∑ k { tensor[i][j][k] * input[k] }) } } return clone(input) }
Compute energy for a system of N physical bodies:
fn energy(bodies: [{}]) -> f64 { n := len(bodies) return ∑ i n { bodies[i].vel · bodies[i].vel * bodies[i].mass / 2.0 - bodies[i].mass * ∑ j [i+1, n) { bodies[j].mass / |bodies[i].pos - bodies[j].pos| } } }
Set all weights in a neural network to random values:
fn randomize__tensor(mut tensor: [[[f64]]]) { for i, j, k { tensor[i][j][k] = random() } }
This is designed for:
The text was updated successfully, but these errors were encountered:
ForN
No branches or pull requests
Dyon supports a short For loop for counters starting at 0 and incremented until it is greater or equal to a pre-evalutated expression:
This For loop is approximately 2.9x faster when running on AST than the equivalent traditional For loop:
Inferring range by indexing
The expression
len(list)
can be inferred by the indexlist[i]
in the body:When nesting loops this way, you have to order items in a list in the same order, for example:
However, as long as you don't depend on the previous indices, you can write it any way you like:
Specify range
With index start and end:
Packed loops
When nesting loops of same kind, you can pack them together by separating the indices with ",":
You can also write ranges in packed version, like
for i n, j [i+1, n) { ... }
.Examples
Computing output for a neural network:
Compute energy for a system of N physical bodies:
Set all weights in a neural network to random values:
Motivation
This is designed for:
The text was updated successfully, but these errors were encountered: