Skip to content

# ElsevierSoftwareX/SOFTX-D-17-00013 forked from MatteoRagni/cas-rb

Mr.CAS- A Minimalistic (pure) Ruby CAS for Fast Prototyping and Code Generation. To cite this Software Publication: http://www.sciencedirect.com/science/article/pii/S2352711017300146
This branch is 2 commits behind MatteoRagni:master. Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information. bin certs docs lib tests .codeclimate.yml .gitignore .travis.yml CHANGELOG Mr.CAS.gemspec README.md Rakefile mrbgem.rake mrblib

# Mr.CAS

## Introduction

An extremely simple graph, that handles only differentiation and simplification with one step ahead in the graph.

## Module Structure ## Example

Given the function of Rosenbrock, find the optimum of such a function.

## Execution

### Installation

First of all is necessary to install and load the `Mr.CAS` gem.

`gem install Mr.CAS`
`require 'Mr.CAS'`

### Define a function

Now we can define the Rosenbrock function, that has two variable (`x` and `y`) and two constants (`a` and `b`, with the default value on 1 and 100 in this case). The variable `f` will contain our function:

```x, y = CAS::vars :x, :y
a, b = CAS::const 1.0, 100.0

f = ((a - x) ** 2) + b * ((y - (x ** 2)) ** 2)```

We can print this function as follows:

```puts "f = #{f}"
# => ((1.0 - x)^2 + (100.0 * (y - x^2)^2))```

### Derivatives

To find the minimum we need to find the point in which the gradient of such a function is equal to zero

The two derivatives are:

```dfx = f.diff(x).simplify
dfy = f.diff(y).simplify

puts "df/dx = #{dfx}"
# => df/dx = ((((1 - x) * 2) * -1) + ((((y - x^2) * 2) * -(x * 2)) * 100.0))

puts "df/dy = #{dfy}"
# => df/dy = (((y - x^2) * 2) * 100.0)```

### Substitutions

Now, from the second it is quite evident that

``````y = x^2 = g(x)
``````

and thus we can perform on the first a substitution:

```g = (x ** 2)
dfx = dfx.subs({y => g})

puts "df/dx = #{dfx}"
# => df/dx = (((1 - x) * 2) * -1)```

### Meta-Programming (a Newton algorithm...)

That is something quite simple to solve (x = 1). Let's use this formulation for some meta-programming anyway! The stationary point is in the root of the function `dfx`, that depends only on one variable:

`puts "Arguments: #{dfx.args}"`

We can write a Newton method. The solution is found iteratively solving the recursive equation:

``````x[k + 1] = x[k] - ( f(x[k]) / f'(x[k]) )
``````

Let's write an algorithm that takes the symblic expression, and creates its own method (e.g.: using `Proc` objects). Here an example with our function:

```unused = (dfx/dfx.diff(x)).simplify
puts "unused(x) = #{unused}"
# => unused(x) = ((((1 - x) * 2) * -1)) / (((-1 * 2) * -1))

unused_proc = unused.as_proc
puts "unused(12.0) = #{unused.call({"x" => 12.0})}"
# => unused(12.0) = 11.0```

First of all, let's write a function that contains the algorithm. We want to give as input a symbolic function of one variable, it must reply with the value of the variable in which the function as value equal to zero:

```def newton(f, init=3.0, tol=1E-8, kmax=100)
k = 0

x   = f.args
s   = {x.name => init}      # <-This is the solution dictionary

fp  = f.as_proc             # <- This is a parsed object
df  = f.diff(x).simplify    # <- This is still symbolic
res = (f/df).as_proc        # <- This is a parsed object

f0 = fp.call(s)
k = 0

loop do
# Algorithm
s[x.name] = s[x.name] - res.call(s)
# Tolerance check
f1        = fp.call(s)
if (f1 - f0).abs < tol
break
else
f0 = f1
end

# Iterations check
k = k + 1
break if k > kmax
end
puts "Solution after #{k} iterations: #{x} = #{s[x.name]}, f(#{x}) = #{f0}"
return s[x.name]
end```

Transforming the symbolic function into a `Proc` makes the code more efficient. It is not mandatory, but higlhy suggested in case of iterative algorithms.

Let's call our new function, using as argument the derivative of the function that we want to optimize:

```x_opt = newton(dfx)
# => Solution after 1 iterations: x = 1.0, f(x) = -0.0```

We can use the solution of `x`, to get the value of `y`:

```puts "Optimum in #{x} = #{x_opt} and #{y} = #{g.call({x => x_opt})}"
# => Optimum in x = 1.0 and y = 1.0```

## Disclaimer

This is a work in progress and mainly a proof of concept. What really is missing is a graph a way to perform better simplifications.

## Full example code

```require 'Mr.CAS'

# Define the function
x, y = CAS::vars "x", "y"
a, b = CAS::const 1.0, 100.0

f = ((a - x) ** 2) + b * ((y - (x ** 2)) ** 2)

puts "f = #{f}"

# Derivate wrt variables
dfx = f.diff(x).simplify
dfy = f.diff(y).simplify

puts "df/dx = #{dfx}"
puts "df/dy = #{dfy}"

# Perform substitutions
g = (x ** 2)
dfx = dfx.subs({y => g}).simplify
puts "df/dx = #{dfx}"

# Arguments of a function
puts "Arguments: #{dfx.args}"

# Metaprogramming: Create a Proc from symbolic math
unused = (dfx/dfx.diff(x)).simplify
puts "unused(x) = #{unused}"
unused_proc = unused.as_proc

# Testing the Proc. The feed must have string as key to identify
# the variable
puts "unused(12.0) = #{unused.call({"x" => 12.0})}"

# We will not use this function, instead we will let the
# algorithm to create its own

# Newton algorthm on the fly, that creates its own formula!
def newton(f, init=3.0, tol=1E-8, kmax=100)
k = 0

x   = f.args
s   = {x.name => init}      # <-This is the solution dictionary

fp  = f.as_proc             # <- This is a parsed object
df  = f.diff(x).simplify    # <- This is still symbolic
res = (f/df).as_proc        # <- This is a parsed object

f0 = fp.call(s)
k = 0

loop do
# Algorithm
s[x.name] = s[x.name] - res.call(s)
# Tolerance check
f1        = fp.call(s)
if (f1 - f0).abs < tol
break
else
f0 = f1
end

# Iterations check
k = k + 1
break if k > kmax
end
puts "Solution after #{k} iterations: #{x} = #{s[x.name]}, f(#{x}) = #{f0}"
return s[x.name]
end

# Let's call our shining algorithm:
x_opt = newton(dfx)

# Let's see the final result for both:
puts "Optimum in #{x} = #{x_opt} and #{y} = #{g.call({x => x_opt})}"```

## To do

#### Simple

• substitute `CAS::Invert` with `CAS::Neg`
• substitute `CAS::Condition` with `CAS::Expression`
• move code generation to external plugins that can be loaded on demand
• `CAS::Sum` and `CAS::Prod` inherits from `CAS::NaryOp` instead of `CAS::BinaryOp`

#### Complex

• implement `CAS::VectorVariable` and `CAS::MatrixVariable`
• Replace `CAS::Op` and `CAS::BinaryOp` with a single `CAS::Op` equal to `CAS::NaryOp`
##### Extremely complex
• Implement the Risch algorithms for symbolic integrations
You can’t perform that action at this time.