New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SR-14113] Support _read
and _modify
accessor differentiation
#54401
Comments
Comment by Pedro N. Rodriguez (JIRA) Hi @dan-zheng In https://bugs.swift.org/browse/TF-1078, a workaround was recommended to address the issue. Using v0.7-rc2 Ubuntu CPU Only, the following code will fail to compile with an error suggesting that the function is not differentiable.
import TensorFlow extension Array { extension Array where Element: Differentiable { struct PathCalc : Differentiable { @differentiable(wrt: (Rates, Lambda)) var L = Array<Float>(repeating: 0.0,count: N) for n in 0...Nmat-1{ for i in n+1...N-1{ } } struct Portfolio: Differentiable { @noDerivative let N: Int @differentiable(wrt: (self, Rates, Lambda)) let pathCalcModel = PathCalc(N: N, Nmat: Nmat, Delta: Delta, Z:Z) var S = Array<Float>(repeating: 0.0,count: N) var b = Float(1.0) for n in Nmat...N-1{ var v = Float(0) for i in 0...Nopt-1{ } return v } let Nmat = 40 let rates = Array<Float>(repeating: 0.05,count: N) let portfolioModel = Portfolio(N:N , Nmat:Nmat , Nopt: Nopt , Delta:0.25, let (g_model, g_rates,g_lambdas) = gradient(at: portfolioModel, rates, lambdas) { |
You have to use the workaround's .updated() function explicitly, it does not override subscripting so that things will work automatically. Here is an updated version of two of your for loops that need to use the workaround: for i in n+1...N-1{
let lam = Lambda[i - n - 1 ]
let con1 = Delta*lam
v = v + (con1*Rates[i])/(1.0 + Delta*L[i])
let first = con1*v
let second = lam*(sqez - 0.5*con1)
let vrat = exp(first + second)
//L[i] = Rates[i]*vrat
L = L.updated(at: i, with: Rates[i] * vrat)
} and for n in Nmat...N-1{
b = b / (1.0 + Delta*L[n])
s = s + Delta*b
//B[n] = b
B = B.updated(at: n, with: b)
//S[n] = s
S = S.updated(at: n, with: s)
} |
Also I needed to add an update to the workaround, I'll go comment on TF-1078 with that change. |
Comment by Pedro N. Rodriguez (JIRA) @porterchild, many thanks for reviewing the code After implementing the suggested changes, the code runs smoothly. |
IIUC, |
@swift-ci create |
1 similar comment
@swift-ci create |
@dan-zheng I do not think it's possible to find bug TF-129, given that the subproject died and (maybe) some bugs transferred over the Apple Sync System. Since you authored this, could you explain that bug in greater detail or show an SR bug it maps to? |
@philipturner According to the description of this issue, TF-129 tracked supporting differentiation of inout arguments. I couldn't find a currently existing bug corresponding to TF-129, but inout argument differentiation is supported (implementation) and doesn't block this issue. |
Additional Detail from JIRA
md5: fc0ab68747451d6712b8e1289f2c8b68
Sub-Tasks:
Array.subscript._modify
blocks:
is blocked by:
inout
parametersrelates to:
begin_apply
for a modify accessor)Issue Description:
Support differentiation of coroutines.
read
andmodify
accessors are coroutines.SIL has dedicated coroutine function types: https://github.com/apple/swift/blob/master/docs/SIL.rst#coroutine-types
Consider adding subtasks when starting work.
modify
accessor differentiation is blocked by TF-129:inout
argument differentiation.modify
accessors haveinout
arguments.The text was updated successfully, but these errors were encountered: