Skip to content

Commit a457fbc

Browse files
Add const checking for elements of tuple formals (#16041)
Add const checking for elements of tuple formals (#16041) This PR adjusts the compiler to emit errors when a const element of a tuple formal is modified. Prior to this PR the compiler would compile programs such as the following without incident: ```chapel record r { var x: int = 0; } proc modifyDefaultIntentFormal(tup) { tup[1] = new r(128); } proc test() { const a = (0, new r(0)); modifyDefaultIntentFormal(a); writeln(a); } test(); ``` The call to `modifyDefaultIntentFormal` incorrectly modifies the second element of the tuple formal `tup`. Because `tup` has the default argument intent, its second element should be passed by `const ref` (because it is a record). Any modification of `tup[1]` within the body of `modifyDefaultIntentFormal` should emit an error. --- - Add a function `checkTupleFormalUses` to `lateConstCheck`. This method checks to make sure the constness of `const ref` tuple elements is not violated. If a field should be const but is set to `QUAL_REF` by `cullOverReferences`, then emit an error. The location that is reported is an imprecise guess - future work should make this reporting as precise as possible and provide more context for the use. - Use a set of `FnSymbol*` in the `for_formals_actuals` loop in `lateConstCheck` so that `const ref` elements of tuple formals are checked once per function instead of once per call. - Add a function `checkTupleFormalToActual` to `lateConstCheck`. This function checks ref/ref-if-modified elements of tuple formals. If the element is ref (or inferred to be ref) and the actual element is const, then emit an error. If the element is ref-if-modified, then provide an imprecise hint about where it is used. Future work should make this hint more precise. - Infer the concrete intent of blank tuple formals that contain single/sync/atomic elements to be `INTENT_REF_MAYBE_CONST`. The blank intent for single/sync/atomic elements is `ref`. It does not make sense to infer the intent of the entire tuple as `INTENT_REF` if it contains such an element, so mimic what we do for blank intent tuples containing arrays. - Fix a bug in `CullRefCtx::collectTuplesAndRefMaybeConstArgs` that caused the compiler to collect a _reference_ to a tuple for analysis instead of the tuple itself. - Adjust `CullRefCtx::checkTupleCastCall` to only collect the LHS of a tuple cast call as a dependency if it contains `ref` elements. If the LHS is a value tuple, the compiler need not be concerned with how it is used. - In `CullRefCtx::checkAccessorLikeCall`, do not add the LHS as a dependency if it is known to be constant. Any misuse of a const LHS should have already been reported in a previous pass. - In `CullRefCtx::checkRefMaybeConstFormal`, escape early if the function being called has flag `FLAG_REF_TO_CONST_WHEN_CONST_THIS` and the actual is a `const` method receiver. Otherwise, we add a false dependency that the compiler may consider a "setting" use. - Add debugging tools to `lateConstCheck`. - Clean up `lateConstCheck` by moving calls to skip into a function `isCallToSkip`. Additional "light" refactoring might help with integrating new code into the rest of the pass. - Adjust the IO proc <~> for writing channels so that the element to write is `const`. - Adjust the tests `types/tuple/dinan/pining_for_lisp.chpl` and `pining_for_lisp_ver2.chpl` so that their `car` and `cdr` functions return `const ref` instead of `ref`. - Add two tests to `types/tuple/const` that verify that the constness of tuple elements is respected for blank/const tuples of almost every element type. Future work should adjust the tests for tuples of sync/single when those are permitted in tuples. Future work should adjust the tests for tuples containing `dmap`, if possible. --- Future work: - Adjust `cullOverReferences` to store field index in addition to `ASTNode*` in the `reasonNotConst` map, and then use this information to print out more accurate error messages in `checkTupleFormalUses` and `checkTupleFormalToActual`. - Better integrate error reporting for tuples with existing error reporting in `lateConstCheck`. - Do const checking for tuple elements of tuples. - Add tests checking constness for tuples containing `sync` and `single` when appropriate. --- Testing: - [x] ALL on linux64 when CHPL_COMM=none - [x] ALL on linux64 when CHPL_COMM=gasnet --- Reviewed by @mppf. Thanks!
1 parent e7fb57e commit a457fbc

17 files changed

+1182
-109
lines changed

compiler/include/resolveIntents.h

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -26,6 +26,7 @@
2626
#include "expr.h"
2727

2828
IntentTag blankIntentForType(Type* t);
29+
IntentTag constIntentForType(Type* t);
2930
IntentTag concreteIntent(IntentTag existingIntent, Type* t);
3031
IntentTag concreteIntentForArg(ArgSymbol* arg);
3132
void resolveArgIntent(ArgSymbol* arg);

compiler/passes/resolveIntents.cpp

Lines changed: 33 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@
2525

2626
bool intentsResolved = false;
2727

28-
static IntentTag constIntentForType(Type* t) {
28+
IntentTag constIntentForType(Type* t) {
2929
if (is_bool_type(t) ||
3030
is_int_type(t) ||
3131
is_uint_type(t) ||
@@ -85,6 +85,31 @@ bool isTupleContainingRefMaybeConst(Type* t)
8585
return false;
8686
}
8787

88+
static bool isTupleContainingTypeWithFlag(Type* t, Flag f) {
89+
AggregateType* at = toAggregateType(t->getValType());
90+
91+
if (at != NULL && at->symbol->hasFlag(FLAG_TUPLE)) {
92+
for_fields(field, at) {
93+
Type* ft = field->type->getValType();
94+
if (ft->symbol->hasFlag(f))
95+
return true;
96+
}
97+
}
98+
99+
return false;
100+
}
101+
102+
static bool isTupleContainingSyncType(Type* t) {
103+
return isTupleContainingTypeWithFlag(t, FLAG_SYNC);
104+
}
105+
106+
static bool isTupleContainingSingleType(Type* t) {
107+
return isTupleContainingTypeWithFlag(t, FLAG_SINGLE);
108+
}
109+
110+
static bool isTupleContainingAtomicType(Type* t) {
111+
return isTupleContainingTypeWithFlag(t, FLAG_ATOMIC_TYPE);
112+
}
88113

89114
IntentTag blankIntentForType(Type* t) {
90115
IntentTag retval = INTENT_BLANK;
@@ -93,8 +118,14 @@ IntentTag blankIntentForType(Type* t) {
93118
t->symbol->hasFlag(FLAG_DEFAULT_INTENT_IS_REF)) {
94119
retval = INTENT_REF;
95120

121+
// Blank intent for sync/single/atomic types is ref, but we can't mark
122+
// the entire tuple as INTENT_REF because that has a special meaning.
123+
// So go ahead and mark the tuple as INTENT_REF_MAYBE_CONST instead.
96124
} else if (t->symbol->hasFlag(FLAG_DEFAULT_INTENT_IS_REF_MAYBE_CONST)
97-
|| isTupleContainingRefMaybeConst(t)) {
125+
|| isTupleContainingRefMaybeConst(t)
126+
|| isTupleContainingSyncType(t)
127+
|| isTupleContainingSingleType(t)
128+
|| isTupleContainingAtomicType(t)) {
98129
retval = INTENT_REF_MAYBE_CONST;
99130

100131
} else if (isManagedPtrType(t)) {

compiler/resolution/cullOverReferences.cpp

Lines changed: 75 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -67,9 +67,12 @@ static const int breakOnId2 = 0;
6767
static const int breakOnId3 = 0;
6868

6969
#define DEBUG_SYMBOL(sym) \
70-
if (sym->id == breakOnId1 || sym->id == breakOnId2 || sym->id == breakOnId3) { \
71-
gdbShouldBreakHere(); \
72-
}
70+
do { \
71+
if (sym->id == breakOnId1 || sym->id == breakOnId2 || \
72+
sym->id == breakOnId3) { \
73+
gdbShouldBreakHere(); \
74+
} \
75+
} while (0)
7376

7477
static const int trace_all = 0;
7578
static const int trace_usr = 0;
@@ -641,6 +644,8 @@ void CullRefCtx::propagateConstnessToMethodReceivers(void) {
641644
if (fn != call->resolvedFunction())
642645
continue;
643646

647+
DEBUG_SYMBOL(call);
648+
644649
SymExpr* thisActual = toSymExpr(call->get(1));
645650
Symbol* actualSymbol = thisActual->symbol();
646651

@@ -700,12 +705,12 @@ void CullRefCtx::collectTuplesAndRefMaybeConstArgs(void) {
700705
if (arg->defPoint->parentSymbol->hasFlag(FLAG_TUPLE_CAST_FN))
701706
continue;
702707

703-
// If the argument is a tuple with reference fields, explore it...
708+
// If the formal is a tuple with reference fields, explore it...
704709
AggregateType* argAt = toAggregateType(arg->getValType());
705710
if (argAt && argAt->symbol->hasFlag(FLAG_TUPLE) &&
706711
containsReferenceFields(argAt)) {
707712

708-
AggregateType* tupleType = toAggregateType(arg->type);
713+
AggregateType* tupleType = argAt;
709714
int fieldIndex = 1;
710715

711716
// Collect reference or tuple fields for later.
@@ -876,6 +881,7 @@ bool CullRefCtx::checkLoopForDependencies(GraphNode node, LoopInfo &info,
876881
// the SymExpr in a ref field? If so, add the tuple as a dependency.
877882
bool CullRefCtx::checkBuildTupleCall(SymExpr* se, CallExpr* call,
878883
GraphNode node, bool &revisit) {
884+
Symbol* sym = se->symbol();
879885

880886
if (FnSymbol* calledFn = call->resolvedFunction()) {
881887
if (!calledFn->hasFlag(FLAG_BUILD_TUPLE))
@@ -912,6 +918,8 @@ bool CullRefCtx::checkBuildTupleCall(SymExpr* se, CallExpr* call,
912918

913919
// If it is a ref field, collect it for later.
914920
if (tupleField->isRef()) {
921+
922+
DEBUG_SYMBOL(sym);
915923
DEBUG_SYMBOL(lhsSymbol);
916924

917925
GraphNode srcNode = makeNode(lhsSymbol, j);
@@ -967,23 +975,39 @@ bool CullRefCtx::checkContextCallExpr(CallExpr* call, GraphNode node,
967975
// consider it const if the result of the tuple cast is const.
968976
bool CullRefCtx::checkTupleCastCall(SymExpr* se, CallExpr* call,
969977
GraphNode node, bool &revisit) {
978+
Symbol* sym = se->symbol();
970979

971980
if (FnSymbol* calledFn = call->resolvedFunction()) {
972981
if (!calledFn->hasFlag(FLAG_TUPLE_CAST_FN))
973982
return false;
974983

975984
CallExpr* move = toCallExpr(call->parentExpr);
976985

977-
if (move && move->isPrimitive(PRIM_MOVE)) {
986+
if (move != NULL && move->isPrimitive(PRIM_MOVE)) {
978987
SymExpr* lhs = toSymExpr(move->get(1));
979988
Symbol* lhsSymbol = lhs->symbol();
980-
GraphNode srcNode = makeNode(lhsSymbol, node.fieldIndex);
981989

982-
collectedSymbols.push_back(srcNode);
983-
addDependency(revisitGraph, srcNode, node);
984-
revisit = true;
990+
AggregateType* lhsAt = toAggregateType(lhsSymbol->type);
985991

986-
return true;
992+
// AFAIK, only internal (tuple -> tuple) casts that adjust tuple ref
993+
// levels will be labeled with the flag FLAG_TUPLE_CAST_FN.
994+
INT_ASSERT(lhsAt != NULL);
995+
INT_ASSERT(lhsAt->symbol->hasFlag(FLAG_TUPLE));
996+
997+
// Only add the LHS as a dependency if it contains reference fields.
998+
// If not, then it's a cast to a value tuple and we don't care
999+
// about it.
1000+
if (containsReferenceFields(lhsAt)) {
1001+
DEBUG_SYMBOL(sym);
1002+
DEBUG_SYMBOL(lhsSymbol);
1003+
1004+
GraphNode srcNode = makeNode(lhsSymbol, node.fieldIndex);
1005+
collectedSymbols.push_back(srcNode);
1006+
addDependency(revisitGraph, srcNode, node);
1007+
revisit = true;
1008+
1009+
return true;
1010+
}
9871011
}
9881012
}
9891013

@@ -1013,12 +1037,24 @@ bool CullRefCtx::checkAccessorLikeCall(SymExpr* se, CallExpr* call,
10131037
Symbol* lhsSymbol = lhs->symbol();
10141038

10151039
if (lhsSymbol->isRef() && lhsSymbol != sym) {
1016-
GraphNode srcNode = makeNode(lhsSymbol, node.fieldIndex);
1017-
collectedSymbols.push_back(srcNode);
1018-
addDependency(revisitGraph, srcNode, node);
1019-
revisit = true;
10201040

1021-
return true;
1041+
// Don't worry about tracking the destination if it's constant or
1042+
// marked with the flag FLAG_REF_TO_CONST.
1043+
// TODO: Also check against `isConstValWillNotChange`?
1044+
if (lhsSymbol->hasFlag(FLAG_REF_TO_CONST) ||
1045+
lhsSymbol->isConstant() ||
1046+
QualifiedType::qualifierIsConst(lhsSymbol->qual)) {
1047+
return false;
1048+
} else {
1049+
GraphNode srcNode = makeNode(lhsSymbol, node.fieldIndex);
1050+
collectedSymbols.push_back(srcNode);
1051+
addDependency(revisitGraph, srcNode, node);
1052+
revisit = true;
1053+
1054+
DEBUG_SYMBOL(call);
1055+
1056+
return true;
1057+
}
10221058
}
10231059
}
10241060
}
@@ -1031,6 +1067,7 @@ bool CullRefCtx::checkAccessorLikeCall(SymExpr* se, CallExpr* call,
10311067
// function.
10321068
bool CullRefCtx::checkRefMaybeConstFormal(SymExpr* se, CallExpr* call,
10331069
GraphNode node, bool &revisit) {
1070+
Symbol* sym = node.variable;
10341071

10351072
if (FnSymbol* calledFn = call->resolvedFunction()) {
10361073
if (calledFn->hasFlag(FLAG_BUILD_TUPLE))
@@ -1042,6 +1079,25 @@ bool CullRefCtx::checkRefMaybeConstFormal(SymExpr* se, CallExpr* call,
10421079
if (formal->intent != INTENT_REF_MAYBE_CONST)
10431080
return false;
10441081

1082+
// @dlongnecke-cray:
1083+
//
1084+
// If formal is receiver marked INTENT_REF_MAYBE_CONST (e.g. array or
1085+
// tuple containing array), and the function to be called is marked
1086+
// FLAG_REF_TO_CONST_WHEN_CONST_THIS, go ahead and leave if the actual
1087+
// is constant. Otherwise we can create a situation where the actual
1088+
// receiver depends on the formal when being marked const. If the
1089+
// function returns a "non-const" access of the formal (e.g. via a
1090+
// this call), the compiler views that as an escaping reference and
1091+
// will transitively mark the actual receiver as non-const (when we
1092+
// know that it _is_ const). Things get even weirder when the function
1093+
// being called is an iterator.
1094+
if (calledFn->hasFlag(FLAG_REF_TO_CONST_WHEN_CONST_THIS) &&
1095+
formal->hasFlag(FLAG_ARG_THIS)) {
1096+
if (se->qualType().isConst()) {
1097+
return false;
1098+
}
1099+
}
1100+
10451101
// No need to add formal to `collectedSymbols` because we already did
10461102
// so in `collectRefMaybeConstArgs`.
10471103

@@ -1052,6 +1108,9 @@ bool CullRefCtx::checkRefMaybeConstFormal(SymExpr* se, CallExpr* call,
10521108
addDependency(revisitGraph, srcNode, node);
10531109
revisit = true;
10541110

1111+
DEBUG_SYMBOL(formal);
1112+
DEBUG_SYMBOL(sym);
1113+
10551114
return true;
10561115
}
10571116

0 commit comments

Comments
 (0)