Skip to content
Permalink
Browse files
Fix some minor issues in the DFG's profiling of heap accesses
https://bugs.webkit.org/show_bug.cgi?id=113010

Reviewed by Goeffrey Garen.

1) If a CodeBlock gets jettisoned by GC, we should count the exit sites.

2) If a CodeBlock clears a structure stub during GC, it should record this, and
the DFG should prefer to not inline that access (i.e. treat it as if it had an
exit site).

3) If a PutById was seen by the baseline JIT, and the JIT attempted to cache it,
but it chose not to, then assume that it will take slow path.

4) If we frequently exited because of a structure check on a weak constant,
don't try to inline that access in the future.

5) Treat all exits that were counted as being frequent.

81% speed-up on Octane/gbemu. Small speed-ups elsewhere, and no regressions.

* bytecode/CodeBlock.cpp:
(JSC::CodeBlock::finalizeUnconditionally):
(JSC):
(JSC::CodeBlock::resetStubDuringGCInternal):
(JSC::CodeBlock::reoptimize):
(JSC::CodeBlock::jettison):
(JSC::ProgramCodeBlock::jettisonImpl):
(JSC::EvalCodeBlock::jettisonImpl):
(JSC::FunctionCodeBlock::jettisonImpl):
(JSC::CodeBlock::tallyFrequentExitSites):
* bytecode/CodeBlock.h:
(CodeBlock):
(JSC::CodeBlock::tallyFrequentExitSites):
(ProgramCodeBlock):
(EvalCodeBlock):
(FunctionCodeBlock):
* bytecode/GetByIdStatus.cpp:
(JSC::GetByIdStatus::computeFor):
* bytecode/PutByIdStatus.cpp:
(JSC::PutByIdStatus::computeFor):
* bytecode/StructureStubInfo.h:
(JSC::StructureStubInfo::StructureStubInfo):
(StructureStubInfo):
* dfg/DFGByteCodeParser.cpp:
(JSC::DFG::ByteCodeParser::handleGetById):
(JSC::DFG::ByteCodeParser::parseBlock):
* dfg/DFGOSRExit.cpp:
(JSC::DFG::OSRExit::considerAddingAsFrequentExitSiteSlow):
* dfg/DFGOSRExit.h:
(JSC::DFG::OSRExit::considerAddingAsFrequentExitSite):
(OSRExit):
* jit/JITStubs.cpp:
(JSC::DEFINE_STUB_FUNCTION):
* runtime/Options.h:
(JSC):



Canonical link: https://commits.webkit.org/131439@main
git-svn-id: https://svn.webkit.org/repository/webkit/trunk@146669 268f45cc-cd09-0410-ab3c-d52691b4dbfc
  • Loading branch information
pizlonator committed Mar 22, 2013
1 parent 994684e commit 4bbc20944b754cebf8002673f9d669dc62a41746
Showing 11 changed files with 113 additions and 48 deletions.
@@ -1,3 +1,62 @@
2013-03-21 Filip Pizlo <fpizlo@apple.com>

Fix some minor issues in the DFG's profiling of heap accesses
https://bugs.webkit.org/show_bug.cgi?id=113010

Reviewed by Goeffrey Garen.

1) If a CodeBlock gets jettisoned by GC, we should count the exit sites.

2) If a CodeBlock clears a structure stub during GC, it should record this, and
the DFG should prefer to not inline that access (i.e. treat it as if it had an
exit site).

3) If a PutById was seen by the baseline JIT, and the JIT attempted to cache it,
but it chose not to, then assume that it will take slow path.

4) If we frequently exited because of a structure check on a weak constant,
don't try to inline that access in the future.

5) Treat all exits that were counted as being frequent.

81% speed-up on Octane/gbemu. Small speed-ups elsewhere, and no regressions.

* bytecode/CodeBlock.cpp:
(JSC::CodeBlock::finalizeUnconditionally):
(JSC):
(JSC::CodeBlock::resetStubDuringGCInternal):
(JSC::CodeBlock::reoptimize):
(JSC::CodeBlock::jettison):
(JSC::ProgramCodeBlock::jettisonImpl):
(JSC::EvalCodeBlock::jettisonImpl):
(JSC::FunctionCodeBlock::jettisonImpl):
(JSC::CodeBlock::tallyFrequentExitSites):
* bytecode/CodeBlock.h:
(CodeBlock):
(JSC::CodeBlock::tallyFrequentExitSites):
(ProgramCodeBlock):
(EvalCodeBlock):
(FunctionCodeBlock):
* bytecode/GetByIdStatus.cpp:
(JSC::GetByIdStatus::computeFor):
* bytecode/PutByIdStatus.cpp:
(JSC::PutByIdStatus::computeFor):
* bytecode/StructureStubInfo.h:
(JSC::StructureStubInfo::StructureStubInfo):
(StructureStubInfo):
* dfg/DFGByteCodeParser.cpp:
(JSC::DFG::ByteCodeParser::handleGetById):
(JSC::DFG::ByteCodeParser::parseBlock):
* dfg/DFGOSRExit.cpp:
(JSC::DFG::OSRExit::considerAddingAsFrequentExitSiteSlow):
* dfg/DFGOSRExit.h:
(JSC::DFG::OSRExit::considerAddingAsFrequentExitSite):
(OSRExit):
* jit/JITStubs.cpp:
(JSC::DEFINE_STUB_FUNCTION):
* runtime/Options.h:
(JSC):

2013-03-22 Filip Pizlo <fpizlo@apple.com>

DFG folding of PutById to SimpleReplace should consider the specialized function case
@@ -2338,10 +2338,6 @@ void CodeBlock::finalizeUnconditionally()
if (verboseUnlinking)
dataLog(*this, " has dead weak references, jettisoning during GC.\n");

// Make sure that the baseline JIT knows that it should re-warm-up before
// optimizing.
alternative()->optimizeAfterWarmUp();

if (DFG::shouldShowDisassembly()) {
dataLog(*this, " will be jettisoned because of the following dead references:\n");
for (unsigned i = 0; i < m_dfgData->transitions.size(); ++i) {
@@ -2427,7 +2423,7 @@ void CodeBlock::finalizeUnconditionally()
if (stubInfo.visitWeakReferences())
continue;

resetStubInternal(repatchBuffer, stubInfo);
resetStubDuringGCInternal(repatchBuffer, stubInfo);
}
}
#endif
@@ -2465,6 +2461,12 @@ void CodeBlock::resetStubInternal(RepatchBuffer& repatchBuffer, StructureStubInf

stubInfo.reset();
}

void CodeBlock::resetStubDuringGCInternal(RepatchBuffer& repatchBuffer, StructureStubInfo& stubInfo)
{
resetStubInternal(repatchBuffer, stubInfo);
stubInfo.resetByGC = true;
}
#endif

void CodeBlock::stronglyVisitStrongReferences(SlotVisitor& visitor)
@@ -2842,12 +2844,10 @@ void CodeBlock::reoptimize()
{
ASSERT(replacement() != this);
ASSERT(replacement()->alternative() == this);
replacement()->tallyFrequentExitSites();
if (DFG::shouldShowDisassembly())
dataLog(*replacement(), " will be jettisoned due to reoptimization of ", *this, ".\n");
replacement()->jettison();
countReoptimization();
optimizeAfterWarmUp();
}

CodeBlock* ProgramCodeBlock::replacement()
@@ -2906,30 +2906,29 @@ DFG::CapabilityLevel FunctionCodeBlock::canCompileWithDFGInternal()
return DFG::canCompileFunctionForCall(this);
}

void ProgramCodeBlock::jettison()
void CodeBlock::jettison()
{
ASSERT(JITCode::isOptimizingJIT(getJITType()));
ASSERT(this == replacement());
alternative()->optimizeAfterWarmUp();
tallyFrequentExitSites();
if (DFG::shouldShowDisassembly())
dataLog("Jettisoning ", *this, ".\n");
jettisonImpl();
}

void ProgramCodeBlock::jettisonImpl()
{
static_cast<ProgramExecutable*>(ownerExecutable())->jettisonOptimizedCode(*globalData());
}

void EvalCodeBlock::jettison()
void EvalCodeBlock::jettisonImpl()
{
ASSERT(JITCode::isOptimizingJIT(getJITType()));
ASSERT(this == replacement());
if (DFG::shouldShowDisassembly())
dataLog("Jettisoning ", *this, ".\n");
static_cast<EvalExecutable*>(ownerExecutable())->jettisonOptimizedCode(*globalData());
}

void FunctionCodeBlock::jettison()
void FunctionCodeBlock::jettisonImpl()
{
ASSERT(JITCode::isOptimizingJIT(getJITType()));
ASSERT(this == replacement());
if (DFG::shouldShowDisassembly())
dataLog("Jettisoning ", *this, ".\n");
static_cast<FunctionExecutable*>(ownerExecutable())->jettisonOptimizedCodeFor(*globalData(), m_isConstructor ? CodeForConstruct : CodeForCall);
}

@@ -3272,7 +3271,7 @@ void CodeBlock::tallyFrequentExitSites()
for (unsigned i = 0; i < m_dfgData->osrExit.size(); ++i) {
DFG::OSRExit& exit = m_dfgData->osrExit[i];

if (!exit.considerAddingAsFrequentExitSite(this, profiledBlock))
if (!exit.considerAddingAsFrequentExitSite(profiledBlock))
continue;

#if DFG_ENABLE(DEBUG_VERBOSE)
@@ -437,7 +437,7 @@ namespace JSC {
JITCode::JITType getJITType() const { return m_jitCode.jitType(); }
ExecutableMemoryHandle* executableMemory() { return getJITCode().getExecutableMemory(); }
virtual JSObject* compileOptimized(ExecState*, JSScope*, unsigned bytecodeIndex) = 0;
virtual void jettison() = 0;
void jettison();
enum JITCompilationResult { AlreadyCompiled, CouldNotCompile, CompiledSuccessfully };
JITCompilationResult jitCompile(ExecState* exec)
{
@@ -1051,10 +1051,17 @@ namespace JSC {
protected:
#if ENABLE(JIT)
virtual bool jitCompileImpl(ExecState*) = 0;
virtual void jettisonImpl() = 0;
#endif
virtual void visitWeakReferences(SlotVisitor&);
virtual void finalizeUnconditionally();

#if ENABLE(DFG_JIT)
void tallyFrequentExitSites();
#else
void tallyFrequentExitSites() { }
#endif

private:
friend class DFGCodeBlocks;

@@ -1064,11 +1071,6 @@ namespace JSC {
ClosureCallStubRoutine* findClosureCallForReturnPC(ReturnAddressPtr);
#endif

#if ENABLE(DFG_JIT)
void tallyFrequentExitSites();
#else
void tallyFrequentExitSites() { }
#endif
#if ENABLE(VALUE_PROFILER)
void updateAllPredictionsAndCountLiveness(OperationInProgress, unsigned& numberOfLiveNonArgumentValueProfiles, unsigned& numberOfSamplesInProfiles);
#endif
@@ -1145,6 +1147,7 @@ namespace JSC {

#if ENABLE(JIT)
void resetStubInternal(RepatchBuffer&, StructureStubInfo&);
void resetStubDuringGCInternal(RepatchBuffer&, StructureStubInfo&);
#endif
WriteBarrier<UnlinkedCodeBlock> m_unlinkedCode;
int m_numParameters;
@@ -1320,7 +1323,7 @@ namespace JSC {
#if ENABLE(JIT)
protected:
virtual JSObject* compileOptimized(ExecState*, JSScope*, unsigned bytecodeIndex);
virtual void jettison();
virtual void jettisonImpl();
virtual bool jitCompileImpl(ExecState*);
virtual CodeBlock* replacement();
virtual DFG::CapabilityLevel canCompileWithDFGInternal();
@@ -1345,7 +1348,7 @@ namespace JSC {
#if ENABLE(JIT)
protected:
virtual JSObject* compileOptimized(ExecState*, JSScope*, unsigned bytecodeIndex);
virtual void jettison();
virtual void jettisonImpl();
virtual bool jitCompileImpl(ExecState*);
virtual CodeBlock* replacement();
virtual DFG::CapabilityLevel canCompileWithDFGInternal();
@@ -1370,7 +1373,7 @@ namespace JSC {
#if ENABLE(JIT)
protected:
virtual JSObject* compileOptimized(ExecState*, JSScope*, unsigned bytecodeIndex);
virtual void jettison();
virtual void jettisonImpl();
virtual bool jitCompileImpl(ExecState*);
virtual CodeBlock* replacement();
virtual DFG::CapabilityLevel canCompileWithDFGInternal();
@@ -125,6 +125,9 @@ GetByIdStatus GetByIdStatus::computeFor(CodeBlock* profiledBlock, unsigned bytec
if (!stubInfo.seen)
return computeFromLLInt(profiledBlock, bytecodeIndex, ident);

if (stubInfo.resetByGC)
return GetByIdStatus(TakesSlowPath, true);

PolymorphicAccessStructureList* list;
int listSize;
switch (stubInfo.accessType) {
@@ -94,9 +94,13 @@ PutByIdStatus PutByIdStatus::computeFor(CodeBlock* profiledBlock, unsigned bytec
if (!stubInfo.seen)
return computeFromLLInt(profiledBlock, bytecodeIndex, ident);

if (stubInfo.resetByGC)
return PutByIdStatus(TakesSlowPath, 0, 0, 0, invalidOffset);

switch (stubInfo.accessType) {
case access_unset:
return computeFromLLInt(profiledBlock, bytecodeIndex, ident);
// If the JIT saw it but didn't optimize it, then assume that this takes slow path.
return PutByIdStatus(TakesSlowPath, 0, 0, 0, invalidOffset);

case access_put_by_id_replace: {
PropertyOffset offset = stubInfo.u.putByIdReplace.baseObjectStructure->get(
@@ -97,6 +97,7 @@ struct StructureStubInfo {
StructureStubInfo()
: accessType(access_unset)
, seen(false)
, resetByGC(false)
{
}

@@ -200,7 +201,8 @@ struct StructureStubInfo {
unsigned bytecodeIndex;

int8_t accessType;
int8_t seen;
bool seen : 1;
bool resetByGC : 1;

#if ENABLE(DFG_JIT)
CodeOrigin codeOrigin;
@@ -1708,7 +1708,8 @@ void ByteCodeParser::handleGetById(
const GetByIdStatus& getByIdStatus)
{
if (!getByIdStatus.isSimple()
|| m_inlineStackTop->m_exitProfile.hasExitSite(m_currentIndex, BadCache)) {
|| m_inlineStackTop->m_exitProfile.hasExitSite(m_currentIndex, BadCache)
|| m_inlineStackTop->m_exitProfile.hasExitSite(m_currentIndex, BadWeakConstantCache)) {
set(destinationOperand,
addToGraph(
getByIdStatus.makesCalls() ? GetByIdFlush : GetById,
@@ -2608,7 +2609,9 @@ bool ByteCodeParser::parseBlock(unsigned limit)
canCountAsInlined = false;
}

bool hasExitSite = m_inlineStackTop->m_exitProfile.hasExitSite(m_currentIndex, BadCache);
bool hasExitSite =
m_inlineStackTop->m_exitProfile.hasExitSite(m_currentIndex, BadCache)
|| m_inlineStackTop->m_exitProfile.hasExitSite(m_currentIndex, BadWeakConstantCache);

if (!hasExitSite && putByIdStatus.isSimpleReplace()) {
addToGraph(CheckStructure, OpInfo(m_graph.addStructureSet(putByIdStatus.oldStructure())), base);
@@ -72,11 +72,8 @@ void OSRExit::correctJump(LinkBuffer& linkBuffer)
m_patchableCodeOffset = linkBuffer.offsetOf(label);
}

bool OSRExit::considerAddingAsFrequentExitSiteSlow(CodeBlock* dfgCodeBlock, CodeBlock* profiledCodeBlock)
bool OSRExit::considerAddingAsFrequentExitSiteSlow(CodeBlock* profiledCodeBlock)
{
if (static_cast<double>(m_count) / dfgCodeBlock->osrExitCounter() <= Options::osrExitProminenceForFrequentExitSite())
return false;

FrequentExitSite exitSite;

if (m_kind == ArgumentsEscaped) {
@@ -100,11 +100,11 @@ struct OSRExit {
ExitKind m_kind;
uint32_t m_count;

bool considerAddingAsFrequentExitSite(CodeBlock* dfgCodeBlock, CodeBlock* profiledCodeBlock)
bool considerAddingAsFrequentExitSite(CodeBlock* profiledCodeBlock)
{
if (!m_count || !exitKindIsCountable(m_kind))
return false;
return considerAddingAsFrequentExitSiteSlow(dfgCodeBlock, profiledCodeBlock);
return considerAddingAsFrequentExitSiteSlow(profiledCodeBlock);
}

void setPatchableCodeOffset(MacroAssembler::PatchableJump);
@@ -118,7 +118,7 @@ struct OSRExit {
RefPtr<ValueRecoveryOverride> m_valueRecoveryOverride;

private:
bool considerAddingAsFrequentExitSiteSlow(CodeBlock* dfgCodeBlock, CodeBlock* profiledCodeBlock);
bool considerAddingAsFrequentExitSiteSlow(CodeBlock* profiledCodeBlock);
};

struct SpeculationFailureDebugInfo {
@@ -1456,10 +1456,8 @@ DEFINE_STUB_FUNCTION(void, op_put_by_id)
stackFrame.args[0].jsValue().put(callFrame, ident, stackFrame.args[2].jsValue(), slot);
if (accessType == static_cast<AccessType>(stubInfo->accessType)) {
if (!stubInfo->seenOnce())
stubInfo->setSeen();
else
tryCachePutByID(callFrame, codeBlock, STUB_RETURN_ADDRESS, stackFrame.args[0].jsValue(), slot, stubInfo, false);
stubInfo->setSeen();
tryCachePutByID(callFrame, codeBlock, STUB_RETURN_ADDRESS, stackFrame.args[0].jsValue(), slot, stubInfo, false);
}
CHECK_FOR_EXCEPTION_AT_END();
@@ -1482,10 +1480,8 @@ DEFINE_STUB_FUNCTION(void, op_put_by_id_direct)
asObject(baseValue)->putDirect(callFrame->globalData(), ident, stackFrame.args[2].jsValue(), slot);
if (accessType == static_cast<AccessType>(stubInfo->accessType)) {
if (!stubInfo->seenOnce())
stubInfo->setSeen();
else
tryCachePutByID(callFrame, codeBlock, STUB_RETURN_ADDRESS, stackFrame.args[0].jsValue(), slot, stubInfo, true);
stubInfo->setSeen();
tryCachePutByID(callFrame, codeBlock, STUB_RETURN_ADDRESS, stackFrame.args[0].jsValue(), slot, stubInfo, true);
}
CHECK_FOR_EXCEPTION_AT_END();
@@ -111,7 +111,6 @@ namespace JSC {
v(unsigned, likelyToTakeSlowCaseMinimumCount, 100) \
v(unsigned, couldTakeSlowCaseMinimumCount, 10) \
\
v(double, osrExitProminenceForFrequentExitSite, 0.3) \
v(unsigned, osrExitCountForReoptimization, 100) \
v(unsigned, osrExitCountForReoptimizationFromLoop, 5) \
\

0 comments on commit 4bbc209

Please sign in to comment.