diff options
author | Koundinya Veluri <kouvel@users.noreply.github.com> | 2019-01-11 18:02:10 -0800 |
---|---|---|
committer | GitHub <noreply@github.com> | 2019-01-11 18:02:10 -0800 |
commit | 37b9d85941c39cfdce2a2ea877388ab1ab630c68 (patch) | |
tree | 004009a0f73f752ecd338a7460473e861609db21 /tests | |
parent | 834f8d9bd3ee5f0095c91e334ed4565a1a740fee (diff) | |
download | coreclr-37b9d85941c39cfdce2a2ea877388ab1ab630c68.tar.gz coreclr-37b9d85941c39cfdce2a2ea877388ab1ab630c68.tar.bz2 coreclr-37b9d85941c39cfdce2a2ea877388ab1ab630c68.zip |
Patch vtable slots and similar when tiering is enabled (#21292)
Patch vtable slots and similar when tiering is enabled
For a method eligible for code versioning and vtable slot backpatch:
- It does not have a precode (`HasPrecode()` returns false)
- It does not have a stable entry point (`HasStableEntryPoint()` returns false)
- A call to the method may be:
- An indirect call through the `MethodTable`'s backpatchable vtable slot
- A direct call to a backpatchable `FuncPtrStub`, perhaps through a `JumpStub`
- For interface methods, an indirect call through the virtual stub dispatch (VSD) indirection cell to a backpatchable `DispatchStub` or a `ResolveStub` that refers to a backpatchable `ResolveCacheEntry`
- The purpose is that typical calls to the method have no additional overhead when code versioning is enabled
Recording and backpatching slots:
- In order for all vtable slots for the method to be backpatchable:
- A vtable slot initially points to the `MethodDesc`'s temporary entry point, even when the method is inherited by a derived type (the slot's value is not copied from the parent)
- The temporary entry point always points to the prestub and is never backpatched, in order to be able to discover new vtable slots through which the method may be called
- The prestub, as part of `DoBackpatch()`, records any slots that are transitioned from the temporary entry point to the method's at-the-time current, non-prestub entry point
- Any further changes to the method's entry point cause recorded slots to be backpatched in `BackpatchEntryPointSlots()`
- In order for the `FuncPtrStub` to be backpatchable:
- After the `FuncPtrStub` is created and exposed, it is patched to point to the method's at-the-time current entry point if necessary
- Any further changes to the method's entry point cause the `FuncPtrStub` to be backpatched in `BackpatchEntryPointSlots()`
- In order for VSD entities to be backpatchable:
- A `DispatchStub`'s entry point target is aligned and recorded for backpatching in `BackpatchEntryPointSlots()`
- The `DispatchStub` was modified on x86 and x64 such that the entry point target is aligned to a pointer to make it backpatchable
- A `ResolveCacheEntry`'s entry point target is recorded for backpatching in `BackpatchEntryPointSlots()`
Slot lifetime and management of recorded slots:
- A slot is recorded in the `LoaderAllocator` in which the slot is allocated, see `RecordAndBackpatchEntryPointSlot()`
- An inherited slot that has a shorter lifetime than the `MethodDesc`, when recorded, needs to be accessible by the `MethodDesc` for backpatching, so the dependent `LoaderAllocator` with the slot to backpatch is also recorded in the `MethodDesc`'s `LoaderAllocator`, see `MethodDescBackpatchInfo::AddDependentLoaderAllocator_Locked()`
- At the end of a `LoaderAllocator`'s lifetime, the `LoaderAllocator` is unregistered from dependency `LoaderAllocators`, see `MethodDescBackpatchInfoTracker::ClearDependencyMethodDescEntryPointSlots()`
- When a `MethodDesc`'s entry point changes, backpatching also includes iterating over recorded dependent `LoaderAllocators` to backpatch the relevant slots recorded there, see `BackpatchEntryPointSlots()`
Synchronization between entry point changes and backpatching slots
- A global lock is used to ensure that all recorded backpatchable slots corresponding to a `MethodDesc` point to the same entry point, see `DoBackpatch()` and `BackpatchEntryPointSlots()` for examples
Due to startup time perf issues:
- `IsEligibleForTieredCompilation()` is called more frequently with this change and in hotter paths. I chose to use a `MethodDesc` flag to store that information for fast retreival. The flag is initialized by `DetermineAndSetIsEligibleForTieredCompilation()`.
- Initially, I experimented with allowing a method versionable with vtable slot backpatch to have a precode, and allocated a new precode that would also be the stable entry point when a direct call is necessary. That also allows recording a new slot to be optional - in the event of an OOM, the slot may just point to the stable entry point. There are a large number of such methods and the allocations were slowing down startup perf. So, I had to eliminate precodes for methods versionable with vtable slot backpatch and that in turn means that recording slots is necessary for versionability.
Diffstat (limited to 'tests')
-rw-r--r-- | tests/src/baseservices/TieredCompilation/TieredVtableMethodTests.cs | 274 | ||||
-rw-r--r-- | tests/src/baseservices/TieredCompilation/TieredVtableMethodTests.csproj | 22 |
2 files changed, 296 insertions, 0 deletions
diff --git a/tests/src/baseservices/TieredCompilation/TieredVtableMethodTests.cs b/tests/src/baseservices/TieredCompilation/TieredVtableMethodTests.cs new file mode 100644 index 0000000000..039f68399c --- /dev/null +++ b/tests/src/baseservices/TieredCompilation/TieredVtableMethodTests.cs @@ -0,0 +1,274 @@ +// Licensed to the .NET Foundation under one or more agreements. +// The .NET Foundation licenses this file to you under the MIT license. +// See the LICENSE file in the project root for more information. + +using System; +using System.Reflection; +using System.Reflection.Emit; +using System.Runtime.CompilerServices; +using System.Text; +using System.Threading; + +public static class TieredVtableMethodTests +{ + private const int CallCountPerIteration = 8; + + private static StringBuilder s_expectedCallSequence = new StringBuilder(); + private static StringBuilder s_actualCallSequence = new StringBuilder(); + + private static int Main() + { + const int Pass = 100, Fail = 101; + + var baseObj = new Base(); + var derivedObj = new Derived(); + var derivedForDevirtualizationObj = new DerivedForDevirtualization(); + + PromoteToTier1( + () => CallVirtualMethod(baseObj), + () => CallVirtualMethod(derivedObj), + () => CallGenericVirtualMethodWithValueType(baseObj), + () => CallGenericVirtualMethodWithValueType(derivedObj), + () => CallGenericVirtualMethodWithReferenceType(baseObj), + () => CallGenericVirtualMethodWithReferenceType(derivedObj), + () => CallVirtualMethodForDevirtualization(derivedForDevirtualizationObj), + () => CallInterfaceVirtualMethodPolymorhpic(baseObj), + () => CallInterfaceVirtualMethodPolymorhpic(derivedObj)); + + for (int i = 0; i < 4; ++i) + { + CallVirtualMethod(baseObj, CallCountPerIteration); + CallVirtualMethod(derivedObj, CallCountPerIteration); + CallGenericVirtualMethodWithValueType(baseObj, CallCountPerIteration); + CallGenericVirtualMethodWithValueType(derivedObj, CallCountPerIteration); + CallGenericVirtualMethodWithReferenceType(baseObj, CallCountPerIteration); + CallGenericVirtualMethodWithReferenceType(derivedObj, CallCountPerIteration); + CallVirtualMethodForDevirtualization(derivedForDevirtualizationObj, CallCountPerIteration); + CallInterfaceVirtualMethodMonomorphicOnBase(baseObj, CallCountPerIteration); + CallInterfaceVirtualMethodMonomorphicOnDerived(derivedObj, CallCountPerIteration); + CallInterfaceVirtualMethodPolymorhpic(baseObj, CallCountPerIteration); + CallInterfaceVirtualMethodPolymorhpic(derivedObj, CallCountPerIteration); + + for (int j = 0; j < 2; ++j) + { + RunCollectibleIterations(); + + GC.Collect(); + GC.WaitForPendingFinalizers(); + GC.WaitForPendingFinalizers(); + } + } + + if (s_actualCallSequence.Equals(s_expectedCallSequence)) + { + return Pass; + } + + Console.WriteLine($"Expected: {s_expectedCallSequence}"); + Console.WriteLine($"Actual: {s_actualCallSequence}"); + return Fail; + } + + /// Creates a collectible type deriving from <see cref="Base"/> similar to <see cref="Derived"/>. The collectible derived + /// type inherits vtable slots from the base. After multiple iterations of the test, the collectible type will be collected + /// and replaced with another new collectible type. This is used to cover vtable slot backpatching and cleanup of recorded + /// slots in collectible types. + [MethodImpl(MethodImplOptions.NoInlining)] + private static void RunCollectibleIterations() + { + Base collectibleDerivedObj = CreateCollectibleDerived(); + + PromoteToTier1( + () => CallVirtualMethod(collectibleDerivedObj), + () => CallGenericVirtualMethodWithValueType(collectibleDerivedObj), + () => CallGenericVirtualMethodWithReferenceType(collectibleDerivedObj), + () => CallInterfaceVirtualMethodPolymorhpic(collectibleDerivedObj)); + + CallVirtualMethod(collectibleDerivedObj, CallCountPerIteration); + CallGenericVirtualMethodWithValueType(collectibleDerivedObj, CallCountPerIteration); + CallGenericVirtualMethodWithReferenceType(collectibleDerivedObj, CallCountPerIteration); + CallInterfaceVirtualMethodPolymorhpic(collectibleDerivedObj, CallCountPerIteration); + } + + public interface IBase + { + void InterfaceVirtualMethod(); + } + + public class Base : IBase + { + [MethodImpl(MethodImplOptions.NoInlining)] + public virtual void VirtualMethod() + { + s_actualCallSequence.Append("v "); + } + + [MethodImpl(MethodImplOptions.NoInlining)] + public virtual void GenericVirtualMethod<T>(T t) + { + s_actualCallSequence.Append(typeof(T).IsValueType ? "gvv " : "gvr "); + } + + [MethodImpl(MethodImplOptions.NoInlining)] + public virtual void VirtualMethodForDevirtualization() + { + s_actualCallSequence.Append("vd "); + } + + [MethodImpl(MethodImplOptions.NoInlining)] + public virtual void InterfaceVirtualMethod() + { + s_actualCallSequence.Append("iv "); + } + } + + private class Derived : Base + { + // Prevent this type from sharing the vtable chunk from the base + public virtual void VirtualMethod2() + { + } + } + + // Derived type that is sealed for testing devirtualization of calls to inherited virtual methods + private sealed class DerivedForDevirtualization : Derived + { + // Prevent this type from sharing the vtable chunk from the base + public override void VirtualMethod() + { + } + } + + [MethodImpl(MethodImplOptions.NoInlining)] + private static void CallVirtualMethod(Base obj, int count = 1) + { + for (int i = 0; i < count; ++i) + { + s_expectedCallSequence.Append("v "); + obj.VirtualMethod(); + } + } + + [MethodImpl(MethodImplOptions.NoInlining)] + private static void CallGenericVirtualMethodWithValueType(Base obj, int count = 1) + { + for (int i = 0; i < count; ++i) + { + s_expectedCallSequence.Append("gvv "); + obj.GenericVirtualMethod(0); + } + } + + [MethodImpl(MethodImplOptions.NoInlining)] + private static void CallGenericVirtualMethodWithReferenceType(Base obj, int count = 1) + { + var objArg = new object(); + for (int i = 0; i < count; ++i) + { + s_expectedCallSequence.Append("gvr "); + obj.GenericVirtualMethod(objArg); + } + } + + /// The virtual call in this method may be devirtualized because <see cref="DerivedForDevirtualization"/> is sealed + [MethodImpl(MethodImplOptions.NoInlining)] + private static void CallVirtualMethodForDevirtualization(DerivedForDevirtualization obj, int count = 1) + { + for (int i = 0; i < count; ++i) + { + s_expectedCallSequence.Append("vd "); + obj.VirtualMethodForDevirtualization(); + } + } + + /// The interface call site in this method is monomorphic on <see cref="Base"/> and is used to cover dispatch stub + /// backpatching + [MethodImpl(MethodImplOptions.NoInlining)] + private static void CallInterfaceVirtualMethodMonomorphicOnBase(IBase obj, int count = 1) + { + for (int i = 0; i < count; ++i) + { + s_expectedCallSequence.Append("iv "); + obj.InterfaceVirtualMethod(); + } + } + + /// The interface call site in this method is monomorphic on <see cref="Base"/> and is used to cover dispatch stub + /// backpatching + [MethodImpl(MethodImplOptions.NoInlining)] + private static void CallInterfaceVirtualMethodMonomorphicOnDerived(IBase obj, int count = 1) + { + for (int i = 0; i < count; ++i) + { + s_expectedCallSequence.Append("iv "); + obj.InterfaceVirtualMethod(); + } + } + + // The call site in this method is polymorphic and is used to cover resolve cache entry backpatching + [MethodImpl(MethodImplOptions.NoInlining)] + private static void CallInterfaceVirtualMethodPolymorhpic(IBase obj, int count = 1) + { + for (int i = 0; i < count; ++i) + { + s_expectedCallSequence.Append("iv "); + obj.InterfaceVirtualMethod(); + } + } + + private static ulong s_collectibleIndex = 0; + + private static Base CreateCollectibleDerived() + { + ulong collectibleIndex = s_collectibleIndex++; + + var ab = + AssemblyBuilder.DefineDynamicAssembly( + new AssemblyName($"CollectibleDerivedAssembly{collectibleIndex}"), + AssemblyBuilderAccess.RunAndCollect); + var mob = ab.DefineDynamicModule($"CollectibleDerivedModule{collectibleIndex}"); + var tb = + mob.DefineType( + $"CollectibleDerived{collectibleIndex}", + TypeAttributes.Class | TypeAttributes.Public, + typeof(Base)); + + /// Add a virtual method to prevent this type from sharing the vtable chunk from the base, similarly to what is done in + /// <see cref="Derived"/> + { + var mb = + tb.DefineMethod( + "VirtualMethod2", + MethodAttributes.Public | MethodAttributes.Virtual | MethodAttributes.NewSlot); + var ilg = mb.GetILGenerator(); + ilg.Emit(OpCodes.Ret); + } + + return (Base)Activator.CreateInstance(tb.CreateTypeInfo()); + } + + [MethodImpl(MethodImplOptions.NoInlining)] + private static void PromoteToTier1(params Action[] actions) + { + // Call the methods once to register a call each for call counting + foreach (Action action in actions) + { + action(); + } + + // Allow time for call counting to begin + Thread.Sleep(500); + + // Call the methods enough times to trigger tier 1 promotion + for (int i = 0; i < 100; ++i) + { + foreach (Action action in actions) + { + action(); + } + } + + // Allow time for the methods to be jitted at tier 1 + Thread.Sleep(Math.Max(500, 100 * actions.Length)); + } +} diff --git a/tests/src/baseservices/TieredCompilation/TieredVtableMethodTests.csproj b/tests/src/baseservices/TieredCompilation/TieredVtableMethodTests.csproj new file mode 100644 index 0000000000..5c64a51cd6 --- /dev/null +++ b/tests/src/baseservices/TieredCompilation/TieredVtableMethodTests.csproj @@ -0,0 +1,22 @@ +<?xml version="1.0" encoding="utf-8"?> +<Project ToolsVersion="12.0" DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003"> + <Import Project="$([MSBuild]::GetDirectoryNameOfFileAbove($(MSBuildThisFileDirectory), dir.props))\dir.props" /> + <PropertyGroup> + <Configuration Condition=" '$(Configuration)' == '' ">Debug</Configuration> + <Platform Condition=" '$(Platform)' == '' ">AnyCPU</Platform> + <ProjectGuid>{DF8B8A76-BC07-4A5F-BD74-1B5D79B94E92}</ProjectGuid> + <OutputType>Exe</OutputType> + <LangVersion>latest</LangVersion> + <AllowUnsafeBlocks>true</AllowUnsafeBlocks> + <CLRTestPriority>0</CLRTestPriority> + </PropertyGroup> + <!-- Default configurations to help VS understand the configurations --> + <PropertyGroup Condition="'$(Configuration)|$(Platform)' == 'Debug|x64'"> + </PropertyGroup> + <PropertyGroup Condition="'$(Configuration)|$(Platform)' == 'Release|x64'"> + </PropertyGroup> + <ItemGroup> + <Compile Include="TieredVtableMethodTests.cs" /> + </ItemGroup> + <Import Project="$([MSBuild]::GetDirectoryNameOfFileAbove($(MSBuildThisFileDirectory), dir.targets))\dir.targets" /> +</Project> |