diff options
author | Noah Falk <noahfalk@users.noreply.github.com> | 2018-02-21 15:15:09 -0800 |
---|---|---|
committer | GitHub <noreply@github.com> | 2018-02-21 15:15:09 -0800 |
commit | 17541a4655715b68219ca974a07af5e6a985acb1 (patch) | |
tree | ae2e285518578dd9e69a0ccd4cf7a1136e279dee | |
parent | 287cff2d0908876b4dedac61d7faa871b5fbe21e (diff) | |
download | coreclr-17541a4655715b68219ca974a07af5e6a985acb1.tar.gz coreclr-17541a4655715b68219ca974a07af5e6a985acb1.tar.bz2 coreclr-17541a4655715b68219ca974a07af5e6a985acb1.zip |
Add more benchmarks to JitBench (#16353)
* Add more JitBench Benchmarks
a) JitBench includes several more benchmarks and a refactored runner to make it easier to add more
b) A new JitBench/unofficial_dotnet SDK style build project offers simpler developer workflow at the command line and in VS when working with these Benchmarks
c) Misc build issues with local dotnet build in the test tree have workarounds now, but official CI builds still don't support it
d) run-xunit-perf.py has support to run a specific assembly from the testBinLoc folder intended to handle selecting a binary from the folder of binaries produced in the .Net Core SDK project build process. However until CI build can support building projects that way the xunit-perf support is unused.
35 files changed, 3249 insertions, 872 deletions
diff --git a/dependencies.props b/dependencies.props index 9576834d13..ab1e43a319 100644 --- a/dependencies.props +++ b/dependencies.props @@ -39,6 +39,7 @@ <XunitConsoleNetcorePackageVersion>1.0.2-prerelease-00177</XunitConsoleNetcorePackageVersion> <XunitPerformanceApiPackageVersion>1.0.0-beta-build0015</XunitPerformanceApiPackageVersion> <MicrosoftDiagnosticsTracingTraceEventPackageVersion>2.0.4</MicrosoftDiagnosticsTracingTraceEventPackageVersion> + <CommandLineParserVersion>2.1.1</CommandLineParserVersion> <VCRuntimeVersion>1.2.0</VCRuntimeVersion> <!-- Scenario tests install this version of Microsoft.NetCore.App, then patch coreclr binaries via xcopy. At the moment it is diff --git a/tests/dir.props b/tests/dir.props index 66aa422cc3..b29983cfbe 100644 --- a/tests/dir.props +++ b/tests/dir.props @@ -14,17 +14,27 @@ <RoslynPackageName>Microsoft.Net.ToolsetCompilers</RoslynPackageName> </PropertyGroup> - <!-- - Switching to the .NET Core version of the BuildTools tasks seems to break numerous scenarios, such as VS intellisense and resource designer - as well as runnning the build on mono. Until we can get these sorted out we will continue using the .NET 4.6 version of the tasks. - --> + <!-- + Switching to the .NET Core version of the BuildTools tasks seems to break numerous scenarios, such as VS intellisense and resource designer + as well as running the build on mono. Until we can get these sorted out we will continue using the .NET 4.5 version of the tasks. + + It also breaks building any C# project with dotnet.exe on Windows. The windows version of BuildTools doesn't appear to download the .Net Core + Roslyn NuGet package which has a Csc task supporting OverrideToolHost, but the default BuildTools CSharpCore targets file does specify + OverrideToolHost. The result is that building anything in C# when RunningOnCore=true on Windows fails in Csc task with a parameter not supported error. + + Given that the Windows Core scenario is pretty broken on BuildTools it is currently configured not to use BuildTools at all. This allows ad-hoc usage + of the dotnet tool to work in the test tree as a developer convenience. Its not clear that we should invest in improving BuildTools Core support and + instead we could just move to using dotnet officially. + --> <PropertyGroup> <RunningOnCore>false</RunningOnCore> <RunningOnCore Condition="'$(MSBuildRuntimeType)' == 'Core'">true</RunningOnCore> <BuildToolsTargetsDesktop>false</BuildToolsTargetsDesktop> <BuildToolsTargetsDesktop Condition="'$(RunningOnCore)' != 'true'">true</BuildToolsTargetsDesktop> <BuildToolsTargets45>$(BuildToolsTargetsDesktop)</BuildToolsTargets45> - <RunningOnUnix Condition="('$(RunningOnUnix)' == '') And ('$(MSBuildRuntimeType)' == 'Core')">true</RunningOnUnix> + <RunningOnUnix Condition="('$(RunningOnUnix)' == '') And ('$(MSBuildRuntimeType)' == 'Core') And ('$(OS)'!='Windows_NT')">true</RunningOnUnix> + <UseBuildTools>true</UseBuildTools> + <UseBuildTools Condition="'$(OS)'=='Windows_NT' And '$(RunningOnCore)' == 'true'">false</UseBuildTools> </PropertyGroup> <!-- Common repo directories --> @@ -37,7 +47,7 @@ <DotnetCliPath Condition="'$(DotnetCliPath)'==''">$(ToolsDir)dotnetcli\</DotnetCliPath> <BuildToolsTaskDir Condition="'$(BuildToolsTargets45)' == 'true'">$(ToolsDir)net46\</BuildToolsTaskDir> <OverrideToolHost Condition="'$(OS)' != 'Windows_NT'">$(DotnetCliPath)dotnet</OverrideToolHost> - <CSharpCoreTargetsPath Condition="'$(BuildToolsTargetsDesktop)' != 'true'">$(ToolsDir)\Microsoft.CSharp.Core.targets</CSharpCoreTargetsPath> + <CSharpCoreTargetsPath Condition="('$(BuildToolsTargetsDesktop)' != 'true') And ('$(UseBuildTools)'=='true')">$(ToolsDir)\Microsoft.CSharp.Core.targets</CSharpCoreTargetsPath> <!-- We don't use any of MSBuild's resolution logic for resolving the framework, so just set these two properties to any folder that exists to skip the GenerateReferenceAssemblyPaths task (not target) and to prevent it from outputting a warning (MSB3644). --> <_TargetFrameworkDirectories Condition="'$(BuildToolsTargetsDesktop)' != 'true'">$(MSBuildThisFileDirectory)/Documentation</_TargetFrameworkDirectories> @@ -77,7 +87,7 @@ </PropertyGroup> <!-- Import Build tools common props file where repo-independent properties are found --> - <Import Condition="Exists('$(ToolsDir)Build.Common.props')" Project="$(ToolsDir)Build.Common.props" /> + <Import Condition="Exists('$(ToolsDir)Build.Common.props') And '$(UseBuildTools)'=='true'" Project="$(ToolsDir)Build.Common.props" /> <!-- Provides properties for dependency versions and configures dependency verification/auto-upgrade. --> <Import Project="$(ProjectDir)..\dependencies.props" /> diff --git a/tests/scripts/run-xunit-perf.py b/tests/scripts/run-xunit-perf.py index a214b412ac..6e0a0bade3 100755 --- a/tests/scripts/run-xunit-perf.py +++ b/tests/scripts/run-xunit-perf.py @@ -18,6 +18,7 @@ description = 'Tool to run coreclr perf tests' parser = argparse.ArgumentParser(description=description) parser.add_argument('-testBinLoc', dest='coreclrPerf', default=None, required=True) +parser.add_argument('-assemblyName', dest='assemblyName', default=None) parser.add_argument('-arch', dest='arch', default='x64', choices=['x64', 'x86']) parser.add_argument('-os', dest='operatingSystem', default=sys.platform, choices=['Windows_NT', 'Ubuntu16.04', 'Ubuntu14.04', 'OSX', sys.platform]) parser.add_argument('-configuration', dest='configuration', default='Release', choices=['Release', 'Checked', 'Debug']) @@ -72,6 +73,8 @@ def validate_args(args): coreclrPerf = os.path.join(os.getcwd(), args.coreclrPerf) validate_arg(coreclrPerf, lambda item: os.path.isdir(item)) + if(args.assemblyName != None): + validate_arg(args.assemblyName, lambda item: os.path.isfile(os.path.join(coreclrPerf, item))) if args.benchviewPath is not None: validate_arg(args.benchviewPath, lambda item: os.path.isdir(item)) @@ -84,6 +87,7 @@ def validate_args(args): log('jitName: %s' % args.jitName) log('optLevel: %s' % args.optLevel) log('coreclrPerf: %s' % coreclrPerf) + log('assemblyName: %s' % args.assemblyName) log('better: %s' % args.better) log('runType: %s' % args.runType) log('configuration: %s' % args.configuration) @@ -104,7 +108,7 @@ def validate_args(args): log('collectionFlags: %s' % args.collectionFlags) log('uploadToBenchview: %s' % args.uploadToBenchview) - return (coreclrPerf, args.arch, args.operatingSystem, args.configuration, args.jitName, args.optLevel, args.runType, args.outputDir, args.stabilityPrefix, args.isScenarioTest, args.benchviewPath, args.isPgoOptimized, args.benchviewGroup, args.hasWarmupRun, args.collectionFlags, args.library, args.uploadToBenchview, args.better, args.sliceNumber, args.sliceConfigFile) + return (coreclrPerf, args.assemblyName, args.arch, args.operatingSystem, args.configuration, args.jitName, args.optLevel, args.runType, args.outputDir, args.stabilityPrefix, args.isScenarioTest, args.benchviewPath, args.isPgoOptimized, args.benchviewGroup, args.hasWarmupRun, args.collectionFlags, args.library, args.uploadToBenchview, args.better, args.sliceNumber, args.sliceConfigFile) def log(message): """ Print logging information @@ -411,7 +415,7 @@ def main(args): log("Python 3.5 or newer is required") return 1 - coreclrPerf, arch, operatingSystem, configuration, jitName, optLevel, runType, outputDir, stabilityPrefix, isScenarioTest, benchviewPath, isPgoOptimized, benchviewGroup, hasWarmupRun, collectionFlags, isLibrary, uploadToBenchview, better, sliceNumber, sliceConfigFile = validate_args(args) + coreclrPerf, assemblyName, arch, operatingSystem, configuration, jitName, optLevel, runType, outputDir, stabilityPrefix, isScenarioTest, benchviewPath, isPgoOptimized, benchviewGroup, hasWarmupRun, collectionFlags, isLibrary, uploadToBenchview, better, sliceNumber, sliceConfigFile = validate_args(args) platform = sys.platform python = 'py' @@ -474,8 +478,13 @@ def main(args): for benchmark in data["slices"][sliceNumber]["folders"]: benchmarks += [benchmark] + # If slice was not specified, either: + # - run a specific indicated benchmark assembly in coreclrPerf directory if assemblyName is set + # - otherwise run everything in the coreclrPerf directory. + elif assemblyName != None: + name,ext = os.path.splitext(assemblyName) + benchmarks = [{'directory' : '', 'extraFlags': '-library' if ext == '.dll' else '', 'benchname': name}] else: - # If slice was not specified, run everything in the coreclrPerf directory. Set benchmarks to an empty string benchmarks = [{ 'directory' : '', 'extraFlags': '-library' if isLibrary else ''}] testFileExt = 'dll' if isLibrary else 'exe' @@ -493,10 +502,9 @@ def main(args): for root, dirs, files in os.walk(testPath): for f in files: - if f.endswith(testFileExt): + benchname, ext = os.path.splitext(f) + if f.endswith(testFileExt) and ((not 'benchname' in benchmark) or benchmark['benchname'] == benchname): totalBenchmarks += 1 - benchname, ext = os.path.splitext(f) - benchmarkOutputDir = os.path.join(sandboxOutputDir, 'Scenarios') if isScenarioTest else os.path.join(sandboxOutputDir, 'Microbenchmarks') benchmarkOutputDir = os.path.join(benchmarkOutputDir, etwCollection, benchname) diff --git a/tests/src/dirs.proj b/tests/src/dirs.proj index c70e91f5f0..e1f8005f2b 100644 --- a/tests/src/dirs.proj +++ b/tests/src/dirs.proj @@ -28,6 +28,7 @@ <DisabledProjects Include="JIT\superpmi\superpmicollect.csproj" Condition="('$(BuildTestsAgainstPackages)' == 'true') Or ('$(BuildOS)' != 'Windows_NT')" /> <DisabledProjects Include="JIT\config\**" /> <DisabledProjects Include="Performance\performance.csproj" /> + <DisabledProjects Include="Performance\Scenario\JitBench\unofficial_dotnet\JitBench.csproj" /> <!-- no official build support for SDK-style netcoreapp2.0 projects --> <DisabledProjects Include="Loader\classloader\generics\regressions\DD117522\Test.csproj" /> <DisabledProjects Include="Loader\classloader\generics\GenericMethods\VSW491668.csproj" /> <!-- issue 5501 --> <DisabledProjects Include="tracing\eventpipetrace\**" /> <!-- issue 15924 --> diff --git a/tests/src/performance/Scenario/JitBench/Benchmarks/BuildHelloWorldBenchmark.cs b/tests/src/performance/Scenario/JitBench/Benchmarks/BuildHelloWorldBenchmark.cs new file mode 100644 index 0000000000..0215fa3dad --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/Benchmarks/BuildHelloWorldBenchmark.cs @@ -0,0 +1,47 @@ +using System; +using System.Collections.Generic; +using System.IO; +using System.Text; +using System.Threading.Tasks; + +namespace JitBench +{ + public class BuildHelloWorldBenchmark : Benchmark + { + public BuildHelloWorldBenchmark() : base("Dotnet_Build_HelloWorld") { } + + public override async Task Setup(DotNetInstallation dotNetInstall, string intermediateOutputDir, bool useExistingSetup, ITestOutputHelper output) + { + using (var setupSection = new IndentedTestOutputHelper("Setup " + Name, output)) + { + await SetupHelloWorldProject(dotNetInstall.DotNetExe, intermediateOutputDir, useExistingSetup, setupSection); + } + } + + protected async Task SetupHelloWorldProject(string dotNetExePath, string intermediateOutputDir, bool useExistingSetup, ITestOutputHelper output) + { + string helloWorldProjectDir = Path.Combine(intermediateOutputDir, "helloworld"); + //the 'exePath' gets passed as an argument to dotnet.exe + //in this case it isn't an executable at all, its a CLI command + //a little cheap, but it works + ExePath = "build"; + WorkingDirPath = helloWorldProjectDir; + + // This disables using the shared build server. I was told using it interferes with the ability to delete folders after the + // test is complete though I haven't encountered that particular issue myself. I imagine this meaningfully changes the + // performance of this benchmark, so if we ever want to do real perf testing on the shared scenario we have to resolve this + // issue another way. + EnvironmentVariables["UseSharedCompilation"] = "false"; + + if(!useExistingSetup) + { + FileTasks.DeleteDirectory(helloWorldProjectDir, output); + FileTasks.CreateDirectory(helloWorldProjectDir, output); + await new ProcessRunner(dotNetExePath, "new console") + .WithWorkingDirectory(helloWorldProjectDir) + .WithLog(output) + .Run(); + } + } + } +} diff --git a/tests/src/performance/Scenario/JitBench/Benchmarks/CscBenchmark.cs b/tests/src/performance/Scenario/JitBench/Benchmarks/CscBenchmark.cs new file mode 100644 index 0000000000..a738ba415c --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/Benchmarks/CscBenchmark.cs @@ -0,0 +1,53 @@ +using System; +using System.Collections.Generic; +using System.IO; +using System.Text; +using System.Threading.Tasks; + +namespace JitBench +{ + public abstract class CscBenchmark : Benchmark + { + public CscBenchmark(string name) : base(name) { } + + public override async Task Setup(DotNetInstallation dotNetInstall, string intermediateOutputDir, bool useExistingSetup, ITestOutputHelper output) + { + using (var setupSection = new IndentedTestOutputHelper("Setup " + Name, output)) + { + SetupCscBinDir(dotNetInstall.SdkDir, dotNetInstall.FrameworkVersion, intermediateOutputDir, useExistingSetup, setupSection); + await SetupSourceToCompile(intermediateOutputDir, dotNetInstall.FrameworkDir, useExistingSetup, setupSection); + } + } + + protected void SetupCscBinDir(string sdkDirPath, string runtimeVersion, string intermediateOutputDir, bool useExistingSetup, ITestOutputHelper output) + { + // copy the SDK version of csc into a private directory so we can safely retarget it + string cscBinaryDirPath = Path.Combine(sdkDirPath, "Roslyn", "bincore"); + string localCscDir = Path.Combine(intermediateOutputDir, "csc"); + ExePath = Path.Combine(localCscDir, "csc.dll"); + + if(useExistingSetup) + { + return; + } + + FileTasks.DirectoryCopy(cscBinaryDirPath, localCscDir, output); + //overwrite csc.runtimeconfig.json to point at the runtime version we want to use + string runtimeConfigPath = Path.Combine(localCscDir, "csc.runtimeconfig.json"); + File.Delete(runtimeConfigPath); + File.WriteAllLines(runtimeConfigPath, new string[] { + "{", + " \"runtimeOptions\": {", + " \"tfm\": \"netcoreapp2.0\",", + " \"framework\": {", + " \"name\": \"Microsoft.NETCore.App\",", + " \"version\": \"" + runtimeVersion + "\"", + " }", + " }", + "}" + }); + } + + protected abstract Task SetupSourceToCompile(string intermediateOutputDir, string runtimeDirPath, bool useExistingSetup, ITestOutputHelper output); + } +} diff --git a/tests/src/performance/Scenario/JitBench/Benchmarks/CscHelloWorldBenchmark.cs b/tests/src/performance/Scenario/JitBench/Benchmarks/CscHelloWorldBenchmark.cs new file mode 100644 index 0000000000..aead67a025 --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/Benchmarks/CscHelloWorldBenchmark.cs @@ -0,0 +1,47 @@ +using System; +using System.Collections.Generic; +using System.IO; +using System.Runtime.InteropServices; +using System.Text; +using System.Threading.Tasks; + +namespace JitBench +{ + class CscHelloWorldBenchmark : CscBenchmark + { + public CscHelloWorldBenchmark() : base("Csc_Hello_World") + { + } + +#pragma warning disable CS1998 // Async method lacks 'await' operators and will run synchronously + protected override async Task SetupSourceToCompile(string intermediateOutputDir, string runtimeDirPath, bool useExistingSetup, ITestOutputHelper output) +#pragma warning restore CS1998 + { + string helloWorldDir = Path.Combine(intermediateOutputDir, "helloWorldSource"); + string helloWorldPath = Path.Combine(helloWorldDir, "hello.cs"); + string systemPrivateCoreLibPath = Path.Combine(runtimeDirPath, "System.Private.CoreLib.dll"); + string systemRuntimePath = Path.Combine(runtimeDirPath, "System.Runtime.dll"); + string systemConsolePath = Path.Combine(runtimeDirPath, "System.Console.dll"); + CommandLineArguments = "hello.cs /nostdlib /r:" + systemPrivateCoreLibPath + " /r:" + systemRuntimePath + " /r:" + systemConsolePath; + WorkingDirPath = helloWorldDir; + if(useExistingSetup) + { + return; + } + + FileTasks.DeleteDirectory(helloWorldDir, output); + FileTasks.CreateDirectory(helloWorldDir, output); + File.WriteAllLines(helloWorldPath, new string[] + { + "using System;", + "public static class Program", + "{", + " public static void Main(string[] args)", + " {", + " Console.WriteLine(\"Hello World!\");", + " }", + "}" + }); + } + } +} diff --git a/tests/src/performance/Scenario/JitBench/Benchmarks/CscRoslynSourceBenchmark.cs b/tests/src/performance/Scenario/JitBench/Benchmarks/CscRoslynSourceBenchmark.cs new file mode 100644 index 0000000000..92f91ae1d7 --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/Benchmarks/CscRoslynSourceBenchmark.cs @@ -0,0 +1,32 @@ +using System; +using System.Collections.Generic; +using System.IO; +using System.Runtime.InteropServices; +using System.Text; +using System.Threading.Tasks; + +namespace JitBench +{ + class CscRoslynSourceBenchmark : CscBenchmark + { + public CscRoslynSourceBenchmark() : base("Csc_Roslyn_Source") + { + } + + protected override async Task SetupSourceToCompile(string intermediateOutputDir, string runtimeDirPath, bool useExistingSetup, ITestOutputHelper output) + { + string cscSourceDownloadLink = "https://roslyninfra.blob.core.windows.net/perf-artifacts/CodeAnalysisRepro" + + (RuntimeInformation.IsOSPlatform(OSPlatform.Windows) ? ".zip" : ".tar.gz"); + string sourceDownloadDir = Path.Combine(intermediateOutputDir, "roslynSource"); + string sourceDir = Path.Combine(sourceDownloadDir, "CodeAnalysisRepro"); + CommandLineArguments = "@repro.rsp"; + WorkingDirPath = sourceDir; + if(useExistingSetup) + { + return; + } + + await FileTasks.DownloadAndUnzip(cscSourceDownloadLink, sourceDownloadDir, output); + } + } +} diff --git a/tests/src/performance/Scenario/JitBench/Benchmarks/MusicStoreBenchmark.cs b/tests/src/performance/Scenario/JitBench/Benchmarks/MusicStoreBenchmark.cs new file mode 100644 index 0000000000..f8bc04f1bb --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/Benchmarks/MusicStoreBenchmark.cs @@ -0,0 +1,232 @@ +using System; +using System.IO; +using System.Text.RegularExpressions; +using System.Threading.Tasks; +using Microsoft.Xunit.Performance.Api; + +namespace JitBench +{ + class MusicStoreBenchmark : Benchmark + { + public MusicStoreBenchmark() : base("MusicStore") + { + } + + public override async Task Setup(DotNetInstallation dotNetInstall, string outputDir, bool useExistingSetup, ITestOutputHelper output) + { + if(!useExistingSetup) + { + using (var setupSection = new IndentedTestOutputHelper("Setup " + Name, output)) + { + await DownloadAndExtractJitBenchRepo(outputDir, setupSection); + await CreateStore(dotNetInstall, outputDir, setupSection); + await Publish(dotNetInstall, outputDir, setupSection); + } + } + string musicStoreSrcDirectory = GetMusicStoreSrcDirectory(outputDir); + string tfm = DotNetSetup.GetTargetFrameworkMonikerForFrameworkVersion(dotNetInstall.FrameworkVersion); + ExePath = "MusicStore.dll"; + WorkingDirPath = GetMusicStorePublishDirectory(outputDir, tfm); + EnvironmentVariables.Add("DOTNET_SHARED_STORE", GetMusicStoreStoreDir(outputDir)); + } + + async Task DownloadAndExtractJitBenchRepo(string outputDir, ITestOutputHelper output) + { + // If the repo already exists, we delete it and extract it again. + string jitBenchRepoRootDir = GetJitBenchRepoRootDir(outputDir); + FileTasks.DeleteDirectory(jitBenchRepoRootDir, output); + + string localJitBenchRepo = GetLocalJitBenchRepoDirectory(); + if (localJitBenchRepo == null) + { + var url = $"{JitBenchRepoUrl}/archive/{JitBenchCommitSha1Id}.zip"; + FileTasks.DeleteDirectory(jitBenchRepoRootDir + "_temp", output); + await FileTasks.DownloadAndUnzip(url, jitBenchRepoRootDir+"_temp", output); + FileTasks.MoveDirectory(Path.Combine(jitBenchRepoRootDir + "_temp", $"JitBench-{JitBenchCommitSha1Id}"), jitBenchRepoRootDir, output); + } + else + { + if (!Directory.Exists(localJitBenchRepo)) + { + throw new Exception("Local JitBench repo " + localJitBenchRepo + " does not exist"); + } + FileTasks.DirectoryCopy(localJitBenchRepo, jitBenchRepoRootDir, output); + } + } + + private static async Task CreateStore(DotNetInstallation dotNetInstall, string outputDir, ITestOutputHelper output) + { + string tfm = DotNetSetup.GetTargetFrameworkMonikerForFrameworkVersion(dotNetInstall.FrameworkVersion); + string rid = $"win7-{dotNetInstall.Architecture}"; + string storeDirName = ".store"; + await new ProcessRunner("powershell.exe", $".\\AspNet-GenerateStore.ps1 -InstallDir {storeDirName} -Architecture {dotNetInstall.Architecture} -Runtime {rid}") + .WithWorkingDirectory(GetJitBenchRepoRootDir(outputDir)) + .WithEnvironmentVariable("PATH", $"{dotNetInstall.DotNetDir};{Environment.GetEnvironmentVariable("PATH")}") + .WithEnvironmentVariable("DOTNET_MULTILEVEL_LOOKUP", "0") + .WithEnvironmentVariable("JITBENCH_TARGET_FRAMEWORK_MONIKER", tfm) + .WithEnvironmentVariable("JITBENCH_FRAMEWORK_VERSION", dotNetInstall.FrameworkVersion) + .WithLog(output) + .Run(); + } + + private static async Task<string> Publish(DotNetInstallation dotNetInstall, string outputDir, ITestOutputHelper output) + { + string tfm = DotNetSetup.GetTargetFrameworkMonikerForFrameworkVersion(dotNetInstall.FrameworkVersion); + string publishDir = GetMusicStorePublishDirectory(outputDir, tfm); + string manifestPath = Path.Combine(GetMusicStoreStoreDir(outputDir), dotNetInstall.Architecture, tfm, "artifact.xml"); + FileTasks.DeleteDirectory(publishDir, output); + string dotNetExePath = dotNetInstall.DotNetExe; + await new ProcessRunner(dotNetExePath, $"publish -c Release -f {tfm} --manifest {manifestPath}") + .WithWorkingDirectory(GetMusicStoreSrcDirectory(outputDir)) + .WithEnvironmentVariable("DOTNET_MULTILEVEL_LOOKUP", "0") + .WithEnvironmentVariable("JITBENCH_ASPNET_VERSION", "2.0") + .WithEnvironmentVariable("JITBENCH_TARGET_FRAMEWORK_MONIKER", tfm) + .WithEnvironmentVariable("JITBENCH_TARGET_FRAMEWORK_VERSION", dotNetInstall.FrameworkVersion) + .WithEnvironmentVariable("UseSharedCompilation", "false") + .WithLog(output) + .Run(); + return publishDir; + } + + public override Metric[] GetDefaultDisplayMetrics() + { + return new Metric[] + { + StartupMetric, + FirstRequestMetric, + MedianResponseMetric + }; + } + + protected override IterationResult RecordIterationMetrics(ScenarioExecutionResult scenarioIteration, string stdout, string stderr, ITestOutputHelper output) + { + IterationResult result = base.RecordIterationMetrics(scenarioIteration, stdout, stderr, output); + AddConsoleMetrics(result, stdout, output); + return result; + } + + void AddConsoleMetrics(IterationResult result, string stdout, ITestOutputHelper output) + { + output.WriteLine("Processing iteration results."); + + double? startupTime = null; + double? firstRequestTime = null; + double? steadyStateMedianTime = null; + + using (var reader = new StringReader(stdout)) + { + string line; + while ((line = reader.ReadLine()) != null) + { + Match match = Regex.Match(line, @"^Server start \(ms\): \s*(\d+)\s*$"); + if (match.Success && match.Groups.Count == 2) + { + startupTime = Convert.ToDouble(match.Groups[1].Value); + continue; + } + + match = Regex.Match(line, @"^1st Request \(ms\): \s*(\d+)\s*$"); + if (match.Success && match.Groups.Count == 2) + { + firstRequestTime = Convert.ToDouble(match.Groups[1].Value); + continue; + } + + //the steady state output chart looks like: + // Requests Aggregate Time(ms) Req/s Req Min(ms) Req Mean(ms) Req Median(ms) Req Max(ms) SEM(%) + // ---------- ------------------ ----- ----------- ------------ -------------- ----------- ------ + // 2- 100 5729 252.60 3.01 3.96 3.79 9.81 1.86 + // 101- 250 6321 253.76 3.40 3.94 3.84 5.25 0.85 + // ... many more rows ... + + // Requests Agg req/s min mean median max SEM + match = Regex.Match(line, @"^\s*\d+-\s*\d+ \s* \d+ \s* \d+\.\d+ \s* \d+\.\d+ \s* (\d+\.\d+) \s* (\d+\.\d+) \s* \d+\.\d+ \s* \d+\.\d+$"); + if (match.Success && match.Groups.Count == 3) + { + //many lines will match, but the final values of these variables will be from the last batch which is presumably the + //best measurement of steady state performance + steadyStateMedianTime = Convert.ToDouble(match.Groups[2].Value); + continue; + } + } + } + + if (!startupTime.HasValue) + throw new FormatException("Startup time was not found."); + if (!firstRequestTime.HasValue) + throw new FormatException("First Request time was not found."); + if (!steadyStateMedianTime.HasValue) + throw new FormatException("Steady state median response time not found."); + + + result.Measurements.Add(StartupMetric, startupTime.Value); + result.Measurements.Add(FirstRequestMetric, firstRequestTime.Value); + result.Measurements.Add(MedianResponseMetric, steadyStateMedianTime.Value); + + output.WriteLine($"Server started in {startupTime}ms"); + output.WriteLine($"Request took {firstRequestTime}ms"); + output.WriteLine($"Median steady state response {steadyStateMedianTime.Value}ms"); + } + + /// <summary> + /// When serializing the result data to benchview this is called to determine if any of the metrics should be reported differently + /// than they were collected. MusicStore uses this to collect several measurements in each iteration, then present those measurements + /// to benchview as if each was the Duration metric of a distinct scenario test with its own set of iterations. + /// </summary> + public override bool TryGetBenchviewCustomMetricReporting(Metric originalMetric, out Metric newMetric, out string newScenarioModelName) + { + if(originalMetric.Equals(StartupMetric)) + { + newScenarioModelName = "Startup"; + } + else if (originalMetric.Equals(FirstRequestMetric)) + { + newScenarioModelName = "First Request"; + } + else if (originalMetric.Equals(MedianResponseMetric)) + { + newScenarioModelName = "Median Response"; + } + else + { + return base.TryGetBenchviewCustomMetricReporting(originalMetric, out newMetric, out newScenarioModelName); + } + newMetric = Metric.ElapsedTimeMilliseconds; + return true; + } + + static string GetJitBenchRepoRootDir(string outputDir) + { + return Path.Combine(outputDir, "J"); + } + + static string GetMusicStoreSrcDirectory(string outputDir) + { + return Path.Combine(GetJitBenchRepoRootDir(outputDir), "src", "MusicStore"); + } + + static string GetMusicStorePublishDirectory(string outputDir, string tfm) + { + return Path.Combine(GetMusicStoreSrcDirectory(outputDir), "bin", "Release", tfm, "publish"); + } + + static string GetMusicStoreStoreDir(string outputDir) + { + return Path.Combine(GetJitBenchRepoRootDir(outputDir), StoreDirName); + } + + string GetLocalJitBenchRepoDirectory() + { + return Environment.GetEnvironmentVariable("MUSICSTORE_PRIVATE_REPO"); + } + + private const string JitBenchRepoUrl = "https://github.com/aspnet/JitBench"; + private const string JitBenchCommitSha1Id = "6e1327b633e2d7d45f4c13f498fc27698ea5735a"; + private const string EnvironmentFileName = "JitBenchEnvironment.txt"; + private const string StoreDirName = ".store"; + private readonly Metric StartupMetric = new Metric("Startup", "ms"); + private readonly Metric FirstRequestMetric = new Metric("First Request", "ms"); + private readonly Metric MedianResponseMetric = new Metric("Median Response", "ms"); + private readonly Metric MeanResponseMetric = new Metric("Mean Response", "ms"); + } +} diff --git a/tests/src/performance/Scenario/JitBench/IterationData.cs b/tests/src/performance/Scenario/JitBench/IterationData.cs deleted file mode 100644 index 82b5b3127e..0000000000 --- a/tests/src/performance/Scenario/JitBench/IterationData.cs +++ /dev/null @@ -1,26 +0,0 @@ -// Licensed to the .NET Foundation under one or more agreements. -// The .NET Foundation licenses this file to you under the MIT license. -// See the LICENSE file in the project root for more information. - -using Microsoft.Xunit.Performance.Api; - -namespace JitBench -{ - /// <summary> - /// Interface used to buffer each scenario iteration/run for later post processing. - /// </summary> - internal class IterationData - { - public ScenarioExecutionResult ScenarioExecutionResult { get; set; } - - public string StandardOutput { get; set; } - - public double StartupTime { get; set; } - - public double FirstRequestTime { get; set; } - - public double SteadystateTime { get; set; } - - public double SteadystateMedianTime { get; set; } - } -} diff --git a/tests/src/performance/Scenario/JitBench/JitBench.csproj b/tests/src/performance/Scenario/JitBench/JitBench.csproj index d8d2f872f7..2e384d183e 100644 --- a/tests/src/performance/Scenario/JitBench/JitBench.csproj +++ b/tests/src/performance/Scenario/JitBench/JitBench.csproj @@ -29,9 +29,9 @@ </CodeAnalysisDependentAssemblyPaths> </ItemGroup> <ItemGroup> - <Compile Include="IterationData.cs" /> - <Compile Include="JitBenchHarness.cs" /> - <Compile Include="JitBenchHarnessOptions.cs" /> + <Compile Include="Runner\*.cs" /> + <Compile Include="Benchmarks\*.cs" /> + <Compile Include="Utilities\*.cs" /> <Compile Include="$(BaseIntermediateOutputPath)AutoGeneratedVersioningConstants.cs" /> </ItemGroup> <ItemGroup> diff --git a/tests/src/performance/Scenario/JitBench/JitBenchHarness.cs b/tests/src/performance/Scenario/JitBench/JitBenchHarness.cs deleted file mode 100644 index 50211f28df..0000000000 --- a/tests/src/performance/Scenario/JitBench/JitBenchHarness.cs +++ /dev/null @@ -1,691 +0,0 @@ -// Licensed to the .NET Foundation under one or more agreements. -// The .NET Foundation licenses this file to you under the MIT license. -// See the LICENSE file in the project root for more information. - -using Microsoft.Xunit.Performance.Api; -using Microsoft.Xunit.Performance.Api.Profilers.Etw; -using System; -using System.Collections.Generic; -using System.Diagnostics; -using System.IO; -using System.IO.Compression; -using System.Linq; -using System.Net.Http; -using System.Text; -using System.Text.RegularExpressions; - -namespace JitBench -{ - class JitBenchHarness - { - static void Main(string[] args) - { - // The flag below is set to false to prevent the VBCSCompiler.exe hanging around - // after the performance execution finished and preventing the deletion of the folder. - Environment.SetEnvironmentVariable("UseSharedCompilation", "false"); - - var options = JitBenchHarnessOptions.Parse(args); - - SetupStatics(options); - - using (var h = new XunitPerformanceHarness(args)) - { - ProcessStartInfo startInfo = options.UseExistingSetup ? UseExistingSetup() : CreateNewSetup(); - - string scenarioName = "MusicStore"; - if (!startInfo.Environment.ContainsKey("DOTNET_MULTILEVEL_LOOKUP")) - throw new InvalidOperationException("DOTNET_MULTILEVEL_LOOKUP was not defined."); - if (startInfo.Environment["DOTNET_MULTILEVEL_LOOKUP"] != "0") - throw new InvalidOperationException("DOTNET_MULTILEVEL_LOOKUP was not set to 0."); - - if (options.EnableTiering) - { - startInfo.Environment.Add("COMPlus_EXPERIMENTAL_TieredCompilation", "1"); - scenarioName += " Tiering"; - } - if (options.Minopts) - { - startInfo.Environment.Add("COMPlus_JITMinOpts", "1"); - scenarioName += " Minopts"; - } - - if (options.DisableR2R) - { - startInfo.Environment.Add("COMPlus_ReadyToRun", "0"); - scenarioName += " NoR2R"; - } - - if (options.DisableNgen) - { - startInfo.Environment.Add("COMPlus_ZapDisable", "1"); - scenarioName += " NoNgen"; - } - - PrintHeader($"Running scenario '{scenarioName}'"); - - var program = new JitBenchHarness(); - try - { - var scenarioConfiguration = new ScenarioTestConfiguration(TimeSpan.FromMilliseconds(60000), startInfo) - { - Iterations = (int)options.Iterations, - PreIterationDelegate = program.PreIteration, - PostIterationDelegate = program.PostIteration, - Scenario = new ScenarioBenchmark("JitBench"), - }; - var processesOfInterest = new string[] { - "dotnet.exe", - }; - var modulesOfInterest = new string[] { - "Anonymously Hosted DynamicMethods Assembly", - "clrjit.dll", - "coreclr.dll", - "dotnet.exe", - "MusicStore.dll", - "ntoskrnl.exe", - "System.Private.CoreLib.dll", - "Unknown", - }; - - if (!File.Exists(startInfo.FileName)) - throw new FileNotFoundException(startInfo.FileName); - if (!Directory.Exists(startInfo.WorkingDirectory)) - throw new DirectoryNotFoundException(startInfo.WorkingDirectory); - - h.RunScenario(scenarioConfiguration, teardownDelegate: (ScenarioBenchmark scenarioBenchmark) => - { - program.PostRun(scenarioBenchmark, "MusicStore", processesOfInterest, modulesOfInterest); - }); - } - catch - { - Console.WriteLine(program.StandardOutput); - Console.WriteLine(program.StandardError); - throw; - } - } - } - - public JitBenchHarness() - { - _stdout = new StringBuilder(); - _stderr = new StringBuilder(); - IterationsData = new List<IterationData>(); - } - - public string StandardOutput => _stdout.ToString(); - - public string StandardError => _stderr.ToString(); - - private static void SetupStatics(JitBenchHarnessOptions options) - { - s_temporaryDirectory = options.IntermediateOutputDirectory; - s_targetArchitecture = options.TargetArchitecture; - if (string.IsNullOrWhiteSpace(s_targetArchitecture)) - throw new ArgumentNullException("Unspecified target architecture."); - - // J == JitBench folder. By reducing the length of the directory - // name we attempt to reduce the chances of hitting PATH length - // problems we have been hitting in the lab. - // The changes we have done have reduced it in this way: - // C:\Jenkins\workspace\perf_scenario---5b001a46\bin\sandbox\JitBench\JitBench-dev - // C:\j\workspace\perf_scenario---5b001a46\bin\sandbox\JitBench\JitBench-dev - // C:\j\w\perf_scenario---5b001a46\bin\sandbox\JitBench\JitBench-dev - // C:\j\w\perf_scenario---5b001a46\bin\sandbox\J - s_jitBenchDevDirectory = Path.Combine(s_temporaryDirectory, "J"); - s_dotnetProcessFileName = Path.Combine(s_jitBenchDevDirectory, ".dotnet", "dotnet.exe"); - s_musicStoreDirectory = Path.Combine(s_jitBenchDevDirectory, "src", "MusicStore"); - - s_localJitBenchRepo = options.LocalJitBenchRepo; - if(s_localJitBenchRepo != null && !Directory.Exists(s_localJitBenchRepo)) - { - throw new Exception("Requested local JitBench repo " + s_localJitBenchRepo + " does not exist"); - } - } - - private static void DownloadAndExtractJitBenchRepo() - { - // If the repo already exists, we delete it and extract it again. - if (Directory.Exists(s_jitBenchDevDirectory)) - Directory.Delete(s_jitBenchDevDirectory, true); - - if (s_localJitBenchRepo == null) - { - using (var client = new HttpClient()) - { - var archiveName = $"{JitBenchCommitSha1Id}.zip"; - var url = $"{JitBenchRepoUrl}/archive/{archiveName}"; - var zipFile = Path.Combine(s_temporaryDirectory, archiveName); - - using (FileStream tmpzip = File.Create(zipFile)) - { - using (Stream stream = client.GetStreamAsync(url).Result) - stream.CopyTo(tmpzip); - tmpzip.Flush(); - } - - // This step will create s_JitBenchDevDirectory. - ZipFile.ExtractToDirectory(zipFile, s_temporaryDirectory); - Directory.Move(Path.Combine(s_temporaryDirectory, $"JitBench-{JitBenchCommitSha1Id}"), s_jitBenchDevDirectory); - } - } - else - { - DirectoryCopy(s_localJitBenchRepo, s_jitBenchDevDirectory); - } - } - - private static void DirectoryCopy(string sourceDir, string destDir) - { - DirectoryInfo dir = new DirectoryInfo(sourceDir); - - DirectoryInfo[] dirs = dir.GetDirectories(); - if (!Directory.Exists(destDir)) - { - Directory.CreateDirectory(destDir); - } - - FileInfo[] files = dir.GetFiles(); - foreach (FileInfo file in files) - { - string temppath = Path.Combine(destDir, file.Name); - file.CopyTo(temppath, false); - } - - foreach (DirectoryInfo subdir in dirs) - { - string temppath = Path.Combine(destDir, subdir.Name); - DirectoryCopy(subdir.FullName, temppath); - } - } - - private static IDictionary<string, string> SetupJitBench() - { - // This step generates some environment variables needed later. - string coreclrPrivateBinDir = Directory.GetCurrentDirectory(); - var psi = new ProcessStartInfo() - { - WorkingDirectory = s_jitBenchDevDirectory, - FileName = "powershell.exe", - Arguments = $"-Command \".\\RunBenchmark.ps1 " + - $"-SetupOnly " + - $"-Architecture {s_targetArchitecture} " + - $"-Rid win7-{s_targetArchitecture} " + - $"-FrameworkVersion: {VersioningConstants.MicrosoftNetCoreAppPackageVersion} " + - $"-PrivateCoreClrBinDirPath {coreclrPrivateBinDir} " + - $"; gi env:PATH, env:JITBENCH_*, env:DOTNET_* | %{{ \\\"$($_.Name)=$($_.Value)\\\" }} 1>>{EnvironmentFileName}\"" - }; - - LaunchProcess(psi, 1800000); - - // Return the generated environment variables. - IDictionary<string, string> environment = new Dictionary<string, string>(); - return GetEnvironment(environment, Path.Combine(s_jitBenchDevDirectory, EnvironmentFileName)); - } - - private static IDictionary<string, string> GetEnvironment(IDictionary<string, string> environment, string fileName) - { - foreach (var line in File.ReadLines(fileName)) - { - if (string.IsNullOrWhiteSpace(line)) - continue; - - string[] pair = line.Split(new char[] { '=' }, 2); - if (pair.Length != 2) - throw new InvalidOperationException($"AspNet-GenerateStore.ps1 did not generate the expected environment variable {pair}"); - - string key = pair[0].ToUpperInvariant(); - string value = pair[1]; - if (!environment.ContainsKey(key)) - environment.Add(key,value); - else - environment[key] = value; - } - - return environment; - } - - // Return an environment with the downloaded dotnet on the path. - private static IDictionary<string, string> GetInitialEnvironment() - { - // TODO: This is currently hardcoded, but we could probably pull it from the powershell cmdlet call. - var dotnetPath = Path.Combine(s_jitBenchDevDirectory, ".dotnet"); - var dotnetexe = Path.Combine(dotnetPath, "dotnet.exe"); - if (!File.Exists(dotnetexe)) - throw new FileNotFoundException(dotnetexe); - - var environment = new Dictionary<string, string> { - { "DOTNET_MULTILEVEL_LOOKUP", "0" }, - { "PATH", $"{dotnetPath};{Environment.GetEnvironmentVariable("PATH")}" } - }; - - return environment; - } - - private static ProcessStartInfo CreateJitBenchStartInfo(IDictionary<string, string> environment) - { - var psi = new ProcessStartInfo - { - Arguments = "MusicStore.dll", - FileName = s_dotnetProcessFileName, - RedirectStandardError = true, - RedirectStandardOutput = true, - WorkingDirectory = Path.Combine(s_musicStoreDirectory, "bin", "Release", environment["JITBENCH_TARGET_FRAMEWORK_MONIKER"], "publish"), - }; - - foreach (KeyValuePair<string, string> pair in environment) - psi.Environment.Add(pair.Key, pair.Value); - - return psi; - } - - private static ProcessStartInfo UseExistingSetup() - { - PrintHeader("Using existing SETUP"); - - IDictionary<string, string> environment = GetInitialEnvironment(); - environment = GetEnvironment(environment, Path.Combine(s_jitBenchDevDirectory, EnvironmentFileName)); - ValidateEnvironment(environment); - return CreateJitBenchStartInfo(environment); - } - - private static ProcessStartInfo CreateNewSetup() - { - PrintHeader("Starting SETUP"); - DownloadAndExtractJitBenchRepo(); - IDictionary<string, string> environment = SetupJitBench(); - ValidateEnvironment(environment); - return CreateJitBenchStartInfo(environment); - } - - private static void ValidateEnvironment(IDictionary<string, string> environment) - { - var expectedVariables = new string[] { - "DOTNET_MULTILEVEL_LOOKUP", - "PATH", - "DOTNET_SHARED_STORE", - "JITBENCH_TARGET_FRAMEWORK_MONIKER" - }; - if (expectedVariables.Except(environment.Keys, StringComparer.Ordinal).Any()) - throw new Exception("Missing expected environment variables."); - - Console.WriteLine("**********************************************************************"); - foreach (var env in expectedVariables) - Console.WriteLine($" {env}={environment[env]}"); - Console.WriteLine("**********************************************************************"); - } - - private const string JitBenchRepoUrl = "https://github.com/aspnet/JitBench"; - private const string JitBenchCommitSha1Id = "6e1327b633e2d7d45f4c13f498fc27698ea5735a"; - private const string EnvironmentFileName = "JitBenchEnvironment.txt"; - - private void PreIteration(ScenarioTest scenario) - { - PrintHeader("Setting up data standard output/error process handlers."); - - _stderr.Clear(); - _stdout.Clear(); - - if (scenario.Process.StartInfo.RedirectStandardError) - { - scenario.Process.ErrorDataReceived += (object sender, DataReceivedEventArgs errorLine) => - { - if (!string.IsNullOrEmpty(errorLine.Data)) - _stderr.AppendLine(errorLine.Data); - }; - } - - if (scenario.Process.StartInfo.RedirectStandardInput) - throw new NotImplementedException("RedirectStandardInput has not been implemented yet."); - - if (scenario.Process.StartInfo.RedirectStandardOutput) - { - scenario.Process.OutputDataReceived += (object sender, DataReceivedEventArgs outputLine) => - { - if (!string.IsNullOrEmpty(outputLine.Data)) - _stdout.AppendLine(outputLine.Data); - }; - } - } - - private void PostIteration(ScenarioExecutionResult scenarioExecutionResult) - { - PrintHeader("Processing iteration results."); - - double? startupTime = null; - double? firstRequestTime = null; - double? steadyStateAverageTime = null; - double? steadyStateMedianTime = null; - - using (var reader = new StringReader(_stdout.ToString())) - { - string line; - while ((line = reader.ReadLine()) != null) - { - Match match = Regex.Match(line, @"^Server start \(ms\): \s*(\d+)\s*$"); - if (match.Success && match.Groups.Count == 2) - { - startupTime = Convert.ToDouble(match.Groups[1].Value); - continue; - } - - match = Regex.Match(line, @"^1st Request \(ms\): \s*(\d+)\s*$"); - if (match.Success && match.Groups.Count == 2) - { - firstRequestTime = Convert.ToDouble(match.Groups[1].Value); - continue; - } - - //the steady state output chart looks like: - // Requests Aggregate Time(ms) Req/s Req Min(ms) Req Mean(ms) Req Median(ms) Req Max(ms) SEM(%) - // ---------- ------------------ ----- ----------- ------------ -------------- ----------- ------ - // 2- 100 5729 252.60 3.01 3.96 3.79 9.81 1.86 - // 101- 250 6321 253.76 3.40 3.94 3.84 5.25 0.85 - // ... many more rows ... - - // Requests Agg req/s min mean median max SEM - match = Regex.Match(line, @"^\s*\d+-\s*\d+ \s* \d+ \s* \d+\.\d+ \s* \d+\.\d+ \s* (\d+\.\d+) \s* (\d+\.\d+) \s* \d+\.\d+ \s* \d+\.\d+$"); - if (match.Success && match.Groups.Count == 3) - { - //many lines will match, but the final values of these variables will be from the last batch which is presumably the - //best measurement of steady state performance - steadyStateAverageTime = Convert.ToDouble(match.Groups[1].Value); - steadyStateMedianTime = Convert.ToDouble(match.Groups[2].Value); - continue; - } - } - } - - if (!startupTime.HasValue) - throw new Exception("Startup time was not found."); - if (!firstRequestTime.HasValue) - throw new Exception("First Request time was not found."); - if (!steadyStateAverageTime.HasValue) - throw new Exception("Steady state average response time not found."); - if (!steadyStateMedianTime.HasValue) - throw new Exception("Steady state median response time not found."); - - IterationsData.Add(new IterationData - { - ScenarioExecutionResult = scenarioExecutionResult, - StandardOutput = _stdout.ToString(), - StartupTime = startupTime.Value, - FirstRequestTime = firstRequestTime.Value, - SteadystateTime = steadyStateAverageTime.Value, - SteadystateMedianTime = steadyStateMedianTime.Value, - }); - - PrintRunningStepInformation($"({IterationsData.Count}) Server started in {IterationsData.Last().StartupTime}ms"); - PrintRunningStepInformation($"({IterationsData.Count}) Request took {IterationsData.Last().FirstRequestTime}ms"); - PrintRunningStepInformation($"({IterationsData.Count}) Cold start time (server start + first request time): {IterationsData.Last().StartupTime + IterationsData.Last().FirstRequestTime}ms"); - PrintRunningStepInformation($"({IterationsData.Count}) Average steady state response {IterationsData.Last().SteadystateTime}ms"); - PrintRunningStepInformation($"({IterationsData.Count}) Median steady state response {IterationsData.Last().SteadystateMedianTime}ms"); - - _stdout.Clear(); - _stderr.Clear(); - } - - private void PostRun( - ScenarioBenchmark scenarioBenchmark, - string scenarioTestModelName, - IReadOnlyCollection<string> processesOfInterest, - IReadOnlyCollection<string> modulesOfInterest) - { - PrintHeader("Post-Processing scenario data."); - - foreach (var iter in IterationsData) - { - var scenarioExecutionResult = iter.ScenarioExecutionResult; - var scenarioTestModel = scenarioBenchmark.Tests - .SingleOrDefault(t => t.Name == scenarioTestModelName); - - if (scenarioTestModel == null) - { - scenarioTestModel = new ScenarioTestModel(scenarioTestModelName); - scenarioBenchmark.Tests.Add(scenarioTestModel); - - // Add measured metrics to each test. - scenarioTestModel.Performance.Metrics.Add(ElapsedTimeMilliseconds); - } - - scenarioTestModel.Performance.IterationModels.Add(new IterationModel - { - Iteration = new Dictionary<string, double> { - { ElapsedTimeMilliseconds.Name, (scenarioExecutionResult.ProcessExitInfo.ExitTime - scenarioExecutionResult.ProcessExitInfo.StartTime).TotalMilliseconds}, - } - }); - - // Create (measured) test entries for this scenario. - var startup = scenarioBenchmark.Tests - .SingleOrDefault(t => t.Name == "Startup" && t.Namespace == scenarioTestModel.Name); - if (startup == null) - { - startup = new ScenarioTestModel("Startup") - { - Namespace = scenarioTestModel.Name, - }; - scenarioBenchmark.Tests.Add(startup); - - // Add measured metrics to each test. - startup.Performance.Metrics.Add(ElapsedTimeMilliseconds); - } - - var firstRequest = scenarioBenchmark.Tests - .SingleOrDefault(t => t.Name == "First Request" && t.Namespace == scenarioTestModel.Name); - if (firstRequest == null) - { - firstRequest = new ScenarioTestModel("First Request") - { - Namespace = scenarioTestModel.Name, - }; - scenarioBenchmark.Tests.Add(firstRequest); - - // Add measured metrics to each test. - firstRequest.Performance.Metrics.Add(ElapsedTimeMilliseconds); - } - - var medianResponse = scenarioBenchmark.Tests - .SingleOrDefault(t => t.Name == "Median Response" && t.Namespace == scenarioTestModel.Name); - if (medianResponse == null) - { - medianResponse = new ScenarioTestModel("Median Response") - { - Namespace = scenarioTestModel.Name, - }; - scenarioBenchmark.Tests.Add(medianResponse); - - // Add measured metrics to each test. - medianResponse.Performance.Metrics.Add(ElapsedTimeMilliseconds); - } - - startup.Performance.IterationModels.Add(new IterationModel - { - Iteration = new Dictionary<string, double> { - { ElapsedTimeMilliseconds.Name, iter.StartupTime }, - }, - }); - - firstRequest.Performance.IterationModels.Add(new IterationModel - { - Iteration = new Dictionary<string, double> { - { ElapsedTimeMilliseconds.Name, iter.FirstRequestTime }, - }, - }); - - medianResponse.Performance.IterationModels.Add(new IterationModel - { - Iteration = new Dictionary<string, double> { - { ElapsedTimeMilliseconds.Name, iter.SteadystateMedianTime }, - }, - }); - - if (!string.IsNullOrWhiteSpace(iter.ScenarioExecutionResult.EventLogFileName) && - File.Exists(iter.ScenarioExecutionResult.EventLogFileName)) - { - // Adding ETW data. - scenarioBenchmark = AddEtwData( - scenarioBenchmark, iter.ScenarioExecutionResult, processesOfInterest, modulesOfInterest); - } - } - } - - private static ScenarioBenchmark AddEtwData( - ScenarioBenchmark scenarioBenchmark, - ScenarioExecutionResult scenarioExecutionResult, - IReadOnlyCollection<string> processesOfInterest, - IReadOnlyCollection<string> modulesOfInterest) - { - var metricModels = scenarioExecutionResult.PerformanceMonitorCounters - .Select(pmc => new MetricModel - { - DisplayName = pmc.DisplayName, - Name = pmc.Name, - Unit = pmc.Unit, - }); - - // Get the list of processes of interest. - Console.WriteLine($"Parsing: {scenarioExecutionResult.EventLogFileName}"); - var processes = new SimpleTraceEventParser().GetProfileData(scenarioExecutionResult); - - // Extract the Pmc data for each one of the processes. - foreach (var process in processes) - { - if (!processesOfInterest.Any(p => p.Equals(process.Name, StringComparison.OrdinalIgnoreCase))) - continue; - - var processTest = scenarioBenchmark.Tests - .SingleOrDefault(t => t.Name == process.Name && t.Namespace == ""); - if (processTest == null) - { - processTest = new ScenarioTestModel(process.Name) - { - Namespace = "", - }; - scenarioBenchmark.Tests.Add(processTest); - - // Add metrics definitions. - processTest.Performance.Metrics.Add(ElapsedTimeMilliseconds); - processTest.Performance.Metrics.AddRange(metricModels); - } - - var processIterationModel = new IterationModel - { - Iteration = new Dictionary<string, double>() - }; - processTest.Performance.IterationModels.Add(processIterationModel); - - processIterationModel.Iteration.Add( - ElapsedTimeMilliseconds.Name, process.LifeSpan.Duration.TotalMilliseconds); - - // Add process metrics values. - foreach (var pmcData in process.PerformanceMonitorCounterData) - processIterationModel.Iteration.Add(pmcData.Key.Name, pmcData.Value); - - foreach (var module in process.Modules) - { - var moduleName = Path.GetFileName(module.FullName); - if (modulesOfInterest.Any(m => m.Equals(moduleName, StringComparison.OrdinalIgnoreCase))) - { - var moduleTestName = $"{moduleName}"; - var moduleTest = scenarioBenchmark.Tests - .SingleOrDefault(t => t.Name == moduleTestName && t.Namespace == process.Name); - - if (moduleTest == null) - { - moduleTest = new ScenarioTestModel(moduleTestName) - { - Namespace = process.Name, - Separator = "!", - }; - scenarioBenchmark.Tests.Add(moduleTest); - - // Add metrics definitions. - moduleTest.Performance.Metrics.AddRange(metricModels); - } - - var moduleIterationModel = new IterationModel - { - Iteration = new Dictionary<string, double>() - }; - moduleTest.Performance.IterationModels.Add(moduleIterationModel); - - // 5. Add module metrics values. - foreach (var pmcData in module.PerformanceMonitorCounterData) - moduleIterationModel.Iteration.Add(pmcData.Key.Name, pmcData.Value); - } - } - } - - return scenarioBenchmark; - } - - private static void LaunchProcess(ProcessStartInfo processStartInfo, int timeoutMilliseconds, IDictionary<string, string> environment = null) - { - Console.WriteLine(); - Console.WriteLine($"{System.Security.Principal.WindowsIdentity.GetCurrent().Name}@{Environment.MachineName} \"{processStartInfo.WorkingDirectory}\""); - Console.WriteLine($"[{DateTime.Now}] $ {processStartInfo.FileName} {processStartInfo.Arguments}"); - - if (environment != null) - { - foreach (KeyValuePair<string, string> pair in environment) - { - if (!processStartInfo.Environment.ContainsKey(pair.Key)) - processStartInfo.Environment.Add(pair.Key, pair.Value); - else - processStartInfo.Environment[pair.Key] = pair.Value; - } - } - - using (var p = new System.Diagnostics.Process { StartInfo = processStartInfo }) - { - p.Start(); - if (p.WaitForExit(timeoutMilliseconds) == false) - { - // FIXME: What about clean/kill child processes? - p.Kill(); - throw new TimeoutException($"The process '{processStartInfo.FileName} {processStartInfo.Arguments}' timed out."); - } - - if (p.ExitCode != 0) - throw new Exception($"{processStartInfo.FileName} exited with error code {p.ExitCode}"); - } - } - - private static void PrintHeader(string message) - { - Console.WriteLine(); - Console.WriteLine("**********************************************************************"); - Console.WriteLine($"** [{DateTime.Now}] {message}"); - Console.WriteLine("**********************************************************************"); - } - - private static void PrintRunningStepInformation(string message) - { - Console.WriteLine($"-- {message}"); - } - - private List<IterationData> IterationsData { get; } - - private static MetricModel ElapsedTimeMilliseconds { get; } = new MetricModel - { - DisplayName = "Duration", - Name = "Duration", - Unit = "ms", - }; - -#if DEBUG - private const int NumberOfIterations = 2; -#else - private const int NumberOfIterations = 11; -#endif - private readonly StringBuilder _stdout; - private readonly StringBuilder _stderr; - - private static string s_temporaryDirectory; - private static string s_jitBenchDevDirectory; - private static string s_dotnetProcessFileName; - private static string s_musicStoreDirectory; - private static string s_targetArchitecture; - private static string s_localJitBenchRepo; - } -} diff --git a/tests/src/performance/Scenario/JitBench/JitBenchHarnessOptions.cs b/tests/src/performance/Scenario/JitBench/JitBenchHarnessOptions.cs deleted file mode 100644 index a8b7b37626..0000000000 --- a/tests/src/performance/Scenario/JitBench/JitBenchHarnessOptions.cs +++ /dev/null @@ -1,139 +0,0 @@ -// Licensed to the .NET Foundation under one or more agreements. -// The .NET Foundation licenses this file to you under the MIT license. -// See the LICENSE file in the project root for more information. - -using CommandLine; -using CommandLine.Text; -using Microsoft.Xunit.Performance.Api; -using System; -using System.IO; -using System.Linq; -using System.Reflection; - -namespace JitBench -{ - /// <summary> - /// Provides an interface to parse the command line arguments passed to the JitBench harness. - /// </summary> - internal sealed class JitBenchHarnessOptions - { - public JitBenchHarnessOptions() - { - _tempDirectory = Directory.GetCurrentDirectory(); - _iterations = 11; - } - - [Option("use-existing-setup", Required = false, HelpText = "Use existing setup.")] - public Boolean UseExistingSetup { get; set; } - - [Option("local-jitbench-repo", Required = false, HelpText = "Optional path to a local JitBench repo enlistment to use instead of downloading from github")] - public string LocalJitBenchRepo { get; set; } - - [Option("tiering", Required = false, HelpText = "Enable tiered jit.")] - public Boolean EnableTiering { get; set; } - - [Option("minopts", Required = false, HelpText = "Force jit to use minopt codegen.")] - public Boolean Minopts { get; set; } - - [Option("disable-r2r", Required = false, HelpText = "Disable loading of R2R images.")] - public Boolean DisableR2R { get; set; } - - [Option("disable-ngen", Required = false, HelpText = "Disable loading of ngen images.")] - public Boolean DisableNgen { get; set; } - - [Option("iterations", Required = false, HelpText = "Number of iterations to run.")] - public uint Iterations { get { return _iterations; } set { _iterations = value; } } - - [Option('o', Required = false, HelpText = "Specifies the intermediate output directory name.")] - public string IntermediateOutputDirectory - { - get { return _tempDirectory; } - - set - { - if (string.IsNullOrWhiteSpace(value)) - throw new InvalidOperationException("The intermediate output directory name cannot be null, empty or white space."); - - if (value.Any(c => Path.GetInvalidPathChars().Contains(c))) - throw new InvalidOperationException("Specified intermediate output directory name contains invalid path characters."); - - _tempDirectory = Path.IsPathRooted(value) ? value : Path.GetFullPath(value); - Directory.CreateDirectory(_tempDirectory); - } - } - - [Option("target-architecture", Required = true, HelpText = "JitBench target architecture (It must match the built product that was copied into sandbox).")] - public string TargetArchitecture { get; set; } - - public static JitBenchHarnessOptions Parse(string[] args) - { - using (var parser = new Parser((settings) => { - settings.CaseInsensitiveEnumValues = true; - settings.CaseSensitive = false; - settings.HelpWriter = new StringWriter(); - settings.IgnoreUnknownArguments = true; - })) - { - JitBenchHarnessOptions options = null; - parser.ParseArguments<JitBenchHarnessOptions>(args) - .WithParsed(parsed => options = parsed) - .WithNotParsed(errors => { - foreach (Error error in errors) - { - switch (error.Tag) - { - case ErrorType.MissingValueOptionError: - throw new ArgumentException( - $"Missing value option for command line argument '{(error as MissingValueOptionError).NameInfo.NameText}'"); - case ErrorType.HelpRequestedError: - Console.WriteLine(Usage()); - Environment.Exit(0); - break; - case ErrorType.VersionRequestedError: - Console.WriteLine(new AssemblyName(typeof(JitBenchHarnessOptions).GetTypeInfo().Assembly.FullName).Version); - Environment.Exit(0); - break; - case ErrorType.BadFormatTokenError: - case ErrorType.UnknownOptionError: - case ErrorType.MissingRequiredOptionError: - throw new ArgumentException( - $"Missing required command line argument '{(error as MissingRequiredOptionError).NameInfo.NameText}'"); - case ErrorType.MutuallyExclusiveSetError: - case ErrorType.BadFormatConversionError: - case ErrorType.SequenceOutOfRangeError: - case ErrorType.RepeatedOptionError: - case ErrorType.NoVerbSelectedError: - case ErrorType.BadVerbSelectedError: - case ErrorType.HelpVerbRequestedError: - break; - } - } - }); - return options; - } - } - - public static string Usage() - { - var parser = new Parser((parserSettings) => { - parserSettings.CaseInsensitiveEnumValues = true; - parserSettings.CaseSensitive = false; - parserSettings.EnableDashDash = true; - parserSettings.HelpWriter = new StringWriter(); - parserSettings.IgnoreUnknownArguments = true; - }); - - var helpTextString = new HelpText { - AddDashesToOption = true, - AddEnumValuesToHelpText = true, - AdditionalNewLineAfterOption = false, - Heading = "JitBenchHarness", - MaximumDisplayWidth = 80, - }.AddOptions(parser.ParseArguments<JitBenchHarnessOptions>(new string[] { "--help" })).ToString(); - return helpTextString; - } - - private string _tempDirectory; - private uint _iterations; - } -} diff --git a/tests/src/performance/Scenario/JitBench/Properties/launchSettings.json b/tests/src/performance/Scenario/JitBench/Properties/launchSettings.json new file mode 100644 index 0000000000..00a2e20e4c --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/Properties/launchSettings.json @@ -0,0 +1,8 @@ +{ + "profiles": { + "JitBench": { + "commandName": "Project", + "commandLineArgs": "--perf:outputdir F:\\github\\coreclr\\bin\\sandbox_logs\\Scenarios\\On\\JitBench --perf:runid Perf-On --target-architecture x64 --perf:collect BranchMispredictions+CacheMisses+InstructionRetired" + } + } +}
\ No newline at end of file diff --git a/tests/src/performance/Scenario/JitBench/Runner/Benchmark.cs b/tests/src/performance/Scenario/JitBench/Runner/Benchmark.cs new file mode 100644 index 0000000000..b6b18acc7f --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/Runner/Benchmark.cs @@ -0,0 +1,219 @@ +using System; +using System.Collections.Generic; +using System.Diagnostics; +using System.IO; +using System.Linq; +using System.Text; +using System.Threading.Tasks; +using Microsoft.Xunit.Performance.Api; +using Microsoft.Xunit.Performance.Api.Profilers.Etw; + +namespace JitBench +{ + public abstract class Benchmark + { + public Benchmark(string name) + { + Name = name; + EnvironmentVariables = new Dictionary<string, string>(); + } + + public string Name { get; private set; } + public string ExePath { get; protected set; } + public string WorkingDirPath { get; protected set; } + public string CommandLineArguments { get; protected set; } + public Dictionary<string, string> EnvironmentVariables { get; private set; } + + public BenchmarkRunResult[] Run(TestRun run, ITestOutputHelper output) + { + using (var runSectionOutput = new IndentedTestOutputHelper($"Run {Name} iterations", output)) + { + return MeasureIterations(run, runSectionOutput); + } + } + + public abstract Task Setup(DotNetInstallation dotnetInstall, string intermediateOutputDir, bool useExistingSetup, ITestOutputHelper output); + + public virtual Metric[] GetDefaultDisplayMetrics() + { + return new Metric[] { Metric.ElapsedTimeMilliseconds }; + } + + BenchmarkRunResult[] MeasureIterations(TestRun run, ITestOutputHelper output) + { + List<BenchmarkRunResult> results = new List<BenchmarkRunResult>(); + foreach (BenchmarkConfiguration config in run.Configurations) + { + results.Add(MeasureIterations(run, config, output)); + } + return results.ToArray(); + } + + BenchmarkRunResult MeasureIterations(TestRun run, BenchmarkConfiguration config, ITestOutputHelper output) + { + // The XunitPerformanceHarness is hardcoded to log to the console. It would be nice if the output was configurable somehow + // but in lieue of that we can redirect all console output with light hackery. + using (var redirector = new ConsoleRedirector(output)) + { + // XunitPerformanceHarness expects to do the raw commandline parsing itself, but I really don't like that its default collection + // metric requires the use of ETW. Getting an admin console or admin VS instance isn't where most people start, its + // a small nuissance, and for these tests its often not needed/adds non-trivial overhead. I set the default to stopwatch if the + // perf:collect argument hasn't been specified, but that sadly requires that I pre-parse, interpret, and then re-format all the + // args to make that change :( + // + // In TestRun.ValidateMetricNames() I pre-check if ETW is going to be needed and give an error there rather than doing all the + // test setup (~1 minute?) and then giving the error after the user has probably wandered away. That also relies on some of this + // replicated command line parsing. + string[] args = new string[] { "--perf:collect", string.Join("+", run.MetricNames), "--perf:outputdir", run.OutputDir, "--perf:runid", run.BenchviewRunId }; + using (var harness = new XunitPerformanceHarness(args)) + { + ProcessStartInfo startInfo = new ProcessStartInfo(run.DotNetInstallation.DotNetExe, ExePath + " " + CommandLineArguments); + startInfo.WorkingDirectory = WorkingDirPath; + startInfo.RedirectStandardError = true; + startInfo.RedirectStandardOutput = true; + foreach (KeyValuePair<string, string> kv in config.EnvironmentVariables) + { + startInfo.Environment[kv.Key] = kv.Value; + } + foreach (KeyValuePair<string, string> kv in EnvironmentVariables) + { + startInfo.Environment[kv.Key] = kv.Value; + } + startInfo.Environment["DOTNET_MULTILEVEL_LOOKUP"] = "0"; + + BenchmarkRunResult result = new BenchmarkRunResult(this, config); + StringBuilder stderr = new StringBuilder(); + StringBuilder stdout = new StringBuilder(); + var scenarioConfiguration = new ScenarioTestConfiguration(TimeSpan.FromMilliseconds(60000), startInfo) + { + //XUnitPerformanceHarness writes files to disk starting with {runid}-{ScenarioBenchmarkName}-{TestName} + TestName = (Name + "-" + config.Name).Replace(' ', '_'), + Scenario = new ScenarioBenchmark("JitBench"), + Iterations = run.Iterations, + PreIterationDelegate = scenario => + { + stderr.Clear(); + stdout.Clear(); + scenario.Process.ErrorDataReceived += (object sender, DataReceivedEventArgs errorLine) => + { + if(!string.IsNullOrEmpty(errorLine.Data)) + { + stderr.AppendLine(errorLine.Data); + redirector.WriteLine("STDERROR: " + errorLine.Data); + } + }; + scenario.Process.OutputDataReceived += (object sender, DataReceivedEventArgs outputLine) => + { + stdout.AppendLine(outputLine.Data); + redirector.WriteLine(outputLine.Data); + }; + }, + PostIterationDelegate = scenarioResult => + { + result.IterationResults.Add(RecordIterationMetrics(scenarioResult, stdout.ToString(), stderr.ToString(), redirector)); + } + }; + harness.RunScenario(scenarioConfiguration, sb => { BenchviewResultExporter.ConvertRunResult(sb, result); }); + return result; + } + } + } + + protected virtual IterationResult RecordIterationMetrics(ScenarioExecutionResult scenarioIteration, string stdout, string stderr, ITestOutputHelper output) + { + IterationResult iterationResult = new IterationResult(); + int elapsedMs = (int)(scenarioIteration.ProcessExitInfo.ExitTime - scenarioIteration.ProcessExitInfo.StartTime).TotalMilliseconds; + iterationResult.Measurements.Add(Metric.ElapsedTimeMilliseconds, elapsedMs); + if (!string.IsNullOrWhiteSpace(scenarioIteration.EventLogFileName) && File.Exists(scenarioIteration.EventLogFileName)) + { + AddEtwData(iterationResult, scenarioIteration, output); + } + return iterationResult; + } + + protected static void AddEtwData( + IterationResult iteration, + ScenarioExecutionResult scenarioExecutionResult, + ITestOutputHelper output) + { + string[] modulesOfInterest = new string[] { + "Anonymously Hosted DynamicMethods Assembly", + "clrjit.dll", + "coreclr.dll", + "dotnet.exe", + "MusicStore.dll", + "ntoskrnl.exe", + "System.Private.CoreLib.dll", + "Unknown", + }; + + // Get the list of processes of interest. + try + { + var processes = new SimpleTraceEventParser().GetProfileData(scenarioExecutionResult); + + // Extract the Pmc data for each one of the processes. + foreach (var process in processes) + { + if (process.Id != scenarioExecutionResult.ProcessExitInfo.ProcessId) + continue; + + iteration.Measurements.Add(new Metric($"PMC/{process.Name}/Duration", "ms"), + process.LifeSpan.Duration.TotalMilliseconds); + + // Add process metrics values. + foreach (var pmcData in process.PerformanceMonitorCounterData) + iteration.Measurements.Add(new Metric($"PMC/{process.Name}/{pmcData.Key.Name}", pmcData.Key.Unit), pmcData.Value); + + foreach (var module in process.Modules) + { + var moduleName = Path.GetFileName(module.FullName); + if (modulesOfInterest.Any(m => m.Equals(moduleName, StringComparison.OrdinalIgnoreCase))) + { + foreach (var pmcData in module.PerformanceMonitorCounterData) + { + Metric m = new Metric($"PMC/{process.Name}!{moduleName}/{pmcData.Key.Name}", pmcData.Key.Unit); + // Sometimes the etw parser gives duplicate module entries which leads to duplicate keys + // but I haven't hunted down the reason. For now it is first one wins. + if (!iteration.Measurements.ContainsKey(m)) + { + iteration.Measurements.Add(m, pmcData.Value); + } + } + + } + } + } + } + catch (InvalidOperationException e) + { + output.WriteLine("Error while processing ETW log: " + scenarioExecutionResult.EventLogFileName); + output.WriteLine(e.ToString()); + } + } + + /// <summary> + /// When serializing the result data to benchview this is called to determine if any of the metrics should be reported differently + /// than they were collected. We use this to collect several measurements in each iteration, then present those measurements + /// to benchview as if each was a distinct test model with its own set of iterations of a single measurement. + /// </summary> + public virtual bool TryGetBenchviewCustomMetricReporting(Metric originalMetric, out Metric newMetric, out string newScenarioModelName) + { + if (originalMetric.Name.StartsWith("PMC/")) + { + int prefixLength = "PMC/".Length; + int secondSlash = originalMetric.Name.IndexOf('/', prefixLength); + newScenarioModelName = originalMetric.Name.Substring(prefixLength, secondSlash - prefixLength); + string newMetricName = originalMetric.Name.Substring(secondSlash+1); + newMetric = new Metric(newMetricName, originalMetric.Unit); + return true; + } + else + { + newMetric = default(Metric); + newScenarioModelName = null; + return false; + } + } + } +} diff --git a/tests/src/performance/Scenario/JitBench/Runner/BenchmarkConfiguration.cs b/tests/src/performance/Scenario/JitBench/Runner/BenchmarkConfiguration.cs new file mode 100644 index 0000000000..42aa2e2698 --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/Runner/BenchmarkConfiguration.cs @@ -0,0 +1,52 @@ +using System; +using System.Collections.Generic; +using System.Text; + +namespace JitBench +{ + public class BenchmarkConfiguration + { + public BenchmarkConfiguration() + { + Name = "Default"; + EnvironmentVariables = new Dictionary<string, string>(); + } + public bool IsDefault { get { return Name == "Default"; } } + public string Name { get; set; } + public Dictionary<string, string> EnvironmentVariables { get; private set; } + + public BenchmarkConfiguration WithTiering() + { + return WithModifier("Tiering", "COMPLUS_EXPERIMENTAL_TieredCompilation", "1"); + } + + public BenchmarkConfiguration WithMinOpts() + { + return WithModifier("Minopts", "COMPLUS_JitMinOpts", "1"); + } + + public BenchmarkConfiguration WithNoR2R() + { + return WithModifier("NoR2R", "COMPlus_ReadyToRun", "0"); + } + + public BenchmarkConfiguration WithNoNgen() + { + return WithModifier("NoNgen", "COMPLUS_ZapDisable", "1"); + } + + private BenchmarkConfiguration WithModifier(string modifier, string variableName, string variableValue) + { + if (IsDefault) + { + Name = modifier; + } + else + { + Name += " " + modifier; + } + EnvironmentVariables.Add(variableName, variableValue); + return this; + } + } +} diff --git a/tests/src/performance/Scenario/JitBench/Runner/BenchmarkRunResult.cs b/tests/src/performance/Scenario/JitBench/Runner/BenchmarkRunResult.cs new file mode 100644 index 0000000000..06e2fac644 --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/Runner/BenchmarkRunResult.cs @@ -0,0 +1,47 @@ +using System; +using System.Collections.Generic; +using System.Text; + +namespace JitBench +{ + public class BenchmarkRunResult + { + public BenchmarkRunResult(Benchmark benchmark, BenchmarkConfiguration configuration) + { + Benchmark = benchmark; + Configuration = configuration; + IterationResults = new List<IterationResult>(); + } + + public Benchmark Benchmark { get; private set; } + public BenchmarkConfiguration Configuration { get; private set; } + public List<IterationResult> IterationResults { get; private set; } + } + + public class IterationResult + { + public IterationResult() + { + Measurements = new Dictionary<Metric, double>(); + } + public Dictionary<Metric, double> Measurements { get; private set; } + } + + public struct Metric + { + public Metric(string name, string unit) + { + Name = name; + Unit = unit; + } + public string Name { get; private set; } + public string Unit { get; private set; } + + public static readonly Metric ElapsedTimeMilliseconds = new Metric("Duration", "ms"); + + public override string ToString() + { + return $"{Name}({Unit})"; + } + } +} diff --git a/tests/src/performance/Scenario/JitBench/Runner/BenchviewResultExporter.cs b/tests/src/performance/Scenario/JitBench/Runner/BenchviewResultExporter.cs new file mode 100644 index 0000000000..b5d3691e00 --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/Runner/BenchviewResultExporter.cs @@ -0,0 +1,95 @@ +using System; +using System.Collections.Generic; +using System.IO; +using Microsoft.Xunit.Performance.Api; + +namespace JitBench +{ + public static class BenchviewResultExporter + { + public static void ConvertRunResult(ScenarioBenchmark scenario, BenchmarkRunResult runResult) + { + scenario.Tests = new List<ScenarioTestModel>(); + scenario.Tests.AddRange(ConvertRunResult(runResult)); + } + + static ScenarioTestModel[] ConvertRunResult(BenchmarkRunResult runResult) + { + List<ScenarioTestModel> testModels = new List<ScenarioTestModel>(); + string name = runResult.Benchmark.Name; + List<Metric> metrics = CollectMetrics(runResult); + foreach (Metric m in metrics.ToArray()) + { + if(runResult.Benchmark.TryGetBenchviewCustomMetricReporting(m, out Metric newMetric, out string newScenarioModelName)) + { + metrics.Remove(m); + testModels.Add(ConvertRunResult(runResult, new Metric[] { newMetric }, oldMetric => m.Equals(oldMetric) ? newMetric : default(Metric), name, newScenarioModelName)); + } + } + testModels.Insert(0, ConvertRunResult(runResult, metrics, oldMetric => metrics.Contains(oldMetric) ? oldMetric : default(Metric), null, name)); + return testModels.ToArray(); + } + + static ScenarioTestModel ConvertRunResult(BenchmarkRunResult runResult, IEnumerable<Metric> metrics, Func<Metric,Metric> metricMapping, string scenarioModelNamespace, string scenarioModelName) + { + var testModel = new ScenarioTestModel(scenarioModelName); + testModel.Namespace = scenarioModelNamespace; + testModel.Performance = new PerformanceModel(); + testModel.Performance.Metrics = new List<MetricModel>(); + testModel.Performance.IterationModels = new List<IterationModel>(); + foreach (var iterationResult in runResult.IterationResults) + { + testModel.Performance.IterationModels.Add(ConvertIterationResult(iterationResult, metricMapping)); + } + foreach (var metric in metrics) + { + testModel.Performance.Metrics.Add(new MetricModel() + { + DisplayName = metric.Name, + Name = metric.Name, + Unit = metric.Unit + }); + } + return testModel; + } + + static List<Metric> CollectMetrics(BenchmarkRunResult runResult) + { + List<Metric> metrics = new List<Metric>(); + foreach(IterationResult iterationResult in runResult.IterationResults) + { + foreach (KeyValuePair<Metric, double> measurement in iterationResult.Measurements) + { + if (!metrics.Contains(measurement.Key)) + { + metrics.Add(measurement.Key); + } + } + } + return metrics; + } + + /// <summary> + /// Converts IterationResult into Benchview's IterationModel, remaping and filtering the metrics reported + /// </summary> + static IterationModel ConvertIterationResult(IterationResult iterationResult, Func<Metric, Metric> metricMapping) + { + IterationModel iterationModel = new IterationModel(); + iterationModel.Iteration = new Dictionary<string, double>(); + foreach(KeyValuePair<Metric,double> measurement in iterationResult.Measurements) + { + Metric finalMetric = metricMapping(measurement.Key); + if(!finalMetric.Equals(default(Metric))) + { + iterationModel.Iteration.Add(finalMetric.Name, measurement.Value); + } + } + return iterationModel; + } + + private static string GetFilePathWithoutExtension(string outputDir, string runId, ScenarioBenchmark benchmark) + { + return Path.Combine(outputDir, $"{runId}-{benchmark.Name}"); + } + } +} diff --git a/tests/src/performance/Scenario/JitBench/Runner/CommandLineOptions.cs b/tests/src/performance/Scenario/JitBench/Runner/CommandLineOptions.cs new file mode 100644 index 0000000000..de6e3a286d --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/Runner/CommandLineOptions.cs @@ -0,0 +1,181 @@ +using CommandLine; +using CommandLine.Text; +using System; +using System.Collections.Generic; +using System.IO; +using System.Linq; +using System.Reflection; + +namespace JitBench +{ + // Licensed to the .NET Foundation under one or more agreements. + // The .NET Foundation licenses this file to you under the MIT license. + // See the LICENSE file in the project root for more information. + + + + /// <summary> + /// Provides an interface to parse the command line arguments passed to the TieredJitBench harness. + /// </summary> + internal sealed class CommandLineOptions + { + public CommandLineOptions() { } + + [Option("use-existing-setup", Required = false, HelpText = "Use existing setup for all benchmarks.")] + public Boolean UseExistingSetup { get; set; } + + [Option("coreclr-bin-dir", Required = false, HelpText = "Copy private CoreCLR binaries from this directory. (The binaries must match target-architecture)")] + public string CoreCLRBinaryDir { get; set; } + + [Option("dotnet-framework-version", Required = false, HelpText = "The version of dotnet on which private CoreCLR binaries will be overlayed")] + public string DotnetFrameworkVersion { get; set; } + + [Option("dotnet-sdk-version", Required = false, HelpText = "The version of dotnet SDK to install for this test")] + public string DotnetSdkVersion { get; set; } + + [Option("configs", Required = false, Separator=',', HelpText = "A comma list of all configurations that the benchmarks will be run with. The options are: Default, Tiering, Minopts, NoR2R, and NoNgen. " + + "If not specified this defaults to a list containing only Default.")] + public IEnumerable<string> Configs { get; set; } + + [Option("iterations", Required = false, HelpText = "Number of iterations to run.")] + public uint Iterations { get; set; } + + [Option("perf:outputdir", Required = false, HelpText = "Specifies the output directory name.")] + public string OutputDirectory { get; set; } + + [Option("target-architecture", Required = false, HelpText = "The architecture of the binaries being tested.")] + public string TargetArchitecture { get; set; } + + [Option("benchmark", Required=false, HelpText = "A semicolon seperated list of benchmarks to run")] + public string BenchmarkName { get; set; } + + [Option("perf:runid", Required = false, HelpText = "User defined id given to the performance harness.")] + public string RunId + { + get { return _runid; } + + set + { + if (string.IsNullOrWhiteSpace(value)) + { + throw new Exception("The RunId cannot be null, empty or white space."); + } + + if (value.Any(c => Path.GetInvalidFileNameChars().Contains(c))) + { + throw new Exception("Specified RunId contains invalid file name characters."); + } + + _runid = value; + } + } + + /* + * Provider & Reader + * + * --perf:collect [metric1[+metric2[+...]]] + * + * default + * Set by the test author (This is the default behavior if no option is specified. It will also enable ETW to capture some of the Microsoft-Windows-DotNETRuntime tasks). + * + * stopwatch + * Capture elapsed time using a Stopwatch (It does not require ETW). + * + * BranchMispredictions|CacheMisses|InstructionRetired + * These are performance metric counters and require ETW. + * + * gcapi + * It currently enable "Allocation Size on Benchmark Execution Thread" and it is only available through ETW. + * + * Examples + * --perf:collect default + * Collect metrics specified in the test source code by using xUnit Performance API attributes + * + * --perf:collect BranchMispredictions+CacheMisses+InstructionRetired + * Collects PMC metrics + * + * --perf:collect stopwatch + * Collects duration + * + * --perf:collect default+BranchMispredictions+CacheMisses+InstructionRetired+gcapi + * '+' implies union of all specified options + */ + [Option("perf:collect", Required = false, Separator = '+', Hidden = true, + HelpText = "The metrics to be collected.")] + public IEnumerable<string> MetricNames { get; set; } + + public static CommandLineOptions Parse(string[] args) + { + using (var parser = new Parser((settings) => + { + settings.CaseInsensitiveEnumValues = true; + settings.CaseSensitive = false; + settings.HelpWriter = new StringWriter(); + settings.IgnoreUnknownArguments = true; + })) + { + CommandLineOptions options = null; + parser.ParseArguments<CommandLineOptions>(args) + .WithParsed(parsed => options = parsed) + .WithNotParsed(errors => + { + foreach (Error error in errors) + { + switch (error.Tag) + { + case ErrorType.MissingValueOptionError: + throw new ArgumentException( + $"Missing value option for command line argument '{(error as MissingValueOptionError).NameInfo.NameText}'"); + case ErrorType.HelpRequestedError: + Console.WriteLine(Usage()); + Environment.Exit(0); + break; + case ErrorType.VersionRequestedError: + Console.WriteLine(new AssemblyName(typeof(CommandLineOptions).GetTypeInfo().Assembly.FullName).Version); + Environment.Exit(0); + break; + case ErrorType.BadFormatTokenError: + case ErrorType.UnknownOptionError: + case ErrorType.MissingRequiredOptionError: + throw new ArgumentException( + $"Missing required command line argument '{(error as MissingRequiredOptionError).NameInfo.NameText}'"); + case ErrorType.MutuallyExclusiveSetError: + case ErrorType.BadFormatConversionError: + case ErrorType.SequenceOutOfRangeError: + case ErrorType.RepeatedOptionError: + case ErrorType.NoVerbSelectedError: + case ErrorType.BadVerbSelectedError: + case ErrorType.HelpVerbRequestedError: + break; + } + } + }); + return options; + } + } + + public static string Usage() + { + var parser = new Parser((parserSettings) => + { + parserSettings.CaseInsensitiveEnumValues = true; + parserSettings.CaseSensitive = false; + parserSettings.EnableDashDash = true; + parserSettings.HelpWriter = new StringWriter(); + parserSettings.IgnoreUnknownArguments = true; + }); + + var helpTextString = new HelpText + { + AddDashesToOption = true, + AddEnumValuesToHelpText = true, + AdditionalNewLineAfterOption = false, + Heading = "JitBench", + MaximumDisplayWidth = 80, + }.AddOptions(parser.ParseArguments<CommandLineOptions>(new string[] { "--help" })).ToString(); + return helpTextString; + } + + private string _runid; + } +} diff --git a/tests/src/performance/Scenario/JitBench/Runner/Program.cs b/tests/src/performance/Scenario/JitBench/Runner/Program.cs new file mode 100644 index 0000000000..9b81c94118 --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/Runner/Program.cs @@ -0,0 +1,221 @@ +using System; +using System.Collections.Generic; +using System.IO; +using System.Linq; +using System.Reflection; +using System.Runtime.InteropServices; +using System.Threading; +using System.Threading.Tasks; + +namespace JitBench +{ + class Program + { + public static void Main(string[] args) + { + CommandLineOptions options = CommandLineOptions.Parse(args); + TestRun testRun = ConfigureTestRun(options); + + ConsoleTestOutputHelper console = new ConsoleTestOutputHelper(); + string logPath = Path.Combine(testRun.OutputDir, "JitBench_log.txt"); + FileTestOutputHelper logOutput = new FileTestOutputHelper(logPath); + + testRun.WriteConfiguration(console); + testRun.WriteConfiguration(logOutput); + console.WriteLine(""); + console.WriteLine(""); + console.WriteLine("Benchmark run in progress..."); + console.WriteLine("Verbose log: " + logPath); + console.WriteLine(""); + + testRun.Run(logOutput); + testRun.WriteBenchmarkResults(console); + } + + static TestRun ConfigureTestRun(CommandLineOptions options) + { + TestRun run = new TestRun() + { + OutputDir = GetInitialWorkingDir(), + DotnetFrameworkVersion = JitBench.VersioningConstants.MicrosoftNetCoreAppPackageVersion, + Iterations = 11 + }; + + if(options.OutputDirectory != null) + { + run.OutputDir = options.OutputDirectory; + } + + if(options.CoreCLRBinaryDir != null) + { + if(!Directory.Exists(options.CoreCLRBinaryDir)) + { + throw new Exception("coreclr-bin-dir directory " + options.CoreCLRBinaryDir + " does not exist"); + } + run.PrivateCoreCLRBinDir = options.CoreCLRBinaryDir; + } + else + { + string coreRootEnv = Environment.GetEnvironmentVariable("CORE_ROOT"); + if (coreRootEnv != null) + { + if (!Directory.Exists(coreRootEnv)) + { + throw new Exception("CORE_ROOT directory " + coreRootEnv + " does not exist"); + } + run.PrivateCoreCLRBinDir = coreRootEnv; + } + else + { + //maybe we've got private coreclr binaries in our current directory? Use those if so. + string currentDirectory = Directory.GetCurrentDirectory(); + if(File.Exists(Path.Combine(currentDirectory, "System.Private.CoreLib.dll"))) + { + run.PrivateCoreCLRBinDir = currentDirectory; + } + else + { + // don't use private CoreCLR binaries + } + } + } + + if(options.DotnetFrameworkVersion != null) + { + run.DotnetFrameworkVersion = options.DotnetFrameworkVersion; + } + + if(options.DotnetSdkVersion != null) + { + run.DotnetSdkVersion = options.DotnetSdkVersion; + } + else + { + run.DotnetSdkVersion = DotNetSetup.GetCompatibleDefaultSDKVersionForRuntimeVersion(run.DotnetFrameworkVersion); + } + + + if(options.TargetArchitecture != null) + { + if(options.TargetArchitecture.Equals("x64", StringComparison.OrdinalIgnoreCase)) + { + run.Architecture = Architecture.X64; + } + else if(options.TargetArchitecture.Equals("x86", StringComparison.OrdinalIgnoreCase)) + { + run.Architecture = Architecture.X86; + } + else + { + throw new Exception("Unrecognized architecture " + options.TargetArchitecture); + } + } + else + { + run.Architecture = RuntimeInformation.ProcessArchitecture; + } + + if(options.Iterations > 0) + { + run.Iterations = (int)options.Iterations; + } + + run.UseExistingSetup = options.UseExistingSetup; + run.BenchviewRunId = options.RunId ?? "Unofficial"; + run.MetricNames.AddRange(options.MetricNames); + run.Benchmarks.AddRange(GetBenchmarkSelection(options)); + run.Configurations.AddRange(GetBenchmarkConfigurations(options)); + + return run; + } + + static string GetInitialWorkingDir() + { + string timestamp = DateTime.Now.ToString("yyyy\\_MM\\_dd\\_hh\\_mm\\_ss\\_ffff"); + return Path.Combine(Path.GetTempPath(), "JitBench_" + timestamp); + } + + static IEnumerable<Benchmark> GetBenchmarkSelection(CommandLineOptions options) + { + if(options.BenchmarkName == null) + { + return GetAllBenchmarks(); + } + else + { + string[] names = options.BenchmarkName.Split(';'); + return GetAllBenchmarks().Where(b => names.Any(n => n.Equals(b.Name, StringComparison.OrdinalIgnoreCase))); + } + } + + static IEnumerable<Benchmark> GetAllBenchmarks() + { + IEnumerable<Type> benchmarkTypes = typeof(Program).GetTypeInfo().Assembly.GetTypes().Where(t => typeof(Benchmark).IsAssignableFrom(t)); + foreach (Type bt in benchmarkTypes) + { + ConstructorInfo c = bt.GetConstructor(Type.EmptyTypes); + if (c != null) + { + yield return (Benchmark)c.Invoke(null); + } + } + } + + static IEnumerable<BenchmarkConfiguration> GetBenchmarkConfigurations(CommandLineOptions options) + { + string tieredEnv = Environment.GetEnvironmentVariable("COMPLUS_EXPERIMENTAL_TieredCompilation"); + string minoptsEnv = Environment.GetEnvironmentVariable("COMPLUS_JitMinopts"); + string r2rEnv = Environment.GetEnvironmentVariable("COMPLUS_ReadyToRun"); + string ngenEnv = Environment.GetEnvironmentVariable("COMPLUS_ZapDisable"); + BenchmarkConfiguration envConfig = new BenchmarkConfiguration(); + if(tieredEnv != null && tieredEnv != "0") + { + envConfig.WithTiering(); + } + if (minoptsEnv != null && minoptsEnv != "0") + { + envConfig.WithMinOpts(); + } + if(r2rEnv != null && r2rEnv != "1") + { + envConfig.WithNoR2R(); + } + if(ngenEnv != null && ngenEnv != "0") + { + envConfig.WithNoNgen(); + } + + string[] configNames = options.Configs.Distinct().ToArray(); + if (!envConfig.IsDefault && configNames.Length != 0) + { + throw new Exception("ERROR: Benchmarks cannot be configured via both environment variables and the --configs command line option at the same time. Use one or the other."); + } + if (configNames.Length == 0) + { + yield return envConfig; + yield break; + } + + BenchmarkConfiguration[] possibleConfigs = new BenchmarkConfiguration[] + { + new BenchmarkConfiguration(), + new BenchmarkConfiguration().WithTiering(), + new BenchmarkConfiguration().WithMinOpts(), + new BenchmarkConfiguration().WithNoR2R(), + new BenchmarkConfiguration().WithNoNgen() + }; + foreach(string configName in configNames) + { + BenchmarkConfiguration config = possibleConfigs.Where(c => c.Name.Equals(configName, StringComparison.OrdinalIgnoreCase)).FirstOrDefault(); + if(config == null) + { + throw new ArgumentException("Unrecognized config value: " + configName); + } + else + { + yield return config; + } + } + } + } +} diff --git a/tests/src/performance/Scenario/JitBench/Runner/Statistics.cs b/tests/src/performance/Scenario/JitBench/Runner/Statistics.cs new file mode 100644 index 0000000000..3121441a9f --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/Runner/Statistics.cs @@ -0,0 +1,71 @@ +using System; +using System.Collections.Generic; +using System.Linq; +using System.Text; + +namespace JitBench +{ + public static class Statistics + { + public static double SampleStandardDeviation(this IEnumerable<double> data) + { + int n = data.Count(); + double sampleMean = data.Average(); + return Math.Sqrt(data.Select(x => (x - sampleMean) * (x - sampleMean)).Sum() / (n - 1)); + } + + public static double StandardError(this IEnumerable<double> data) + { + int n = data.Count(); + return SampleStandardDeviation(data) / Math.Sqrt(n); + } + + public static double MarginOfError95(this IEnumerable<double> data) + { + return StandardError(data) * 1.96; + } + + public static double Median(this IEnumerable<double> data) + { + double[] dataArr = data.ToArray(); + Array.Sort(dataArr); + if(dataArr.Length % 2 == 1) + { + return dataArr[dataArr.Length / 2]; + } + else + { + int midpoint = dataArr.Length / 2; + return (dataArr[midpoint-1] + dataArr[midpoint]) / 2; + } + } + + public static double Quartile1(this IEnumerable<double> data) + { + double[] dataArr = data.ToArray(); + Array.Sort(dataArr); + if (dataArr.Length % 2 == 1) + { + return Median(dataArr.Take(dataArr.Length / 2 + 1)); + } + else + { + return Median(dataArr.Take(dataArr.Length / 2)); + } + } + + public static double Quartile3(this IEnumerable<double> data) + { + double[] dataArr = data.ToArray(); + Array.Sort(dataArr); + if (dataArr.Length % 2 == 1) + { + return Median(dataArr.Skip(dataArr.Length / 2 )); + } + else + { + return Median(dataArr.Skip(dataArr.Length / 2)); + } + } + } +} diff --git a/tests/src/performance/Scenario/JitBench/Runner/TestRun.cs b/tests/src/performance/Scenario/JitBench/Runner/TestRun.cs new file mode 100644 index 0000000000..492bcd57b0 --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/Runner/TestRun.cs @@ -0,0 +1,281 @@ +using System; +using System.Collections.Generic; +using System.IO; +using System.Linq; +using System.Runtime.InteropServices; +using System.Text; +using System.Threading.Tasks; +using Microsoft.Diagnostics.Tracing.Session; + +namespace JitBench +{ + public class TestRun + { + public TestRun() + { + Benchmarks = new List<Benchmark>(); + Configurations = new List<BenchmarkConfiguration>(); + BenchmarkRunResults = new List<BenchmarkRunResult>(); + MetricNames = new List<string>(); + } + + public bool UseExistingSetup { get; set; } + public string DotnetFrameworkVersion { get; set; } + public string DotnetSdkVersion { get; set; } + public string PrivateCoreCLRBinDir { get; set; } + public Architecture Architecture { get; set; } + public string OutputDir { get; set; } + public int Iterations { get; set; } + public List<Benchmark> Benchmarks { get; } + public List<BenchmarkConfiguration> Configurations { get; private set; } + public List<string> MetricNames { get; private set; } + public string BenchviewRunId { get; set; } + public DotNetInstallation DotNetInstallation { get; private set; } + public List<BenchmarkRunResult> BenchmarkRunResults { get; private set; } + + + public void Run(ITestOutputHelper output) + { + CheckConfiguration(); + SetupBenchmarks(output).Wait(); + RunBenchmarks(output); + WriteBenchmarkResults(output); + } + + public void CheckConfiguration() + { + ValidateOutputDir(); + ValidateMetrics(); + } + + private void ValidateMetrics() + { + var validCollectionOptions = new[] { + "default", + "gcapi", + "stopwatch", + "BranchMispredictions", + "CacheMisses", + "InstructionRetired", + }; + var reducedList = MetricNames.Distinct(StringComparer.OrdinalIgnoreCase); + var isSubset = !reducedList.Except(validCollectionOptions, StringComparer.OrdinalIgnoreCase).Any(); + + if (!isSubset) + { + var errorMessage = $"Valid collection metrics are: {string.Join("|", validCollectionOptions)}"; + throw new InvalidOperationException(errorMessage); + } + + MetricNames = reducedList.Count() > 0 ? new List<string>(reducedList) : new List<string> { "stopwatch" }; + + if (MetricNames.Any(n => !n.Equals("stopwatch"))) + { + if (TraceEventSession.IsElevated() != true) + { + throw new UnauthorizedAccessException("The application is required to run as Administrator in order to capture kernel data"); + } + } + } + + private void ValidateOutputDir() + { + if (string.IsNullOrWhiteSpace(OutputDir)) + throw new InvalidOperationException("The output directory name cannot be null, empty or white space."); + + if (OutputDir.Any(c => Path.GetInvalidPathChars().Contains(c))) + throw new InvalidOperationException($"Specified output directory {OutputDir} contains invalid path characters."); + + OutputDir = Path.IsPathRooted(OutputDir) ? OutputDir : Path.GetFullPath(OutputDir); + if (OutputDir.Length > 80) + { + throw new InvalidOperationException($"The output directory path {OutputDir} is too long (>80 characters). Tests writing here may trigger errors because of path length limits"); + } + try + { + Directory.CreateDirectory(OutputDir); + } + catch (IOException e) + { + throw new Exception($"Unable to create output directory {OutputDir}: {e.Message}", e); + } + } + + public void WriteConfiguration(ITestOutputHelper output) + { + output.WriteLine(""); + output.WriteLine(" === CONFIGURATION ==="); + output.WriteLine(""); + output.WriteLine("DotnetFrameworkVersion: " + DotnetFrameworkVersion); + output.WriteLine("DotnetSdkVersion: " + DotnetSdkVersion); + output.WriteLine("PrivateCoreCLRBinDir: " + PrivateCoreCLRBinDir); + output.WriteLine("Architecture: " + Architecture); + output.WriteLine("OutputDir: " + OutputDir); + output.WriteLine("Iterations: " + Iterations); + output.WriteLine("UseExistingSetup: " + UseExistingSetup); + output.WriteLine("Configurations: " + string.Join(",", Configurations.Select(c => c.Name))); + } + + async Task SetupBenchmarks(ITestOutputHelper output) + { + output.WriteLine(""); + output.WriteLine(" === SETUP ==="); + output.WriteLine(""); + + if(UseExistingSetup) + { + output.WriteLine("UseExistingSetup is TRUE. Setup will be skipped."); + } + await PrepareDotNet(output); + foreach (Benchmark benchmark in Benchmarks) + { + await benchmark.Setup(DotNetInstallation, OutputDir, UseExistingSetup, output); + } + } + + async Task PrepareDotNet(ITestOutputHelper output) + { + if (!UseExistingSetup) + { + DotNetSetup setup = new DotNetSetup(Path.Combine(OutputDir, ".dotnet")) + .WithSdkVersion(DotnetSdkVersion) + .WithArchitecture(Architecture); + if(DotnetFrameworkVersion != "use-sdk") + { + setup.WithFrameworkVersion(DotnetFrameworkVersion); + } + if (PrivateCoreCLRBinDir != null) + { + setup.WithPrivateRuntimeBinaryOverlay(PrivateCoreCLRBinDir); + } + DotNetInstallation = await setup.Run(output); + } + else + { + DotNetInstallation = new DotNetInstallation(Path.Combine(OutputDir, ".dotnet"), DotnetFrameworkVersion, DotnetSdkVersion, Architecture); + } + } + + void RunBenchmarks(ITestOutputHelper output) + { + output.WriteLine(""); + output.WriteLine(" === EXECUTION ==="); + output.WriteLine(""); + foreach (Benchmark benchmark in Benchmarks) + { + BenchmarkRunResults.AddRange(benchmark.Run(this, output)); + } + } + + public void WriteBenchmarkResults(ITestOutputHelper output) + { + output.WriteLine(""); + output.WriteLine(" === RESULTS ==="); + output.WriteLine(""); + WriteBenchmarkResultsTable((b, m) => b.GetDefaultDisplayMetrics().Any(metric => metric.Equals(m)), output); + } + + void WriteBenchmarkResultsTable(Func<Benchmark,Metric, bool> primaryMetricSelector, ITestOutputHelper output) + { + List<ResultTableRowModel> rows = BuildRowModels(primaryMetricSelector); + List<ResultTableColumn> columns = BuildColumns(); + List<List<string>> formattedCells = new List<List<string>>(); + List<string> headerCells = new List<string>(); + foreach(var column in columns) + { + headerCells.Add(column.Heading); + } + formattedCells.Add(headerCells); + foreach(var row in rows) + { + List<string> rowFormattedCells = new List<string>(); + foreach(var column in columns) + { + rowFormattedCells.Add(column.CellFormatter(row)); + } + formattedCells.Add(rowFormattedCells); + } + StringBuilder headerRow = new StringBuilder(); + StringBuilder headerRowUnderline = new StringBuilder(); + StringBuilder rowFormat = new StringBuilder(); + for (int j = 0; j < columns.Count; j++) + { + int columnWidth = Enumerable.Range(0, formattedCells.Count).Select(i => formattedCells[i][j].Length).Max(); + int hw = headerCells[j].Length; + headerRow.Append(headerCells[j].PadLeft(hw + (columnWidth - hw) / 2).PadRight(columnWidth + 2)); + headerRowUnderline.Append(new string('-', columnWidth) + " "); + rowFormat.Append("{" + j + "," + columnWidth + "} "); + } + output.WriteLine(headerRow.ToString()); + output.WriteLine(headerRowUnderline.ToString()); + for(int i = 1; i < formattedCells.Count; i++) + { + output.WriteLine(string.Format(rowFormat.ToString(), formattedCells[i].ToArray())); + } + } + + List<ResultTableRowModel> BuildRowModels(Func<Benchmark, Metric, bool> primaryMetricSelector) + { + List<ResultTableRowModel> rows = new List<ResultTableRowModel>(); + foreach (Benchmark benchmark in Benchmarks) + { + BenchmarkRunResult canonResult = BenchmarkRunResults.Where(r => r.Benchmark == benchmark).FirstOrDefault(); + if (canonResult == null || canonResult.IterationResults == null || canonResult.IterationResults.Count == 0) + { + continue; + } + IterationResult canonIteration = canonResult.IterationResults[0]; + foreach (Metric metric in canonIteration.Measurements.Keys) + { + if (primaryMetricSelector(benchmark, metric)) + { + rows.Add(new ResultTableRowModel() { Benchmark = benchmark, Metric = metric }); + } + } + } + return rows; + } + + List<ResultTableColumn> BuildColumns() + { + List<ResultTableColumn> columns = new List<ResultTableColumn>(); + ResultTableColumn benchmarkColumn = new ResultTableColumn(); + benchmarkColumn.Heading = "Benchmark"; + benchmarkColumn.CellFormatter = row => row.Benchmark.Name; + columns.Add(benchmarkColumn); + ResultTableColumn metricNameColumn = new ResultTableColumn(); + metricNameColumn.Heading = "Metric"; + metricNameColumn.CellFormatter = row => $"{row.Metric.Name} ({row.Metric.Unit})"; + columns.Add(metricNameColumn); + foreach(BenchmarkConfiguration config in Configurations) + { + ResultTableColumn column = new ResultTableColumn(); + column.Heading = config.Name; + column.CellFormatter = row => + { + var runResult = BenchmarkRunResults.Where(r => r.Benchmark == row.Benchmark && r.Configuration == config).Single(); + var measurements = runResult.IterationResults.Skip(1).Select(r => r.Measurements.Where(kv => kv.Key.Equals(row.Metric)).Single()).Select(kv => kv.Value); + double median = measurements.Median(); + double q1 = measurements.Quartile1(); + double q3 = measurements.Quartile3(); + int digits = Math.Min(Math.Max(0, (int)Math.Ceiling(-Math.Log10(q3-q1) + 1)), 15); + return $"{Math.Round(median, digits)} ({Math.Round(q1, digits)}-{Math.Round(q3, digits)})"; + }; + columns.Add(column); + } + return columns; + } + + class ResultTableRowModel + { + public Benchmark Benchmark; + public Metric Metric; + } + + class ResultTableColumn + { + public string Heading; + public Func<ResultTableRowModel, string> CellFormatter; + } + } +} diff --git a/tests/src/performance/Scenario/JitBench/Utilities/ConsoleRedirector.cs b/tests/src/performance/Scenario/JitBench/Utilities/ConsoleRedirector.cs new file mode 100644 index 0000000000..372516976c --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/Utilities/ConsoleRedirector.cs @@ -0,0 +1,49 @@ +using System; +using System.Collections.Generic; +using System.IO; +using System.Text; + +namespace JitBench +{ + /// <summary> + /// Diverts console output into a ITestOutputHelper + /// </summary> + public class ConsoleRedirector : IDisposable, ITestOutputHelper + { + ITestOutputHelper _output; + TextWriter _originalConsoleOut; + MemoryStream _bufferedConsoleStream; + StreamWriter _bufferedConsoleWriter; + + public ConsoleRedirector(ITestOutputHelper output) + { + _output = output; + _originalConsoleOut = Console.Out; + _bufferedConsoleStream = new MemoryStream(); + Console.SetOut(_bufferedConsoleWriter = new StreamWriter(_bufferedConsoleStream)); + } + + public void Dispose() + { + Console.SetOut(_originalConsoleOut); + if(_output != null) + { + _bufferedConsoleWriter.Flush(); + StreamReader reader = new StreamReader(_bufferedConsoleStream); + _bufferedConsoleStream.Seek(0, SeekOrigin.Begin); + while (true) + { + string line = reader.ReadLine(); + if (line == null) + break; + _output.WriteLine(line); + } + } + } + + public void WriteLine(string line) + { + _bufferedConsoleWriter.WriteLine(line); + } + } +} diff --git a/tests/src/performance/Scenario/JitBench/Utilities/ConsoleTestOutputHelper.cs b/tests/src/performance/Scenario/JitBench/Utilities/ConsoleTestOutputHelper.cs new file mode 100644 index 0000000000..61eb053e0c --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/Utilities/ConsoleTestOutputHelper.cs @@ -0,0 +1,12 @@ +using System; + +namespace JitBench +{ + public class ConsoleTestOutputHelper : ITestOutputHelper + { + public void WriteLine(string message) + { + Console.WriteLine(message); + } + } +} diff --git a/tests/src/performance/Scenario/JitBench/Utilities/DotNetSetup.cs b/tests/src/performance/Scenario/JitBench/Utilities/DotNetSetup.cs new file mode 100644 index 0000000000..9f8a0da3c1 --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/Utilities/DotNetSetup.cs @@ -0,0 +1,365 @@ +using System; +using System.IO; +using System.IO.Compression; +using System.Net; +using System.Runtime.InteropServices; +using System.Threading; +using System.Threading.Tasks; + +namespace JitBench +{ + public class DotNetSetup + { + public DotNetSetup(string dotNetDirPath) + { + DotNetDirPath = dotNetDirPath; + OS = DefaultOSPlatform; + Architecture = RuntimeInformation.OSArchitecture; + AzureFeed = DefaultAzureFeed; + } + + public string DotNetDirPath { get; set; } + public string FrameworkVersion { get; set; } + public string SdkVersion { get; set; } + public OSPlatform OS { get; set; } + public Architecture Architecture { get; set; } + public string AzureFeed { get; set; } + public string PrivateRuntimeBinaryDirPath { get; set; } + + public DotNetSetup WithFrameworkVersion(string version) + { + FrameworkVersion = version; + return this; + } + + public DotNetSetup WithSdkVersion(string version) + { + SdkVersion = version; + return this; + } + + public DotNetSetup WithArchitecture(Architecture architecture) + { + Architecture = architecture; + return this; + } + + public DotNetSetup WithOS(OSPlatform os) + { + OS = os; + return this; + } + + public DotNetSetup WithPrivateRuntimeBinaryOverlay(string privateRuntimeBinaryDirPath) + { + PrivateRuntimeBinaryDirPath = privateRuntimeBinaryDirPath; + return this; + } + + public string GetFrameworkDownloadLink() + { + if(FrameworkVersion == null) + { + return null; + } + else + { + return GetFrameworkDownloadLink(AzureFeed, FrameworkVersion, OS, Architecture); + } + } + + public string GetSDKDownloadLink() + { + if(SdkVersion == null) + { + return null; + } + else + { + return GetSDKDownloadLink(AzureFeed, SdkVersion, OS, Architecture); + } + } + + async public Task<DotNetInstallation> Run(ITestOutputHelper output) + { + using (var acquireOutput = new IndentedTestOutputHelper("Acquiring DotNet", output)) + { + string remoteSdkPath = GetSDKDownloadLink(); + if(remoteSdkPath != null) + { + await FileTasks.DownloadAndUnzip(remoteSdkPath, DotNetDirPath, acquireOutput); + } + string remoteRuntimePath = GetFrameworkDownloadLink(); + if(remoteRuntimePath != null) + { + await FileTasks.DownloadAndUnzip(remoteRuntimePath, DotNetDirPath, acquireOutput); + + // the SDK may have included another runtime version, but to help prevent mistakes + // where a test might run against a different version than we intended all other + // versions will be deleted. + string mnappDirPath = Path.Combine(DotNetDirPath, "shared", "Microsoft.NETCore.App"); + foreach (string dir in Directory.GetDirectories(mnappDirPath)) + { + string versionDir = Path.GetFileName(dir); + if (versionDir != FrameworkVersion) + { + FileTasks.DeleteDirectory(dir, acquireOutput); + } + } + } + string actualFrameworkVersion = FrameworkVersion; + if (actualFrameworkVersion == null) + { + //if Framework version is being infered from an SDK then snoop the filesystem to see what got installed + foreach (string dirPath in Directory.EnumerateDirectories(Path.Combine(DotNetDirPath, "shared", "Microsoft.NETCore.App"))) + { + actualFrameworkVersion = Path.GetFileName(dirPath); + break; + } + } + + + DotNetInstallation result = new DotNetInstallation(DotNetDirPath, actualFrameworkVersion, SdkVersion, Architecture); + acquireOutput.WriteLine("Dotnet path: " + result.DotNetExe); + if (!File.Exists(result.DotNetExe)) + { + throw new FileNotFoundException(result.DotNetExe + " not found"); + } + if (result.SdkVersion != null) + { + if (!Directory.Exists(result.SdkDir)) + { + throw new DirectoryNotFoundException("Sdk directory " + result.SdkDir + " not found"); + } + } + if (result.FrameworkVersion != null) + { + if (!Directory.Exists(result.FrameworkDir)) + { + throw new DirectoryNotFoundException("Framework directory " + result.FrameworkDir + " not found"); + } + + //overlay private binaries if needed + if (PrivateRuntimeBinaryDirPath != null) + { + foreach (string fileName in GetPrivateRuntimeOverlayBinaryNames(OS)) + { + string backupPath = Path.Combine(result.FrameworkDir, fileName + ".original"); + string overwritePath = Path.Combine(result.FrameworkDir, fileName); + string privateBinPath = Path.Combine(PrivateRuntimeBinaryDirPath, fileName); + if (!File.Exists(backupPath)) + { + File.Copy(overwritePath, backupPath); + } + if (!File.Exists(privateBinPath)) + { + throw new FileNotFoundException("Private binary " + privateBinPath + " not found"); + } + File.Copy(privateBinPath, overwritePath, true); + } + } + } + return result; + } + } + + public static string DefaultFrameworkVersion { get { return "2.0.0"; } } + + public static string DefaultAzureFeed { get { return "https://dotnetcli.azureedge.net/dotnet"; } } + + public static OSPlatform DefaultOSPlatform + { + get + { + if (RuntimeInformation.IsOSPlatform(OSPlatform.Windows)) + return OSPlatform.Windows; + if (RuntimeInformation.IsOSPlatform(OSPlatform.Linux)) + return OSPlatform.Linux; + if (RuntimeInformation.IsOSPlatform(OSPlatform.OSX)) + return OSPlatform.OSX; + throw new Exception("Unable to detect current OS"); + } + } + + public static string GetNormalizedOSName(OSPlatform os) + { + if (os == OSPlatform.Windows) + { + return "win"; + } + else if (os == OSPlatform.Linux) + { + return "linux"; + } + else if (os == OSPlatform.OSX) + { + return "osx"; + } + else + { + throw new Exception("OS " + os + " wasn't recognized. No normalized name for dotnet download is available"); + } + } + + + + public static string GetRuntimeDownloadLink(string version, Architecture arch) + { + return GetFrameworkDownloadLink(DefaultAzureFeed, version, DefaultOSPlatform, arch); + } + + public static string GetFrameworkDownloadLink(string azureFeed, string version, OSPlatform os, Architecture arch) + { + return GetRuntimeDownloadLink(azureFeed, version, GetNormalizedOSName(os), DotNetInstallation.GetNormalizedArchitectureName(arch)); + } + + public static string GetRuntimeDownloadLink(string azureFeed, string version, string os, string arch) + { + return string.Format("{0}/Runtime/{1}/dotnet-runtime-{1}-{2}-{3}.zip", azureFeed, version, os, arch); + } + + public static string GetSDKDownloadLink(string version, Architecture arch) + { + return GetSDKDownloadLink(DefaultAzureFeed, version, DefaultOSPlatform, arch); + } + + public static string GetSDKDownloadLink(string azureFeed, string version, OSPlatform os, Architecture arch) + { + return GetSDKDownloadLink(azureFeed, version, GetNormalizedOSName(os), DotNetInstallation.GetNormalizedArchitectureName(arch)); + } + + public static string GetSDKDownloadLink(string azureFeed, string version, string os, string arch) + { + return string.Format("{0}/Sdk/{1}/dotnet-sdk-{1}-{2}-{3}.zip", azureFeed, version, os, arch); + } + + public static string GetTargetFrameworkMonikerForFrameworkVersion(string runtimeVersion) + { + if(runtimeVersion.StartsWith("2.0")) + { + return "netcoreapp2.0"; + } + else if(runtimeVersion.StartsWith("2.1")) + { + return "netcoreapp2.1"; + } + else + { + throw new NotSupportedException("Version " + runtimeVersion + " doesn't have a known TFM"); + } + } + + public static string GetCompatibleDefaultSDKVersionForRuntimeVersion(string runtimeVersion) + { + return GetCompatibleDefaultSDKVersionForRuntimeTFM( + GetTargetFrameworkMonikerForFrameworkVersion(runtimeVersion)); + } + + public static string GetCompatibleDefaultSDKVersionForRuntimeTFM(string targetFrameworkMoniker) + { + if (targetFrameworkMoniker == "netcoreapp2.0") + { + return "2.0.0"; + } + else if (targetFrameworkMoniker == "netcoreapp2.1") + { + return "2.2.0-preview1-007558"; + } + else + { + throw new Exception("No compatible SDK version has been designated for TFM: " + targetFrameworkMoniker); + } + } + + public static string[] GetPrivateRuntimeOverlayBinaryNames(OSPlatform os) + { + return new string[] + { + GetNativeDllNameConvention("coreclr", os), + GetNativeDllNameConvention("clrjit", os), + GetNativeDllNameConvention("mscordaccore", os), + GetNativeDllNameConvention("mscordbi", os), + GetNativeDllNameConvention("sos", os), + "sos.NETCore.dll", + GetNativeDllNameConvention("clretwrc", os), + "System.Private.CoreLib.dll", + "mscorrc.debug.dll", + "mscorrc.dll" + }; + } + + private static string GetNativeDllNameConvention(string baseName, OSPlatform os) + { + if(os == OSPlatform.Windows) + { + return baseName + ".dll"; + } + else + { + return "lib" + baseName; + } + } + + + } + + public class DotNetInstallation + { + public DotNetInstallation(string dotNetDir, string frameworkVersion, string sdkVersion, Architecture architecture) + { + DotNetDir = dotNetDir; + FrameworkVersion = frameworkVersion; + SdkVersion = sdkVersion; + Architecture = GetNormalizedArchitectureName(architecture); + DotNetExe = Path.Combine(DotNetDir, "dotnet" + (RuntimeInformation.IsOSPlatform(OSPlatform.Windows) ? ".exe" : "")); + if(frameworkVersion != null) + { + FrameworkDir = GetFrameworkDir(dotNetDir, frameworkVersion); + } + if(sdkVersion != null) + { + SdkDir = GetSDKDir(dotNetDir, sdkVersion); + } + } + + public string DotNetExe { get; } + public string DotNetDir { get; } + public string FrameworkDir { get; } + public string FrameworkVersion { get; } + public string SdkDir { get; } + public string SdkVersion { get; } + public string Architecture { get; } + + public static string GetMNAppDir(string dotNetDir) + { + return Path.Combine(dotNetDir, "shared", "Microsoft.NETCore.App"); + } + + public static string GetFrameworkDir(string dotNetDir, string frameworkVersion) + { + return Path.Combine(GetMNAppDir(dotNetDir), frameworkVersion); + } + + public static string GetSDKDir(string dotNetDir, string sdkVersion) + { + return Path.Combine(dotNetDir, "sdk", sdkVersion); + } + + public static string GetNormalizedArchitectureName(Architecture arch) + { + if (arch == System.Runtime.InteropServices.Architecture.X64) + { + return "x64"; + } + else if (arch == System.Runtime.InteropServices.Architecture.X86) + { + return "x86"; + } + else + { + throw new Exception("Architecture " + arch + " wasn't recognized. No normalized name for dotnet download is available"); + } + } + } + +} diff --git a/tests/src/performance/Scenario/JitBench/Utilities/FileTasks.cs b/tests/src/performance/Scenario/JitBench/Utilities/FileTasks.cs new file mode 100644 index 0000000000..5e9efa2ffb --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/Utilities/FileTasks.cs @@ -0,0 +1,237 @@ +using System; +using System.Collections.Generic; +using System.IO; +using System.IO.Compression; +using System.Net; +using System.Net.Http; +using System.Runtime.InteropServices; +using System.Text; +using System.Threading; +using System.Threading.Tasks; + +namespace JitBench +{ + public static class FileTasks + { + public async static Task DownloadAndUnzip(string remotePath, string localExpandedDirPath, ITestOutputHelper output, bool deleteTempFiles=true) + { + string tempFileNameBase = Guid.NewGuid().ToString(); + string tempDownloadPath = Path.Combine(Path.GetTempPath(), tempFileNameBase + Path.GetExtension(remotePath)); + Download(remotePath, tempDownloadPath, output); + await Unzip(tempDownloadPath, localExpandedDirPath, output, true); + } + + public static void Download(string remotePath, string localPath, ITestOutputHelper output) + { + output.WriteLine("Downloading: " + remotePath + " -> " + localPath); + Directory.CreateDirectory(Path.GetDirectoryName(localPath)); + using (var client = new HttpClient()) + { + using (FileStream localStream = File.Create(localPath)) + { + using (Stream stream = client.GetStreamAsync(remotePath).Result) + stream.CopyTo(localStream); + localStream.Flush(); + } + } + } + + public static async Task Unzip(string zipPath, string expandedDirPath, ITestOutputHelper output, bool deleteZippedFiles=true, string tempTarPath=null) + { + if (zipPath.EndsWith(".zip")) + { + await FileTasks.UnWinZip(zipPath, expandedDirPath, output); + if (deleteZippedFiles) + { + File.Delete(zipPath); + } + } + else if (zipPath.EndsWith(".tar.gz")) + { + bool deleteTar = deleteZippedFiles; + if(tempTarPath == null) + { + string tempFileNameBase = Guid.NewGuid().ToString(); + tempTarPath = Path.Combine(Path.GetTempPath(), tempFileNameBase + ".tar"); + deleteTar = true; + } + await UnGZip(zipPath, tempTarPath, output); + await UnTar(tempTarPath, expandedDirPath, output); + if(deleteZippedFiles) + { + File.Delete(zipPath); + } + if(deleteTar) + { + File.Delete(tempTarPath); + } + } + else + { + output.WriteLine("Unsupported compression format: " + zipPath); + throw new NotSupportedException("Unsupported compression format: " + zipPath); + } + } + + public static async Task UnWinZip(string zipPath, string expandedDirPath, ITestOutputHelper output) + { + output.WriteLine("Unziping: " + zipPath + " -> " + expandedDirPath); + using (FileStream zipStream = File.OpenRead(zipPath)) + { + ZipArchive zip = new ZipArchive(zipStream); + foreach (ZipArchiveEntry entry in zip.Entries) + { + if(entry.CompressedLength == 0) + { + continue; + } + string extractedFilePath = Path.Combine(expandedDirPath, entry.FullName); + Directory.CreateDirectory(Path.GetDirectoryName(extractedFilePath)); + using (Stream zipFileStream = entry.Open()) + { + using (FileStream extractedFileStream = File.OpenWrite(extractedFilePath)) + { + await zipFileStream.CopyToAsync(extractedFileStream); + } + } + } + } + } + + public async static Task UnGZip(string gzipPath, string expandedFilePath, ITestOutputHelper output) + { + output.WriteLine("Unziping: " + gzipPath + " -> " + expandedFilePath); + using (FileStream gzipStream = File.OpenRead(gzipPath)) + { + using (GZipStream expandedStream = new GZipStream(gzipStream, CompressionMode.Decompress)) + { + using (FileStream targetFileStream = File.OpenWrite(expandedFilePath)) + { + await expandedStream.CopyToAsync(targetFileStream); + } + } + } + } + + public async static Task UnTar(string tarPath, string expandedDirPath, ITestOutputHelper output) + { + Directory.CreateDirectory(expandedDirPath); + string tarToolPath = null; + if (RuntimeInformation.IsOSPlatform(OSPlatform.Linux)) + { + tarToolPath = "/bin/tar"; + } + else if (RuntimeInformation.IsOSPlatform(OSPlatform.OSX)) + { + tarToolPath = "/usr/bin/tar"; + } + else + { + throw new NotSupportedException("Unknown where this OS stores the tar executable"); + } + + await new ProcessRunner(tarToolPath, "-xf " + tarPath). + WithWorkingDirectory(expandedDirPath). + WithLog(output). + WithExpectedExitCode(0). + Run(); + } + + public static void DirectoryCopy(string sourceDir, string destDir, ITestOutputHelper output = null, bool overwrite = true) + { + if(output != null) + { + output.WriteLine("Copying " + sourceDir + " -> " + destDir); + } + + DirectoryInfo dir = new DirectoryInfo(sourceDir); + + DirectoryInfo[] dirs = dir.GetDirectories(); + if (!Directory.Exists(destDir)) + { + Directory.CreateDirectory(destDir); + } + + FileInfo[] files = dir.GetFiles(); + foreach (FileInfo file in files) + { + string temppath = Path.Combine(destDir, file.Name); + file.CopyTo(temppath, overwrite); + } + + foreach (DirectoryInfo subdir in dirs) + { + string temppath = Path.Combine(destDir, subdir.Name); + DirectoryCopy(subdir.FullName, temppath, null, overwrite); + } + } + + public static void DeleteDirectory(string path, ITestOutputHelper output) + { + output.WriteLine("Deleting " + path); + int retries = 10; + for(int i = 0; i < retries; i++) + { + if(!Directory.Exists(path)) + { + return; + } + try + { + Directory.Delete(path, true); + return; + } + catch(IOException e) when (i < retries-1) + { + output.WriteLine($" Attempt #{i+1} failed: {e.Message}"); + } + catch(UnauthorizedAccessException e) when (i < retries - 1) + { + output.WriteLine($" Attempt #{i + 1} failed: {e.Message}"); + } + // if something has a transient lock on the file waiting may resolve the issue + Thread.Sleep((i+1) * 10); + } + } + + public static void MoveDirectory(string sourceDirName, string destDirName, ITestOutputHelper output) + { + if (output != null) + { + output.WriteLine("Moving " + sourceDirName + " -> " + destDirName); + } + int retries = 10; + for (int i = 0; i < retries; i++) + { + if (!Directory.Exists(sourceDirName) && Directory.Exists(destDirName)) + { + return; + } + try + { + Directory.Move(sourceDirName, destDirName); + return; + } + catch (IOException e) when (i < retries - 1) + { + output.WriteLine($" Attempt #{i + 1} failed: {e.Message}"); + } + catch (UnauthorizedAccessException e) when (i < retries - 1) + { + output.WriteLine($" Attempt #{i + 1} failed: {e.Message}"); + } + // if something has a transient lock on the file waiting may resolve the issue + Thread.Sleep((i + 1) * 10); + } + } + + public static void CreateDirectory(string path, ITestOutputHelper output) + { + output.WriteLine("Creating " + path); + if (!Directory.Exists(path)) + { + Directory.CreateDirectory(path); + } + } + } +} diff --git a/tests/src/performance/Scenario/JitBench/Utilities/FileTestOutputHelper.cs b/tests/src/performance/Scenario/JitBench/Utilities/FileTestOutputHelper.cs new file mode 100644 index 0000000000..b5cb6f9804 --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/Utilities/FileTestOutputHelper.cs @@ -0,0 +1,43 @@ +using System; +using System.Collections.Generic; +using System.IO; +using System.Linq; +using System.Text; +using System.Threading.Tasks; + +namespace JitBench +{ + /// <summary> + /// An ITestOutputHelper implementation that logs to a file + /// </summary> + public class FileTestOutputHelper : ITestOutputHelper, IDisposable + { + readonly StreamWriter _logWriter; + readonly object _lock; + + public FileTestOutputHelper(string logFilePath, FileMode fileMode = FileMode.Create) + { + Directory.CreateDirectory(Path.GetDirectoryName(logFilePath)); + FileStream fs = new FileStream(logFilePath, fileMode); + _logWriter = new StreamWriter(fs); + _logWriter.AutoFlush = true; + _lock = new object(); + } + + public void WriteLine(string message) + { + lock (_lock) + { + _logWriter.WriteLine(message); + } + } + + public void Dispose() + { + lock (_lock) + { + _logWriter.Dispose(); + } + } + } +} diff --git a/tests/src/performance/Scenario/JitBench/Utilities/IProcessLogger.cs b/tests/src/performance/Scenario/JitBench/Utilities/IProcessLogger.cs new file mode 100644 index 0000000000..06754b03e7 --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/Utilities/IProcessLogger.cs @@ -0,0 +1,25 @@ +namespace JitBench +{ + public enum ProcessStream + { + StandardIn = 0, + StandardOut = 1, + StandardError = 2, + MaxStreams = 3 + } + + public enum KillReason + { + TimedOut, + Unknown + } + + public interface IProcessLogger + { + void ProcessExited(ProcessRunner runner); + void ProcessKilled(ProcessRunner runner, KillReason reason); + void ProcessStarted(ProcessRunner runner); + void Write(ProcessRunner runner, string data, ProcessStream stream); + void WriteLine(ProcessRunner runner, string data, ProcessStream stream); + } +} diff --git a/tests/src/performance/Scenario/JitBench/Utilities/ITestOutputHelper.cs b/tests/src/performance/Scenario/JitBench/Utilities/ITestOutputHelper.cs new file mode 100644 index 0000000000..7dcebdfdef --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/Utilities/ITestOutputHelper.cs @@ -0,0 +1,7 @@ +namespace JitBench +{ + public interface ITestOutputHelper + { + void WriteLine(string line); + } +} diff --git a/tests/src/performance/Scenario/JitBench/Utilities/IndentedTestOutputHelper.cs b/tests/src/performance/Scenario/JitBench/Utilities/IndentedTestOutputHelper.cs new file mode 100644 index 0000000000..6121400935 --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/Utilities/IndentedTestOutputHelper.cs @@ -0,0 +1,43 @@ +using System; +using System.Collections.Generic; +using System.Linq; +using System.Text; +using System.Threading.Tasks; + +namespace JitBench +{ + /// <summary> + /// An implementation of ITestOutputHelper that adds one indent level to + /// the start of each line + /// </summary> + public class IndentedTestOutputHelper : ITestOutputHelper, IDisposable + { + readonly string _indentText; + readonly ITestOutputHelper _output; + readonly string _closingBrace; + + public IndentedTestOutputHelper(string header, ITestOutputHelper innerOutput) : + this(header, innerOutput, " ", "{", "}" + Environment.NewLine) + { + } + + public IndentedTestOutputHelper(string header, ITestOutputHelper innerOutput, string indentText, string openingBrace, string closingBrace) + { + _output = innerOutput; + _indentText = indentText; + _closingBrace = closingBrace; + _output.WriteLine(header); + _output.WriteLine(openingBrace); + } + + public void Dispose() + { + _output.WriteLine(_closingBrace); + } + + public void WriteLine(string message) + { + _output.WriteLine(_indentText + message); + } + } +} diff --git a/tests/src/performance/Scenario/JitBench/Utilities/ProcessRunner.cs b/tests/src/performance/Scenario/JitBench/Utilities/ProcessRunner.cs new file mode 100644 index 0000000000..60d30e1af6 --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/Utilities/ProcessRunner.cs @@ -0,0 +1,451 @@ +using System; +using System.Collections.Generic; +using System.Diagnostics; +using System.IO; +using System.Text; +using System.Threading; +using System.Threading.Tasks; + +namespace JitBench +{ + /// <summary> + /// Executes a process and logs the output + /// </summary> + /// <remarks> + /// The intended lifecycle is: + /// a) Create a new ProcessRunner + /// b) Use the various WithXXX methods to modify the configuration of the process to launch + /// c) await RunAsync() to start the process and wait for it to terminate. Configuration + /// changes are no longer possible + /// d) While waiting for RunAsync(), optionally call Kill() one or more times. This will expedite + /// the termination of the process but there is no guarantee the process is terminated by + /// the time Kill() returns. + /// + /// Although the entire API of this type has been designed to be thread-safe, its typical that + /// only calls to Kill() and property getters invoked within the logging callbacks will be called + /// asynchronously. + /// </remarks> + public class ProcessRunner + { + // All of the locals might accessed from multiple threads and need to read/written under + // the _lock. We also use the lock to synchronize property access on the process object. + // + // Be careful not to cause deadlocks by calling the logging callbacks with the lock held. + // The logger has its own lock and it will hold that lock when it calls into property getters + // on this type. + object _lock = new object(); + + List<IProcessLogger> _loggers; + Process _p; + DateTime _startTime; + TimeSpan _timeout; + ITestOutputHelper _traceOutput; + int? _expectedExitCode; + TaskCompletionSource<Process> _waitForProcessStartTaskSource; + Task<int> _waitForExitTask; + Task _timeoutProcessTask; + Task _readStdOutTask; + Task _readStdErrTask; + CancellationTokenSource _cancelSource; + private string _replayCommand; + private KillReason? _killReason; + + public ProcessRunner(string exePath, string arguments, string replayCommand = null) + { + ProcessStartInfo psi = new ProcessStartInfo(); + psi.FileName = exePath; + psi.Arguments = arguments; + psi.UseShellExecute = false; + psi.RedirectStandardInput = true; + psi.RedirectStandardOutput = true; + psi.RedirectStandardError = true; + psi.CreateNoWindow = true; + + lock (_lock) + { + _p = new Process(); + _p.StartInfo = psi; + _p.EnableRaisingEvents = false; + _loggers = new List<IProcessLogger>(); + _timeout = TimeSpan.FromMinutes(10); + _cancelSource = new CancellationTokenSource(); + _killReason = null; + _waitForProcessStartTaskSource = new TaskCompletionSource<Process>(); + Task<Process> startTask = _waitForProcessStartTaskSource.Task; + + // unfortunately we can't use the default Process stream reading because it only returns full lines and we have scenarios + // that need to receive the output before the newline character is written + _readStdOutTask = startTask.ContinueWith(t => + { + ReadStreamToLoggers(_p.StandardOutput, ProcessStream.StandardOut, _cancelSource.Token); + }, + _cancelSource.Token, TaskContinuationOptions.LongRunning, TaskScheduler.Default); + + _readStdErrTask = startTask.ContinueWith(t => + { + ReadStreamToLoggers(_p.StandardError, ProcessStream.StandardError, _cancelSource.Token); + }, + _cancelSource.Token, TaskContinuationOptions.LongRunning, TaskScheduler.Default); + + _timeoutProcessTask = startTask.ContinueWith(t => + { + Task.Delay(_timeout, _cancelSource.Token).ContinueWith(t2 => Kill(KillReason.TimedOut), TaskContinuationOptions.NotOnCanceled); + }, + _cancelSource.Token, TaskContinuationOptions.LongRunning, TaskScheduler.Default); + + _waitForExitTask = InternalWaitForExit(startTask, _readStdOutTask, _readStdErrTask); + + if (replayCommand == null) + { + _replayCommand = ExePath + " " + Arguments; + } + else + { + _replayCommand = replayCommand; + } + } + } + + public string ReplayCommand + { + get { lock (_lock) { return _replayCommand; } } + } + + public ProcessRunner WithEnvironmentVariable(string key, string value) + { + lock (_lock) + { + _p.StartInfo.Environment[key] = value; + } + return this; + } + + public ProcessRunner WithEnvironment(IDictionary<string,string> environmentVariables) + { + lock (_lock) + { + if(environmentVariables != null) + { + foreach (KeyValuePair<string, string> kv in environmentVariables) + { + _p.StartInfo.Environment[kv.Key] = kv.Value; + } + } + } + return this; + } + + public ProcessRunner WithWorkingDirectory(string workingDirectory) + { + lock (_lock) + { + _p.StartInfo.WorkingDirectory = workingDirectory; + } + return this; + } + + public ProcessRunner WithLog(IProcessLogger logger) + { + lock (_lock) + { + _loggers.Add(logger); + } + return this; + } + + public ProcessRunner WithLog(ITestOutputHelper output) + { + lock (_lock) + { + _loggers.Add(new TestOutputProcessLogger(output)); + } + return this; + } + + public ProcessRunner WithDiagnosticTracing(ITestOutputHelper traceOutput) + { + lock (_lock) + { + _traceOutput = traceOutput; + } + return this; + } + + public IProcessLogger[] Loggers + { + get { lock (_lock) { return _loggers.ToArray(); } } + } + + public ProcessRunner WithTimeout(TimeSpan timeout) + { + lock (_lock) + { + _timeout = timeout; + } + return this; + } + + public ProcessRunner WithExpectedExitCode(int expectedExitCode) + { + lock (_lock) + { + _expectedExitCode = expectedExitCode; + } + return this; + } + + public string ExePath + { + get { lock (_lock) { return _p.StartInfo.FileName; } } + } + + public string Arguments + { + get { lock (_lock) { return _p.StartInfo.Arguments; } } + } + + public string WorkingDirectory + { + get { lock (_lock) { return _p.StartInfo.WorkingDirectory; } } + } + + public int ProcessId + { + get { lock (_lock) { return _p.Id; } } + } + + public Dictionary<string,string> EnvironmentVariables + { + get { lock (_lock) { return new Dictionary<string, string>(_p.StartInfo.Environment); } } + } + + public bool IsStarted + { + get { lock (_lock) { return _waitForProcessStartTaskSource.Task.IsCompleted; } } + } + + public DateTime StartTime + { + get { lock (_lock) { return _startTime; } } + } + + public int ExitCode + { + get { lock (_lock) { return _p.ExitCode; } } + } + + public void StandardInputWriteLine(string line) + { + IProcessLogger[] loggers = null; + StreamWriter inputStream = null; + lock (_lock) + { + loggers = _loggers.ToArray(); + inputStream = _p.StandardInput; + } + foreach (IProcessLogger logger in loggers) + { + logger.WriteLine(this, line, ProcessStream.StandardIn); + } + inputStream.WriteLine(line); + } + + public Task<int> Run() + { + Start(); + return WaitForExit(); + } + + public Task<int> WaitForExit() + { + lock (_lock) + { + return _waitForExitTask; + } + } + + public ProcessRunner Start() + { + Process p = null; + lock (_lock) + { + p = _p; + } + // this is safe to call on multiple threads, it only launches the process once + bool started = p.Start(); + + IProcessLogger[] loggers = null; + lock (_lock) + { + // only the first thread to get here will initialize this state + if (!_waitForProcessStartTaskSource.Task.IsCompleted) + { + loggers = _loggers.ToArray(); + _startTime = DateTime.Now; + _waitForProcessStartTaskSource.SetResult(_p); + } + } + + // only the first thread that entered the lock above will run this + if (loggers != null) + { + foreach (IProcessLogger logger in loggers) + { + logger.ProcessStarted(this); + } + } + + return this; + } + + private void ReadStreamToLoggers(StreamReader reader, ProcessStream stream, CancellationToken cancelToken) + { + IProcessLogger[] loggers = Loggers; + + // for the best efficiency we want to read in chunks, but if the underlying stream isn't + // going to timeout partial reads then we have to fall back to reading one character at a time + int readChunkSize = 1; + if (reader.BaseStream.CanTimeout) + { + readChunkSize = 1000; + } + + char[] buffer = new char[readChunkSize]; + bool lastCharWasCarriageReturn = false; + do + { + int charsRead = 0; + int lastStartLine = 0; + charsRead = reader.ReadBlock(buffer, 0, readChunkSize); + + // this lock keeps the standard out/error streams from being intermixed + lock (loggers) + { + for (int i = 0; i < charsRead; i++) + { + // eat the \n after a \r, if any + bool isNewLine = buffer[i] == '\n'; + bool isCarriageReturn = buffer[i] == '\r'; + if (lastCharWasCarriageReturn && isNewLine) + { + lastStartLine++; + lastCharWasCarriageReturn = false; + continue; + } + lastCharWasCarriageReturn = isCarriageReturn; + if (isCarriageReturn || isNewLine) + { + string line = new string(buffer, lastStartLine, i - lastStartLine); + lastStartLine = i + 1; + foreach (IProcessLogger logger in loggers) + { + logger.WriteLine(this, line, stream); + } + } + } + + // flush any fractional line + if (charsRead > lastStartLine) + { + string line = new string(buffer, lastStartLine, charsRead - lastStartLine); + foreach (IProcessLogger logger in loggers) + { + logger.Write(this, line, stream); + } + } + } + } + while (!reader.EndOfStream && !cancelToken.IsCancellationRequested); + } + + public void Kill(KillReason reason = KillReason.Unknown) + { + IProcessLogger[] loggers = null; + Process p = null; + lock (_lock) + { + if (_waitForExitTask.IsCompleted) + { + return; + } + if (_killReason.HasValue) + { + return; + } + _killReason = reason; + if (!_p.HasExited) + { + p = _p; + } + + loggers = _loggers.ToArray(); + _cancelSource.Cancel(); + } + + if (p != null) + { + // its possible the process could exit just after we check so + // we still have to handle the InvalidOperationException that + // can be thrown. + try + { + p.Kill(); + } + catch (InvalidOperationException) { } + } + + foreach (IProcessLogger logger in loggers) + { + logger.ProcessKilled(this, reason); + } + } + + private async Task<int> InternalWaitForExit(Task<Process> startProcessTask, Task stdOutTask, Task stdErrTask) + { + DebugTrace("starting InternalWaitForExit"); + Process p = await startProcessTask; + DebugTrace("InternalWaitForExit {0} '{1}'", p.Id, _replayCommand); + + Task processExit = Task.Factory.StartNew(() => + { + DebugTrace("starting Process.WaitForExit {0}", p.Id); + p.WaitForExit(); + DebugTrace("ending Process.WaitForExit {0}", p.Id); + }, + TaskCreationOptions.LongRunning); + + DebugTrace("awaiting process {0} exit, stdOut, and stdErr", p.Id); + await Task.WhenAll(processExit, stdOutTask, stdErrTask); + DebugTrace("await process {0} exit, stdOut, and stdErr complete", p.Id); + + foreach (IProcessLogger logger in Loggers) + { + logger.ProcessExited(this); + } + + lock (_lock) + { + if (_expectedExitCode.HasValue && p.ExitCode != _expectedExitCode.Value) + { + throw new Exception("Process returned exit code " + p.ExitCode + ", expected " + _expectedExitCode.Value + Environment.NewLine + + "Command Line: " + ReplayCommand + Environment.NewLine + + "Working Directory: " + WorkingDirectory); + } + DebugTrace("InternalWaitForExit {0} returning {1}", p.Id, p.ExitCode); + return p.ExitCode; + } + } + + private void DebugTrace(string format, params object[] args) + { + lock (_lock) + { + if (_traceOutput != null) + { + string message = string.Format("TRACE: " + format, args); + _traceOutput.WriteLine(message); + } + } + } + } +} diff --git a/tests/src/performance/Scenario/JitBench/Utilities/TestOutputProcessLogger.cs b/tests/src/performance/Scenario/JitBench/Utilities/TestOutputProcessLogger.cs new file mode 100644 index 0000000000..63c50a5645 --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/Utilities/TestOutputProcessLogger.cs @@ -0,0 +1,141 @@ +using System; +using System.Collections; +using System.Collections.Generic; +using System.Linq; +using System.Text; +using System.Xml; + +namespace JitBench +{ + public class TestOutputProcessLogger : IProcessLogger + { + string _timeFormat = "mm\\:ss\\.fff"; + ITestOutputHelper _output; + StringBuilder[] _lineBuffers; + + public TestOutputProcessLogger(ITestOutputHelper output) + { + _output = output; + _lineBuffers = new StringBuilder[(int)ProcessStream.MaxStreams]; + } + + public void ProcessStarted(ProcessRunner runner) + { + lock (this) + { + _output.WriteLine("Running Process: " + runner.ReplayCommand); + _output.WriteLine("Working Directory: " + runner.WorkingDirectory); + IEnumerable<KeyValuePair<string,string>> additionalEnvVars = + runner.EnvironmentVariables.Where(kv => Environment.GetEnvironmentVariable(kv.Key) != kv.Value); + + if(additionalEnvVars.Any()) + { + _output.WriteLine("Additional Environment Variables: " + + string.Join(", ", additionalEnvVars.Select(kv => kv.Key + "=" + kv.Value))); + } + _output.WriteLine("{"); + } + } + + public virtual void Write(ProcessRunner runner, string data, ProcessStream stream) + { + lock (this) + { + AppendToLineBuffer(runner, stream, data); + } + } + + public virtual void WriteLine(ProcessRunner runner, string data, ProcessStream stream) + { + lock (this) + { + StringBuilder lineBuffer = AppendToLineBuffer(runner, stream, data); + //Ensure all output is written even if it isn't a full line before we log input + if (stream == ProcessStream.StandardIn) + { + FlushOutput(); + } + _output.WriteLine(lineBuffer.ToString()); + _lineBuffers[(int)stream] = null; + } + } + + public virtual void ProcessExited(ProcessRunner runner) + { + lock (this) + { + TimeSpan offset = runner.StartTime - DateTime.Now; + _output.WriteLine("}"); + _output.WriteLine("Exit code: " + runner.ExitCode + " ( " + offset.ToString(_timeFormat) + " elapsed)"); + _output.WriteLine(""); + } + } + + public void ProcessKilled(ProcessRunner runner, KillReason reason) + { + lock (this) + { + TimeSpan offset = runner.StartTime - DateTime.Now; + string reasonText = ""; + if (reason == KillReason.TimedOut) + { + reasonText = "Process timed out"; + } + else if (reason == KillReason.Unknown) + { + reasonText = "Kill() was called"; + } + _output.WriteLine(" Killing process: " + offset.ToString(_timeFormat) + ": " + reasonText); + } + } + + protected void FlushOutput() + { + if (_lineBuffers[(int)ProcessStream.StandardOut] != null) + { + _output.WriteLine(_lineBuffers[(int)ProcessStream.StandardOut].ToString()); + _lineBuffers[(int)ProcessStream.StandardOut] = null; + } + if (_lineBuffers[(int)ProcessStream.StandardError] != null) + { + _output.WriteLine(_lineBuffers[(int)ProcessStream.StandardError].ToString()); + _lineBuffers[(int)ProcessStream.StandardError] = null; + } + } + + private StringBuilder AppendToLineBuffer(ProcessRunner runner, ProcessStream stream, string data) + { + StringBuilder lineBuffer = _lineBuffers[(int)stream]; + if (lineBuffer == null) + { + TimeSpan offset = runner.StartTime - DateTime.Now; + lineBuffer = new StringBuilder(); + lineBuffer.Append(" "); + if (stream == ProcessStream.StandardError) + { + lineBuffer.Append("STDERROR: "); + } + else if (stream == ProcessStream.StandardIn) + { + lineBuffer.Append("STDIN: "); + } + lineBuffer.Append(offset.ToString(_timeFormat)); + lineBuffer.Append(": "); + _lineBuffers[(int)stream] = lineBuffer; + } + + // xunit has a bug where a non-printable character isn't properly escaped when + // it is written into the xml results which ultimately results in + // the xml being improperly truncated. For example MDbg has a test case that prints + // \0 and dotnet tools print \u001B to colorize their console output. + foreach(char c in data) + { + if(!char.IsControl(c)) + { + lineBuffer.Append(c); + } + } + return lineBuffer; + } + } +} diff --git a/tests/src/performance/Scenario/JitBench/unofficial_dotnet/JitBench.csproj b/tests/src/performance/Scenario/JitBench/unofficial_dotnet/JitBench.csproj new file mode 100644 index 0000000000..009949ab1b --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/unofficial_dotnet/JitBench.csproj @@ -0,0 +1,58 @@ +<Project Sdk="Microsoft.NET.Sdk"> + + <!--The common test dirs props expects Platform to be empty in order to initialize it and + by default this project style appears to set it as "AnyCPU" so we need to clear it --> + <PropertyGroup> + <Platform></Platform> + <Platforms>AnyCPU;x64</Platforms> + <Configuration></Configuration> + </PropertyGroup> + + <Import Project="$([MSBuild]::GetDirectoryNameOfFileAbove($(MSBuildThisFileDirectory), dir.props))\dir.props" /> + <PropertyGroup> + <OutputType>Exe</OutputType> + <TargetFramework>netcoreapp2.0</TargetFramework> + + <!-- the common test dirs props pushes all tests to use a more recent NTM unless we explictly opt-out --> + <NuGetTargetMoniker>.NETCoreApp,Version=v2.0</NuGetTargetMoniker> + <NuGetTargetMonikerShort>netcoreapp2.0</NuGetTargetMonikerShort> + </PropertyGroup> + + <ItemGroup> + <PackageReference Include="CommandLineParser"> + <Version>$(CommandLineParserVersion)</Version> + </PackageReference> + <PackageReference Include="Microsoft.Diagnostics.Tracing.TraceEvent"> + <Version>$(MicrosoftDiagnosticsTracingTraceEventPackageVersion)</Version> + </PackageReference> + <PackageReference Include="xunit.performance.api"> + <Version>$(XunitPerformanceApiPackageVersion)</Version> + </PackageReference> + <PackageReference Include="xunit.performance.core"> + <Version>$(XunitPerformanceApiPackageVersion)</Version> + </PackageReference> + <PackageReference Include="xunit.performance.execution"> + <Version>$(XunitPerformanceApiPackageVersion)</Version> + </PackageReference> + <PackageReference Include="xunit.performance.metrics"> + <Version>$(XunitPerformanceApiPackageVersion)</Version> + </PackageReference> + </ItemGroup> + + <ItemGroup> + <Compile Include="..\**\*.cs" /> + </ItemGroup> + + <ItemGroup> + <!-- BaselineMicrosoftNetCoreAppPackageVersion comes from dependencies.props in the root of the coreclr tree --> + <VersioningConstantsLines Include="namespace JitBench { public static class VersioningConstants { public static string MicrosoftNetCoreAppPackageVersion="$(BaselineMicrosoftNetCoreAppPackageVersion)"%3B } }" /> + <Compile Include="$(BaseIntermediateOutputPath)AutoGeneratedVersioningConstants.cs" /> + </ItemGroup> + + <Target Name="GenerateVersioningConstantsFile" BeforeTargets="CoreCompile"> + <WriteLinesToFile File="$(BaseIntermediateOutputPath)AutoGeneratedVersioningConstants.cs" Lines="@(VersioningConstantsLines)" Overwrite="true" Encoding="Unicode" /> + </Target> + + <!-- The CoreCLR test build system requires a target named RestorePackage in order to do BatchRestore --> + <Target Name="RestorePackage" DependsOnTargets="Restore" /> +</Project> diff --git a/tests/src/performance/Scenario/JitBench/unofficial_dotnet/Properties/launchSettings.json b/tests/src/performance/Scenario/JitBench/unofficial_dotnet/Properties/launchSettings.json new file mode 100644 index 0000000000..756237378e --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/unofficial_dotnet/Properties/launchSettings.json @@ -0,0 +1,8 @@ +{ + "profiles": { + "JitBench": { + "commandName": "Project", + "commandLineArgs": "--iterations 3 --benchmark MusicStore" + } + } +}
\ No newline at end of file diff --git a/tests/src/performance/Scenario/JitBench/unofficial_dotnet/README.md b/tests/src/performance/Scenario/JitBench/unofficial_dotnet/README.md new file mode 100644 index 0000000000..fd86d7bf37 --- /dev/null +++ b/tests/src/performance/Scenario/JitBench/unofficial_dotnet/README.md @@ -0,0 +1,188 @@ +# JitBench # + +JitBench is a collection of scenario benchmarks that were originally designed to do performance testing of the tiered jitting feature. They can be easily run for ad-hoc investigation or as part of automated performance testing + + +## Running the test (ad-hoc) ## + +Execute 'dotnet run' in this directory. The test should eventually produce output like this: + + + + === CONFIGURATION === + + DotnetFrameworkVersion: 2.1.0-preview2-26131-06 + DotnetSdkVersion: 2.2.0-preview1-007558 + PrivateCoreCLRBinDir: + Architecture: X64 + OutputDir: C:\Users\noahfalk\AppData\Local\Temp\JitBench_2018_02_12_05_16_34_0611 + Iterations: 3 + UseExistingSetup: True + Configurations: Default + + + Benchmark run in progress... + Verbose log: C:\Users\noahfalk\AppData\Local\Temp\JitBench_2018_02_12_05_16_34_0611\JitBench_log.txt + + + === RESULTS === + + Benchmark Metric Default + ----------------------- -------------------- ----------- + Dotnet_Build_HelloWorld Duration (ms) 1322.5+-2.9 + Csc_Hello_World Duration (ms) 782+-105 + Csc_Roslyn_Source Duration (ms) 2858+-16 + MusicStore Startup (ms) 703.5+-6.9 + MusicStore First Request (ms) 636+-11 + MusicStore Median Response (ms) 89+-0 + + +By default the test downloads versions of dotnet framework and SDK from Azure, downloads various workloads that will run in the benchmarks, and then executes each benchmark multiple times in the default configuration. The results are tabulated and displayed in the output. + + +Examples of more customized ways the benchmark can be run (see the command line for other options not show here): + +**Run with a private CoreCLR build instead of a downloaded one** + + dotnet.exe run -- --coreclr-bin-dir F:\github\coreclr\bin\Product\Windows_NT.x64.Release + +**Run multiple configurations for comparison** + + dotnet.exe run -- --configs Default,Tiering,Minopts + + ... + === RESULTS === + + Benchmark Metric Default Tiering Minopts + ----------------------- -------------------- --------- ----------- -------- + Dotnet_Build_HelloWorld Duration (ms) 1368+-66 1227+-2 1188+-37 + Csc_Hello_World Duration (ms) 648+-41 542+-9.8 518+-7.8 + Csc_Roslyn_Source Duration (ms) 2806+-185 3130+-50 2842+-68 + MusicStore Startup (ms) 716+-15 633.5+-4.9 628+-21 + MusicStore First Request (ms) 626+-53 482.5+-0.98 456+-17 + MusicStore Median Response (ms) 89+-0 89+-0 89+-0 + +**Run only a specific benchmark** + + dotnet.exe run -- --benchmark Dotnet_Build_HelloWorld + + ... + === RESULTS === + + Benchmark Metric Default + ----------------------- ------------- -------- + Dotnet_Build_HelloWorld Duration (ms) 1391+-25 + +**Run with ETW collection enabled** + + dotnet.exe run -- --perf:collect BranchMispredictions+CacheMisses+InstructionRetired + +ETL traces will show up in the output directory here: <run\_id\>-JitBench-<benchmark\_name\>-<config\_name\>-traces\\<run\_id\>-JitBench-<benchmark\_name\>-<config\_name\>(#).etl + +**Run without repeating all the setup steps (for a faster inner dev loop)** + + dotnet.exe run -- --use-existing-setup + +**Run with fewer iterations (faster inner dev loop but error bounds increase)** + + dotnet.exe run -- --iterations 3 + +**Run with a specific output directory** + + dotnet.exe run -- --perf:outputdir C:\temp\JitBench\_results + +## Adding a new Benchmark ## + +In the Benchmarks folder create a new .cs file that implements a class deriving from Benchmark. Provide a name for the benchmark in the constructor and implement the abstract Setup() method. In Setup do whatever you need to do to acquire files specific to your benchmark and then set the properties + +- ExePath +- WorkingDirPath +- EnvironmentVariables (optional) + +to determine what process will be invoked later when the benchmark runs. BuildHelloWorldBenchmark.cs is a simple example if you need a template to copy. MusicStore is a bit more sophisticated and shows gathering custom metrics + customizing the Benchview output. + +## Automation +This how we currently setup to run the test in CI and then retrieve its results. + +**Setup:** + +1. Create a directory with all the runtime and framework binaries in it (currently called the sandbox directory) +2. Build the JitBench executable with msbuild (this occurs as part of test build) +3. Set any COMPLUS variables that will modify the run +4. Invoke the test with commandline _sandbox\_dir_\\corerun.exe --perf:outputdir _output\_dir_ --perf:runid _run\_id_ --target-architecture x64 --perf:collect _metrics_ + + +**Results:** + +For each benchmark in the benchmark suite the test will write out a set of result files in the _output\_dir_: + +- <run\_id\>-JitBench-<benchmark\_name\>-<config\_name\>.csv +- <run\_id\>-JitBench-<benchmark\_name\>-<config\_name\>.md +- <run\_id\>-JitBench-<benchmark\_name\>-<config\_name\>.xml + +If ETW was enabled there will also be a set of ETW traces for each process execution in the test: + +- <run\_id\>-JitBench-<benchmark\_name\>-<config\_name\>-traces\\<run\_id\>-JitBench-<benchmark\_name\>-<config\_name\>(#).etl + +For example: + + 02/15/2018 09:07 PM <DIR> Perf-On-JitBench-Csc_Hello_World-Default-traces + 02/15/2018 09:07 PM 2,766 Perf-On-JitBench-Csc_Hello_World-Default.csv + 02/15/2018 09:07 PM 3,801 Perf-On-JitBench-Csc_Hello_World-Default.md + 02/15/2018 09:07 PM 11,610 Perf-On-JitBench-Csc_Hello_World-Default.xml + 02/15/2018 09:08 PM <DIR> Perf-On-JitBench-Csc_Roslyn_Source-Default-traces + 02/15/2018 09:08 PM 2,856 Perf-On-JitBench-Csc_Roslyn_Source-Default.csv + 02/15/2018 09:08 PM 3,851 Perf-On-JitBench-Csc_Roslyn_Source-Default.md + 02/15/2018 09:08 PM 11,716 Perf-On-JitBench-Csc_Roslyn_Source-Default.xml + 02/15/2018 08:48 PM <DIR> Perf-On-JitBench-Dotnet_Build_HelloWorld_Default-traces + 02/15/2018 08:48 PM 2,901 Perf-On-JitBench-Dotnet_Build_HelloWorld_Default.csv + 02/15/2018 08:48 PM 4,001 Perf-On-JitBench-Dotnet_Build_HelloWorld_Default.md + 02/15/2018 08:48 PM 11,777 Perf-On-JitBench-Dotnet_Build_HelloWorld_Default.xml + 02/15/2018 09:08 PM <DIR> Perf-On-JitBench-MusicStore-Default-traces + 02/15/2018 09:09 PM 3,511 Perf-On-JitBench-MusicStore-Default.csv + 02/15/2018 09:09 PM 5,543 Perf-On-JitBench-MusicStore-Default.md + 02/15/2018 09:09 PM 15,965 Perf-On-JitBench-MusicStore-Default.xml + +The result files use standard XUnitPerformanceHarness formatting. Typical metrics content from the csv when ETW is enabled looks like this: + + JitBench Metric Unit Iterations Average STDEV.S Min Max + MusicStore Duration ms 2 2146 4.242640687 2143 2149 + MusicStore/dotnet.exe Duration ms 2 2136.8458 3.163030054 2134.6092 2139.0824 + MusicStore/dotnet.exe BranchMispredictions count 2 57272320 147711.7782 57167872 57376768 + MusicStore/dotnet.exe CacheMisses count 2 47482880 78200.35314 47427584 47538176 + MusicStore/dotnet.exe InstructionRetired count 2 9266000000 32526911.93 9243000000 9289000000 + MusicStore/dotnet.exe!Anonymously Hosted DynamicMethods Assembly BranchMispredictions count 2 0 0 0 0 + MusicStore/dotnet.exe!Anonymously Hosted DynamicMethods Assembly CacheMisses count 2 0 0 0 0 + MusicStore/dotnet.exe!Anonymously Hosted DynamicMethods Assembly InstructionRetired count 2 0 0 0 0 + MusicStore/dotnet.exe!clrjit.dll BranchMispredictions count 2 23300096 147711.7782 23195648 23404544 + MusicStore/dotnet.exe!clrjit.dll CacheMisses count 2 9240576 330179.2688 9007104 9474048 + MusicStore/dotnet.exe!clrjit.dll InstructionRetired count 2 1562500000 48790367.9 1528000000 1597000000 + MusicStore/dotnet.exe!coreclr.dll BranchMispredictions count 2 15316992 298319.8657 15106048 15527936 + MusicStore/dotnet.exe!coreclr.dll CacheMisses count 2 16463872 49237.25939 16429056 16498688 + MusicStore/dotnet.exe!coreclr.dll InstructionRetired count 2 2432500000 58689862.84 2391000000 2474000000 + MusicStore/dotnet.exe!dotnet.exe BranchMispredictions count 2 0 0 0 0 + MusicStore/dotnet.exe!dotnet.exe CacheMisses count 2 0 0 0 0 + MusicStore/dotnet.exe!dotnet.exe InstructionRetired count 2 0 0 0 0 + MusicStore/dotnet.exe!MusicStore.dll BranchMispredictions count 2 0 0 0 0 + MusicStore/dotnet.exe!MusicStore.dll CacheMisses count 2 0 0 0 0 + MusicStore/dotnet.exe!MusicStore.dll InstructionRetired count 2 0 0 0 0 + MusicStore/dotnet.exe!ntoskrnl.exe BranchMispredictions count 2 12146688 112956.0657 12066816 12226560 + MusicStore/dotnet.exe!ntoskrnl.exe CacheMisses count 2 12468224 57926.18751 12427264 12509184 + MusicStore/dotnet.exe!ntoskrnl.exe InstructionRetired count 2 4192000000 15556349.19 4181000000 4203000000 + MusicStore/dotnet.exe!System.Private.CoreLib.dll BranchMispredictions count 2 489472 31859.40313 466944 512000 + MusicStore/dotnet.exe!System.Private.CoreLib.dll CacheMisses count 2 1196032 92681.90002 1130496 1261568 + MusicStore/dotnet.exe!System.Private.CoreLib.dll InstructionRetired count 2 85000000 7071067.812 80000000 90000000 + MusicStore/dotnet.exe!Unknown BranchMispredictions count 2 1380352 28963.09376 1359872 1400832 + MusicStore/dotnet.exe!Unknown CacheMisses count 2 2377728 228808.4407 2215936 2539520 + MusicStore/dotnet.exe!Unknown InstructionRetired count 2 129500000 707106.7812 129000000 130000000 + MusicStore/First Request Duration ms 2 872 24.04163056 855 889 + MusicStore/Median Response Duration ms 2 88.8 0 88.8 88.8 + MusicStore/Startup Duration ms 2 887 35.35533906 862 912 + +## Why is this project in a folder marked 'unofficial'? + +CoreCLR CI machines don't currently support building netcoreapp2.0 projects authored with the new msbuild SDK authoring style so the repo build uses the JitBench.csproj one directory higher, not the one in this directory. If you try to build the project in this directory in CI you get this error: + + C:\Program Files\dotnet\sdk\1.1.0\Sdks\Microsoft.NET.Sdk\build\Microsoft.NET.TargetFrameworkInference.targets(112,5): error : The current .NET SDK does not support targeting .NET Core 2.0. Either target .NET Core 1.1 or lower, or use a version of the .NET SDK that supports .NET Core 2.0. [D:\j\workspace\x64_checked_w---eac6a79c\tests\src\performance\Scenario\JitBench\JitBench.csproj] + +I assume the CI machines have fairly old SDK tools installed but I didn't have enough time to keep investigating these build issues. From I can tell if you have .Net Core 2.0+ SDK installed on your machine this build works fine from the command line and from VS. |