Skip to content

Integrate Grpc.Tools into msbuild system (C# only)#13207

Merged
jtattermusch merged 21 commits intogrpc:masterfrom
kkm000:package-grpc-tools
Oct 17, 2018
Merged

Integrate Grpc.Tools into msbuild system (C# only)#13207
jtattermusch merged 21 commits intogrpc:masterfrom
kkm000:package-grpc-tools

Conversation

@kkm000
Copy link
Copy Markdown
Contributor

@kkm000 kkm000 commented Nov 1, 2017

When Grpc.Tools is added to a C# project, the project recognizes .proto files. These .proto are compiled into .cs files placed in the object directory (under obj/...), and are not shown in Visual Studio, but are available to Intellisense nevertheless.

The scripts support both "classic" projects and the new dotnet style projects. In line with the latter ones (when .cs files are just dropped into the directory are picked up and compiled), .proto files are handled the same way, so just placing a .proto file into the folder is enough to compile it, and compile its protoc outputs into the project target assembly.

The "classic" project requires adding .proto files by one, and they are assigned a correct build action automatically. To manually add proto files to msbuild script, use 'ProtoBuf' item type.

For native code C++ projects, there is no added support beyond just dropping the compiler as before. For all other project types, an error message is generated on an attempted build. More C++ support is on the way.

Dependency change: the package now adds a dependency on Grpc.Tools of a matching version for managed projects only. This is naturally skipped for native C++, as the managed runtime is not required.

Also packaged are proto includes for well-known types.


Validated and confirmed working:

  • Windows (VS, classic C#, dotnet C#, C++)
  • Linux/Mono (both classic C# and dotnet C#)
  • Linux/dotnet (dotnet C# project only)

There are quite a few questions I wanted to discuss. If anyone could help me with proper decicions, I would appreciate it.

  1. The captialization of generated .cs files. When I compile fooBar.proto, the resulting .cs file is called FooBar.proto. Is there a way around that? This does not seem to make sense in the build like this, when the .cs files are treated at best as temporary. I can certainly script around the issue in msbuild scripts, but why have it in the first place?

  2. If x.proto does not contain any RPC services. the corresponding xGrpc.cs is not generated (FWIW, the cpp plugin does not do that). This is not good for the build. While it is certainly possible to see if the file is there after compilation, it is impossible to learn if it was actually produced by it. In other words, if I take a service definition out of x.proto after having previously compiled it, I end up with a stale xGrpc.cs that is added to compilation. Of course, it is possible to write some msbuild incantations about this issue as well, but it would be much cleaner just to create a nearly empty xGrpc.cs stub even if no services are defined. So options I am asking about here are

  • a) Leave it as is and script some very ugly transformations around it.
  • b) Add a plugin option to do always generate the stub.
  • c) In fact, forget the option and just always generate the stub.
  1. I want to package cpp plugins also, and support to integrate gRPC into the build process. (I already have it packaged internally). This will cover the build process in Visual Studio. Should we do that?

  2. I am adding a bunch of files to the root of src/chsarp that only go into the nuget package. Is it better to place them into a subdirectory? Is Grpc.Tools subdirectory good?


/cc @jskeet @hcoona
Closes #4805
/* #13098 */

@thelinuxfoundation
Copy link
Copy Markdown

Thank you for your pull request. Before we can look at your contribution, we need to ensure all contributors are covered by a Contributor License Agreement.

After the following items are addressed, please respond with a new comment here, and the automated system will re-verify.

Regards,
The Linux Foundation CLA GitHub bot

@kkm000
Copy link
Copy Markdown
Contributor Author

kkm000 commented Nov 1, 2017

@thelinuxfoundation: I signed these 4 pages of impassable legalese! Might as well have been rot13. My firstborn is yours!

@kkm000 kkm000 force-pushed the package-grpc-tools branch from cb967e6 to b81878c Compare November 1, 2017 02:42
@kkm000
Copy link
Copy Markdown
Contributor Author

kkm000 commented Nov 3, 2017

Hi @jtattermusch, I am not pushing for a quicker review, especially given that probably very few people around have an experience with msbuild, and I would be surprised to find a reviewer for the VS project XAML object model integration (the new xml property file in my CL). @jskeet, can you help with this maybe? I remember discussing that with you couple years ago.

The main thing I want a kinda expedite feedback for, and you are probably the best person to give an answer, is whether do you think I am on the right track at all. I would continue working on this integration projects, by hacking protoc and grpc codegen to add options to produce predictable file names, and reworking the examples and the docs. Seems I can carve a couple hours a day these weeks to finish this! We can do similar packaging for the protoc project too.

It would be also super amazing to somehow test these build scripts. The build part can be reasonably easy be tested on Travis and Appveyor, if you are using them. What external public cloud services are you using/approving of use? I see only links to sponge, that I obviously do not see (not any more :)), VS integration is probably not easily testable tho.

Thanks!

@jskeet
Copy link
Copy Markdown
Contributor

jskeet commented Nov 3, 2017

I don't have any time to look at this at the moment. I can try to get to it next week, but I'm travelling to a conference (having been away this week) so it may be the week after before I can allocate any time to this.

@kkm000
Copy link
Copy Markdown
Contributor Author

kkm000 commented Nov 3, 2017

@jskeet, thank you very much, appreciate it. Naturally there is no rush at all!

@kkm000
Copy link
Copy Markdown
Contributor Author

kkm000 commented Nov 4, 2017

Xref #6085 /cc @aniongithub. Holy Guacamole, this MsBuild integration archeology uncovers quite a history! :)

@kkm000
Copy link
Copy Markdown
Contributor Author

kkm000 commented Nov 17, 2017

@jskeet, @jtattermusch: Do you think it would be possible to allocate some time to look at this?

The current packaging scheme of gRPC is quite lacking. In fact, I do not think I have an extremely complex project, and I am crippled by the way the tools are. I have a C++ server and many C# clients. To compile a proto file with a service definition for the clients, I need, first of all, protoc (from the tools package) that is rather impossible to discover (I am just defining a property in my build file like ProtoCompiler=../../packages/Grpc.Tools.1.7.1/.... and trying not to forget to update the path with upgrades; this is a no-go for Net.SDK projects though). Second of all, I need the standard proto imports (include/google/protobuf/), and they not packaged anywhere at all, as far as I looked, so I just checked them into my project and maintain a copy :-/. The only way I allow the clients to get the interface is precompile the proto and put the assembly into a NuGet package on the internal feed. This is not a bad solution whatsoever, but I think the support for .proto files should be as simple as just dropping them into an MsBuild project, as easy as you have in a BUILD file.

I understand I am most likely preaching to the choir, but I am just corroborating that the packaging and build support does need a radical rework, and currently barely delivers. I will be more than happy to do all the work (in fact, I did most of it to support gRPC internally here). I need help from the maintainers with locating binary artifacts for multiple systems, and perhaps a certain degree of commitment so that this attempt at the solution does not sink this time. I am totally determined to support packaging and build for multiple platforms, ideally all platforms and frameworks where the gRPC runtime is supported.

@jskeet
Copy link
Copy Markdown
Contributor

jskeet commented Nov 17, 2017

I may have time to look next week, but I can't make any promises. I would point out that the standard protos are in Google.Protobuf.Tools though.

@kkm000
Copy link
Copy Markdown
Contributor Author

kkm000 commented Nov 17, 2017

@jskeet, thanks!

As for the standard protos, thanks for the pointer. Unfortunately, they are about as discoverable by the build as the gRPC packaged tools (namely, not at all), so the amount of maintenance when using the package is about the same if not greater. It's just one directory of small files. You see, it all kinda works together in the end, but not without picking one's poison first.

@jskeet
Copy link
Copy Markdown
Contributor

jskeet commented Nov 20, 2017

Okay, having had more of a look, a few questions:

  1. Dependencies

Dependency change: the package now adds a dependency on Grpc.Tools of a matching version for managed projects only

Is that dependency added as a build-only dependency? If I'm building a library, I don't want to end up with a dependency on Grpc.Tools in my resulting library or nuget package.

  1. Source presence

There are two times at which I'd definitely want the generated source to be available:

  • When debugging locally; I assume this is okay as they'll still be present in the obj file
  • When debugging code that depends on a library that includes the protos. There are multiple ways of getting at the source code for code that you're debugging; the one I'm tending towards these days is SourceLink, but that usually expects that the source code is persisted somewhere. Would you expect the source code for the protos to be bundled into the nuget package?
  1. Well-known type dependencies

I'd prefer them to only be in Google.Protobuf.Tools, personally (where they are already). With a single source of truth, there's less chance of ending up with conflicting definitions. I think it would be reasonable to have a dependency on Google.Protobuf.Tools though.

  1. Protobuf-only tooling

A lot of users may well use protobuf without gRPC. Would it be feasible to have an msbuild task that enables that scenario, and somehow automatically extend it for gRPC scenarios if Grpc.Core is in the dependencies? In particular, I don't want to end up with an msbuild task that stops us from doing anything sensible with proto-only projects.

  1. Options

There's currently an option for generating internal types, and in the future I expect there to be an async option. Is there a simple way for us to allow options to be set in the project file?

  1. Rebuilding

Am I right in saying that msbuild will basically associate the input proto file with the output cs file, and assume that it needs to rerun protoc if and only if the proto has changed? I assume so, but I thought it worth checking.

@jtattermusch jtattermusch self-assigned this Nov 20, 2017
@jtattermusch jtattermusch self-requested a review November 20, 2017 16:17
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ignoring changing the location from /tools to /build, are there any limitations in using Grpc.Tools in the "old" way
(e.g. just using a user-provided script and invoking the binaries manually) - I think it is pretty important to make sure that the old workflow still works for users that don't want to change the way of building their project (I think it's fair to require users to update the path from /tools/... to /build/... but not more).
If the "VS integration" turns out not to work well for some users (I think there will always be a scenario...), it is possible to disable it so it doesn't interefere with the manual .proto build process?

Copy link
Copy Markdown
Contributor Author

@kkm000 kkm000 Nov 20, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: it's not just VS integration, it's a complete integration for VS, standalone MSBuild, .NET.SDK on multiple platforms and mono (I am testing on Windows and Linux).

The short answer is no, just change "tools" to "build".

The long answer is, unfortunately, there is no "old" way to use the tools, without going through hoops and encoding explicit paths to grpc tools. The problem is, there are, again, two different ways of placing NuGet packages, one corresponding to the "old" (non-.NET.SDK projects), when NuGet is a separate tool. The packages are (by default) placed into the /packages subdirectory of the solution. This is less of a problem when packaged are libraries, as they are just added with a relative path to the project references. Here's an example from an "old" style project (which we currently only use, with very few exceptions):

    <Reference Include="Grpc.Core, Version=1.0.0.0, Culture=neutral, PublicKeyToken=d754f35622e28bad, processorArchitecture=MSIL">
      <HintPath>..\..\packages\Grpc.Core.1.7.1\lib\net45\Grpc.Core.dll</HintPath>
    </Reference>

The <HintPath> is a misnomer, it is the actual location where the library should be found. Of course, moving projects is a headache, and switching the target version of the framework is a double-headache (note the \net45\ part, which might or might not work with a different framework). These are created by the NuGet extension inside Visual Studio, and quite brittle.

The "new" build scripts integrate NuGet into the build process. So in a "new" style project, there is a reference to NuGet package ID only, and the build process figures out the correct library folder, if there are multiple. Here's how it looks:

  <ItemGroup>
    <PackageReference Include="Grpc.Core" Version="1.7.1" />
  </ItemGroup>

(happily, VS2017 supports that for the "old" projects, but I digress).

The tools as packaged are hardly useful, because they are not discoverable by the build process. First of all, the solution-level packages folder is only a default. The user can change the location in his Nuget.config so that, for one example, all packages from all builds are placed under the same directory and shared. This would make the path to the tools totally disconnected from, and absolutely undiscoverable for the build. While I can think of a double-horrific hack of traversing the tree up, looking for a package directory, finding one (or more) that matches the Grpc.Core.* pattern, then maybe selecting one with the hifghest version (but oh boy, how??), this is brittle by itself and also will break with the common location for projects.

Worse news is that the common location is actually the default in .NET.SDK projects. Here's a sample project under Linux that I used while developing these scripts (version 1.9.9.5 is just fake large version to pick the package off my local feed):

kkm@yupana:~/work/dotnet/hwapp$ cat hwapp.csproj
<Project Sdk="Microsoft.NET.Sdk">

  <PropertyGroup>
    <OutputType>Exe</OutputType>
    <TargetFramework>netcoreapp2.0</TargetFramework>
  </PropertyGroup>

  <ItemGroup>
    <PackageReference Include="Google.Protobuf" Version="3.4.1" />
    <PackageReference Include="Grpc.Tools" Version="1.9.9.5" />
  </ItemGroup>

</Project>
kkm@yupana:~/work/dotnet/hwapp$ dotnet nuget locals global-packages --list
info : global-packages: /home/kkm/.nuget/packages/
kkm@yupana:~/work/dotnet/hwapp$ ll /home/kkm/.nuget/packages/grpc.tools
total 4.0K
drwxrwxr-x 3 kkm kkm 4.0K Oct 31 17:11 1.9.9.5/

So you see where the guy is hiding. NuGet is absolutely not doing any good here; if I wanted to compile some .proto files using a script, I would much prefer to download a .tar.gz file with the tools and other required files, than trying to parse the output of dotnet nuget locals global-packages --list in a script (and then the problem of multiple versions does not go away).

All in all, he point of this "long answer" exposition is to show that packaging, quite unfortunately, simply cannot get worse than it is. It's currently rather dysfunctional, a quite inconvenient .zip file with extra metadata stuff in it (.nuget is a .zip) that is better unpacked in a known location that pulled automatically from within a project using the tools.

(As a side note, the correct way of the simplest case for tool discovery is including a .props or .targets file that sets a variable based on the predefined variable $(MSBuildThisFileDirectory) which is set to the location of the current, be it the root or 'ed, project file. .props is for something that user can override in project, and .targets is for things we want more control of).

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the comment, I do agree that the "old" way is hardly sufficient, but I'll try to rephrase my question:

  1. let's say someone has Grpc.Tools installed in his project and that project has some .proto files that were previously being build using some kind of script (that the user herself wrote)
  2. the .proto files don't necessarily have the right directory structure, they are just somewhere in the project directory because until now it didn't really matter.
  3. after upgrading to Grpc.Tools from this PR, the automatic build action for .proto files present in the project will kick in and likely fail. And maybe the user isn't ready to restructure her project to fix those build issues or there might be some issues that are tricky to solve.

My question is basically what can we do to prevent users' existing projects from breaking once they upgrade the Grpc.Tools package? Is there a simple way to disable to .proto build actions?

Copy link
Copy Markdown
Contributor Author

@kkm000 kkm000 Nov 21, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am sorry, I seem to be failing to communicate properly how dire the problem actually is. That someone is me, and everyone else out there. That is the whole point: compile does necessarily break on every package upgrade, because version. We cannot prevent that breakage, and could not from day one. Here's the Git history of the file imported by other projects for the last 3 upgrades. All changes you see are necessarily manual:

$ git show -G packages -U1 --minimal cde0e90 5c74249 811223e -- ProtoBuf.targets
commit cde0e90cff406827d16e49c48fc33c448f284327
Author: kkm <[email protected]>
Date:   Thu Oct 20 18:44:11 2016 -0700

    Upgrade gRPC package version to 1.0.0

diff --git a/cssrc/ProtoBuf.targets b/cssrc/ProtoBuf.targets
index 7a420a7..351575d 100644
--- a/cssrc/ProtoBuf.targets
+++ b/cssrc/ProtoBuf.targets
@@ -6,4 +6,4 @@
   <PropertyGroup>
-    <ProtoCompilerPath Condition=" '$(ProtoCompilerPath)' == '' ">..\..\packages\Grpc.Tools.0.15.0\tools\windows_x86\</ProtoCompilerPath>
-    <ProtoCsPluginPath Condition=" '$(ProtoCsPluginPath)' == '' ">..\..\packages\Grpc.Tools.0.15.0\tools\windows_x86\</ProtoCsPluginPath>
+    <ProtoCompilerPath Condition=" '$(ProtoCompilerPath)' == '' ">..\..\packages\Grpc.Tools.1.0.0\tools\windows_x86\</ProtoCompilerPath>
+    <ProtoCsPluginPath Condition=" '$(ProtoCsPluginPath)' == '' ">..\..\packages\Grpc.Tools.1.0.0\tools\windows_x86\</ProtoCsPluginPath>
   </PropertyGroup>

commit 5c74249d264c322b7cc5c90d7013f0e4aa446b77
Author: kkm <[email protected]>
Date:   Sat Oct 21 20:35:38 2017 -0700

    Update gRPC and protobuf packages for C# projects

diff --git a/cssrc/ProtoBuf.targets b/cssrc/ProtoBuf.targets
index 351575d..84aabc9 100644
--- a/cssrc/ProtoBuf.targets
+++ b/cssrc/ProtoBuf.targets
@@ -6,4 +6,4 @@
   <PropertyGroup>
-    <ProtoCompilerPath Condition=" '$(ProtoCompilerPath)' == '' ">..\..\packages\Grpc.Tools.1.0.0\tools\windows_x86\</ProtoCompilerPath>
-    <ProtoCsPluginPath Condition=" '$(ProtoCsPluginPath)' == '' ">..\..\packages\Grpc.Tools.1.0.0\tools\windows_x86\</ProtoCsPluginPath>
+    <ProtoCompilerPath Condition=" '$(ProtoCompilerPath)' == '' ">..\..\packages\Grpc.Tools.1.6.1\tools\windows_x86\</ProtoCompilerPath>
+    <ProtoCsPluginPath Condition=" '$(ProtoCsPluginPath)' == '' ">..\..\packages\Grpc.Tools.1.6.1\tools\windows_x86\</ProtoCsPluginPath>
   </PropertyGroup>

commit 811223eede7ac4177e0bc64fc45070dd8a87d66b
Author: kkm <[email protected]>
Date:   Tue Nov 14 19:57:04 2017 -0800

    Upgrade gRPC packages to 1.7.1 (fixed 2 leaks)

diff --git a/cssrc/ProtoBuf.targets b/cssrc/ProtoBuf.targets
index 354e81d..6c2fbf8 100644
--- a/cssrc/ProtoBuf.targets
+++ b/cssrc/ProtoBuf.targets
@@ -6,4 +6,4 @@
   <PropertyGroup>
-    <ProtoCompilerPath Condition=" '$(ProtoCompilerPath)' == '' ">..\..\packages\Grpc.Tools.1.6.1\tools\windows_x86\</ProtoCompilerPath>
-    <ProtoCsPluginPath Condition=" '$(ProtoCsPluginPath)' == '' ">..\..\packages\Grpc.Tools.1.6.1\tools\windows_x86\</ProtoCsPluginPath>
+    <ProtoCompilerPath Condition=" '$(ProtoCompilerPath)' == '' ">..\..\packages\Grpc.Tools.1.7.1\tools\windows_x86\</ProtoCompilerPath>
+    <ProtoCsPluginPath Condition=" '$(ProtoCsPluginPath)' == '' ">..\..\packages\Grpc.Tools.1.7.1\tools\windows_x86\</ProtoCsPluginPath>
   </PropertyGroup>

Except those guys who use .NET.SDK are in a worse position: in these "old" project types we use here, I can more or less guess where packages are, relative to the project, and commit this file for other team members to use. Users of NET.SDK are out of luck though: the path is not ../packages or ../../packages or ../../../packages for them, but in addition to having the version in it, starts with like C:/users/kkm/... on my machine vs C:/users/jtattermush/... on your machine. Cannot even check it in for others to use.

The only workaround I know of is use the nuget command (a real command, this is not available during build by default) with a few switches that tell nuget essentially “download this package as if you were curl(1), and then unzip it to this directory, as if you were unzip(1), and please do not add version to the extract path”. Essentially, download and unpack a zip file. That's for nuget.exe on Windows and Mono, and dotnet nuget does not support these (but curl and unzip are usually available on the platforms where you cannot run nuget.exe). I do not really know how other people work around the problem with the gRPC/Proto tooling packages in real life, but I can feel their pain.

Copy link
Copy Markdown
Contributor Author

@kkm000 kkm000 Nov 21, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To give a short answer, if someone wants for some reason to continue using Grpc.Tools this way, they will have to replace "tools" with "build" once in addition to the version that they have to replace with every upgrade anyway. No big deal at all.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kkm000 I see, but wouldn't all the .proto files under the project directory be automatically attempted to be built by the new MSBuild rules without me having a chance to opt-out (except uninstalling Grpc.Tools from my project)? Or is there an explicit rule I need to add to my .csproj file to make the automagic .proto compilation kick in? (if so, then there's no problem). I haven't actually tried playing with the new version of Grpc.Tools package yet, hence my question - I might be missing something.

Copy link
Copy Markdown
Contributor Author

@kkm000 kkm000 Nov 22, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jtattermusch Yes, absolutely. Inside Visual Studio, changing a .cs file from "Build" to "None" in properties makes VS generate a piece in the project file (equivalent to):

  <Compile Remove="foo.cs" />
  <None Include="foo.cs" />

Same happens with proto files which are included by default but excluded specifically per-file.

Without VS (.NET.SDK alone, e. g. on Linux) there is no need adding the Removed files to <None> (adding to the None collection is done so that VS continues to show the file in the project tree), so you would just write, e.g.

  <ProtoBuf Remove="*_test.proto" />

to exclude a specific pattern, or, why not,

  <ProtoBuf Remove="@(ProtoBuf)" />
  <ProtoBuf Include="only_one.proto" />

I the end, I would prefer the defaults be in step with the Microsoft vision of these projects (as I grok it), but it is naturally controllable on the level of individual files.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great, I think that's very acceptable, thanks for the explanation. - I think the "way to disable" should be part of the documentation along with the way to use. Does having a src/csharp/Grpc.Tools/README.md sound like a good place for a some initial Grpc.Tools docs?

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If upgrading an existing package radically changes the default build process, that sounds like it's not great on the compatibility front.

Can I just check I understand properly: you're currently considering adding this to Grpc.Tools. Any reason not to create a new package of Grpc.Tools.MSBuild (or Google.Protobuf.Tools.MSBuild)? Yes, moving the tool locations in Grpc.Tools is already disruptive, but it's less so than requiring each project to add extra lines to disable something. Having the MSBuild task separate allows for separate versioning etc as well.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: should the Grpc.Tools.* files go under Grpc.Tools/ directory?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I expected that, and was going to ask for your suggestion as to where is the best place to put them. Yes, absolutely, I'll move them, thanks!

Copy link
Copy Markdown
Contributor

@jtattermusch jtattermusch Nov 20, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if there are multiple .proto files with dependencies, will they build correctly? It looks like this is building .proto files one by one in separate protoc.exe invocations (which I think will fail for .proto file with dependencies).

Copy link
Copy Markdown
Contributor Author

@kkm000 kkm000 Nov 20, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Might not, as written, when they are at the different levels of a common subtree. I did keep that in mind writing these scripts, but I am sure this should be the next step; you can treat this as a proof-of-concept. This is in a way a complex matter. Ideally, we should gather all .proto files within the project, find the common root for them, and use that with one -I (another being the common include root), passing all protos to a single protoc invocation. The common root, however, is only a guess; the user should be able to specify the root explicitly. Also, this might not work if the files come from a different location, not under the project root. Another scenatio is when the user wants to compile multiple sets of proto files (e. g., some from another location, e.g. ../../otherproj/{x,y}.proto, some from this project.

So my though is, we should "automagically" guess the "default" proto root (e. g., the project root is not a bad guess), but allow more advanced cases by batching (MsBuild term) invocations per collection of proto files. This is easily doable by using MSBuild metadata, where said metadata would explicitly provide a root for a given file. For example,

<ProtoBuf Include="subdir1/this_proj_file1.proto;subdir2/this_proj_file2.proto" />
<ProtoBuf Include="../other/other_proj_file1.proto;other_proj_file2.proto">
  <ProtoRoot>../other/</ProtoRoot>
</ProtoBuf>

should result in two protoc invocation: the first compiling */this_proj* files with the default root (equal to the location of the .csproj file), and another with the -I path explicitly specified by the ProtoRoot metadata on the two other files (maybe converted to the full path in case protoc does not digest relative paths as specified).

So my goal is to (1) make the compilation totally automatic by default, in line with the "new" MS approach (when you just put files besides .csproj, and they are picked up by the build automatically), at the same time allowing better control of proto differently-rooted "collections" (for the lack of a better term).

Can you think of use cases that won't be covered?

/cc @jskeet please comment on this one too.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the minimum Visual Studio version for this .targets file to work? 2013, 2015, 2017?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I specifically avoided using any MsBuild features beyond those in v4.0. So likely the build part will work even with VS2010, if anyone is antiquated enough to still use it. The XAML object model file is required for "new" style .NET SDK projects within VS; without them, proto files will not show up in Visual Studio at all. Using this file has no effect on older build, as they are simply ignoring the new collection. The AvailableItems collection is another way to tell VS to show files in the tree, used for the "old" style .csproj, and is certainly recognized by VS2012, but I am not about of VS2010.

Stand-alone SDK projects or standalone invocations of MSBuild do not recognize any of these special collections.

Copy link
Copy Markdown
Contributor Author

@kkm000 kkm000 Nov 21, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think I was wrong about VS2010--did it support .NET4.0 at all? Yes it did.

Question: what is the minimum supported build tools version for gRPC in .NET? I am asking this after following up on protocolbuffers/protobuf#3843; In case we should proceed with a build task DLL for protoc, then we need to make only sure the task can be loaded into MSBuild v4.0, v12.0, v14.0, v15.0 for net45+ and v15.0 for netstandard1.3+ (the latter on multiple platforms).

For reference, v4.0 is used by VS2010 aka VS10.0 and VS2012 aka v11.0, v12.0 by VS2013 aka VS12.0, v14.0 by VS2015 aka VS14.0, v15.0 by VS2017 aka VS15.0. Microsoft certainly made their versioning easy for us so we can have even more fun! :-/

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For gRPC as a user (= user downloads gRPC nugets and uses them in a project) we are supporting VS2013+ (.NET4.5+) and mono 4.0+ (things might actually work with mono 3.X unofficially).

For gRPC as a developer, you need VS2017 or dotnet CLI core as we are using C#6 and the "new" .csproj XML projects, but that's a different story.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good to know, thanks. I'll make sure these all work.

Copy link
Copy Markdown
Contributor

@jtattermusch jtattermusch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I quite like this, thanks for the contribution! I had some questions though (see comments inline).

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jtattermusch: Should it be "Google" or "gRPC authors"? It varies.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"gRPC authors" is correct. We probably did't update all the occurrences.

@kkm000
Copy link
Copy Markdown
Contributor Author

kkm000 commented Nov 20, 2017

@jskeet wrote:

Dependency change: the package now adds a dependency on Grpc.Tools of a matching version for managed projects only

Is that dependency added as a build-only dependency? If I'm building a library, I don't want to end up with a dependency on Grpc.Tools in my resulting library or nuget package.

Ugh, did I say Grpc.Tools. This is actually Grpc.Core, sorry about the confusion!

Mutatis mutandis, yes, if you build the package from proto files, you will end up with a runtime dependency on Grpc.Tools. Can you think of a scenario where that would be undesired (assuming there are some services defined, or the user is rather expected use Protobuf tools alone)? The built assembly will not work without it (and the Google.Protobuf runtime package).

Source presence
There are two times at which I'd definitely want the generated source to be available:
When debugging locally; I assume this is okay as they'll still be present in the obj file

Yes, both the debugger and IntelliSense know the files (you can hit F12 to go to the generated source).

When debugging code that depends on a library that includes the protos. There are multiple ways of getting at the source code for code that you're debugging; the one I'm tending towards these days is SourceLink, but that usually expects that the source code is persisted somewhere. Would you expect the source code for the protos to be bundled into the nuget package?

This is not what I planned for. I am not familiar with SourceLink (I'll figure out). I just tried $ dotnet pack --include-source on my little toy project, and the generated source are not inlcuded into the .symbols.nuget package (neither was the proto files). I am sure it should not be hard to tap into the packaging path in the NET.SDK scripts to include them.

A question here would be where to place them. As is, the files are generated under e.g. obj/Release/netstandard1.5, with their corresponding directories relative to the -I root. I do not want to lump them all into a single directory, as file names might conflict. Also, since the PDB records file locations, I think it is not that easy to move them even if the directory structure is preserved. Rewriting the PDB somehow? Do not know if there are tools for that.

Do you think we can leave that to the iteration 2? We can advance very far already with simple enough changes. I can look at packaging the proto with the symbol+source package with this version.

We should not generate the files under the source tree, however, due to the greediness of NET.SDK file discovery. As soon as we place them even into a subdirectory, the .csproj will discover them by default, and would yell at us for trying to include the file into the Compile collection twice (yes, I've been yelled at for that). Not that it cannot be remedied, but this should be an advanced option, well thought of. How likely is that someone will need access to the generated sources when using a source+symbols package, anyway? Indeed, your case, as the gRPC developer, is quite exceptional.

Well-known type dependencies
I'd prefer them to only be in Google.Protobuf.Tools, personally (where they are already). With a single source of truth, there's less chance of ending up with conflicting definitions. I think it would be reasonable to have a dependency on Google.Protobuf.Tools though.

Oh, no question here, of course I would prefer that!!! Can you help this happen? This would require Google.Protobuf.Tools packaging changes, very much inline with what I am doing in this CL. The main problem is the tools are not discoverable (read my "long answer" to @jtattermusch). In the ideal world, we can package the grpc codegen only, and the packages would know how to find each other and build with or without gRPC. But this requires a tight cooperation with the protobuf team. @xfxyjwf's answer to my question today made me think we have some work to do in this direction.

But I certainly do not want this project to die out again. My opinion is we should just go ahead and include the files for this version, while reworking the protoc compiler packaging too, to end up where we both, if I am understanding, want to end up.

Protobuf-only tooling
A lot of users may well use protobuf without gRPC. Would it be feasible to have an msbuild task that enables that scenario, and somehow automatically extend it for gRPC scenarios if Grpc.Core is in the dependencies? In particular, I don't want to end up with an msbuild task that stops us from doing anything sensible with proto-only projects.

I am not sure I understand the problem here (they say where the wise one does not see a solution, a fool does not see a problem, so forgive me :)).

In addition to my above comment about repackaging proto tools, one thing that is hard to tack is to know in advance whether the protonameGrpc.cs will be generated. The C++ codegen just creates a (nearly empty) file for a proto without services. I think the most sensible thing to do is to change the C# codegen to match this behavior. This will enable dependency checking in build to work correctly. As written, the script always causes the proto to recompile on every build, as the *Grpc.cs files are missing.

I could script around that, but there is no sound solution. What if someone refactors their proto files and moves the service definition? We can end up with the stale *Grpc.cs that still defines the old service, and compile errors (hopefully!).

Options
There's currently an option for generating internal types, and in the future I expect there to be an async option. Is there a simple way for us to allow options to be set in the project file?

Easy-peasy in the simplest case. See the MSBuild attributes in my answer to @jtattermusch.

But more complex cases are harder to handle. What if you compile a set of protobuf files that refer each other, and set the option to different values? I do not think we can express that in the command line. In this case, probably we can (a) produce an error or (b) take one value to trump another, (c) maybe with a warnings. Which would you go with?

Rebuilding
Am I right in saying that msbuild will basically associate the input proto file with the output cs file, and assume that it needs to rerun protoc if and only if the proto has changed? I assume so, but I thought it worth checking.

Yes. Since the .cs files are added to the Compile collection, MSBuild transitively checks their dependencies as well.

@kkm000
Copy link
Copy Markdown
Contributor Author

kkm000 commented Nov 20, 2017

@jskeet: SourceLink looks amazing, thanks for bringing me up to date on it. Looks like a better alternative to source packages!

It seems that the files it pulls should be under source control and pushed up to the server. The generated files generally should not. This is an interesting part.

@kkm000
Copy link
Copy Markdown
Contributor Author

kkm000 commented Nov 21, 2017

@jtattermusch, @jskeet: The question now is to use an MSBuild task or not.

@xfxyjwf suggested using the dep file generated by protoc. I like the idea, because it provides real, minimal, granular dependencies. Otherwise we need to either assume that the compile depends on all *.proto in all directories supplied by -I. This also solves the problem with file name mangling. Likely also solves the problem of missing xGrpc.cs file if x.proto has no services. And then the .sh script that guesses the Unix-like OS can go away too. This is a win all round.

However, this requires writing an MSBuild task, as scripting that inside MSBuild XML is nearly impossible. This certainly raises compatibility issues.

I played with MSBuild tasks yesterday a bit, and I can compile and package a task DLL so that it works on all versions of MsBuild and VS from 2010 on, and also with .NET.SDK on Linux (I did not yet test Mono, but it is using the same sources of MSBuild as VS, so I do not expect problems there). Essentially, the task can be packaged in two different builds: one for netstandard1.3 and another for net40. The first is below the requirements for gRPC on a machine, and the latter depends only on assemblies that come with the .NET runtime itself, not even a particular version of VS (that was my biggest concern, as these MsBuild runtime DLLs are even named differently!)

Do you have any opinion on this, pros and cons?

@jskeet
Copy link
Copy Markdown
Contributor

jskeet commented Nov 21, 2017

MSBuild task vs not: It feels like it's an upfront hit but one that would pay dividends over time. There may be various opportunities which are just a little too expensive if we have to do everything within the XML, but which would be easy from C#.

One other aspect that might be relevant: would using an MSBuild task allow you to run protoc once across all .proto files? There are probably some problematic cases there, but also potentially useful ones.

Protobuf-only tooling: I would have started with this, personally. Indeed, it may be all that's required in the long run. I'd start with a task that turns a .proto file into a .cs file, and doesn't know anything about gRPC. A second iteration would check whether the project has any gRPC dependencies, and generates the gRPC code if so. (Or even uses an explicit option for it.) What I don't want is a situation where we can't later add that task because we'd then have two tasks fighting for who gets to build things with a .proto file.

Debugging: The lack of SourceLink support would probably stop me from using this for my current work, but it's a reasonable thing to defer. (I believe it does have options for source files to be in the package though, as well as other files being in source control.)

@kkm000
Copy link
Copy Markdown
Contributor Author

kkm000 commented Nov 21, 2017

@jskeet:

MSBuild task vs not: It feels like it's an upfront hit but one that would pay dividends over time.

Agree. So the task it is!

One other aspect that might be relevant: would using an MSBuild task allow you to run protoc once across all .proto files?

Since <Exec> is also a task, everything applies to either. Batching is the MSBuild feature, so it calls the task as many times as needed, determined by the scripts around it.

Protobuf-only tooling: I would have started with this, personally. Indeed, it may be all that's required in the long run. I'd start with a task that turns a .proto file into a .cs file, and doesn't know anything about gRPC.

Since a task (of this type, derived from ToolTask) is concerned only with running a tool, there should be only one protoc task that takes same parameters as protoc beneath it does. Specify gRPC plugin, its options and output directory, get the stubs file.

My current plan is to package the tools in one package but with separate .props and .targets files, as if they came from separate packages, one for protoc and another for gRPC plugins. This way it would be easier to transplant the protoc part into the Protobuf.Tools package. The most important part is tools cross-detection, and it will be much easier to debug in a single package (as long as nothing actually depends on their being in the single package).

Debugging: The lack of SourceLink support would probably stop me from using this for my current work, but it's a reasonable thing to defer. (I believe it does have options for source files to be in the package though, as well as other files being in source control.)

You know that tool much better than I do. I agree this is for the next iteration, probably what I have is already quite a bite to swallow. If you can tell me what needs to be done to the generated files so that they are recognized by the tool, I'll probably figure out a way to do that.

@jskeet
Copy link
Copy Markdown
Contributor

jskeet commented Nov 22, 2017

I actually know very little about the details of SourceLink, but we can discover what to do later on. (ctaggart has been very helpful in the past.)

I'm fine with there being multiple tasks in a single package. I do think it would be best of in Google.Protobuf.Tools (or even Google.Protobuf.MSBuild) from the start though. While it's fairly natural for a gRPC user to refer to protobuf tools, it's odd for a Protobuf-only user to need a gRPC package. (Again, we need to make sure that the package doesn't end up in end-project dependencies etc, but I'm assuming that's easy enough to do.)

I think you've answered all my questions at this point, so I'll leave the details to the gRPC and Protobuf teams :)

@jtattermusch
Copy link
Copy Markdown
Contributor

@kkm000 the plan is for the change to land in the next release v1.17.0 (branch will be cut around Nov 20 and the release will go out beginning of Dec). Until then, the package will be available in nightly builds on packages.grpc.io

Before the branch cut, we should update the documentation and do some cleanup, the example update PR should be merged after the packages are pushed to nuget.org (but it should be in ready-to-merge state before then).

@kkm000
Copy link
Copy Markdown
Contributor Author

kkm000 commented Oct 18, 2018

@jtattermusch,

I had similar problems in the past (when still the legacy nuget), that's why the original version of Grpc.Tools.nuspec had full paths including the destination filename. I also vaguely remember the / and \ distinction was important.

Yep, I had a similar experience. I had an internal package containing a text file called DEPLOY for internal tooling, and also had a trouble putting it in place. This issue is different, I believe. I'll look even closer into this when we are done with this release. Let's wait if the NuGet guys respond.

Knowing that some versions of the SDK may silently mess with our packaging is kinda unnerving. Glad we have daily builds easy to eyeball, but an automated verification that the tools behaved sane would be helpful, too. I'll keep that on my list. I am worrying that (1) we do not accidentally release a package with messed paths and (2) someone who gets the code and builds their own (modified, presumably) version with their SDK gets a consistent result. (1) is a pretty major point, (2) is much less so. I have no idea if anyone in the wild could be RYO a modified Tools package, seems not very likely. Overall, (2) is safe to postpone until a bug report of it, IMO.

the example update PR should be merged after the packages are pushed to nuget.org

But before the version is tagged with a Git tag, right? So the users who git checkout v1.17.0 get examples pointing to the new package. Just trying to grok the release sequence. This part is a bit circular in my mind. Should I make a copy/cherrypick PR targeting the branch after it's cut in order to get examples into it, and then separately into master after the packages are shipped? Or the release branch is merged back into master, so that would happens automagically, and the examples PR should target only the 1.17 branch?

Examples are so independent from the rest of the versioning and branching of the core development, it often make sense to put them in own repo. Maybe that's our case, too.

(but it should be in ready-to-merge state before then).

The examples are in the PR #14684, ready for a review.

@jtattermusch
Copy link
Copy Markdown
Contributor

@kkm000 the best way is probably to create the examples PR against v1.17.x (once it exists - i.e. approx 2 weeks before the actual release where the version is git tagged with v1.17.0 and nugets are pushed), merge into the release branch and then integrate the example updates into master branch.
For now we can review #14684 against master and later close it and create a copy of that PR against v1.17.x

@kkm000
Copy link
Copy Markdown
Contributor Author

kkm000 commented Oct 19, 2018

@jtattermusch Sure, makes the most sense. FYI, It's easy to change the target, there's not even a need to open/close a PR. The Edit button to the right of title turns target name after "Wants to merge into..." into a drop-down.

@jurby
Copy link
Copy Markdown

jurby commented Nov 14, 2018

@jtattermusch @kkm000 May I ask you when does it plan to release this version (1.17) on nuget.org? Alternatively, could you make a relase of 1.17-dev, please? Thank you very much.

@kkm000
Copy link
Copy Markdown
Contributor Author

kkm000 commented Nov 15, 2018

@jurby, as far as I can see, the branch has not been cut yet. You can pull the package from https://packages.grpc.io/ (click in the BuildId column to get to the files). I'd be grateful if you could report any problems you'd find!

@gilescope
Copy link
Copy Markdown

Tried the dev nuget package. Seemed to work well in OS-X and Windows :-) .
The linux x86 nuget protoc segfaulted in alpine :-( (After apt add protobuf, protoc on the path worked fine but the nuget protoc still segfaulted.)

(FROM microsoft/dotnet:2.1-sdk-alpine)

@Harald-Improbable
Copy link
Copy Markdown

Tried the dev nuget package. Seemed to work well in OS-X and Windows :-) .
The linux x86 nuget protoc segfaulted in alpine :-( (After apt add protobuf, protoc on the path worked fine but the nuget protoc still segfaulted.)

(FROM microsoft/dotnet:2.1-sdk-alpine)

If it helps any: gdb backtrace is

#0  0x000000000066790b in ?? ()
#1  0x00000000006306bb in ?? ()
#2  0x0000000000631735 in ?? ()
#3  0x00007ffff7dc4d47 in __pthread_once_full () from /lib64/ld-linux-x86-64.so.2
#4  0x0000000000001000 in ?? ()
#5  0x00007ffff7dc4cc6 in pthread_mutexattr_settype () from /lib64/ld-linux-x86-64.so.2
#6  0x000000000097f9b8 in ?? ()
#7  0x0000000000000000 in ?? ()

@jtattermusch
Copy link
Copy Markdown
Contributor

@Harald-Improbable can you run
ldd path/to/protoc and paste the input here?

@kkm000
Copy link
Copy Markdown
Contributor Author

kkm000 commented Nov 20, 2018

@Harald-Improbable, @gilescope, @jtattermusch, AFAIK, Alpine is a quite a squashed distribution, designed to minimize Docker image footprint for production on disk-strained systems. If I were choosing one for development, I'd go with Ubuntu. Then you can deploy the built stuff with the Alpine image. It also might be worth a try opening the issue against google/ProtoBuffers repo.

It looks like the gRPC plugin worked--or at the least loaded correctly when the distro's protoc was used, assuming all the package defaults were used. @Harald-Improbable, do you:

  • have any services declared in your proto files?
  • have perchance disabled gRPC plugin by setting GrpcServices="None" on .proto files?

Also, to clarify, @jtattermusch is probably asking for the ldd output from our packaged protoc, the one in linux_x64 subdirectory of the package (the backtrace indicates you are x64, but this is probably the only thing possible to infer from it, lacking debug symbols). The file is in ~/.nuget/packages/grpc.tools/1.17.0-dev/tools/linux_x64/protoc, unless you configure NuGet in your setup differently.

@jtattermusch
Copy link
Copy Markdown
Contributor

FTR, see #17255 which has a similar problem for grpc_csharp_ext.

@jtattermusch
Copy link
Copy Markdown
Contributor

@Harald-Improbable
I got this output:

ldd /root/.nuget/packages/grpc.tools/1.15.0/tools/linux_x64/protoc
	/lib64/ld-linux-x86-64.so.2 (0x7fdcf778e000)
	libm.so.6 => /lib64/ld-linux-x86-64.so.2 (0x7fdcf778e000)
	libpthread.so.0 => /lib64/ld-linux-x86-64.so.2 (0x7fdcf778e000)
	libc.so.6 => /lib64/ld-linux-x86-64.so.2 (0x7fdcf778e000)
Error loading shared library ld-linux-x86-64.so.2: No such file or directory (needed by /root/.nuget/packages/grpc.tools/1.15.0/tools/linux_x64/protoc)

Based on that, installing libc6-compat should be enough to fix the problem: apk add libc6-compat (it should solve the missing ld-linux-x86-64.so.2 problem)

@Harald-Improbable
Copy link
Copy Markdown

@jtattermusch We do use libc6-compat in our docker config, but unfortunately it's still dying:

FROM microsoft/dotnet:2.1-sdk-alpine
RUN apk update && apk add libc6-compat

COPY ./csharp /src
WORKDIR /src
RUN dotnet build -c Release

@Harald-Improbable
Copy link
Copy Markdown

@kkm000 Thanks for the advice, we've followed your suggestion and went with Ubuntu

@kkm000
Copy link
Copy Markdown
Contributor Author

kkm000 commented Nov 29, 2018

@Harald-Improbable, sure, as long as it's for build only, why bother with a few extra MB of the size. musl is not easy to combine with glibc.

FWIW, I stepped across this: https://github.com/frol/docker-alpine-glibc. Looks maintained (glibc is 2.28, the latest I believe). But this still inevitably introduces some custom bits that haven't conceivably received the same level of testing as either of the base libraries. So yeah, I'm with you, I'd better stay on the safe side.

@hagen93
Copy link
Copy Markdown

hagen93 commented Dec 6, 2018

A similar problem to what is described with alpine above seems to also be present when running under Windows Docker (Nanoserver SAC2016, microsoft/dotnet:2.2-sdk). After debugging it with the process explorer it looks like it crashes as protoc is trying to load SysWOW64 which is unavialable in Nanoserver and therefore also unavailable in all the dotnet Docker images for Windows. Is there a fix for this?

@kkm000
Copy link
Copy Markdown
Contributor Author

kkm000 commented Dec 6, 2018

@hagen93: Uh-oh. So it looks like nanoserver does not support 32-bit executables? Let me investigate, thanks for the heads-up!

@kkm000
Copy link
Copy Markdown
Contributor Author

kkm000 commented Dec 7, 2018

@hagen93: Thank you, that is an excellent catch! This is actually the case! https://docs.microsoft.com/windows-server/get-started/getting-started-with-nano-server#important-differences-in-nano-server, second bullet point:

  • Only 64-bit applications, tools, and agents are supported.

I'll send in a fix today.

/cc @jtattermusch

@kkm000
Copy link
Copy Markdown
Contributor Author

kkm000 commented Dec 7, 2018

@hagen93, you may temporarily work around the issue by either setting the environment variable PROTOBUF_TOOLS_CPU=x64, or passing it as property with the same value to dotnet build or dornet pack (not possible with `dotner run), e. g.,

dotnet build -v:n --no-incremental -p:PROTOBUF_TOOLS_CPU=x64

I just opened #17437 for the fix. I would really appreciate if you confirm that the new package, when available, works in this image without this workaround.

@hagen93
Copy link
Copy Markdown

hagen93 commented Dec 7, 2018

@kkm000 Sure, when and where can I find the new package? For the workaround I already built a custom .NET Core image based on Server Core instead of Nanoserver where SysWOW64 is available, but the workaround you are suggesting looks like it is quite a bit nicer than that :)

@kkm000
Copy link
Copy Markdown
Contributor Author

kkm000 commented Dec 7, 2018

@hagen93, that's not under my control, I'm just a contributor. I hope 1.17.2, but I do not understand how minor points are released. I'll @ you when it available. You may also subscribe to watch #17437, if you do not mind extra notification fuzz. :)

@hagen93
Copy link
Copy Markdown

hagen93 commented Dec 7, 2018

Alright 😄 An @ would do nicely. I'll take the new version for a spin when I hear from you 😄

@hagen93
Copy link
Copy Markdown

hagen93 commented Dec 21, 2018

@kkm000 FYI: The workaround you suggested above does not seem to work. I tried both the env var and the msbuild property, but no luck. Just in case any one else tries to use it 😄.

@lock lock bot locked as resolved and limited conversation to collaborators Mar 21, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

cla: yes lang/C# release notes: yes Indicates if PR needs to be in release notes

Projects

None yet

Development

Successfully merging this pull request may close these issues.