Skip to content

Implement 3rd-party (envoy) proto generation mechanism#25272

Closed
lidizheng wants to merge 1 commit intogrpc:masterfrom
lidizheng:xds-proto
Closed

Implement 3rd-party (envoy) proto generation mechanism#25272
lidizheng wants to merge 1 commit intogrpc:masterfrom
lidizheng:xds-proto

Conversation

@lidizheng
Copy link
Copy Markdown
Contributor

@lidizheng lidizheng commented Jan 26, 2021

This PR improves our infrastructure code to support 3rd party proto dependencies, and allow them to generated C++ stubs via protoc compilation. It works for both Bazel and CMake.

The 3rd party proto libraries supported by this PR:

  • envoyproxy/data-plane-api
  • googleapis/googleapis
  • cncf/udpa
  • envoyproxy/protoc-gen-validate
  • census-instrumentation/opencensus-proto
  • ProtoBuf's well known protos (switched them to this path)

Bazel

For Bazel compilation, the general idea is letting Bazel treat envoy-api as a sub-repository. It's done by injecting a WORKSPACE file for it. But to avoid circular dependencies, we need to explicitly depend on transitive libraries. Most of the transitive libraries are already included in gRPC as Git submodules, except opencensus-proto which was needed by Envoy's opencensus.proto. The added grpc_deps are:

  • googleapis
  • re2
  • gazelle
  • opencensus-proto
  • protoc-gen-validate
  • io_bazel_rules_go (update required by protoc-gen-validate)

One extra setup is added for googleapis to WORKSPACE, so it exposes the APIs in a subset of languages instead of all of them.

Current version of re2 failed ASAN (see log). So, this PR disabled ODR check in ASAN (feel free to suggest other ways).

One caveat is the proto generation will fail on Windows RBE, since it doesn't have good Golang support (see log). The Golang tools is required by protoc-gen-validate which is a ProtoBuf plugin wrote by Envoy team, and it needs to be executed for every single Envoy proto files. CC @jtattermusch do you have more context about Golang on Windows RBE?

Overall, Bazel has generally good external library support. Follow the error log and fix the complains one-by-one, and it builds magically in the end.

ProtoBuf Inconsistent Path Problem

Without Bazel, we have to directly interact with protoc, which has some quirks that it is very picky about paths. One needs to find the right combination of working directory (pwd), import directory (-I), and import path (how one proto is meant to be imported). For example, let's say we have 2 proto files each in a different 3rd_party library:

# third_party/lib_a/a.proto
...
message NotImportant {}

# third_party/lib_b/b.proto

import "a.proto";
...
Proto file location Current working directory Expected Descriptor Name for "a.proto" Generated Descriptor Path How "b.pb.h" Imports "a.pb.h" Generated "a.pb.h" Header Path
Untouched . descriptor_table_a_proto descriptor_table_third_party_lib_a_a_proto ✗ #include "a.pb.h" $OUTPUT/third_party/lib_a/a.pb.h ✗
Untouched third_party descriptor_table_a_proto descriptor_table_lib_a_a_proto ✗ #include "a.pb.h" $OUTPUT/lib_a/a.pb.h ✗
Both copied to project root . descriptor_table_a_proto descriptor_table_a_proto ✓ #include "a.pb.h" $OUTPUT/a.pb.h ✓
Only b.proto is copied to project root . descriptor_table_a_proto descriptor_table_third_party_lib_a_proto ✗ #include "a.pb.h" $OUTPUT/third_party/lib_a/a.pb.h ✗

As above, the ProtoBuf C++ generated code brutal-forcely translate the import line import "third_party/lib_a/a.proto"; to how its dependency headers should be imported, and what its dependency descriptor should be named. But it somehow generated descriptor name based on relative path. The inconsistent header file path is annoying but not blocking the build. However, the difference in descriptor name will cause the build to fail deterministically. This issue troubles many ProtoBuf adopters that it blocks people from using multiple repos to store proto files.

To make protoc work, we can simulate how it is intended to be used in Google, which is gathering the build metadata, find the expected import path, and rebuild the directories, like a monorepo, or like Bazel.

gRPC Buildgen

The core update is done within extract_metadata_from_bazel_xml.py:

  • Allow external library labels to be count as dependencies;
  • Treat external libraries as normal libraries, so the script can traverse to find all transitive dependencies;
  • Map external library Bazel file labels to actual file locations.

(By removing all the intermediate dependencies, we lost the needed information to build those files in order. But the script uses topological sort to create a workable building sequence... What's the rationale behind it?)

The buildgen code is kind of complex, to understand how it work, I polished part of them as I read. It modernized some function calls and added type annotations (not all). There was one logical change, which is move the pre-build logic from mako_renderer.py to generated_projects.py, because the pre-build logic isn't really using "mako" and will only be executed once, there is not much benefit to wrap it in a process pool.

Also, there were 2 tiny fixes: fixed the jobset.py output so it prints str instead of bytes, so when run_tests.py fails the result will be more readable; finally moved the PyYAML installation to sanity Docker container, sorry for delaying for so long.

CMake

The updated ProtoBuf generation logic is:

  1. Create a cmake/build/protos directory (or /protos);
  2. Copy proto files to cmake/build/protos;
  3. Generate C++ files via protoc as needed.

Notably, in step 2, the copy is done by file(COPY instead of add_custom_command, so the proto files will always be copied regardless they were imported or not. This is not as elegant as per-file dependency, because updating the proto files could trigger a large scale rebuild. On the other hand, the per-file dependency is not hard to implemented (as follow). However, since we removed dependency information and I yet to grasp the essence of the build-order sort code, the PR currently uses the brutal-force-copy solution. (@jtattermusch Please help)

    add_custom_command(
        OUTPUT  <%text>${_gRPC_PROTO_SRCS_DIR}/${REL_DIR}</%text>
        COMMAND <%text>${CMAKE_COMMAND}</%text>
        ARGS    -E make_directory
                <%text>${_gRPC_PROTO_SRCS_DIR}/${REL_DIR}</%text>
        WORKING_DIRECTORY <%text>${CMAKE_CURRENT_SOURCE_DIR}</%text>
        VERBATIM)
    add_custom_command(
        OUTPUT  <%text>${_gRPC_PROTO_SRCS_DIR}/${IMPORT_PATH}</%text>
        COMMAND <%text>${CMAKE_COMMAND}</%text>
        ARGS    -E copy
                <%text>${CMAKE_CURRENT_SOURCE_DIR}/${FILE_LOCATION}</%text>
                <%text>${_gRPC_PROTO_SRCS_DIR}/${IMPORT_PATH}</%text>
        DEPENDS <%text>${CMAKE_CURRENT_SOURCE_DIR}/${FILE_LOCATION}</%text>
        WORKING_DIRECTORY <%text>${CMAKE_CURRENT_SOURCE_DIR}</%text>
        VERBATIM)

xDS End2End Test

I make our C++ xDS end2end test the final examination of this PR. I replaced all the xDS proto dependencies from our copied version under src/proto/grpc/testing/xds to their real, complete version. Since the protos were manually moved and trimmed, there were some inaccuracy in message naming.

Old Compilers

Current ProtoBuf library (v3.14.0) has a bug that prevents C++ generated code to build on gcc 4.9, gcc 5.3, clang 3.6, and clang 3.7. I filed an issue to them: protocolbuffers/protobuf#8240, but yet got any response. By upgrading to their head, we regain compatibility with gcc 4.9 and gcc 5.3, but ProtoBuf is still incompatible with clang 3.6 and clang 3.7...


This change is Reviewable

@lidizheng
Copy link
Copy Markdown
Contributor Author

@gnossen PTAL at the Bazel changes and the Python script updates.
@veblush PTAL at the CMake related changes and if I made some errors in "buildgen" scripts.

@gnossen
Copy link
Copy Markdown
Contributor

gnossen commented Feb 4, 2021

Current version of re2 failed ASAN (see log). So, this PR disabled ODR check in ASAN (feel free to suggest other ways).

This isn't a workable solution. This will lead to nondeterminism and segfaults.

Copy link
Copy Markdown
Contributor Author

@lidizheng lidizheng left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the quick review, and catching the two re2 bug!

Copy link
Copy Markdown
Contributor

@gnossen gnossen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM on the Bazel changes. I can't speak for the Cmake changes though.

Copy link
Copy Markdown
Contributor

@jtattermusch jtattermusch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's some controversy here :-). Two main points:

  • let's split the PR into refactoring and cleanup
  • come up with a cleaner and more readable way to update tools/buildgen/extract_metadata_from_bazel_xml.py logic.

Copy link
Copy Markdown
Contributor Author

@lidizheng lidizheng left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the detailed review! PTALA @jtattermusch

@lidizheng
Copy link
Copy Markdown
Contributor Author

@jtattermusch PTALA. The changes have been rebased into this PR.

@lidizheng lidizheng requested a review from jtattermusch March 18, 2021 19:29
Copy link
Copy Markdown
Contributor

@jtattermusch jtattermusch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the rebase.
I reviewed and I think the overall approach looks ok, but some cleanup is still needed.
I'm particularly hesitant about he introduction of the "external_library" element in the build.yaml (since it goes against our past effort to remove stuff from build.yaml).

Other than that, I left a bunch of comments.


@rem Just before installing gRPC, wipe out contents of all the submodules to simulate
@rem a standalone build from an archive
git submodule deinit --all --force
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: why this change?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

deinit leaves an empty folder, like third_party/udpa. But the submodule polling mechanism only works if the submodule's folder isn't exist. Alternatively, we can validate the integrity of each submodule but they will be much more complex, since the source archive contains no Git information.

com_envoyproxy_protoc_gen_validate='third_party/protoc-gen-validate/',
opencensus_proto='third_party/opencensus-proto/src/')
# The external repositories that we are currently able to support the metadata
# extraction. We aim to support arbitrary external libraries in the long term.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"We aim to support arbitrary external libraries in the long term" - i'm not convinced that is actually what we want.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed.

# extraction. We aim to support arbitrary external libraries in the long term.
# This variable should be removed eventually.
RE_MATCH_SUPPORTED_EXTERNAL_LIBS = re.compile(
f'^@({"|".join(EXTERNAL_LIB_LABEL_PATH_MAP.keys())}).*$')
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think constructing a huge regex from EXTERNAL_LIB_LABEL_PATH_MAP is actually quite fragile and hard to understand and it doesn't really buy you much (I'm pretty sure the speed gain from this in negligible since the number of source files we have is relatively small).
The keys are simply prefixes such as "@envoy_api" so you can match them with a simple for loop without needing complicated regexes constructed from dictionaries and the result is going to me more readable, with no noticeable speed impact.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done. Make sense.

xml_tree = _fetch_raw_http_archives()
libraries = _parse_http_archives(xml_tree)
libraries.sort(key=lambda x: x.destination)
print(yaml.dump(dict(external_libraries=list(map(asdict, libraries)))))
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

also, you're ambitiously calling this the new entity external_library but it's unclear what is the future plan with it.
Is it intended to be only used for importing .proto files (in which case it should be e.g. external_proto_library) or are you somehow planning to use it for other libraries in the future as well? If so, since there's no detail no plan on that, it's unclear to me whether that would actually work or not and if it would be useful (more research would be needed).

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm keeping it as external_library since we might move upb/xxhash into this list. WDYT?


% for external_library in external_libraries:
# Setup external proto libraray at ${external_library.destination}
if (NOT EXISTS <%text>${CMAKE_CURRENT_SOURCE_DIR}</%text>/${external_library.destination})
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1 for checking the submodule is already there and skipping the download if so. Perhaps we should add a comment that explains why exactly you're doing this?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adding several comments, describing how the if condition works and what the download function is doing.

# ``.proto`` files
# ``FILE_LOCATION``
# The relative path of the ``.proto`` file to the project root
# ``IMPORT_PATH``
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: you're calling this arg "import_path" which would suggest that it's a directory name, but from the generated files I can see it's the full file name (not just the path).

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's the ProtoBuf import path, where the source file itself thinks it should belong to. Feel free to suggest alternatives.

'opencensus_proto',
'com_envoyproxy_protoc_gen_validate',
'com_google_googleapis',
'com_github_cncf_udpa',
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why is this under BAZEL_ONLY_DEPS? it seems there's a third_party/udpa submodule so it should be possible to check that the bazel dep and the submodule are at the same commit. Same for other "external proto" libraries.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed. TIL this is an exemption list, I thought it is adding extra checks.

Copy link
Copy Markdown
Contributor Author

@lidizheng lidizheng left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry for dragging this so long. Last 2 weeks I was trying to get fault injection/xDS config dump/admin interface done before the release cut.

I understand your concern about adding "external_library" element in the "build.yaml", but actually, with previous pattern the "external_library" is a transient "build.yaml" that will be placed in "/tmp". However, by moving the logic into "extract_metadata_from_bazel_xml.py", we are adding the new field to build_autogenerated.yaml.

Feel free to suggest alternative!

On a higher level, this PR will face a tough issue during import to google3 (same as #25762). None of the Envoy source code is accepted as Core component, but gRPC is a Core component. I guess the eventual result is that we will hide the xDS protos in google3 same as #25762. This requires further investigation.

BuildDict = Dict[str, BuildMetadata]
BuildYaml = Dict[str, Any]

import build_cleaner
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.


PROJECT_ROOT = os.path.join(os.path.dirname(os.path.abspath(__file__)), '..',
'..')
os.chdir(PROJECT_ROOT)
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed. With this line, I hoped this script can be invoked anywhere. But that seems like a useless feature.

com_envoyproxy_protoc_gen_validate='third_party/protoc-gen-validate/',
opencensus_proto='third_party/opencensus-proto/src/')
# The external repositories that we are currently able to support the metadata
# extraction. We aim to support arbitrary external libraries in the long term.
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed.

# extraction. We aim to support arbitrary external libraries in the long term.
# This variable should be removed eventually.
RE_MATCH_SUPPORTED_EXTERNAL_LIBS = re.compile(
f'^@({"|".join(EXTERNAL_LIB_LABEL_PATH_MAP.keys())}).*$')
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done. Make sense.


@rem Just before installing gRPC, wipe out contents of all the submodules to simulate
@rem a standalone build from an archive
git submodule deinit --all --force
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

deinit leaves an empty folder, like third_party/udpa. But the submodule polling mechanism only works if the submodule's folder isn't exist. Alternatively, we can validate the integrity of each submodule but they will be much more complex, since the source archive contains no Git information.


% for external_library in external_libraries:
# Setup external proto libraray at ${external_library.destination}
if (NOT EXISTS <%text>${CMAKE_CURRENT_SOURCE_DIR}</%text>/${external_library.destination})
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adding several comments, describing how the if condition works and what the download function is doing.

# ``.proto`` files
# ``FILE_LOCATION``
# The relative path of the ``.proto`` file to the project root
# ``IMPORT_PATH``
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's the ProtoBuf import path, where the source file itself thinks it should belong to. Feel free to suggest alternatives.

# ``FILE_LOCATION``
# The relative path of the ``.proto`` file to the project root
# ``IMPORT_PATH``
# The proto import path that itself expected to be placed in
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated to:

  #   ``IMPORT_PATH``
  #     The proto import path that itself expected to be placed in. For
  #     example, a "bar.proto" file wants to be imported as
  #     `import "foo/bar.proto"`. Then we should place it under
  #     "<ProtoBuf_Include_Path>/foo/bar.proto" instead of
  #     "<ProtoBuf_Include_Path>/third_party/foo/bar.proto". This ensures
  #     correct symbol being generated and C++ include path being correct.
  #     More info can be found at https://github.com/grpc/grpc/pull/25762.
  #

file(RELATIVE_PATH REL_FIL <%text>${_gRPC_PROTO_SRCS_DIR}</%text> <%text>${ABS_FIL}</%text>)
get_filename_component(REL_DIR <%text>${REL_FIL}</%text> DIRECTORY)
set(RELFIL_WE "<%text>${REL_DIR}/${FIL_WE}</%text>")
# copy the proto file to a centralized location
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is due to the import path issue I mentioned in the PR description.

The copy happened at configuration phase of the CMake building, and the generation happens in the actual building phase of the CMake building. This means before the first protoc command is invoked, all protos will be placed in the right location.

# list since we want to avoid building sources that are already
# built by our dependencies
exclude_deps.update(bazel_rules[dep]['_TRANSITIVE_DEPS'])
exclude_deps.add(dep)
Copy link
Copy Markdown
Contributor Author

@lidizheng lidizheng Mar 30, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reverted to your version.

I don't think it's worth either of our time to proof it's right or wrong. It's a trade off between CPU cycles and cleanness. I think both are correct as long as they generate correct result within acceptable time.

@lidizheng
Copy link
Copy Markdown
Contributor Author

@jtattermusch The test should be green, and it is sync with master again. PTALA.

Copy link
Copy Markdown
Contributor

@jtattermusch jtattermusch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks for bearing with me and addressing all the comments I raised.

set(_download_archive_TEMPORARY_DIR ${CMAKE_BINARY_DIR}/http_archives)
file(MAKE_DIRECTORY ${_download_archive_TEMPORARY_DIR})

# This is bascially Bazel's http_archive.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: bascially => basically

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done. Good finding!

"mac",
"posix",
"windows"
"posix"
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: why this change? (also below in this file)

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is updating the admin_services_end2end_test test which uses Envoy proto on Windows, requiring to add Golang to our existing Windows RBE environment. But I haven't got time to experiment that effort yet.

src.replace(
f"@{external_library_name}//",
EXTERNAL_PROTO_LIBRARIES[external_library_name].
proto_prefix).replace(':', "/"))
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: inconsistent ' and " in replace(':', "/")

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated.

# If this http archive is not one of the external proto libraries,
# we don't want to include it as a CMake target
continue
for url in http_archive["urls"]:
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

strictly speaking, duplicating the external_library entries when there's multiple URLs is not correct, but I can understand that it makes the logic in the CMakeLists.txt simpler to implement (but it also makes it a bit harder to understand).

Since the sha256 and strip_prefix for the mirrored urls always needs to be the same, I think it would be better for the external_library entity in the build_autogenerated.yaml to allow URL to be a list of URLs (this is not a problem at all for the yaml file). That way, the "metadata" in build_autogenerated.yaml would be more correct (because you'd keep the clear mapping between http_achives and "external_library" entries). The duplication download_archive rules can then be trivially done in CMakeLists.txt.template (and at that point it would be also easier to understand the logic behind it - you could even add an explanatory comment for cases where "urls" contains multiple entries.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done. Updated the external_library to contain multiple URLs, and added comments if there are multiple download URLs for one library.

Copy link
Copy Markdown
Contributor Author

@lidizheng lidizheng left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the review! All comments addressed, PTALA. @jtattermusch

set(_download_archive_TEMPORARY_DIR ${CMAKE_BINARY_DIR}/http_archives)
file(MAKE_DIRECTORY ${_download_archive_TEMPORARY_DIR})

# This is bascially Bazel's http_archive.
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done. Good finding!

"mac",
"posix",
"windows"
"posix"
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is updating the admin_services_end2end_test test which uses Envoy proto on Windows, requiring to add Golang to our existing Windows RBE environment. But I haven't got time to experiment that effort yet.

src.replace(
f"@{external_library_name}//",
EXTERNAL_PROTO_LIBRARIES[external_library_name].
proto_prefix).replace(':', "/"))
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated.

# If this http archive is not one of the external proto libraries,
# we don't want to include it as a CMake target
continue
for url in http_archive["urls"]:
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done. Updated the external_library to contain multiple URLs, and added comments if there are multiple download URLs for one library.

Copy link
Copy Markdown
Contributor

@jtattermusch jtattermusch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Still LGTM. Thanks!

* Works for both CMake and Bazel
* Instrument xds_end2end_test with protos from envoy-api
* Fixed the Bazel repo naming with re2
* Refactor the build metadata compute algorithm
    * Now, in one pass, it will generate all the build metadata needed within
    this script, including the transitive deps, the sources/headers expanded
    without intermediate dependencies.
* Temporarily remove the removal of third_party dependencies
* Spin-off changes to sub PRs
* Add a fetching mechanism for external proto libs
* Accommodate Windows' CMake behavior difference
* Allow CMakefile to fetch external library
    * The URL is parsed from grpc_deps
    * Allow multiple download URLs for extra stability
@stale
Copy link
Copy Markdown

stale bot commented Jul 13, 2021

This issue/PR has been automatically marked as stale because it has not had any update (including commits, comments, labels, milestones, etc) for 30 days. It will be closed automatically if no further update occurs in 7 day. Thank you for your contributions!

@lidizheng
Copy link
Copy Markdown
Contributor Author

Still working on the internal parts of this PR, moving xDS protos into Core components.

@lidizheng
Copy link
Copy Markdown
Contributor Author

It's been a year, the internal component change is about to complete.

@lidizheng
Copy link
Copy Markdown
Contributor Author

We still want to remove the copied proto files. I have a rebased, all-in-one version in #29214. Learning from this PR, I hope this time I can make review less painful, so all code changes is split into:

  1. [xDS Proto] Update Bazel build dependencies for Envoy API #29219
  2. [xDS Proto] Enhence gRPC buildgen for 3rd party proto compilation #29220
  3. [xDS Proto] Update code to use Envoy protos instead of copy #29221
  4. [xDS Proto] rm -rf src/proto/grpc/testing/xds #29222

@markdroth
Copy link
Copy Markdown
Member

I think that almost all of the changes from this PR have been merged in other ways at this point, and I've just merged #37863 as a proof-of-concept that this works, so we no longer need this PR.

@markdroth markdroth closed this Oct 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants