Hi, is it possible to provide `.proto` file `depen...
# general
b
Hi, is it possible to provide
.proto
file
dependencies
to
protobuf_sources
target? Basically, I want to compile some proto files, but not their import dependencies which live in a different directory.
Something like -
Copy code
protobuf_sources(
    name="test-protos",
    sources=["test/protos/v1/*.proto"],
    dependencies=[":proto-deps"]
)

file(
    name="proto-deps",
    source="test-dep/protos/v1/options.proto"
)
h
You can specify dependencies= on a protobuf_sources target to depend on other protobuf_sources. You don't need that intermediate file target.
Or am I misunderstanding what you're trying to do
I think also that those deps are inferred automatically anyway
b
Correct, those dependencies are inferred automatically. But I don't want Pants to generate code for the dependencies (protos in import statements).
A protoc equivalent would be
protoc -I={dep.proto,src.proto} --python_out=. src.proto
e
It looks like you're out of luck. Unfortunately, we have no shared code here but I checked the 4 protobuf generators (go, scala, java & python) and none passes include args nor can be made to via options if I read correctly. This seems reasonable to support one way or another; so you may want to proceed with filing a feature request issue.
👍 1
b
Thanks John. I'll file a request
Related, is it possible to use
http_source
with
protobuf_sources
?
Thats really the end goal here. The source protos import protos which are published separately. It would be nice to provide a zip file as the
http_source
and include it in the proto-path
e
I'm not sure but I'd keep the feature request focused - some way to split the distinction between include vs compile. You can add as an aside that that's the end goal though and ask this same question. Hopefully that's enough to prompt whoever takes this up to consider the question and either include the feature or design with that in mind while asking for another feature request to be broken out for that separately.
👍 1
h
I'm not sure passing
-I
is a solution here, unless you're willing to construct that flag value manually. So I understand the problem better, what is the downside of generating the dependencies?
b
We have common protos that are packaged and distributed as zip files. Moreover, they are compiled to a language and the resulting artifact is also distributed (jar, wheel etc). Now, a pants repo can have protos that import these common protos, and also take a dependency on the precompiled common protos python package. The repo has to provide the common protos in
protobuf_sources
to compile the local protos. Which means the common protos get compiled as well. And since we already have a import dependency on the precompiled python package, the grpc runtime will complain about a conflict. This may be an edge case, but I think it is a valid one.
👋 Got some time to get back to this. I realize I may need a custom plugin to accomplish what I want. I would appreciate any help in pointing me in the correct direction.
I'm thinking something like this -
Copy code
protobuf_remote_dependencies(
    name="remote-deps",
    remote_packages=["some-archive", "some-other-archive"]
)

protobuf_sources(
    sources=["**/*.proto"],
    dependencies=[":remote-deps"]
)
So when I run
test
or
export-codegen
on the protobuf target, the engine recognizes the
remote-deps
dependency and executes my plugin.
The plugin should generate
.proto
files and make them available as dependencies to the
protobuf_sources
target.
I'm thinking a codegen plugin that goes from
SomeSourceField
->
ProtobufSourceField
?