So I'm going to start writing a (set of?) plugins ...
# plugins
a
So I'm going to start writing a (set of?) plugins that might seem a little out of order, and I'm not sure where to start. Basically, I've got a set of things that needs to happen. 1. Do some 'codegen' to inline python code 2. Do some more codegen to generate YAMLs file based on some targets/fields in the BUILD files & environment vars 3. Do some more codegen to create another YAML file referencing the other yaml files 4.
deploy
this to 'the cloud' using a binary blob 5. Run a subset of tests against this code in 'the cloud' 6. also have the ability to
deploy
this manually. of cause, hope to leverage the whole Pants 'don't repeat work' feature to skip steps #1 to #4 if the code doesn't change. I think I've got a grasp on doing the codegen. But can't picture how to make the deploy the code as a dependency for tests work And it's been nearly 6 months since I last used Pants.
h
šŸ‘‹
Can you explain 1. a little further? What is the input and what is the output?
Actually never mind, if you've got a grasp on the codegen already
So I guess, can you explain 4 and 5 in more detail?
a
been poking around and I've got no grasp on anything šŸ˜†
So, basically one of our runtimes has a few restrictions • No imports • One file • Magical Globals But we need to share parts of the code across different systems, so we need to work around this. We're going to do that with code-inlining. So, to upload the code to the system we need to 1. Inline the various python sources into one file. 2. Modify a
resource.yaml
file that has an ID for the file, and a pointer to the filename, and some other unique-stuff 3. Create a
manifest.yaml
file that references things inside multiple
resource.yaml
around the place (how this works I have no idea) 4. Use a provided 3rd party binary-blob that can be run on the command line to upload the code
./thing upload <http://path.to|path.to>.manifest.yaml
Then we want to test the stuff we uploaded using
pytest
and
behave
Trying to whip up a POC tonight to take to the decision makers to prove some of this can be done. So I've started on generating a Pants rule to read the
resource.yaml
on disk, modify it as necessary, and then spit out a new one.
which i guess should be simple? the
GeneratedSourceRequest
gives me a
Snapshot(... files=(src/python/foo/resource.yaml)...)
just need to read that, and then dump that in a
FileContents
yeah?
šŸ’„ easy
Copy code
import yaml
from dataclasses import dataclass

from pants.engine.rules import collect_rules, rule, Get, MultiGet
from pants.engine.unions import UnionRule
from pants.engine.target import GenerateSourcesRequest, GeneratedSources
from pants.engine.fs import FileContent, Snapshot, Digest, DigestContents, CreateDigest
from pants.engine.environment import EnvironmentRequest, Environment

from .target_types import SmartId, SmartContractSrc, RTIMEResourceSrc

@dataclass(frozen=True)
class RTIMEConfigRequest:
    pass


@dataclass(frozen=True)
class RTIMEConfig:
    resource_prefix: str


class GenerateYamlFromResourceYamlRequest(GenerateSourcesRequest):
    input = SmartSrc
    output = RTIMEResourceSrc


@rule
async def get_tmv_resource_prefix(request: RTIMEConfigRequest) -> RTIMEConfig:
    req = EnvironmentRequest(["RTIME_RESOURCE_PREFIX"])
    env = await Get(Environment, EnvironmentRequest, req)

    resource_prefix = env.get('RTIME_RESOURCE_PREFIX')
    if not resource_prefix:
        1/0

    return RTIMEConfig(resource_prefix=resource_prefix)


@rule
async def generate_resource_yaml(request: GenerateYamlFromResourceYamlRequest) -> GeneratedSources:
    rtime_config, digest_contents = await MultiGet(
        Get(RTIMEConfig, RTIMEConfigRequest()),
        Get(DigestContents, Digest, request.protocol_sources.digest),
    )
    src_file = digest_contents[0]

    sc_id = request.protocol_target[SmartId]

    resource = {
        "type": "SMART__VERSION",
        "id": f"{rtime_config.resource_prefix}{sc_id.value}",
        "payload": src_file.content.decode('ascii')
    }

    content = FileContent(src_file.path, yaml.dump(resource, indent=2).encode())

    digest = await Get(Digest, CreateDigest([content]))
    snapshot = await Get(Snapshot, Digest, digest)

    return GeneratedSources(snapshot)


def rules():
    return [
        *collect_rules(),
        UnionRule(GenerateSourcesRequest, GenerateYamlFromResourceYamlRequest)
    ]
I forgot how easy this stuff is once you've found the right
Get
call in the codebase.
Finding that call, another matter. But I've written a (very) primative inliner and YAML generator in under 100 lines between them
h
Oh wow, this is a pretty unique use case, fascinating!
Which was the right
Get
call, and what would have helped find it more easily? We'd like to improve the docs for all this.
āž• 1
a
I think, to put it bluntly, all of it could be improved. Once I worked out what data I needed, it was kinda easy to use Github to find out what Request I needed to make to get it. Like working out that
DigestContents
would give me the file contents. Then working out that
request
had a Digest tucked away in it to get what I wanted. Or
EnvironmentRequest
to get a subset of the environment. Which I only knew about from my previous forays into plugin development. If it were me, I'd probably consider making custom documentation browser where you can browse the available request/response dataclasses for rules and read what they do (e.g.
Environment
) and have it output a list of valid uses. e.g. you can get one with an
EnvironmentRequest
or
CompleteEnvironmentRequest
and it's used as an argument to a
Process
āž• 1
h
A good first step we haven't advertised enough yet from Andreas is
./pants help DigestContents
for example - will show you all the rules that return it etc
šŸ’” 1
a
And I think, as a 'plugin developer' there's a bit of a curve to 'get it' but once you 'get' it, you get it and there's this feeling of "oh shit, I can do... everything?" And it's just then a case of working out the steps to get there.
āž• 1
šŸ’Æ 1
(mostly, at least) I need to make my
test
targets that have a dependency on my
publish
targets do the deploy of the deployable before running the test. Because that seems alien to me.
although I managed to work out how to make package run package 6mo ago. so maybe it's not so alien.