I have a repository_rule which is querying to see if the local system has a database running and is fully migrated.
# impl
result = ctx.execute(["mysql", "-u", "root", "--protocol", "tcp", "-e", "select * from %s.flyway_schema_history" % ctx.attr.dbname])
ctx.file(
"local_checksum",
"""
{RETURN_CODE}
{STDERR}
{STDOUT}
""".format(
RETURN_CODE = result.return_code,
STDERR = result.stderr,
STDOUT = result.stdout,
),
)
...
# Rule Def
local_database = repository_rule(
implementation = _local_database,
local = True,
configure=True,
attrs = {
"datasource_configuration": attr.label(providers = [DataSourceConnectionInfo]),
"dbname": attr.string(doc = """
If omitted, will be the name of the repository.
"""),
"migrations": attr.label_list(allow_files = True),
},
)
The local_checksum is re-calculated and does its job whenever the dependency graph changes (as stated in the docs https://docs.bazel.build/versions/master/skylark/repository_rules.html#when-is-the-implementation-function-executed).
But since the database is not managed by bazel is there any way to force this specific rule to run every time bazel is invoked to ensure all dependencies are available?
After some sleep I cobbled something together. Still looking for a better answer, I would think there is a first class way to solve this.
I created a bazel wrapper at tools/bazel
#!/bin/bash
set -e
echo "`date`
Generated by tools/bazel" > .bazelexec.stamp
# from https://github.com/grpc/grpc/blob/master/tools/bazel
exec -a "$0" "${BAZEL_REAL}" "$@"
and then I added to the rule an attribute for reading that file:
local_database = repository_rule(
implementation = _local_database,
local = True,
configure=True,
attrs = {
"datasource_configuration": attr.label(providers = [DataSourceConnectionInfo]),
"dbname": attr.string(doc = """
If omitted, will be the name of the repository.
"""),
"migrations": attr.label_list(allow_files = True),
"recalculate_when": attr.label_list(allow_files = True, doc = """
Files to watch which will trigger the repository to run when they change.
You can add a tools/bazel script to your local repository, and write a file with a date
every time bazel is executed in order to get the migrator to check each bazel run if
someone changed the database.
"""),
},
and lastly, I create paths for those files in the rule so the repository believes its graph has changed.
# If you don't do something with the file, then the rule does not recalculate.
[ctx.path(file) for file in ctx.attr.recalculate_when]
# Total mysql hack for now... need java tool which dumps content of a table for different databases
result = ctx.execute(
["mysql", "-u", "root", "--protocol", "tcp", "-e", "show databases"],
)
ctx.file(
"local_database_checksum",
"""
{RETURN_CODE}
{STDERR}
{STDOUT}
""".format(
RETURN_CODE = result.return_code,
STDERR = result.stderr,
STDOUT = result.stdout,
),
)
Now every time I run build, if the databases change, the local_checksum file changes, and can trigger other rules to re-build (in my case I'm generating jooq classes, so my query goes into the tables), if the databases and tables are stable, then it doesn't trigger a rebuild.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With