Skip to content

Composer automations test#102

Open
larsespenssb wants to merge 2 commits intomainfrom
composer-automations-test
Open

Composer automations test#102
larsespenssb wants to merge 2 commits intomainfrom
composer-automations-test

Conversation

@larsespenssb
Copy link
Copy Markdown
Contributor

No description provided.

Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds a minimal Cloud Composer / Apache Airflow “smoke test” setup to the repo: a simple DAG plus a helper script to upload the DAG file to a Composer GCS bucket.

Changes:

  • Introduces an example Airflow DAG (god_dag) that prints a greeting on an hourly schedule.
  • Adds a small GCS upload utility to push the DAG file into the Composer bucket’s dags/ path.
  • Adds apache-airflow to project dependencies.

Reviewed changes

Copilot reviewed 3 out of 4 changed files in this pull request and generated 3 comments.

File Description
src/cloud_composer/upload_dag.py Adds a script/function to upload a local DAG file to a Composer GCS bucket.
src/cloud_composer/god_dag.py Adds a demo DAG definition intended to run hourly in Airflow/Composer.
pyproject.toml Adds the apache-airflow dependency required to author the DAG.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

greet()


god_dag()
Copy link

Copilot AI Apr 16, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The module calls god_dag() but does not keep a reference to the returned DAG object. Airflow DAG discovery typically relies on a DAG instance being present in module globals; without assignment, the DAG may not be registered/loaded by the scheduler.

Suggested change
god_dag()
god_dag_instance = god_dag()

Copilot uses AI. Check for mistakes.
@@ -0,0 +1,19 @@
from pathlib import Path

from google.cloud import storage
Copy link

Copilot AI Apr 16, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This module directly imports and uses google.cloud.storage, but google-cloud-storage is not declared as a direct dependency in pyproject.toml (it only appears to be present transitively in the lockfile). Please add google-cloud-storage explicitly to avoid future breakage if transitive dependencies change.

Copilot uses AI. Check for mistakes.
Comment on lines +10 to +15
def upload_file(bucket_name: str, source_file_path: Path, destination_blob_name: str) -> None:
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(destination_blob_name)
blob.upload_from_filename(str(source_file_path))
print(f"File {source_file_path} uploaded to {destination_blob_name}.")
Copy link

Copilot AI Apr 16, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

New upload behavior is introduced via upload_file(...), but there are no tests covering its expected interactions (e.g., that it calls the GCS client with the right bucket/blob and uploads the intended local file). Consider adding a unit test that mocks storage.Client so this can be validated without hitting GCP.

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants