Skip to content
#

workflow

Here are 3,307 public repositories matching this topic...

rpauli
rpauli commented Jan 24, 2022

Description

I am using Airflow to move data periodically to our datalake and noticed that the MySQLToS3Operator has tempalted fields and the DynamoDBToS3Operator doesn't. I found a semi awkward workaround but thought templated fields would be nice.
I supposed an implementation could be as simple as adding

template_fields = (
's3_bucket',
's3_key',
)

to the Operator

kvnkho
kvnkho commented Dec 15, 2021

Current behavior

You get an error if you try to upload the same file name

azure.core.exceptions.ResourceExistsError: The specified blob already exists.
RequestId:5bef0cf1-b01e-002e-6

Proposed behavior

The task should take in an overwrite argument and pass it to [this line](https://github.com/PrefectHQ/prefect/blob/6cd24b023411980842fa77e6c0ca2ced47eeb83e/src/prefect/

sryza
sryza commented Jan 18, 2022

https://dagster.slack.com/archives/C01U954MEER/p1642163479382400:

Hi, I'm experiencing a bug when trying to write an ALS model from pyspark.ml.recommendation to S3 and reading it back in if this takes place within a dynamically executed graph (i.e. via dynamic mapping). I wrote a custom IO manager using the pattern f's3a://{self.s3_bucket}/{key}' as _uri_for_key similar to the one currently imp

SuiteCRM
github-pages-deploy-action

Improve this page

Add a description, image, and links to the workflow topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the workflow topic, visit your repo's landing page and select "manage topics."

Learn more