Skip to content
#

apache-spark

spark logo

Apache Spark is an open source distributed general-purpose cluster-computing framework. It provides an interface for programming entire clusters with implicit data parallelism and fault tolerance.

Here are 1,011 public repositories matching this topic...

alikefia
alikefia commented Feb 8, 2021

Willingness to contribute

  • Yes. I can contribute this feature independently.

Proposal Summary

By default, artifacts are stored to ./mlruns (hard coded constant: DEFAULT_LOCAL_FILE_AND_ARTIFACT_PATH). The idea is to give the possibility to change this behaviour by setting an env var.

The same mechanism exists but not generalized:

_TRACKING_DIR_ENV_VAR = "MLFLOW_TRAC

Created by Matei Zaharia

Released May 26, 2014

Repository
apache/spark
Website
spark.apache.org
Wikipedia
Wikipedia

Related Topics

hadoop scala