Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 10 additions & 3 deletions google/cloud/dataproc_spark_connect/session.py
Original file line number Diff line number Diff line change
Expand Up @@ -1214,10 +1214,17 @@ def addArtifacts(
if pypi:
artifacts = PyPiArtifacts(set(artifact))
logger.debug("Making addArtifact call to install packages")
self.addArtifact(
artifacts.write_packages_config(self._active_s8s_session_uuid),
file=True,
config_path = artifacts.write_packages_config(
self._active_s8s_session_uuid
)
try:
self.addArtifact(config_path, file=True)
finally:
try:
os.remove(config_path)
os.rmdir(os.path.dirname(config_path))
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's not delete the parent directory even if empty. Let's say in future write_packages_config directly creates file under /tmp directory (instead of current session specific ones), in that case we may end up deleting tmp directory itself (if empty).

I believe @tim-u was working on replacing this entire logic with direct run command call (supported in latest Spark release), in that case these temporary file won't be needed at all.

except OSError:
pass
Comment thread
ajma marked this conversation as resolved.
else:
super().addArtifacts(
*artifact, pyfile=pyfile, archive=archive, file=file
Expand Down
Loading