10

How to pass a python variables from %python cmd to shell script %sh,in azure databricks notebook..?

notebook example

1 Answer 1

20

Per my experience, there are two workaround ways to pass a Python variable to Bash script for your current scenario.

Here is my sample codes using Python3 in notebook.

  1. To pass little data via environment variable in the same shell session of Azure Databricks Notebook, as below.

    %python
    import os
    l = ['A', 'B', 'C', 'D']
    os.environ['LIST'] = ' '.join(l)
    print(os.getenv('LIST'))
    
    %%bash
    for i in $LIST
    do
      echo $i
    done
    

It works as the figure below.

enter image description here

  1. To pass large data via file in the current path of Azure Databricks Notebook, as below.

    %python
    with open('varL.txt', 'w') as f:
      for elem in l:
        f.write(elem+'\n')
      f.close()  
    
    %%bash
    pwd
    ls -lh varL.txt
    echo '=======show content=========='
    cat varL.txt
    echo '=====result of script========'
    for i in $(cat varL.txt)
    do
      echo $i
    done
    

It works as the figure below.

enter image description here

Sign up to request clarification or add additional context in comments.

2 Comments

Thanks @Peter for your explanation!
How would one do this if you didn't want to hardcode the name of the environment variable? In cmd 2 in the first way, you hardcode the name LIST in $LIST. Is there anyway that the name for LIST could be stored in a python variable and evaluated within the bash script? Like x = 'LIST' and you do something like for i in $x in the bash script. In my experience, this doesn't work.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.