Skip to main content
added 17 characters in body
Source Link

This function will either use cURL or wGet to download a script and execute it with additional arguments:

wexec() {
    url=$1
    shift
    if command -v "curl" >/dev/null 2>&1; then
        curl -s $url | bash -s -- $@
    elif command -v "wget" >/dev/null 2>&1; then
        wget -qO- $url | bash -s -- $@
    else
        echo "No curl or wget found."
        return 1
    fi
}

I want to download and run the following script from a server and run it locally with passed arguments:

#!/usr/bin/env bash
hello() {
    echo Hello World!
}
echo Arguments: $@

I also want the hello function to be available in the local environment, but it's not, because I defined it in a subshellsubshell new shell, so calling wexec http://example.org/my-remote-script.sh a1 a2 a3; hello will successfully output Arguments: a1 a2 a3, but fail with hello: command not found.

Is there a way to send arguments from the local environment and still receive the functions from the remote script?

This function will either use cURL or wGet to download a script and execute it with additional arguments:

wexec() {
    url=$1
    shift
    if command -v "curl" >/dev/null 2>&1; then
        curl -s $url | bash -s -- $@
    elif command -v "wget" >/dev/null 2>&1; then
        wget -qO- $url | bash -s -- $@
    else
        echo "No curl or wget found."
        return 1
    fi
}

I want to download and run the following script from a server and run it locally with passed arguments:

#!/usr/bin/env bash
hello() {
    echo Hello World!
}
echo Arguments: $@

I also want the hello function to be available in the local environment, but it's not, because I defined it in a subshell, so calling wexec http://example.org/my-remote-script.sh a1 a2 a3; hello will successfully output Arguments: a1 a2 a3, but fail with hello: command not found.

Is there a way to send arguments from the local environment and still receive the functions from the remote script?

This function will either use cURL or wGet to download a script and execute it with additional arguments:

wexec() {
    url=$1
    shift
    if command -v "curl" >/dev/null 2>&1; then
        curl -s $url | bash -s -- $@
    elif command -v "wget" >/dev/null 2>&1; then
        wget -qO- $url | bash -s -- $@
    else
        echo "No curl or wget found."
        return 1
    fi
}

I want to download and run the following script from a server and run it locally with passed arguments:

#!/usr/bin/env bash
hello() {
    echo Hello World!
}
echo Arguments: $@

I also want the hello function to be available in the local environment, but it's not, because I defined it in a subshell new shell, so calling wexec http://example.org/my-remote-script.sh a1 a2 a3; hello will successfully output Arguments: a1 a2 a3, but fail with hello: command not found.

Is there a way to send arguments from the local environment and still receive the functions from the remote script?

Source Link

How to curl/wget a script, run it with arguments, and make its functions available locally?

This function will either use cURL or wGet to download a script and execute it with additional arguments:

wexec() {
    url=$1
    shift
    if command -v "curl" >/dev/null 2>&1; then
        curl -s $url | bash -s -- $@
    elif command -v "wget" >/dev/null 2>&1; then
        wget -qO- $url | bash -s -- $@
    else
        echo "No curl or wget found."
        return 1
    fi
}

I want to download and run the following script from a server and run it locally with passed arguments:

#!/usr/bin/env bash
hello() {
    echo Hello World!
}
echo Arguments: $@

I also want the hello function to be available in the local environment, but it's not, because I defined it in a subshell, so calling wexec http://example.org/my-remote-script.sh a1 a2 a3; hello will successfully output Arguments: a1 a2 a3, but fail with hello: command not found.

Is there a way to send arguments from the local environment and still receive the functions from the remote script?