0

In a bash script I've stored the URL from a previous command in a bash variable $DESTINATION_URL. I want to run a curl command using this variable.

If I use the $DESTINATION_URL variable, the curl command fails.

If I try that same curl command with the URL itself, it works fine. It seems like the & is causing a problem, but I can't see why.

Example below:

ha@hadoop-fullslot1:~$ echo $DESTINATION_URL
http://hadoop-fullslot1:50075/webhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true


ha@hadoop-fullslot1:~$ curl -v -s -i -X PUT -T $SOURCE "$DESTINATION_URL"
* About to connect() to hadoop-fullslot1 port 50075 (#0)
*   Trying 10.1.3.39... connected
 HTTP/1.1bhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true
> User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3
> Host: hadoop-fullslot1:50075
> Accept: */*
> Content-Length: 1907377
> Expect: 100-continue
>
* Empty reply from server
* Connection #0 to host hadoop-fullslot1 left intact
* Closing connection #0


ha@hadoop-fullslot1:~$ curl -v -s -i -X PUT -T $SOURCE "http://hadoop-fullslot1:50075/webhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true"
* About to connect() to hadoop-fullslot1 port 50075 (#0)
*   Trying 10.1.3.39... connected
> PUT /webhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true HTTP/1.1
> User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3
> Host: hadoop-fullslot1:50075
> Accept: */*
> Content-Length: 1907377
> Expect: 100-continue
>
< HTTP/1.1 100 Continue
HTTP/1.1 100 Continue

* We are completely uploaded and fine
< HTTP/1.1 201 Created
HTTP/1.1 201 Created
< Cache-Control: no-cache
Cache-Control: no-cache
< Expires: Fri, 26 Apr 2013 09:01:38 GMT
Expires: Fri, 26 Apr 2013 09:01:38 GMT
< Date: Fri, 26 Apr 2013 09:01:38 GMT
Date: Fri, 26 Apr 2013 09:01:38 GMT
< Pragma: no-cache
Pragma: no-cache
< Expires: Fri, 26 Apr 2013 09:01:38 GMT
Expires: Fri, 26 Apr 2013 09:01:38 GMT
< Date: Fri, 26 Apr 2013 09:01:38 GMT
Date: Fri, 26 Apr 2013 09:01:38 GMT
< Pragma: no-cache
Pragma: no-cache
< Location: webhdfs://hadoop-meta1:50070/user/ha/s3distcp.jar
Location: webhdfs://hadoop-meta1:50070/user/ha/s3distcp.jar
< Content-Type: application/octet-stream
Content-Type: application/octet-stream
< Content-Length: 0
Content-Length: 0
< Server: Jetty(6.1.26.cloudera.2)
Server: Jetty(6.1.26.cloudera.2)

<
* Connection #0 to host hadoop-fullslot1 left intact
* Closing connection #0
ha@hadoop-fullslot1:~$

2 Answers 2

3

Your variable contains something more (garbage) than just the URL. I would guess on a CR byte or something, see how " HTTP/1.1" gets printed first on the line although it should be on the right of the URL...

Sign up to request clarification or add additional context in comments.

6 Comments

to check whether this is the case, try something like `echo "|${DESTINATION_URL}|" so you see any whitespace.
There are no special character in the URL because the CURL command works fine if I put it directly to the command (instead of using variable).
We get "HTTP/1.1 100 Continue" because we try to send out data before the redirect. - See more
You're missing the point. Look at the debug log in the failing case and how it shows the URL it used. Compare to the working case. Figure out the difference!
Examine the exact contents of the variable: printf "%s" "$DESTINATION_URL" | od -c
|
0

use single quote ' instead of double quote "

1 Comment

I have to use double quote because I want to pass a variable as parameter to the CURL command.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.