force URLencoding script

J

João

Hi.
I'm trying to figure out how to force URLencoding in my Python 2.4.3
environment receiving data an input argument but I'm really at a loss
here.

What am I doing wrong?

#!/usr/bin/env python

import sys
from urllib import urlencode, urlopen
from urllib2 import Request
import urlparse

destination = sys.argv[1]
msg = sys.argv[2] #Will I have problems with this one if the input is
multiline?

# the browser identifies itself using the User-Agent header
# after creating the Request object, it's possible to pass in a
dictionary of headers
user_agent = 'Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1;
SV1; .NET CLR 1.1.4322)' # force the request to be identified as IE
5.5....
headers = { ’User-Agent’ : user_agent }

# force no proxy

authentication = 'UID=22541&PW=gdyb21LQTcIANtvYMT7QVQ==&'
# force Unicode display format
message = u'M=%s&' % msg
dest_number = 'N=%s' % destination
data = authentication + message + dest_number

url = 'http://10.112.28.221:38080/GwHTTPin/sendtext'
print 'Encoded URL:', url

#get full URL adding ? to it, followed by the encoded values
#full_url = url + '?' url_values
#should I force url_values = urllib.urlencode(data) instead?
full_url = urllib2.Request(url, data, headers)

response = urllib2.urlopen(full_url) #.urlopen works transparently
with proxies which do not require authentication

processed = urllib.open(full_url)
 
R

r0g

João said:
Someone please?


Haven't seen your original post yet mate, usenet can be flaky like that,
might have been a good idea to quote your original post!

Roger.
 
J

João

Haven't seen your original post yet mate, usenet can be flaky like that,
might have been a good idea to quote your original post!

Roger.

Thanks Roger.

#!/usr/bin/env python

import sys
from urllib import urlencode, urlopen
from urllib2 import Request
import urlparse

destination = sys.argv[1]
msg = sys.argv[2] #Will I have problems with this one if the input is
multiline?

# the browser identifies itself using the User-Agent header
# after creating the Request object, it's possible to pass in a
dictionary of headers
user_agent = 'Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1;
SV1; .NET CLR 1.1.4322)' # force the request to be identified as IE
5.5....
headers = { ’User-Agent’ : user_agent }

# force no proxy

authentication = 'UID=22541&PW=gdyb21LQTcIANtvYMT7QVQ==&'
# force Unicode display format
message = u'M=%s&' % msg
dest_number = 'N=%s' % destination
data = authentication + message + dest_number

url = 'http://10.112.28.221:38080/GwHTTPin/sendtext'
print 'Encoded URL:', url

#get full URL adding ? to it, followed by the encoded values
#full_url = url + '?' url_values
#should I force url_values = urllib.urlencode(data) instead?
full_url = urllib2.Request(url, data, headers)

response = urllib2.urlopen(full_url) #.urlopen works transparently
with proxies which do not require authentication

processed = urllib.open(full_url)
 
R

r0g

João said:
Thanks Roger.


headers = { ’User-Agent’ : user_agent }

Those quotes need to be either " or ' i.e.

headers = { 'User-Agent' : user_agent }


If that doesn't sort it then it would be helpful to see a traceback or,
if it is not crashing, get a description of how it is failing to do what
you expect.

Cheers,

Roger.
 
J

João

Those quotes need to be either " or ' i.e.

headers =  {  'User-Agent'  :  user_agent  }

If that doesn't sort it then it would be helpful to see a traceback or,
if it is not crashing, get a description of how it is failing to do what
you expect.

Cheers,

Roger.

Weird, I had the correct quotes in my script, maybe I've pasted with
some error.
This is my current script,

(and though I got it working with the subprocess.call I don't know how
to use a pure Python version.

for the following data,
authentication = "UID=somestring&"
message = 'PROBLEM severity High: OperatorX Plat1(locationY) global
Succ. : 94.470000%'
dest_number = 'XXXXXXXXXXX'

url_values = urlencode({'M':message})
enc_data = authentication + url_values + dest_number


I'm getting null for
full_url = Request(url, enc_data, headers)

and thus,
response = urlopen(full_url).read()
returns,
TypeError: <exceptions.TypeError instance at 0x2b4d88ec6440>

)

#!/usr/bin/env python
import sys
import os
import subprocess
from urllib import urlencode, urlopen
from urllib2 import Request

destination = sys.argv[1]
msg = sys.argv[2]

authentication = "UID=somestring&"
dest_number = "&N=%s" % destination

message = '%s' % msg
url_values = urlencode({'M':message})

enc_data = authentication + url_values + dest_number

url = 'http://10.112.28.221:38080/GwHTTPin/sendtext'

subprocess.call('echo "%s" | /usr/bin/POST -P "%s"' % (enc_data, url),
shell=True)
 
R

r0g

João said:
for the following data,
authentication = "UID=somestring&"
message = 'PROBLEM severity High: OperatorX Plat1(locationY) global
Succ. : 94.470000%'
dest_number = 'XXXXXXXXXXX'

url_values = urlencode({'M':message})
enc_data = authentication + url_values + dest_number


I'm getting null for
full_url = Request(url, enc_data, headers)

and thus,
response = urlopen(full_url).read()
returns,
TypeError: <exceptions.TypeError instance at 0x2b4d88ec6440>

)


Are you sure it's returning a null and not just some other unexpected
type?

I think your problem may be that you are passing a urllib2 class to
urllib(1)'s urlopen. Try using urllib2's urlopen instead e.g.

import urllib2
request_object = urllib2.Request('http://www.example.com')
response = urllib2.urlopen(request_object)
the_page = response.read()

Roger.
 
J

João

Are you sure it's returning a null and not just some other unexpected
type?

I think your problem may be that you are passing a urllib2 class to
urllib(1)'s urlopen. Try using urllib2's urlopen instead e.g.

import urllib2
request_object = urllib2.Request('http://www.example.com')
response = urllib2.urlopen(request_object)
the_page = response.read()

Roger.

Thanks Roger.
I think it's a null because i did a print(full_url) right after the
Request
I tried
request_object = urllib2.Request('http://www.example.com')
print(request_object)

but when printing I get: <urllib2.Request instance at 0x2afaa2fe3f80>

I've read about Python 2.4 not playing well with proxies even with no
proxy activated.
Any sugestion?

Thanks again
 
J

João

EDIT:

About the proxy.
That's why I'm using the '-P' in the POST call.
/usr/bin/POST -P
 
R

r0g

João said:
Thanks Roger.
I think it's a null because i did a print(full_url) right after the
Request
I tried
request_object = urllib2.Request('http://www.example.com')
print(request_object)

but when printing I get: <urllib2.Request instance at 0x2afaa2fe3f80>

Hi João,

That's exactly what you want, an object that is an instance of the
Request class. That object doesn't do anything by itself, you still need
to a) Connect to the server and request that URL and b) Read the data
from the server.

a) To connect to the web server and initialize the request you need to
call urllib2.urlopen() with the Request object you just created and
assign the result to a name e.g.


That will give you an object (response) that you can call the .read()
method of to get the web page data.



If that doesn't make sense or seem to work for you then please try
reading the following website from top to bottom before taking any
further steps...

http://www.voidspace.org.uk/python/articles/urllib2.shtml

I've read about Python 2.4 not playing well with proxies even with no
proxy activated.
Any sugestion?

I doubt any language can play well with proxies if there are none so I
doubt it's a factor ;)

Good luck,

Roger.
 
J

João

Hi João,

That's exactly what you want, an object that is an instance of the
Request class. That object doesn't do anything by itself, you still need
to a) Connect to the server and request that URL and b) Read the data
from the server.

a) To connect to the web server and initialize the request you need to
call urllib2.urlopen() with the Request object you just created and
assign the result to a name e.g.


That will give you an object (response) that you can call the .read()
method of to get the web page data.


If that doesn't make sense or seem to work for you then please try
reading the following website from top to bottom before taking any
further steps...

http://www.voidspace.org.uk/python/articles/urllib2.shtml




I doubt any language can play well with proxies if there are none so I
doubt it's a factor ;)

Good luck,

Roger.

lol.
I've expressed myself poorly,
I meant I read about some issues when getting the Request + urlopen
working when there's a proxy involved (like in my case)
even when activating a no_proxy configuration, something like,

proxy_support = urllib.ProxyHandler({})
opener = urllib.build_opener(proxy_support)
urllib.install_opener(opener)

But I don't know how to use it :(
 
R

r0g

João said:
lol.
I've expressed myself poorly,
I meant I read about some issues when getting the Request + urlopen
working when there's a proxy involved (like in my case)
even when activating a no_proxy configuration, something like,

proxy_support = urllib.ProxyHandler({})
opener = urllib.build_opener(proxy_support)
urllib.install_opener(opener)

But I don't know how to use it :(


That is how you use it IIRC, this installs the proxy handler into urllib
and subsequent objects you subclass from urllib will use the custom handler.

From what I can tell, you should be using urllib2 though, not urllib.

Lets take a step back. You had the following line...

request_object = urllib2.Request('http://www.example.com')

....You printed it and it showed that you had created a Request object
right. Now what happens when you type...

response = urllib2.urlopen(request_object)
print response

?

Roger.
 
J

João

That is how you use it IIRC, this installs the proxy handler into urllib
and subsequent objects you subclass from urllib will use the custom handler.

From what I can tell, you should be using urllib2 though, not urllib.

Lets take a step back. You had the following line...

request_object = urllib2.Request('http://www.example.com')

...You printed it and it showed that you had created a Request object
right. Now what happens when you type...

response = urllib2.urlopen(request_object)
print response

?

Roger.

Thanks for the patience Roger.
Your explanation opened my eyes.

I finally got it to work, and it turned out I didn't have to include
any custom proxy handler to avoid our proxy.
It ended on such a small and simple block of code after getting the
pieces together..


#!/usr/bin/env python

import sys
from urllib2 import Request, urlopen
from urllib import urlencode


authentication = "UID=124675&PW=gdberishyb8LcIANtvT89QVQ==&"
url = 'http://10.112.28.221:38080/GwHTTPin/sendtext'
encoded_data = urlencode({'M':sys.argv[2],
'N':sys.argv[1]})

# Was having problem with this one, shouldn't have tried to pass the
authentication as an urlencode parameter because it was breaking the
password string!
sent_data = authentication + encoded_data

full_url = Request(url,sent_data)
response = urlopen(full_url).read()
 
J

João

That is how you use it IIRC, this installs the proxy handler into urllib
and subsequent objects you subclass from urllib will use the custom handler.

From what I can tell, you should be using urllib2 though, not urllib.

Lets take a step back. You had the following line...

request_object = urllib2.Request('http://www.example.com')

...You printed it and it showed that you had created a Request object
right. Now what happens when you type...

response = urllib2.urlopen(request_object)
print response

?

Roger.

Thanks for the patience Roger.
Your explanation opened my eyes.

I finally got it to work, and it turned out I didn't have to include
any custom proxy handler to avoid our proxy.
It ended on such a small and simple block of code after getting the
pieces together..


#!/usr/bin/env python

import sys
from urllib2 import Request, urlopen
from urllib import urlencode


authentication = "UID=124675&PW=gdberishyb8LcIANtvT89QVQ==&"
url = 'http://10.112.28.221:38080/GwHTTPin/sendtext'
encoded_data = urlencode({'M':sys.argv[2],
'N':sys.argv[1]})

# Was having problem with this one, shouldn't have tried to pass the
authentication as an urlencode parameter because it was breaking the
password string!
sent_data = authentication + encoded_data

full_url = Request(url,sent_data)
response = urlopen(full_url).read()
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,982
Messages
2,570,190
Members
46,736
Latest member
zacharyharris

Latest Threads

Top