mechanize - Error 403: Request disallowed by robots.txt on Python -
i trying fill form using mechanize on python. when run code, error:
error 403:request disallowed robots.txt.
i went through previous answered questions similar issue , saw adding br.set_handle_robots(false)
should fix it, still getting same error. missing here?
import re import mechanize mechanize import browser br = mechanize.browser() br.set_handle_equiv(false) br.set_handle_robots(false) br.addheaders = [('user-agent','mozilla/5.0 (x11; linux x86_64; rv:18.0)gecko/20100101 firefox/18.0 (compatible;)'),('accept', '*/*')] text = "1500103233" browser = browser() browser.open("http://kuhs.ac.in/results.htm") browser.select_form(nr=0) browser['stream']=['medical'] browser['level']=['ug'] browser['course']=['mbbs'] browser['scheme']=['mbbs 2015 admissions'] browser['year']=['ist year mbbs'] browser['examination']=['first professional mbbs degree regular(2015 admissions) examinations,august2016'] browser['reg no']=text response = browser.submit()
- you set
br = mechanize.browser()
, setbrowser = browser()
? - the link :
http://kuhs.ac.in/results.htm
if can see page source, source :src="http://14.139.185.148/kms/index.php/results/create"
- from page source can see name of forms. in case
stream</label
name="results[streamid]"
so , can try :
import mechanize br = mechanize.browser() br.set_handle_equiv(false) br.set_handle_robots(false) br.addheaders = [('user-agent','mozilla/5.0 (x11; linux x86_64; rv:18.0)gecko/20100101 firefox/18.0 (compatible;)'),('accept', '*/*')] text = "1500103233" br.open("http://14.139.185.148/kms/index.php/results/create").read() forms in br.forms(): print forms br.select_form(nr=0) br['results[streamid]']=['1',] #medical #etc.. response = br.submit() print response.read()
you can see here :submitting form mechanize (typeerror: listcontrol, must set sequence)
hope helps, works me!
Comments
Post a Comment