Using Python's Requests library to navigate webpages / Click buttons -


i'm new web programming, , have began looking using python automate manual processes. i'm trying log site, click drop-down menus select settings, , run report.

i've found acclaimed requests library: http://docs.python-requests.org/en/latest/user/advanced/#request-and-response-objects , have been trying figure out how use it.

i've logged in using bpbp's answer on page: how use python login webpage , retrieve cookies later usage?

my understanding of "clicking" button write post() command mimics click: python - clicking javascript button

my question (since i'm new web programming , library) how go pulling data need figure out how construct these commands. i've been looking [requestobject].headers, .text, etc. examples great.

as always, help!

edit::: make question more concrete, i'm having trouble interacting different aspects of web-page. following image shows i'm trying do:

example

i'm on web-page looks this. there drop-down menu click-able dates can changed. goal automate changing date recent date, "click"'save , run', , download report when it's finished running.

the solution have found selenium. if werent javascript heavy website try mechanize need render javascript , inject javascript...like selenium does.

upside: can record actions in firefox (using selenium) export actions python. downside code has open browser window run.


Comments

Popular posts from this blog

c++ - Creating new partition disk winapi -

Android Prevent Bluetooth Pairing Dialog -

php - joomla get content in onBeforeCompileHead function -