Welcome!
This is the community forum for my apps Pythonista and Editorial.
For individual support questions, you can also send an email. If you have a very short question or just want to say hello — I'm @olemoritz on Twitter.
Apple system status
-
Hi @omz,
Please find below the code and rectify what is the problem in that code.
from bs4 import BeautifulSoup import requests import console import sys import notification url="http://www.apple.com/support/systemstatus/" resp = requests.get(url) html_doc = resp.text soup = BeautifulSoup(html_doc) console.clear() console.set_font('Futura', 16) down = 0 up = 0 for id in soup.find_all('key_title'): if id('li_issue') [0] == 'Issue': console.set_color(1, 0.5, 0) down+=1 else: console.set_color(0, 0, 0) up+=1 print id.text console.set_color(0, 0, 0) print '_______________________' console.set_color(1, 0.5, 0) print 'Issue:',down console.set_color(0, 0, 0) print 'Normal:',up runagain = notification.schedule('Checking System Status', 3600, 'default', 'pythonista://SystemsStatus?action=run')
-
print '_' * 23 is the same as print '_______________________'
-
@ccc Sorry i couldn't understand...
-
replace print '_______________________' with print ''*23
-
@ccc & @filippocld I think you guys are misunderstand. I am not getting output for the above code.
I am getting below output.
-
The problem in your code is twofold:
-
requests.get('http://www.apple.com/support/systemstatus/') returns the source of the page, but Apple uses JavaScript to fiddle with that source when loading it into the browser, so the source you get doesn't correspond to what you actually see on the screen. I'm not knowledgeable enough in Python to know how to fetch the JS-generated source.
-
You aren't fetching the correct elements from the page source using BeautifulSoup. Instead of targeting key_title and li_issue (the latter of which doesn't even exist on the page), something like this is what you need instead:
for id in soup.find_all('p', 'matrix_p'): if not 'allgood' in id.previous_sibling['class']: console.set_color(0, 0.5, 0) down += 1 else: console.set_color(0, 0, 0) up += 1 print(id.text)
(apologies if this is not exactly correct; I worked on it yesterday and didn't save my test script so I'm trying to reconstruct it from memory)
But even with that, you'll still run into #1 as a problem.
-
-
Hi @roosterboy Sorry I am not getting output.
-
And you won't, until you solve the first of the two problems I mentioned. The source returned by the call to requests.get('http://www.apple.com/support/systemstatus/') simply doesn't contain the info you are looking for because Apple updates the page with JavaScript after the browser loads. You have to somehow get that updated source and I'm not sure how to do that with Python. But once you can do that, then the code in my second point above should work.
-
So... The problem in 1) is that Requests provides you the HTML source code of Apple's web page when what you are really looking for is the results of the (JavaScript) execution of that web page. In the days before Pythonista had a Location module, we hacked together few different solutions for obtaining the results of the execution of a web page. This might help in quest for a solution.
http://omz-forums.appspot.com/pythonista/post/6119538122817536
-
@ccc @roosterboy I done everything what you guys told. But still i am not getting the output.
-
roosterboy wrote: "The source returned by the call to requests.get('http://www.apple.com/support/systemstatus/') simply doesn't contain the info you are looking for."
It is impossible to find a needle in a haystack if the haystack that you are searching thru contains no needles.
ccc wrote: "what you are really looking for is the results of the (JavaScript) execution of that web page"
That is the haystack that you are not currently searching thru that contains needles.
-
Check out http://www.packtpub.com/article/web-scraping-with-python-part-2
Hence, instead of relying on a library that generates HTTP requests, we need a library that behaves as a real web browser...
Selenium and Windmill might be worth checking out.