Save Webpage for offline
In regards to
webView.load_url(), is there a way to download the webpage content and display the downloaded page?
I have a webview in my script that shows a wiki page but I want it to work offline. So I guess it would download the webpage when the script first runs but after that it should use the downloaded page if it is still up to date.
You can download the HTML file and save it with
import urllib2 page = urllib2.urlopen("example.com") with open("out.html", "w") as outfile: outfile.write(page.read())
You can load the contents of this file into a
WebViewnow. I don't know if
load_urlsupports local files, but you can always use
load_html. This will load the file we saved in the last example into a
import ui with open("out.html", "r") as outfile: html = outfile.read() wv = ui.WebView() wv.load_html(html) wv.present()
If you don't need to access the HTML between two separate runnings of the script, you don't have to save the file and then read it again, you can just store it locally.
@ccc you named it with a
.puextension, little typo there :) Not a big deal, but GitHub doesn't syntax-highlight it properly like that.
load_url does support local files, if you provide the os.abspath. For pages with multiple resources (images, scripts, etc) relative paths then work in the html file.
Thanks guys, just what I was looking for.