Welcome!
This is the community forum for my apps Pythonista and Editorial.
For individual support questions, you can also send an email. If you have a very short question or just want to say hello — I'm @olemoritz on Twitter.
load from url issue
-
# coding: utf-8 import ui import feedparser import urllib2 import webbrowser class MyView (object): def __init__(self): x, y = ui.get_screen_size() self.url_list = [] url = 'http://appshopper.com/feed/?mode=featured&filter=price&platform=ios' self.feed = feedparser.parse(url) tblview = ui.TableView() tblview.name = 'AppShopper' tblview.data_source = self tblview.delegate = self self.segview = ui.SegmentedControl(frame=((x/2)-125, -45, 250, 30)) self.segview.segments = ['New', 'Updates', 'Price Drop'] tblview.add_subview(self.segview) naview = ui.NavigationView(tblview) naview.present(hide_title_bar=True) def tableview_number_of_rows(self, tableview, section): return len(self.feed['entries']) def tableview_cell_for_row(self, tableview, section, row): feed = self.feed html = feed['entries'][row]['summary_detail']['value'] self.url_list.append(html[html.find('href')+6:html.find('iTunes')-2]) title = feed['entries'][row]['title'] thmburl = html[html.find('http', 0, 100):html.find('png', 0, 100)+3] beg = html.find('Price') end = html.find(',', beg, beg+50) price = html[beg+11:end] cell = ui.TableViewCell('subtitle') cell.text_label.number_of_lines = 0 cell.text_label.font = ('<system-bold>', 12.0) cell.text_label.text = title thumb = ui.ImageView() ui.delay(thumb.load_from_url(thmburl), 0) cell.image_view.image = thumb.image cell.detail_text_label.text = price return cell def tableview_did_select(self, tableview, section, row): webbrowser.open(self.url_list[row]) def scrollview_did_scroll(self, scrollview): segmentindex = self.segview.selected_index scry = scrollview.content_offset[1] if scrollview.tracking: if scry < -75: self.segview.enabled = True self.segview.y = scry+30 if scry < -76 and scry > -85: self.segview.selected_index = 0 elif scry < -86 and scry > -95: self.segview.selected_index = 1 elif scry < -96 and scry > -105: self.segview.selected_index = 2 else: self.segview.y = -45 self.segview.enabled = False else: pass MyView()
It's probably a user error but when the code is ran, the images aren't loaded until the cells are scrolled out of view. Any help appreciated!
-
The issue, I think, is thepat ui delay is loading the image, while the rest of the code continues... So the first time around, thumb.image doesn't exist yet.
You might consider the ui.delay to call a function which loads, then sets the image, i.e something along the lines of (starting where you had the delay
def load_and_show_image(): thumb.load_from_url(thumb.url) cell.image_view.image = thumb.image ui.in_background(load_and_show_image()) cell.detail_text_label.text = price return cell
As an aside... This code will try to reload the image every time the scrolls into view... I suspect the images never actually change, so this is pretty wasteful of network resources, and on a slow connection you might experience lots of lag.
A better option might be to spawn a background thread that fills out a instance variable list of Images, which is populated once, starting when the view is created... The cell for row function would probably still have a similar function as above, except instead of loading from a url, you'd load from the cached list. If the background image cache filler has not gotten to the particular cell yet, you would load from url as above, or maybe just keep polling until the Image loaded ( and perhaps exiting early if the cell is no longer on_screen... Actually not sure what happens if you have a reference to a cell which scrolls out of view, does it get deleted?). For the second option( keep polling) you might want to also have a way of pushing
Urls to the top of the queue...Also, note that in your code,
url_list
will grow and grow and grow when you scroll forward and back... I think you would want to initialize it to be the length of feed['entries'], then assign directly based on row, rather then appending... or else parse urls one time during init(), rather than for each row. Given you've already parsed the feed, I think it would be pretty quick. -
You could also consider using the BeautifulSoup module provided in Pythonista to simplify HTML parsing:
import bs4 soup = bs4.BeautifulSoup(html) app_url = soup.a['href'] thmburl = soup.img['src'] price = soup.find('b', text='Price:').next_sibling.strip().rstrip(',')