Asynchronous network requests
I'm not sure how to do this in Python, and specifically in Pythonista - I want to make 4 asynchronous (and concurrent) requests and parse the responses of each of them ( assign each to a specific variable). Is there a concept of a callback in Python?
async_requests.get('http://google.com', google_callback) async_requests.get('http://bing.com', bing_callback) async_requests.get('http://yahoo.com', yahoo_callback) async_requests.get('http://facebook.com', facebook_callback)
threaded_network_requests.py. It does not feature the callbacks but they could be added. There is a preformance difference between the serial and parallel runs.
Cool! Glanced at it and it looks like what I needed - I'll dig in more a bit later. Thanks!
Yup! - this is exactly what I was looking for - concurrent network requests. I implemented it without callbacks, rather did it procedurally - after the request returned I passed the data to a different (new) method in the
def run(self): data = requests.get(self.url).text results[self.key] = self.parse(data) def parse(self, responseText): #stuff here return stuff