Python Open Multiple URL Connections - urllib2
I'm trying to create a python script that will allow me to load up multiple connections similar to having multiple tabs open on a browser, more explicitly I have a code like this:
Using urlv, etc, I load up multiple connections to the API, however I want to make it so that I call all 5 at the same time instead of in succession. I have looked into things like twisted and tidy, but I don't know how to use them to help me.
You can create HttpHandlers to handle asynchronous http requests (asynchronous as far as your code is concerned, not when it comes to actual network operations).
import urllib2 class MyHttpHandler(urllib2.HTTPHandler): def http_response(self, request, response): for l in response: print l return response u = urllib2.build_opener(MyHttpHandler()) for i in range(1, 5): u.open('http://stackoverflow.com')
Use the shell.
#!/bin/bash python chuck.py "request 1" & python chuck.py "request 2" & python chuck.py "request 3" & python chuck.py "request 4" & python chuck.py "request 5" &
This will run 5 copies of your program. It will tie up as many cores and CPUs as it can. And -- bonus -- no programming using subprocess or threading or anything.
If all 5 are supposed to do somehow different things, then you'll have to provide some kind of arguments or options. Look into argparse for a way to gather command-line arguments.