Python Open Multiple URL Connections - urllib2

I'm trying to create a python script that will allow me to load up multiple connections similar to having multiple tabs open on a browser, more explicitly I have a code like this:

Using urlv, etc, I load up multiple connections to the API, however I want to make it so that I call all 5 at the same time instead of in succession. I have looked into things like twisted and tidy, but I don't know how to use them to help me.

Thanks, Solomon


You can create HttpHandlers to handle asynchronous http requests (asynchronous as far as your code is concerned, not when it comes to actual network operations).

Try this:

import urllib2

class MyHttpHandler(urllib2.HTTPHandler):
    def http_response(self, request, response):
        for l in response:
            print l
        return response

u = urllib2.build_opener(MyHttpHandler())
for i in range(1, 5):'')

Use the shell.

python "request 1" &
python "request 2" &
python "request 3" &
python "request 4" &
python "request 5" &

This will run 5 copies of your program. It will tie up as many cores and CPUs as it can. And -- bonus -- no programming using subprocess or threading or anything.

If all 5 are supposed to do somehow different things, then you'll have to provide some kind of arguments or options. Look into argparse for a way to gather command-line arguments.

Need Your Help

Netty 4 And Server Chunked Responses


I'm trying to migrate my server code that used to use the chunked classes to respond to some requests. Based on feedback I received on a question I asked previously, I wrote to a channel a

About UNIX Resources Network

Original, collect and organize Developers related documents, information and materials, contains jQuery, Html, CSS, MySQL, .NET, ASP.NET, SQL, objective-c, iPhone, Ruby on Rails, C, SQL Server, Ruby, Arrays, Regex, ASP.NET MVC, WPF, XML, Ajax, DataBase, and so on.