jjnguy jjnguy - 1 year ago 113
HTTP Question

How do you Programmatically Download a Webpage in Java

I would like to be able to fetch a web page's html and save it to a

, so I can do some processing on it. Also, how could I handle various types of compression.

How would I go about doing that using Java?

Answer Source

Here's some tested code using Java's URL class. I'd recommend do a better job than I do here of handling the exceptions or passing them up the call stack, though.

public static void main(String[] args) {
    URL url;
    InputStream is = null;
    BufferedReader br;
    String line;

    try {
        url = new URL("http://stackoverflow.com/");
        is = url.openStream();  // throws an IOException
        br = new BufferedReader(new InputStreamReader(is));

        while ((line = br.readLine()) != null) {
    } catch (MalformedURLException mue) {
    } catch (IOException ioe) {
    } finally {
        try {
            if (is != null) is.close();
        } catch (IOException ioe) {
            // nothing to see here
Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download