piotr piotr -4 years ago 79
C++ Question

is it possible to get the content of an external web page that is produced by php and has parameters (after question mark) in its address?

I'd like to programatically save [in c++ or php] to a file the source code of the web page that I see in my browser (I'm using Mozilla under Windows). When the content of the web page physically exists as a file on the server, it can easily be done using e.g. the program curl or the php library 'curl.' However, I'm not familiar with these two methods and the standard way I'm using them fails to work for web pages obtained by addresses e.g. of the form:


So, I mean web pages that are produced by some php file which uses parameters - if only we change them, we see different content of the web page. Is it possible to write a code in c++ or php (or a command line parameters for the program curl) which will be responsible for copying the source code of such web pages into a file or a variable?

Thanks for any help.

The specific web page I'm testing is

In curl program I'm just testing the command line
curl -O url_address
which for the above web page returns ``the error page.''

Answer Source

If the page doesn't use cookies you could just do something like (Assuming you have lynx installed).

$webpage = shell_exec("lynx -source 'http://XXXX'");

which will take a single page and echo it into the $webpage variable. However, if it uses cookies you can still use lynx but that's beyond what I can help you with (I'm a basic lynx user)

I also know you can do with it with a simple

$webpage = file_get_contents("http://xxxxx"); 

as well but that doesn't support any special features.
Edit (file_get_contents doesn't appear to work with his requested URL, lynx does however)

Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download