Ken,
I tested your script on my server and it also worked for me on the command
line and failed via my web server. All I did was add "/usr" to your path to
perl and it worked:
#!/usr/bin/perl
Roy
On 11/23/09 11/23/09 € 8:17 AM, "Ken Irwin" <[log in to unmask]> wrote:
> Hi all,
>
> I'm moving to a new web server and struggling to get it configured properly.
> The problem of the moment: having a Perl CGI script call another web page in
> the background and make decisions based on its content. On the old server I
> used an antique Perl script called "hcat" (from the Pelican
> book<http://oreilly.com/openbook/webclient/ch04.html>); I've also tried curl
> and LWP::Simple.
>
> In all three cases, I get the same behavior: it works just fine on the command
> line, but when called by the web server through a CGI script, the LWP (or
> other socket connection) gets no results. It sounds like a permissions thing,
> but I don't know what kind of permissions setting to tinker with. In the test
> script below, my command line outputs:
>
> Content-type: text/plain
> Getting URL: http://www.npr.org
> 885 lines
>
> Whereas the web output just says "Getting URL: http://www.npr.org" - and
> doesn't even get to the "Couldn't get" error message.
>
> Any clue how I can make use of a web page's contents from w/in a CGI script?
> (The actual application has to do with exporting data from our catalog, but I
> need to work out the basic mechanism first.)
>
> Here's the script I'm using.
>
> #!/bin/perl
> use LWP::Simple;
> print "Content-type: text/plain\n\n";
> my $url = "http://www.npr.org";
> print "Getting URL: $url\n";
> my $content = get $url;
> die "Couldn't get $url" unless defined $content;
> @lines = split (/\n/, $content);
> foreach (@lines) { $i++; }
> print "\n\n$i lines\n\n";
>
> Any ideas?
>
> Thanks
> Ken
>
|