I'm porting some amazon webservices code from their perl examples to newlisp (of course) and to retrieve the info from amazon a 206 char url is generated (below is the string formed from newlisp with my SubscriptionId X'ed out)
viz:
> (length "http://webservices.amazon.com/onca/xml?Service=AWSECommerceService&Operation=ItemLookup&SubscriptionId=XXXXXXXXXXXXXXXXXXXX&ItemId=052164481X&MerchantId=All&ResponseGroup=OfferFull&OfferPage=1&Condition=All")
206
trying to get url gives:
> (get-url "http://webservices.amazon.com/onca/xml?Service=AWSECommerceService&Operation=ItemLookup&SubscriptionId=XXXXXXXXXXXXXXXXXXXX&ItemId=052164481X&MerchantId=All&ResponseGroup=OfferFull&OfferPage=1&Condition=All")
"ERR: bad formed URL"
>
I see in nl-web.c the code:
int parseUrl(char* url, char* host, int* port, char* path)
{
char* colonPtr;
char* slashPtr;
int len;
/* trim trailing whitespace like '/r/n' from url */
if((len = strlen(url)) > 127) return(FALSE);
while(*(url + len) <= ' ' && len > 0)
{
*(url + len) = 0;
len--;
}
...
and
/* parse URL for parameters */
if(parseUrl(url, host, &port, path) == FALSE)
return(stuffString(ERROR_BAD_URL));
...
so I suspect the long url is the problem.
I've not looked further into what would be required to accept longer url's but could this be done easily (in which case I could try coding)? Or would need some string buffer structures etc or different system calls?
Regards
Nigel
Yes, there is a limitation of 127 chars per URL+query, I will post a corrected version later today.
Lutz
Thanks Lutz and Happy New Year (to all).
I'll post some code when I've something useful.
It will hopefully be a cgi or tk/tcl interfaced program to maintain a personal library "database" with ISBN's being converted to full entries.
Regards
Nigel