Rectangle 27 0

Depending on what exactly you're trying to do, you could just fetch the JSON data that makes the table:

@m33ts4k0z: No problem. I found it by using the "Network" tab of Chrome's Developer Tools (The same is possible in other browsers as well). Since document.ready was involved I guessed that the page probably loaded the information externally.

php - Using cURL on a website with javascript - Stack Overflow

javascript php html curl
Rectangle 27 0

Depending on what exactly you're trying to do, you could just fetch the JSON data that makes the table:

@m33ts4k0z: No problem. I found it by using the "Network" tab of Chrome's Developer Tools (The same is possible in other browsers as well). Since document.ready was involved I guessed that the page probably loaded the information externally.

php - Using cURL on a website with javascript - Stack Overflow

javascript php html curl
Rectangle 27 0

If you inspect a page in a browser then it is very likely that its DOM has been manipulated by javascript, and so is different than it would appear to Mechanize. Mechanize does not process javascript and so can only get the raw initial html that the website sends to the user. I recommend to use a tool like cURL to get raw html (like Mechanize would see it) and then inspect this version in a browser to decide what you want to pick out later using Mechanize.

I tried curling your page and you are right that there is no src associated with the img tags. Probably done to avoid scraping! You can examine the accompanying javascript and see if there is any relation that you can use to figure out source urls from the data you are able to retrieve.

parsing - How to parse javascript generated urls in Ruby? - Stack Over...

ruby parsing mechanize
Rectangle 27 0

You can block unspoofed cURL requests in php by checking the User Agent. As far as I know none of the search engine crawlers have curl in their user user agent string, so this shouldn't block them.

if(stripos($_SERVER['HTTP_USER_AGENT'],'curl') !== false) {
    http_response_code(403); //FORBIDDEN
    exit;
}

Note that changing the User Agent string of a cURL request is trivial, so someone could easily bypass this.

javascript - Prevent cURL requests from my website - Stack Overflow

javascript php curl screen-scraping
Rectangle 27 0

The problem is that the click is actually performing a postback using javascript, with the limitations of PHP and cURL you will need to inspect the HTTP headers (GET, POST and COOKIES) being sent by the browser, and emulate them. Taking in mind that some values might be session dependent. Right now I don't have time to do this for you but I know it can be quite tricky with ASP.Net websites in some cases. There might be easier ways to do it, but that's what it will always come down to, because that's what happens.

If you weren't tied to PHP a whole world of options open - for example, the aggregator in the project I'm working on is actually capable of executing (controlled) javascript specifically for these kinds of tasks/pages (albeit on a grander scale).

curl - How to interact with page elements while crawling a website wit...

php curl web-scraping
Rectangle 27 0

It depends what the website is. JavaScript and jQuery alone cannot be used due to the cross-domain policy. You could perhaps use a combination of cURL and AJAX to achieve something similar.

I think you might need to provide a little more information about the site, and exactly why you'd want to do this...

php - Automatically log in on another website at the click of a button...

php javascript jquery curl
Rectangle 27 0

Firstly you have to perform two requests: 1st for initial download and 2nd for passing cookie.

Secondly this cookie is set by Javascript. cURL can't handle such cookies. So you have to handle this cookie manually: parse its value from html and pass it to next cURL request. You can do it by setting option CURLOPT_COOKIE.

CURLOPT_COOKIE did the trick for me :) Thanks alot!

Shameless plug: In the case of extra HTTP cookies presence besides Javascript cookie you may use [Cookie Jar Writer] (github.com/hindmost/cookiejarwriter) in conjunction with CURLOPT_COOKIEFILE/CURLOPT_COOKIEFILE option. This PHP class allows to add custom cookie variables to a cookie jar file within sequence of cURL requests.

javascript - PHP cURL pass Cookie to website - Stack Overflow

javascript php cookies curl fopen
Rectangle 27 0

I don't know what you're trying to do, but let me tell you this: cURL is a tool which can open a website and return the ouput. It can't do anything else like executing javascript. Also is forbidden for Javascript to call urls with not the same url as the website they're currently on. So you can't make an AJAX request to api.smsgatewayhub.com unless your website is api.smsgatewayhub.com. I don't know what you exactly mean with 'execute a link'.

i did check on this doc first.. but am not sure which code to be replaced for our need... would be grateful if you can guide me on the same (i checked with the customer support team and they declined to respond) thanks a lot!

Well, I don't really know the script. For the start maybe you should clean the sample code a bit (like indent, make more whitespaces). Maybe then you're able to read the code. A php developer should be able to read the php code of strangers, if not, because too less whitespaces or not indented, then do the things necessary to make it more readable. I'd have to do that too.

javascript - how to execute a link in ajax called php script - Stack O...

javascript php ajax curl
Rectangle 27 0

You probably want to look at something like PhantomJS for this. PhantomJS is a headless WebKit browser. It has its own API that lets you "script" behavior. So you can tell PhantomJS to load the page and dump out the data you need. to more information check this Using cURL on a website with javascript

Won't this hit the Same Origin restriction?

try with curl method in php

javascript - How get data from PHP page using jQuery? - Stack Overflow

javascript php jquery ajax json
Rectangle 27 0

cURL will only get you the markup of the page. It won't load any additional resources or process the page. You probably want to look at something like PhantomJS for this. PhantomJS is a headless WebKit browser. It has its own API that lets you "script" behavior. So you can tell PhantomJS to load the page and dump out the data you need.

Thanks for your answer. I will need to run this as JavaScript and then save the dump to a PHP variable using the exec command. Is that correct?

You don't have to, actually. You can run it directly from the command line. But if you are using this to display it on a website, then yes you can use exec from PHP.

php - Using cURL on a website with javascript - Stack Overflow

javascript php html curl
Rectangle 27 0

cURL will only get you the markup of the page. It won't load any additional resources or process the page. You probably want to look at something like PhantomJS for this. PhantomJS is a headless WebKit browser. It has its own API that lets you "script" behavior. So you can tell PhantomJS to load the page and dump out the data you need.

Thanks for your answer. I will need to run this as JavaScript and then save the dump to a PHP variable using the exec command. Is that correct?

You don't have to, actually. You can run it directly from the command line. But if you are using this to display it on a website, then yes you can use exec from PHP.

php - Using cURL on a website with javascript - Stack Overflow

javascript php html curl
Rectangle 27 0

The videos do not appear because they are loaded with javascript, which is not executed by curl. I don't know about the youtube api, but to get the data from the website you will need to analyze the javascript it uses and replicate the requests it makes.

is there a way to get the page with javascript in curl? like it's browsing with the browser?

@r10 You could emulate a browser, but it should be easier to just view source, figure out what javascript is retrieving the videos, and translate that.

I'm not sure but i think this is the code they call as javascript : pastebin.com/57EtLjJy When i request these urls, it's giving me the JS files. Do u have any sample which shows how i can get them running?

@r10 Java Rhino can execute javascript, but you'll need to simulate a browser to actually see the results.

Getting results from youtube trends hot videos with php curl - Stack O...

php curl youtube-api youtube-data-api
Rectangle 27 0

$url_c="$ser/ans.do;
$query = q=$q&id=$id&cid=$cid&oid=$oid;
$url_p="$ser/Quiz.do?qId=$qid";
curl_setopt($ch, CURLOPT_URL,$url_c);   
curl_setopt ($ch, CURLOPT_HTTPHEADER, Array("Content-Type: application/x-www-form-urlencoded","Accept: *")); 
curl_setopt($ch, CURLOPT_USERAGENT, $agent);
curl_setopt($ch,CURLOPT_ENCODING,"gzip,deflate");       
curl_setopt($ch, CURLOPT_COOKIEJAR, $cookie);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_REFERER, $url_p);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $query);

$html=curl_exec($ch);

javascript - Want to POST data to a website using cURL But their are n...

javascript curl
Rectangle 27 0

I am not sure about the nature of roblox however what you describe here is called web-crawling, thus in order to accomplish this, there is not a single language, most of them are suitable. what I would do first is check weather roblox provides any usable APIs which are there to help developers such as your self in fetching the data you need, in a more use friendly way such as JSON which you can easily use in any language. in case an API is not available you can try to fetch the data as plain text with various tools such as curl or text based web-browsers in order to determine weather an html parser will suffice or the website requires something more advanced such as a javascript interpeter, and there are such, an headless browser such as phantomjs(also available for use commandline just like curl, with full js support). It is most preffered to limit yourself to just fetching the page, parsing the html and getting the data you need rather than using a full headless browser solution such as phantomjs as the latter has the potential to slow things down and is generally more complex.

for sake of simplicity, since you mentioned that your final result is to make a webserver which serves the data I would go the following way:

The above steps are in case you don't need to go into the complexities of a full blown browser, but if you do, you should find comfort in phantomjs and try to use it standalone(javascript) to fetch your data, or try to find a php interface to communicate with phantomjs in google. The steps are similar in their approach: fetch the page and parse the html in it in order to get the required data.

  • Since you already are in a webserver (lemp/lamp) you are in fact already able to preasent a webpage to your devices online. so simpley do step 2, save to the database(mysql) and generate a page matching your need. Note php runs only when the user loads the page on which it resides, therefore if you need periodic checks, just use cron jobs to schedule tasks at certain times and re run your php scripts.

Note 1: the steps above are very general since you did not specify your background in this field. These steps simply describe how web crawling works in general.

Note 2: If you wish to make your service accesible outside of your network, In order to do that you should configure(usually is the default) your webserver(lemp/lamp) at port 80 and then you should provide your users with your outside ip address. If your ip is dynamically changing you can use free services such as NO-IP or maybe this. there are other more complex solutions such as renting a domain name.

php - Web Crawling Conversion Rates and Graphing Them - Stack Overflow

php sql web-crawler currency roblox
Rectangle 27 0

If you want to replicate form submit with your curl command, than it won't work to the page you mentioned. This page intercepts form submit with javascript, so form submission doesn't reach server at all. Unless you can add POST requests processing to the server side code, you won't be able to get expected results on this website from your curl command.

However, if you just need a result similar to what you show on a page, you may curl http://gothere.sg/maps/geo directly instead of surework.com.

Is there other alternative besides using curl? or can I modify the existing the contents of postalcode.html file to make curl command working?

It's not the page, who's responsible for curl response it's a webserver. You need to add piece of code, which will respond to your POST request. Another option may be to call http://gothere.sg/maps/geo instead of surework, like it's done in js on the page. I've updated the answer with this.

cURL POST & GET command not working - Stack Overflow

curl
Rectangle 27 0

Since CURL is just an html request your server can't differentiate unless you limit certain urls' access or check for referrer url's and implement a filter for anything not referred locally. An example of how to build a check can be found here:

I can use any referer I want when sending a request. It's simply another header

I didn't say it would be impossible to spoof, I said it was a viable option and one of the few if not only ways to filter an incoming http request. Not sure why people down vote something that is valid and helpful advice. Instead of down voting, why not post a better solution.

Well the answer is just not correct. The correct answer is: you can not.

javascript - Prevent cURL requests from my website - Stack Overflow

javascript php curl screen-scraping
Rectangle 27 0

Instead of using live http headers why dont you use chrome developer tools and use this to make the console persistent and get the post params.

Thanks for the reply... i still dint get the post fields..Chrome is just like firebug addon which i also used for this problem..any other suggestion??

Did you make it persistent using the link I gave ? If yes, then it should have logged the POST request in the network tab and you can easily see the params.

Yes i did what you told...i got all the .css and .js file along with the same query string as above... what has to be done next???

When you click submit on the quiz do you see anything new in the Network tab ? that should be your POST request which you are looking for

Yeah i got a new string in networks tab..i copied it as curl..and got this!!curl "xyz.com/ans.do?q=Xee&id=3443&cid=1&oid=2 -X POST -H "Origin: xyz.com -H "Accept-Encoding: gzip,deflate,sdch" -H "Host: www.xyz.com" -H "Accept-Language: en-US,en;q=0.8" -H "User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/27.0.1453.12 Safari/537.36" -H "Accept: /" -H "Referer: xyz.com/Quiz.do?qId=291241 -H "Cookie: JSESSIONID=70C5774AF6D906976BAC954C7619704F; city=-; region=-; id=3add2560-0e07-4c69-b7f9-f829bc47ad21;WHICH IS SAME

javascript - Want to POST data to a website using cURL But their are n...

javascript curl