The GET request is marginally less secure than the POST request. Neither offers true "security" by itself; using POST requests will not magically make your website secure against malicious attacks by a noticeable amount. However, using GET requests can make an otherwise secure application insecure.
The mantra that you "must not use GET requests to make changes" is still very much valid, but this has little to do with malicious behaviour. Login forms are the ones most sensitive to being sent using the wrong request type.
This is the real reason you should use POST requests for changing data. Search spiders will follow every link on your website, but will not submit random forms they find.
Web accelerators are worse than search spiders, because they run on the clients machine, and "click" all links in the context of the logged in user. Thus, an application that uses a GET request to delete stuff, even if it requires an administrator, will happily obey the orders of the (non-malicious!) web accelerator and delete everything it sees.
The only scenario in which POST is slightly less susceptible is that many websites that arent under the attackers control (say, a third-party forum) allow embedding arbitrary images (allowing the attacker to inject an arbitrary GET request), but prevent all ways of injecting an arbitary POST request, whether automatic or manual.
One might argue that web accelerators are an example of confused deputy attack, but thats just a matter of definition. If anything, a malicious attacker has no control over this, so its hardly an attack, even if the deputy is confused.
Proxy servers are likely to log GET URLs in their entirety, without stripping the query string. POST request parameters are not normally logged. Cookies are unlikely to be logged in either case. (example)
This is a very weak argument in favour of POST. Firstly, un-encrypted traffic can be logged in its entirety; a malicious proxy already has everything it needs. Secondly, the request parameters are of limited use to an attacker: what they really need is the cookies, so if the only thing they have are proxy logs, they are unlikely to be able to attack either a GET or a POST URL.
There is one exception for login requests: these tend to contain the users password. Saving this in the proxy log opens up a vector of attack that is absent in the case of POST. However, login over plain HTTP is inherently insecure anyway.
Caching proxies might retain GET responses, but not POST responses. Having said that, GET responses can be made non-cacheable with less effort than converting the URL to a POST handler.
If the user were to navigate to a third party website from the page served in response to a GET request, that third party website gets to see all the GET request parameters.
Belongs to the category of "reveals request parameters to a third party", whose severity depends on what is present in those parameters. POST requests are naturally immune to this, however to exploit the GET request a hacker would need to insert a link to their own website into the servers response.
This is very similar to the "proxy logs" argument: GET requests are stored in the browser history along with their parameters. The attacker can easily obtain these if they have physical access to the machine.
The browser will retry a GET request as soon as the user hits "refresh". It might do that when restoring tabs after shutdown. Any action (say, a payment) will thus be repeated without warning.
The browser will not retry a POST request without a warning.
This is a good reason to use only POST requests for changing data, but has nothing to do with malicious behaviour and, hence, security.
If your site performs sensitive operations, you really need someone who knows what theyre doing, because this cant be covered in a single answer. You need to use HTTPS, HSTS, CSP, mitigate SQL injection, script injection (XSS), CSRF, and a gazillion of other things that may be specific to your platform (like the mass assignment vulnerability in various frameworks: ASP.NET MVC, Ruby on Rails, etc.). There is no single thing that will make the difference between "secure" (not exploitable) and "not secure".
Over HTTPS, POST data is encoded, but could URLs be sniffed by a 3rd party?
No, they cant be sniffed. But the URLs will be stored in the browser history.
Would it be fair to say the best practice is to avoid possible placing sensitive data in the POST or GET altogether and using server side code to handle sensitive information instead?
Depends on how sensitive it is, or more specifically, in what way. Obviously the client will see it. Anyone with physical access to the clients computer will see it. The client can spoof it when sending it back to you. If those matter then yes, keep the sensitive data on the server and dont let it leave.
ahem, CSRF is just as possible with POST.
@Lotus Notes, it is very slightly more difficult, but you do not need any kind of XSS. POST requests are being sent all the time all over the place, and dont forget the CSRF can be sourced from any website, XSS not included.
no you have to make somebody else with privileges to type it, as opposed to a GET which will be silently fetched by the browser. considering that every POST form should be protected with verifiable source hash, and there's no such means for a GET link, your point is silly.
Well, you could add a hash to all your GET requests exactly the same way you add them to POST forms... But you should still not use GET for anything that modifies data.