+deb3day Posted May 2, 2010 Share Posted May 2, 2010 (edited) What is this all about? Is this for real or is this some kind of SPAM? I'm trying to log an event I did yesterday and when I click on the event I get a page I've never seen before which says: Why are you here? You've been redirected to this page because the site has noticed some unusual activity. You may start using the site after showing that you are human. Please follow the instructions below. (has one of those things where you type the skewed words and a submit button) If you are human we apologize for the brief intermission. If you are not human, 011100000110110001100101011000010111001101100101001000000111001101110100011011110111000000100000011000010110001001110101011100110110100101101110011001110010000001110100011010000110010100100000011001100110010101100001011101000111010101110010011001010111001100100000011011110110011000100000011101000110100001101001011100110010000001110011011010010111010001100101 Is this for real or some kind of scam? I'm not clicking on any page I don't recognize!!! I am getting this one two different events from yesterday. addendum: Now I'm getting it on everything I try to log. Is this because of too many people trying to log their 10 year anniversary days this weekend???? Edited May 2, 2010 by deb3day Link to comment
+McBack Posted May 2, 2010 Share Posted May 2, 2010 Strange with the binary text "please stop abusing the features of this site". What's the purpose of it, if there is something automatic disturbing the page, is it easier to get the software/script, whatever, to stop doing it if the text is coded? Stange. ??? Link to comment
ao318 Posted May 2, 2010 Share Posted May 2, 2010 (edited) This was addressed in another thread awhile back. If you are clicking too many caches too quickly, it thinks you might be a computer program scraping info from the site so it locks you out for a certain amount of time. If you are doing a lot of stuff on one page this is what the software might have thought. The only remedy right now is to wait until you can get back in. You might want to try and contact GC.com but they are out for the weekend. Good luck edited for spelling Edited May 2, 2010 by ao318 Link to comment
+deb3day Posted May 2, 2010 Author Share Posted May 2, 2010 I was logging yesterday's finds. I finally relented and answered the thing and it let me back in to continue logging. Link to comment
+Bear and Ragged Posted May 3, 2010 Share Posted May 3, 2010 It was added to stop people scraping the site. When it was added you were shut out for a length of time. After a few moans on the forum GS added the ability to prove you were human to get back in. Link to comment
+Markwell Posted May 3, 2010 Share Posted May 3, 2010 I was logging yesterday's finds. I finally relented and answered the thing and it let me back in to continue logging. You might want to be sure that you don't have any scripts or other items that are running scripts to scrape the site. If you don't have any (typically they are Greasemonkey add-ons or GSAK macros) then it's likely that you might have been opening too many windows at one time. If THAT'S not the case, then maybe there was a glitch in the system. Link to comment
+Chrysalides Posted May 3, 2010 Share Posted May 3, 2010 0111000001101100011001010110000101110011011001010010000001110011011101000110111101110000001000000110000101100010011101010111001101101001011011100110011100100000 01110100011010000110010100100000011001100110010101100001011101000111010101110010 01100101011100110010000001101111011001100010000001110100011010000110100101110011 0010000001110011011010010111010001100101 "please stop abusing the features of this site" I'm not pleased to see GreaseMonkey blamed again. Link to comment
+Markwell Posted May 3, 2010 Share Posted May 3, 2010 (edited) I did not say that GreasMonkey was to blame. Re-read it: You might want to be sure that you don't have any scripts or other items that are running scripts to scrape the site. If you don't have any (typically they are Greasemonkey add-ons or GSAK macros) then it's likely that you might have been opening too many windows at one time. If THAT'S not the case, then maybe there was a glitch in the system. I am definitely not specifically blaming GreaseMonkey - I said "scripts". It can hardly be denied that most of the scripts that run on the site are likely either GSAK macros or GreaseMonkey scripts, and that it is possible that older version of the script that hit the site hard are still in use. I would also say that the methodology of first checking for ANY scripts - either GSAK or GreaseMonkey is a logical first step in seeing why a user may or may not be hitting the server. Right? Edited May 3, 2010 by Markwell Link to comment
+Chrysalides Posted May 3, 2010 Share Posted May 3, 2010 Right? Concede. My knee, it jerks. Something about Jeremy's snarky reply when it first happened still rubs me the wrong way. I'll go chill out. I stipulate that all those who managed to trigger the bot check innocently did so for some other reason than running a GM script. Groundspeak has stated that they are not willing to give details on what triggers the check, for good reasons, and so we're left to speculate. Bookmarking a whole bunch of geocaches, then right click -> Open all in tabs? Using field notes with the log contents already filled in, and clicking Submit the moment the window opens, too many times? Maybe. I don't know. If we can figure out what kind of legitimate human behavior could trigger the check, it can help Groundspeak minimize the false positives. Link to comment
+benh57 Posted May 4, 2010 Share Posted May 4, 2010 This was addressed in another thread awhile back. If you are clicking too many caches too quickly, it thinks you might be a computer program scraping info from the site so it locks you out for a certain amount of time. If you are doing a lot of stuff on one page this is what the software might have thought. The only remedy right now is to wait until you can get back in. You might want to try and contact GC.com but they are out for the weekend. No, the only remedy is to follow the instructions and fill out the CAPTCHA form that the OP talked about... Link to comment
jholly Posted May 4, 2010 Share Posted May 4, 2010 Right? Concede. My knee, it jerks. Something about Jeremy's snarky reply when it first happened still rubs me the wrong way. I'll go chill out. I stipulate that all those who managed to trigger the bot check innocently did so for some other reason than running a GM script. Groundspeak has stated that they are not willing to give details on what triggers the check, for good reasons, and so we're left to speculate. Bookmarking a whole bunch of geocaches, then right click -> Open all in tabs? Using field notes with the log contents already filled in, and clicking Submit the moment the window opens, too many times? Maybe. I don't know. If we can figure out what kind of legitimate human behavior could trigger the check, it can help Groundspeak minimize the false positives. I haven't hit the trip wire with my cachematelogging macro yet, and sometimes I click pretty fast. Link to comment
+Isonzo Karst Posted May 4, 2010 Share Posted May 4, 2010 I managed to trip it today. I was manually opening pages and logging. The captcha worked fine and I continued after only a couple seconds interruption. If it's working to stop the screen scraping, I can handle occasionally having my humanity questioned. Link to comment
+Chrysalides Posted May 4, 2010 Share Posted May 4, 2010 (edited) I managed to trip it today. I was manually opening pages and logging. If you don't mind, some details please: - How many logs? - What exactly is the workflow? Were you picking caches from the Google map view, then opening it to log? Thanks. Every time this thread pops to the top, I'm reminded of this song (Wikipedia link). Edited May 4, 2010 by Chrysalides Link to comment
Jeremy Posted May 4, 2010 Share Posted May 4, 2010 0111000001101100011001010110000101110011011001010010000001110011011101000110111101110000001000000110000101100010011101010111001101101001011011100110011100100000 01110100011010000110010100100000011001100110010101100001011101000111010101110010 01100101011100110010000001101111011001100010000001110100011010000110100101110011 0010000001110011011010010111010001100101 "please stop abusing the features of this site" I'm not pleased to see GreaseMonkey blamed again. Actually it says "if you are a robot" before the binary and it is in binary for robots to read. Learn to love the snark. I'm not putting it in my pocket. Link to comment
+Chrysalides Posted May 4, 2010 Share Posted May 4, 2010 As a developer, I love to see humor in pages like the ones in "are you human?". I thought the one about super powers was hilarious. The decoded binary was a little disappointing though. I think robots might appreciate a little haiku. Like: Hammering the site, Is against the TOU. Please cease and desist. In binary, or hex, of course. Link to comment
+Markwell Posted May 5, 2010 Share Posted May 5, 2010 Here's a much better one: 010101000110100001100101001000000111001101101111011101010110111001100100001000000110111101100110001000000111001101100011011100100110000101110000011010010110111001100111000011010000101001001001011100110010000001101100011010010110101101100101001000000110111001100001011010010110110001110011001000000110111101101110001000000110000100100000011000110110100001100001011011000110101101100010011011110110000101110010011001000000110100001010010010000110111101110000011001010010000001111001011011110111010100100000011000010111001001100101001000000110100001110101011011010110000101101110 Link to comment
+Original A1 Posted May 13, 2010 Share Posted May 13, 2010 Humour a thickie - what on earth is "scraping" other than something I'd do to my arms or legs when out caching?!! Link to comment
+Chrysalides Posted May 13, 2010 Share Posted May 13, 2010 Scraping refers to extracting data automatically from the site and saving it. Because it is automated, it is possible for it to hit many different pages in a short period of time, creating a huge load on the servers. Though it is often referred to as "screen scraping" over here, I believe it is more correctly called "web scraping" (having written actual screen scrapers in serial console days). Link to comment
jholly Posted May 13, 2010 Share Posted May 13, 2010 (edited) Humour a thickie - what on earth is "scraping" other than something I'd do to my arms or legs when out caching?!! scrapping Edit: Darn, Chrysalides won this time Edited May 13, 2010 by jholly Link to comment
knowschad Posted May 13, 2010 Share Posted May 13, 2010 Here's a much better one: 010101000110100001100101001000000111001101101111011101010110111001100100001000000110111101100110001000000111001101100011011100100110000101110000011010010110111001100111000011010000101001001001011100110010000001101100011010010110101101100101001000000110111001100001011010010110110001110011001000000110111101101110001000000110000100100000011000110110100001100001011011000110101101100010011011110110000101110010011001000000110100001010010010000110111101110000011001010010000001111001011011110111010100100000011000010111001001100101001000000110100001110101011011010110000101101110 Yup, that's a good'un! Link to comment
BCProspectors Posted May 13, 2010 Share Posted May 13, 2010 (edited) Something about Jeremy's snarky reply when it first happened still rubs me the wrong way.I remember that. A Geocacher who didn't speak English fluently encountered the screen and posted about it in the forums in broken English, explaining that he was logging the caches "by hand." Jeremy decided "by hand" was a euphemism for an automated script and since then most of the Geocaching community uses the term "manually." Edited May 14, 2010 by BCProspectors Link to comment
+Eartha Posted May 13, 2010 Share Posted May 13, 2010 I got the same message when I was marking a very, very long list of Trackables as "missing". I was opening TB page after TB page after geocoin page, and the site just wanted to make sure I was for real. I waited a while, and went back to it with no problem. Now I'll mark a bunch, then wait, then do another bunch. Link to comment
+gatoller Posted May 13, 2010 Share Posted May 13, 2010 This is very clearly an attempt to stop a specific Android app's activity. The author has been told his app violates the site's terms of service, but he refuses to stop what he's doing. Link to comment
+Lil Devil Posted May 14, 2010 Share Posted May 14, 2010 This is very clearly an attempt to stop a specific Android app's activity. Doubtful. That specific app (which I use) acts more like a browser than a robot. It only reads a single page from geocaching.com upon a user action, much like using a web browser. Click a link or button ==> load a page. Robots, on the other hand, will load many pages automatically. The tightening of the throttling code is more likely an attempt to stop many of the other sites that either re-display geocache info from geocaching.com, or provide statistics about find counts. Link to comment
Recommended Posts