Jump to content

Page coming up asking if I'm "human"?


deb3day

Recommended Posts

What is this all about? Is this for real or is this some kind of SPAM? I'm trying to log an event I did yesterday and when I click on the event I get a page I've never seen before which says:

 

Why are you here?

You've been redirected to this page because the site has noticed some unusual activity.

 

You may start using the site after showing that you are human.

 

Please follow the instructions below.

 

(has one of those things where you type the skewed words and a submit button)

 

If you are human we apologize for the brief intermission.

 

If you are not human, 011100000110110001100101011000010111001101100101001000000111001101110100011011110111000000100000011000010110001001110101011100110110100101101110011001110010000001110100011010000110010100100000011001100110010101100001011101000111010101110010011001010111001100100000011011110110011000100000011101000110100001101001011100110010000001110011011010010111010001100101

 

Is this for real or some kind of scam? I'm not clicking on any page I don't recognize!!!

 

I am getting this one two different events from yesterday.

 

addendum:

Now I'm getting it on everything I try to log. Is this because of too many people trying to log their 10 year anniversary days this weekend????

Edited by deb3day
Link to comment

Strange with the binary text "please stop abusing the features of this site".

What's the purpose of it, if there is something automatic disturbing the page, is it easier to get the software/script, whatever, to stop doing it if the text is coded?

 

Stange. ???

Link to comment

This was addressed in another thread awhile back. If you are clicking too many caches too quickly, it thinks you might be a computer program scraping info from the site so it locks you out for a certain amount of time. If you are doing a lot of stuff on one page this is what the software might have thought.

 

The only remedy right now is to wait until you can get back in. You might want to try and contact GC.com but they are out for the weekend.

 

Good luck

 

edited for spelling

Edited by ao318
Link to comment

I was logging yesterday's finds. I finally relented and answered the thing and it let me back in to continue logging.

You might want to be sure that you don't have any scripts or other items that are running scripts to scrape the site. If you don't have any (typically they are Greasemonkey add-ons or GSAK macros) then it's likely that you might have been opening too many windows at one time. If THAT'S not the case, then maybe there was a glitch in the system.

Link to comment
01110000011011000110010101100001011100110110010100100000011100110111010001101111

01110000001000000110000101100010011101010111001101101001011011100110011100100000

01110100011010000110010100100000011001100110010101100001011101000111010101110010

01100101011100110010000001101111011001100010000001110100011010000110100101110011

0010000001110011011010010111010001100101

 

"please stop abusing the features of this site"

 

I'm not pleased to see GreaseMonkey blamed again.

Link to comment

I did not say that GreasMonkey was to blame. Re-read it:

 

You might want to be sure that you don't have any scripts or other items that are running scripts to scrape the site. If you don't have any (typically they are Greasemonkey add-ons or GSAK macros) then it's likely that you might have been opening too many windows at one time. If THAT'S not the case, then maybe there was a glitch in the system.

 

I am definitely not specifically blaming GreaseMonkey - I said "scripts". It can hardly be denied that most of the scripts that run on the site are likely either GSAK macros or GreaseMonkey scripts, and that it is possible that older version of the script that hit the site hard are still in use. I would also say that the methodology of first checking for ANY scripts - either GSAK or GreaseMonkey is a logical first step in seeing why a user may or may not be hitting the server.

 

Right?

Edited by Markwell
Link to comment
Right?

Concede. My knee, it jerks. Something about Jeremy's snarky reply when it first happened still rubs me the wrong way. I'll go chill out.

 

I stipulate that all those who managed to trigger the bot check innocently did so for some other reason than running a GM script. Groundspeak has stated that they are not willing to give details on what triggers the check, for good reasons, and so we're left to speculate. Bookmarking a whole bunch of geocaches, then right click -> Open all in tabs? Using field notes with the log contents already filled in, and clicking Submit the moment the window opens, too many times? Maybe. I don't know. If we can figure out what kind of legitimate human behavior could trigger the check, it can help Groundspeak minimize the false positives.

Link to comment

This was addressed in another thread awhile back. If you are clicking too many caches too quickly, it thinks you might be a computer program scraping info from the site so it locks you out for a certain amount of time. If you are doing a lot of stuff on one page this is what the software might have thought.

 

The only remedy right now is to wait until you can get back in. You might want to try and contact GC.com but they are out for the weekend.

 

No, the only remedy is to follow the instructions and fill out the CAPTCHA form that the OP talked about...

Link to comment
Right?

Concede. My knee, it jerks. Something about Jeremy's snarky reply when it first happened still rubs me the wrong way. I'll go chill out.

 

I stipulate that all those who managed to trigger the bot check innocently did so for some other reason than running a GM script. Groundspeak has stated that they are not willing to give details on what triggers the check, for good reasons, and so we're left to speculate. Bookmarking a whole bunch of geocaches, then right click -> Open all in tabs? Using field notes with the log contents already filled in, and clicking Submit the moment the window opens, too many times? Maybe. I don't know. If we can figure out what kind of legitimate human behavior could trigger the check, it can help Groundspeak minimize the false positives.

 

I haven't hit the trip wire with my cachematelogging macro yet, and sometimes I click pretty fast.

Link to comment

I managed to trip it today. I was manually opening pages and logging.

 

The captcha worked fine and I continued after only a couple seconds interruption.

 

If it's working to stop the screen scraping, I can handle occasionally having my humanity questioned.

Link to comment
01110000011011000110010101100001011100110110010100100000011100110111010001101111

01110000001000000110000101100010011101010111001101101001011011100110011100100000

01110100011010000110010100100000011001100110010101100001011101000111010101110010

01100101011100110010000001101111011001100010000001110100011010000110100101110011

0010000001110011011010010111010001100101

 

"please stop abusing the features of this site"

 

I'm not pleased to see GreaseMonkey blamed again.

 

Actually it says "if you are a robot" before the binary and it is in binary for robots to read.

 

Learn to love the snark. I'm not putting it in my pocket.

Link to comment

As a developer, I love to see humor in pages like the ones in "are you human?". I thought the one about super powers was hilarious. The decoded binary was a little disappointing though. I think robots might appreciate a little haiku. Like:

 

Hammering the site,

Is against the TOU.

Please cease and desist.

 

In binary, or hex, of course.

Link to comment

Here's a much better one:

 

010101000110100001100101001000000111001101101111011101010110111001100100001000000110111101100110001000000111001101100011011100100110000101110000011010010110111001100111000011010000101001001001011100110010000001101100011010010110101101100101001000000110111001100001011010010110110001110011001000000110111101101110001000000110000100100000011000110110100001100001011011000110101101100010011011110110000101110010011001000000110100001010010010000110111101110000011001010010000001111001011011110111010100100000011000010111001001100101001000000110100001110101011011010110000101101110

Link to comment

Scraping refers to extracting data automatically from the site and saving it. Because it is automated, it is possible for it to hit many different pages in a short period of time, creating a huge load on the servers.

 

Though it is often referred to as "screen scraping" over here, I believe it is more correctly called "web scraping" (having written actual screen scrapers in serial console days).

Link to comment
Here's a much better one:

 

010101000110100001100101001000000111001101101111011101010110111001100100001000000110111101100110001000000111001101100011011100100110000101110000011010010110111001100111000011010000101001001001011100110010000001101100011010010110101101100101001000000110111001100001011010010110110001110011001000000110111101101110001000000110000100100000011000110110100001100001011011000110101101100010011011110110000101110010011001000000110100001010010010000110111101110000011001010010000001111001011011110111010100100000011000010111001001100101001000000110100001110101011011010110000101101110

 

:blink: Yup, that's a good'un!
Link to comment
Something about Jeremy's snarky reply when it first happened still rubs me the wrong way.
I remember that. A Geocacher who didn't speak English fluently encountered the screen and posted about it in the forums in broken English, explaining that he was logging the caches "by hand." Jeremy decided "by hand" was a euphemism for an automated script and since then most of the Geocaching community uses the term "manually." Edited by BCProspectors
Link to comment

I got the same message when I was marking a very, very long list of Trackables as "missing". I was opening TB page after TB page after geocoin page, and the site just wanted to make sure I was for real. I waited a while, and went back to it with no problem. Now I'll mark a bunch, then wait, then do another bunch.

Link to comment
This is very clearly an attempt to stop a specific Android app's activity.

Doubtful. That specific app (which I use) acts more like a browser than a robot. It only reads a single page from geocaching.com upon a user action, much like using a web browser. Click a link or button ==> load a page.

 

Robots, on the other hand, will load many pages automatically.

 

The tightening of the throttling code is more likely an attempt to stop many of the other sites that either re-display geocache info from geocaching.com, or provide statistics about find counts.

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...