Jump to content

Pq Search Radius


planetrobert

Recommended Posts

Would it be possible to have pocket querres do one of the following two things...

 

1) EXCLUDE caches that are included another PQ of your that you select by using a dropdown menu or something like that.

 

2) have an inner search radius wher you would say for example... within 500 miles but atleast 135 miles away...

Link to comment

I agree. It seems a shame that the server has to do the work for the duplicates without the user being able to extend to get more caches. I maintain my GSAK database covering Washington state, southern BC, and northern Oregon. I have trouble filling the areas in a band outside of Seattle and Portland.

 

It would also be nice to "update this list of waypoints". I don't know how that would be implemented, but as more caches are placed into an area, the 500 cache limit radius shrinks leaving caches that were previously in a pocket query to age without updating outside the 500 limit. I have to be careful when I go caching in the outer areas to look at the date the cache was last updated in a pocket query.

 

Also, I had been asking for only 'active' caches. That was a mistake because then I never got the logs that archived caches. I corrected my queries for that, but it reduces the radius that I have available. This works as advertised but I hadn't understood the significance.

 

Actually the pocket queries are a great feature and I appreciate having them.

 

Nudecacher

Link to comment
It would also be nice to "update this list of waypoints".

Try selecting "Updated in the last 7 days"

2 problems...

 

1) you don't know if a cache that you have already has been archived. Updated does NOT mean logged.

 

2) you don't get new log entries. especially DNF's that indicate a problem.

Link to comment
Also, I had been asking for only 'active' caches. That was a mistake because then I never got the logs that archived caches. I corrected my queries for that, but it reduces the radius that I have available. This works as advertised but I hadn't understood the significance.

No, you're still not getting ARCHIVED caches, just the "temp disabled" ones. So if you have a mix of stale and new data, you will still possibly have caching showing in your file, that are now archived. Another good reason to use the PQ's for their intended purpose, planning the day's caching, instead of trying to use them to keep an offline database of all the caches in the state/region/whatever.

Link to comment
Another good reason to use the PQ's for their intended purpose, planning the day's caching, instead of trying to use them to keep an offline database of all the caches in the state/region/whatever.

What are you talking about?!? Keeping an off-line database is part of the intended use. Here is the quote out of the license agreement that we agree to when we use PQs: "Licensee may modify the Data and merge other data sets with the Data for Licensee's own internal use." What the license inhibits is you sharing such a database with others or marketing it in some way. But you are free to do whatever you want beyond that.

Link to comment
Would it be possible to have pocket querres do one of the following two things...

 

1) EXCLUDE caches that are included another PQ of your that you select by using a dropdown menu or something like that.

 

2) have an inner search radius wher you would say for example... within 500 miles but atleast 135 miles away...

Answering yur literal question, I'll offer that you can accomplish the same end result with GPSBabel - the first using the duplicate filter and the second using the radius exclusion filter. But you have to get the data from geocaching.com. That is, I suspect, your unasked question.[1]

 

gpsbabel -i gpx -f bigquery -f myfinds -x duplicate -o gpx -F yettofind

 

(you can, of course, specify multiple files on either side of that...)

 

gpsbabel -i gpx -f bigquery -x radius,lat=36,lon=-87,distance=500 -x radius,lat=36,lon=87,distance=135,exclude

 

would give you those caches within 500 miles but more than 135 miles from the coords given.

 

 

[1] To answer you unasked question, I suggest merging multiple PQs (again, quite easy with GPSBabel) that are based on ranges of difficulty or placement dates to get the area you're interested in.

Link to comment
  But you have to get the data from geocaching.com.  That is, I suspect, your unasked question.

The asked question clearly refers to PQs before they are sent, not afterwards. I too want more power in specifying PQs.

 

Robert, what would it take for Jeremy to install your great software on the PQ server and to give us this sort of capability within the PQ before they are sent to us. If he whines about command line prompts, then would you be willing to help him get this capability inserted? You might have to build a DLL for him or some such thing.

Link to comment

 

Robert, what would it take for Jeremy to install your great software on the PQ server and to give us this sort of capability within the PQ before they are sent to us. If he whines about command line prompts, then would you be willing to help him get this capability inserted? You might have to build a DLL for him or some such thing.

I've offered a couple of times. The conversation went nowhere. The filters don't use any huge rocket science math - it's all textbook stuff, but admittedly a little heady - so it could certainly be coded up without actually _using_ GPSBabel. (If he were to use GPSBabel anywhere, I think it would be a much more natural fit to offer "download this as Mapsend/Mapsource/S&T/the other zillion formats" than to use it for the queries themselves.)

 

So basically we remain in a "whatever it takes to get the data by the rules" mode and rely on tools to process it on our side. That's not terrible (at least not to those of us willing to use such tools) but I agree it could be made nicer. I also have some understanding of the treadmill the house is on; he could burn a lot of develoment time and system resources on this kind of thing and still have an endless line of wish lists. I probably have at least as many finds as the average person in this conversation spread over a large distance and with even a modicum of planning, have never really found the limit of 5x500 a day to be a problem. (I can think of things I'd much rather the site focus on, like delivering four pages in a row without database errors...) Yes, sometimes it does take a couple of cycles to refine what you really want, but between preview on the new PQ page and good tools locally, it's really not that bad.

 

Even after rereading the original question, I can't see it's a slam dunk that the question is actually "how do I request a pocket query from geocaching.com that contains only..." instead of "I have a PQ and want the following from it..." and I saw no reason to mumble about passive voice instead of offering a solution to both since others were fixating on the latter. I've certainly needed the technique described on a single PQ. (I wanted caches in the area but NOT downtown and I've wanted to prepare PQs for group hunts that contained only those that were new to everyone.) So it really is a valid question either way...and it's achievable without server-side changes.

Link to comment
Answering yur literal question, I'll offer that you can accomplish the same end result with GPSBabel - the first using the duplicate filter and the second using the radius exclusion filter. But you have to get the data from geocaching.com. That is, I suspect, your unasked question.

I have gpsbabel, cmconvert, bmgpx and i do radius things and such and manipulate the data every which way from thursday with these, i am sort a data junkie.

 

the reason for this request was more an efficency thing. as it is i border hop between 2 states a lot, if i was just in oregon it wouldn't be a issue since i can using 1 PQ every day do a weekly update of all the caches in oregon based on date placed. now i also do california caching and bacicy i would like to rather than need 7 PQ's for the state have say 3 PQ's that the fist is closest to home, second picks up where that left off, third where the second ended, and that would pretty much cover me for my travels.

 

I basicly want a simpler, and easer way to get the data with less draw on the servers

 

Also setting up pq's by dates takes MANY page loads to get it right.

Link to comment
...with even a modicum of planning, have never really found the limit of 5x500 a day to be a problem. 

I agree when it comes to local stuff.

 

But try taking a trip that takes you through a few metropolitan areas when all you need is those within 2 miles of the freeway corridor. It takes a few days of 5x500 just to get a handful. By the end, some of the first set have been disabled or archived or moved locations etc.

Link to comment

I agree that takes additional planning and does have some excuses involved.

 

I wonder if daily "uber queries" could be provided (computed once per day or even less) that contained _all_ the caches in a given state and relied on local s/w to further filter them.

 

Or go the other way and provide unbounded "light PQ's" that contained only the barest (probably requiring no external JOINs beyond the core database and should thus rip along at the speed of the SELECT FROM) of data

 <wpt lat="35.9476" lon="-86.877533">
 <time>2004-04-12T00:00:00</time>
 <name>GCJ644</name>
 <desc><![CDATA[WET and SLIMIE]]></desc>
 <url>131012.html</url>
<Groundspeak:cache xmlns:Groundspeak="http://www.Groundspeak.com/cache/1/0">
 <Groundspeak:type>Traditional cache</Groundspeak:type>
 <Groundspeak:container>Not chosen</Groundspeak:container>
 <Groundspeak:difficulty>2.5</Groundspeak:difficulty>
 <Groundspeak:terrain>2</Groundspeak:terrain>
</Groundspeak:cache>  </wpt>

that could then be filtered locally and sent BACK to the server to send back the "full" version of just those GC#'s. That would save the server the effort of calculating caches along a route, extensive queries/joins, etc.

 

We programmer types can speculate on ways to improve things, but it's not like we've ever really had the problems involved explained to us nor have we been asked for help. So we just have to find ways to work with "whatever it takes to get the data by the rules" mode and rely on tools to process it on our side.

 

Besides, when you're travelling that kind of distances, no matter how fresh your data is, you should expect it to be stale by the time you're there anyway.

Link to comment

Well, I've continued working on this and am changing the way I do the pocket queries. It should give me better results, but unfortunately it won't lighten the server load any or improve anyone else's response.

 

Before, I had been trying to have maximal information on caches in an area in my offline database for spur of the moment opportunities as I traveled without internet access. That was impossible to maintain anyway, as the maximal radius of a query shrinks over time as new caches are added to an area and the fringe caches in the database are no longer updated by a periodic query. This problem was multiplied, as there were several distinct areas where I travel. The problem is much greater when the area of interest is metropolitan with a high cache density and more than 500 caches. I had to work out overlapping circles of varying unknown radii and was never sure of the data for specific areas. Well, I've abandoned that strategy as unreliable.

 

An additional reason that I want to do a large offline database is that I'm planning a 2000 mile trip that will go to a point almost a 1000 miles from home as the crow flies and back. I want to compare routes to cache along using the ARC filter feature on GSAK and getting a visual picture of cache densities by exporting them to TOPO!GPS USA. I estimate that there are about 9000 caches in the area of interest. Again, I won't have internet access during the trip, so I must capture all the data that I need in advance.

 

I discovered that if I maintain the query definitions off line and re-enter them each time to run once and then delete that they run much faster, since they have never run before. That helps me to the disadvantage of everyone who does periodic queries. I can get 5 full sized queries a day almost on demand that way.

 

I maintain the 500 cache radius from various points in the region and use the TOPO! densities to estimate areas that haven't been covered. Then I sort the GSAK database on "Last Update" and also look for circles that need refreshing. With this information I can optimize my pocket query choices. I'm just doing the optimization by eye now, but perhaps it is possible that I could develop a query definition spreadsheet to automate the process. Now the twenty query definition limit doesn't apply at all, only the 5 query runs a day.

 

Once I have the offline database and have decided on the routes that are interesting, then I export gpx files to load into the GPS during the trip and html datasets to run through plucker for my palm pilot. I use the GPS and palm pilot during caching, and need a laptop to extract traces and logs and reload the GPS and swap plucker files at layovers in the evenings. Plucker updates are very slow for large numbers of html files, but I haven't figured out a more efficient way to format the data for the palm pilot. I really like having the log history with the cache descriptions while caching. I don't really need the nearest caches information, as I use the GPS nearest waypoint functionality as we are caching for that.

 

While I can get what I need from pocket queries here, it might be possible to enhance the server to pre-calculate such data to serve as static pages, instead of needing dynamic queries. I'll have to think about that a bit.

 

Nudecacher

Link to comment

nudecacher

 

A couple of things that might simplify your procedure.

 

1. Roboform This allows you to save forms (PQ's) and use this data to refill a form at a later time. Saves time and eliminates mistakes.

 

2. CacheMate Although the layout is a bit different than you get with plucker, you can have up to 10 log entries. It is a much quicker way to create and upload a database to your Palm device.

 

Hope this helps

Brian

Link to comment
Well, I've continued working on this and am changing the way I do the pocket queries. It should give me better results, but unfortunately it won't lighten the server load any or improve anyone else's response.

 

... Rest removed for brevity

 

I maintain my own database of every cache in Washington State. I do this with nine pocket queries that do not overlap at all. Instead of using location, I use date hidden as the criteria to pull them down, adjusting each date range until it's just under the maximum.

 

Currently my queries look like this:

 

(500) WA - xx-xx-xx thru 06-30-02

(500) WA - 07-01-02 thru 12-31-02

(500) WA - 01-01-03 thru 04-30-03

(500) WA - 05-01-03 thru 08-15-03

(500) WA - 08-16-03 thru 12-15-03

(500) WA - 12-16-03 thru 03-15-04

(500) WA - 03-16-04 thru 05-15-04

(500) WA - 05-16-04 thru 07-15-04

(500) WA - 07-16-04 thru xx-xx-xx

 

I have these staggered to send me a couple a day.

 

The older queries, once a week is fine, midrange twice a week, and the newest one every day since it's the one most likely to change. My data is never more than a week old.

 

When that final query approaches 500 caches I'll go thru the list and adjust the dates again, as the older ones only get smaller as old caches get archived. They never grow.

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...