Jump to content

Keeping other database (GSAK, etc) updated


W8TTS

Recommended Posts

Eric from Groundspeak suggested that I post this here as it affects all geocachers.

 

I've noticed that when a cache is "archived" that that information isn't passed on in any Pocket Queries (PQ) except for the special "My Finds" PQ. By not passing that information on in PQ's, programs that people use to automatically maintain their data, such as GSAK, aren't updated with the archived information and you aren't aware that the cache has been archived. I'm aware that the information can be had through Notifications, but it should be available through PQ's for automated updates. I also realize that I can go to the geocache.com web site and query the caches, but when you're going out for a day of caching and there are 100's to look at, it would be better to have it all correctly in front of you from a program like GSAK.

 

I proposed that they allow the ability for archived caches to be included in PQ's if the user desires. Thereby allowing programs like GSAK to automatically update that information.

 

Here is my original question to them:

 

> It seems very unusual that when a cache is archived that it can't be downloaded in

> a Pocket Query. Not being able to do this causes other programs (GSAK, etc.)

> databases to become inaccurate, i.e. caches still shown as active when they are

> archived. I know that by using "Instanotify" I can be notified when a cache is

> archived, but shouldn't the archiving of caches be automated as other updates

> are?

>

> Please consider adding a way, such as with the "My Finds" Pocket Query to access

> the caches that have been archived.

 

Comments?

 

Ted (W8TTS)

Edited by W8TTS
Link to comment

The key problem is that you would need all archived caches for all time in each PQ because GSAK is an offline database that has no way to know what's what except through PQ's.

 

It can update things off of your all finds PQ since that does include archived caches.

 

It's really not viable without a different way of interacting with this site. What you would need instead is an entirely different off line format that queries this site directly. That can be done but Groundspeak guards our cache data carefully. It would take long and successful negotations to come out with 3rd party software.

 

That said...they could send a special "Archvied" PQ matching your PQ search forarchive changes since the last PQ. GSAK would never be accurate if you miss one though.

Link to comment

Personally i have yet to figure out GSAK but it totally agree. I have noticed that some caches in my GSAK database show active but i know they are archived.

 

I know there probably is a workaround. I have to worry about figuring out GSAK before figuring out the workaround.

 

Your request IMO is reasonable and makes perfect sense.

 

Why couldn't it somehow send the archived caches since your last PQ? Or maybe set up a special PQ that would pick up these archived caches to update your database.

 

I don't know, but i like your idea. <_<

Link to comment

If for example you run a PQ with "Updated in the past 7 days" checked and you also selected archived, then you would have those archived in the past seven days. It wouldn't help with those archived in the past, but it's a place to start.

Link to comment

Plan B. Have GSAK default to only work with current PQ's and not older PQ info updated with new PQ info...

Of course this is what Groundspeak would prefer. Pocket Queries were not created with the intention that you would maintain an offline database of all the caches in your area of interest. Tools like GSAK came along later and people began to use them to maintain there database. So of course they want a way to remove the archived caches from their database (or at least mark them archived). TPTB have repeatedly said that they won't provide this capability; yet they continue to refer people who ask for it to the forums. The only thing the forum regulars can do is describe how they maintain their offline database. I basically get a whole bunch of PQs over the course of the week to cover the areas I generally cache in. This downloads every cache in the area (although I suppose I could leave out the caches I've already found and just run the My Finds query every so often to update these). Each week I run a GSAK filter to show the caches in my area that haven't been updated in a week. These are the archived caches. I usually delete these, but if its a cache I've found or Did Not Find, I'll mark them archived and leave them in my database for statistical reasons.

Link to comment

Been asked for many times. The fact is that TPTB just don't support the creation and use of offline databases. Jeremy himself has stated this. They have very good business reasons for you to only use the freshest data that is available and that is online. Also, they have good reasons for wanting you to visit the site regularly. The intent of a PQ is for you to load caches into your GPSr and go caching - not to maintain a long term database of your own. Sorry.

Link to comment
The key problem is that you would need all archived caches for all time in each PQ because GSAK is an offline database that has no way to know what's what except through PQ's.

I don't think that is the necessary the case at all. For example, I maintain an off-line database of the caches in my state which is updated weekly via PQs. So my database is continuly refreshed and updated. The proposal would benefit me and I see value in it.

 

Regards

Andrew

Link to comment

Every question on that list is answered in the FAQs pinned at the top of the forum.

It's trivial to provide the information needed to shoot down old data in a PQ. The approach and justification have been presented many, many times in the forums and it's sad that Eric from Groundspeak is sending you here when it's a Groundspeak decision.

 

If you never cache > 500 caches away from a computer, the whole "offline databases are bad" stance holds water. In areas where 500 caches of limited type/found status/terrain/container is walking distance (we'll even say biking distance) the "pull the day's worth of PQs before you hunt" thing falls down badly.

 

The argument - as presented in the forums a gazillion times - typically goes something like:

 

User: I need to know when caches go away.

TheHouse: No you don't. Pull the caches just before you go.

User: My PQ didn't run/wasn't delivered/2500 caches gets me an 11 mile circle/the site was deadd/etc.

TheHouse: Providing info on archived caches is bad. People might use it to trample a cache that's no longer there.

User: But if people don't know the cache isn't there, they may trample it looking for a cache that's not there. I don't want to hunt caches that aren't there. In fact, I very much want to NOT hunt caches that aren't there but cant' do that with the tools provided as the site won't tell me when a cache went away.

TheHouse: Not a problem if you pull the caches just before you leave.

User: But I'm on a cruise/trip/away from my computer/ > 11 miles from home, etc.

I don't need detailed information on where the cache isn't; I just need to know that "GC1234 was in the PQ and now it's not" so I can scratch it off the list or remove it from the three ring binder. There's even a flag in the PQ to say the cache was archived, so if it told me that GC1234 was archived since the last time I requested it, how would that be bad?

 

 

This conversation has been played out regularly since the advent of PQs. it's kind of like the "we won't tell you what buildings you can't photograph, but we'll hassle you if you photograph them" conversations with the NSA.

Link to comment
The key problem is that you would need all archived caches for all time in each PQ because GSAK is an offline database that has no way to know what's what except through PQ's.

I don't think that is the necessary the case at all. For example, I maintain an off-line database of the caches in my state which is updated weekly via PQs. So my database is continuly refreshed and updated. The proposal would benefit me and I see value in it.

 

Regards

Andrew

 

Keep in mind that a PQ is not connected to GSAK. There is no way for GC.com to know what's in your GSAK database. To update it, you need info on all relevant caches in your search area. All means all that have ever been. That's because there is no way for GC.com to know what caches GSAK is concerned about. That's why I said what I said.

 

If GSAK could send some info to GC.com that says "I have a list of these caches and need updates on them". Then as RobertLipe says, it's fairly straight forward. But you need the bidirectional communication. This communiation doesn't exist. But per Jeremy they do have the ability and have used it with limited partners.

Link to comment

Been asked for many times. The fact is that TPTB just don't support the creation and use of offline databases. Jeremy himself has stated this. They have very good business reasons for you to only use the freshest data that is available and that is online. Also, they have good reasons for wanting you to visit the site regularly. The intent of a PQ is for you to load caches into your GPSr and go caching - not to maintain a long term database of your own. Sorry.

 

But, if I load bad data into my Palm or GPSr because I can't have correct data available the way I want it, and go caching and then can't find what I was looking for. Come home and find that they were archived. I'm going to be unhappy, not with the cache hider, but with geocaching.com for not allowing me to have the information in a usable fashion.

 

Basically I'm the end user. A paying end user. And, my wishes need to be at least addressed.

Link to comment

Been asked for many times. The fact is that TPTB just don't support the creation and use of offline databases. Jeremy himself has stated this. They have very good business reasons for you to only use the freshest data that is available and that is online. Also, they have good reasons for wanting you to visit the site regularly. The intent of a PQ is for you to load caches into your GPSr and go caching - not to maintain a long term database of your own. Sorry.

 

But, if I load bad data into my Palm or GPSr because I can't have correct data available the way I want it, and go caching and then can't find what I was looking for. Come home and find that they were archived. I'm going to be unhappy, not with the cache hider, but with geocaching.com for not allowing me to have the information in a usable fashion.

 

Basically I'm the end user. A paying end user. And, my wishes need to be at least addressed.

 

I agree with you. It would be a lot better to have direct access to the GC database so you could synchronize directly with the "master" database to create your own smaller version for your area (or where you are going caching).

 

Unfortunately it appears that Groundspeak is more interested in the bottom line than their product. Too bad they have a monopoly on the "sport" (or at least on the data of which truly belongs to the users).

Link to comment

....

But, if I load bad data into my Palm or GPSr because I can't have correct data available the way I want it, and go caching and then can't find what I was looking for. Come home and find that they were archived. I'm going to be unhappy, not with the cache hider, but with geocaching.com for not allowing me to have the information in a usable fashion.

 

Basically I'm the end user. A paying end user. And, my wishes need to be at least addressed.

Interestingly - that scenario is exactly why TPTB perfer you to use the online fresh data and not your own database.

 

And they have addressed it - you simply do not like the answer.

 

Would I like full and complete access to the data? Sure absolutely - who wouldn't?

 

 

oh - wait a minute - I do - All I have to do is use the web.......and they generously allow me to download up to 2500 waypoints a day.

 

I believe the current efforts are aimed at making that real-time updated data more easily accessed by a range of mobile devices. Cell phones, wi-fi and the like. Meanwhile I can use PQs to get more data than I could possibly ever use for a day of caching. You can pull data for a targeted area, You can pull data for a specified route. You can change your mind about the area or route up to 5 times per day. You can use the PQs to grab only those with 2/2 ratings. You can limit it to micros. Next week - when you feel like it you can pull a list of 3/3 tradtionals in the same area. I don't really understand the "need" to have every possible cache page description available at your fingertips at all times. Heaven forbid you should miss doing one - or drive by one without stopping. The beauty of it is that they will still be there and available - just listed in the next PQ.

 

My thoughts - my opinion. I respect that you are a customer (as am I) but we all have to respect that a significant portion of thier operating budget comes from advertising - and that only works if they can find a way to get you to visit the site regularly. Otherwise those monthly fees might have to be much higher.

Link to comment

 

If GSAK could send some info to GC.com that says "I have a list of these caches and need updates on them". Then as RobertLipe says, it's fairly straight forward. But you need the bidirectional communication.

You don't need bidirectional comms. A PQ knows when it was last run and it knows which caches within its selection query have been archived since it was last run. Those records would return with the archived="True" attribute set. It could return either the entire cache page (remember, you can get it with an AFPQ anyway...) or even a subset of the data. As little as

 

<wpt lat="0" lon="0">

<name>GC1234</name>

<Groundspeak:cache id="4321" archived="True" />

</wpt>

 

(There may be some extra sugar syntax to keep the validators and DTDs happy) would be enough to communicate that GC1234 went away since the last time this PQ was run. The 'archived' attribute is already there in the PQ.

 

Web services not required.

Link to comment

Been asked for many times. The fact is that TPTB just don't support the creation and use of offline databases. Jeremy himself has stated this. They have very good business reasons for you to only use the freshest data that is available and that is online. Also, they have good reasons for wanting you to visit the site regularly. The intent of a PQ is for you to load caches into your GPSr and go caching - not to maintain a long term database of your own. Sorry.

 

But, if I load bad data into my Palm or GPSr because I can't have correct data available the way I want it, and go caching and then can't find what I was looking for. Come home and find that they were archived. I'm going to be unhappy, not with the cache hider, but with geocaching.com for not allowing me to have the information in a usable fashion.

 

Basically I'm the end user. A paying end user. And, my wishes need to be at least addressed.

The only reason you load bad data is because you didn't do one last step, the "Last .gpx update" date filter, before you send the information to your GPSr and your Palm.

 

Easy Peasey!

Link to comment

I don't understand why no one wants archived info available.

 

Before placing i hide i would want to know of a similar one has been there, and if so, what problems or issues it might have had to cause it to be archived.

 

You can do this online with the maps. Ask and someone will tell you how. I have done it before so I know it is possible. Do you really want to get all 100 archived caches within 20 miles of your home everytime you run a PQ? And then have to remember to NOT send them to the GPSr when you download?

 

Not me. I filter my GSAK by date after every upload. Any cache whose date does not match the date of the PQ I just pulled is "ticked" in the little check box and deleted by using Waypont- Delete Waypoint- All User Ticked options.

 

It takes about a minute and I don't have to worry about having old data with me by accident. If they sent me archived caches, then I would need to run a different filter.

 

A PQ only sends you 500 caches. Asking for archived caches could mean you don't get all the live ones in the area. That would suck. It already is kinda sad that I only get 10 miles worth of caches when I pull some zipcodes. I would be really annoyed to find that 10% of my 500 caches were archived ones. So I personally would never use the archived option if it was available as I do not need that information. If a cache is now missing from my PQ, then I can assume one of 2 things- 1- It is archived OR 2- it is now outside the 500 cache circle. Either way, I can delete it from GSAK as having old data and then go caching knowing I have the most up-to-date info I can get.

-Jen

Link to comment

I don't understand why no one wants archived info available.

 

Before placing i hide i would want to know of a similar one has been there, and if so, what problems or issues it might have had to cause it to be archived.

 

You can do this online with the maps. Ask and someone will tell you how. I have done it before so I know it is possible. Do you really want to get all 100 archived caches within 20 miles of your home everytime you run a PQ? And then have to remember to NOT send them to the GPSr when you download?

Yes you can do this with maps. They are not searchable though.

 

I'm sure something could be worked out to not send an archived cache to the GPS.

Link to comment

My GSAK database covers 40 miles from my house. All I would need is to run one PQ requesting archived caches up to 40 miles from my house that had been archived since the last time I ran the archive PQ. You would probably never have very many if you ran it every week or so. I would be happy to get the freshest data off the web site if it would give me complete data. As it stands now you can only get the last five logs. When the sixth log gives the corrected coordinates it is a problem. I would like to have a PQ choice to get fewer caches with more past logs. Maybe 5 logs 500 caches, 10 logs 250 caches, 20 logs 125 caches, all logs 50 caches. I could still get more caches than I could find in a week just with one days worth of Pocket Queries.

Link to comment

Two things:

An archived cache could be output just once unless requested again via PQs.

 

If you go to the filters for GSAK there is an option for "Last GPX update". I set this to one week prior as I get all caches in my area each week. This doesn't mark them as archived but filters them out. A reverse filter could be made of this allowing someone to change caches to archived.

Edited by Cyclops
Link to comment

After reading everything in this thread about this issue, and agree with points on both sides of the lines, it came to me.

 

Lets add in new options to PQs.

  • Under That(and), add is Archived. If this box is checked, this PQ becomes only runable once a week(not daily).
  • Not allow more than 3 PQs with the 'archived' box checked per week.

 

Does this sound acceptable as a solution to this issue?

 

The Steaks

Link to comment

After reading everything in this thread about this issue, and agree with points on both sides of the lines, it came to me.

 

Lets add in new options to PQs.

  • Under That(and), add is Archived. If this box is checked, this PQ becomes only runable once a week(not daily).
  • Not allow more than 3 PQs with the 'archived' box checked per week.

Does this sound acceptable as a solution to this issue?

 

The Steaks

 

Works for me! Basically because it answers a need.

 

Ted (W8TTS)

Link to comment

The only reason you load bad data is because you didn't do one last step, the "Last .gpx update" date filter, before you send the information to your GPSr and your Palm.

 

Easy Peasey!

 

First, I don't load bad data, but incomplete data. And, that's only because I don't have access to the complete story, i.e. archived caches.

 

Explain "Last .gpx update" date filter thing to me again.

 

Ted

Link to comment

In GSAK, after you have refreshed your database with new PQs, on the Filter Dialogue box, go to the "Dates" tab. Now, go down to the one that says "Last .gpx Update."

 

Set it for "Not during the last '1' day" or the last '2' days, or whatever works for the PQs you just got.

 

All the caches that are returned in that filter will be the ones that did not update. Depending on how "stale" the data is in your database, this could be a short list . . . or a long list. :laughing:

 

You can either delete all the waypoints in that filter, or you can "visit" each of the cache pages to see which have been Archived, and which have been Temporarily Disabled, before you Delete them, or move them to an "Archived" database, if you care to have one. :laughing:

 

At any rate, after you get them out of your database, you will have "fresh" data to send to your GPSr. It was "fresh" when the PQ ran. After that, with each hour, it becomes "less fresh." :laughing:

Link to comment

My solution to this problem is that I use a instanotify each time a cache is archived within 50 miles from my house. To set it up you go to "set up notifications" on your http://www.geocaching.com/my/ page and then on the instanotifcations page go to the bottom of your list and select "Create a new notification" from there you can create archive Notices along with Publish so you could have a chance at FTF but you have to create one for every cache type. I can't find a how to page on doing this so post here if you have any questions on this.

 

When you have Instanotify you can check your email box for when caches get archived or published. When you see a archive notification you can go directly to that cache page and download the GPX of that cache into your pocket query directory that you download PQs to from emails. When you run get data from email on GSAK that information will automatically load into GSAK.

 

I know this is a low tech way to do this but until GSAK can read these emails to automatically archive caches in those emails or when Groundspeak automatically sends GPXs to your email box for this type notification so GSAK can get them as attachements, you have to use this instanotify method.

Link to comment

I see the last couple postings do mention other ways about keeping the other database programs up to date. My issue comes when I have over 10000 caches in my database(all in WA). I can only fully update it every month or so. If each file is limited to 500 waypoints, the easy solution is to run a PQ of the Archived caches and let it all update that way. Plus, there are caches that have been archived that had some really good containers and areas.

 

The Steaks

Link to comment

I see the last couple postings do mention other ways about keeping the other database programs up to date. My issue comes when I have over 10000 caches in my database(all in WA). I can only fully update it every month or so. If each file is limited to 500 waypoints, the easy solution is to run a PQ of the Archived caches and let it all update that way. Plus, there are caches that have been archived that had some really good containers and areas.

As I have said many times before, "Why do you need so much data?" :huh:

 

You can't possibly go hunting for all 10,000 caches. All the data you need is on this website. Why do you personally have to keep all that data? :blink: Can't you just keep your local area, and areas you travel to/through frequently updated?

 

I only update the areas in my GSAK database I think I will be traveling through or to when I am headed that direction. Otherwise I will have stale data. And, no one wants to have stale data, depending on why a cache went from "active" to "Archived" in one step . . . <_<

 

Is there some reason you have to have all the caches in Washington state. What about those just across the border in Oregon and Idaho? Maybe some of those should be included, as well . . . :P

Link to comment

The only reason you load bad data is because you didn't do one last step, the "Last .gpx update" date filter, before you send the information to your GPSr and your Palm.

 

Easy Peasey!

 

First, I don't load bad data, but incomplete data. And, that's only because I don't have access to the complete story, i.e. archived caches.

 

Explain "Last .gpx update" date filter thing to me again.

 

Ted

I suspect that using the Last .gpx update filter in GSAK to remove archived cache is hard to follow because it is counter-intuitive to what you are trying to do by maintaining an offline database. You probably think that by maintaining a offline database you are doing Groundspeak a favor reducing the load on their servers. You're also probably trying to have pocket queries that only return caches whose state has recently changed or that have been recently logged, thinking that this is less of a load than getting all the caches you are interested in everytime.

 

Groundspeak may not see it that way. First of all, they would like you to visit the website to get the latest information and not rely solely on Pocket Queries. Even though the Google ads are supressed for premium members, this may not always be the case. Groundspeak might like the traffic to the site be as high as possible to indicate to the market the potential of advertising here. While not directly taking money away by using an offline database, it could indirectly affect Groundspeak revenue. Second, while there is no objection to using third party tools and databases for you personal geocaching, Groundspeak is very sensitive to people who might use this to compete with Groundspeak, perhaps by providing a service that Groundspeak itself is considering. An example might be an improved mapping application. Groundspeak allows you to download 2500 caches per day - up to 17500 a week. This is more than enough for anyone's personal use. If you were able to maintain an offline database by just getting the caches that have changed people, by their nature, would still setup their PQs to return 500 caches and the number of caches they maintain in their offline database would balloon to a significant portion of the Geocaching.com database. While Groundspeak can monitor for people using screen scraping bots to collect data, someone using legitimate PQ techniques to grab the database is harder to stop. Even if they detected who is abusing PQs and cancelled that account, someone could simple register a new account and continue to abuse the PQs. Sure, Groundspeak can go after anyone who violates the extra agreement you make when you sign up for Pocket Queries, it is just not so coveinient.

 

Given this, the "approved" method for keeping an offline database is to setup your PQs to return all the caches in some area of interest. If you are getting more than 500 caches, you should divide the PQs by date placed rather than using a bunch of overlapping circles. If your area takes 10 PQs, for example, you will be able to update the area every other day. You can then use GSAK to find the caches in that area that haven't been updated in the past two days. These would be the archived caches. It's pretty easy, once you accept Groundspeak's reasoning.

Link to comment

Given this, the "approved" method for keeping an offline database is to setup your PQs to return all the caches in some area of interest. If you are getting more than 500 caches, you should divide the PQs by date placed rather than using a bunch of overlapping circles. If your area takes 10 PQs, for example, you will be able to update the area every other day. You can then use GSAK to find the caches in that area that haven't been updated in the past two days. These would be the archived caches. It's pretty easy, once you accept Groundspeak's reasoning.

 

First I do use PQ's by Date Placed and it work well, except for those that were archived.

 

Here's the fallacy in this train of thought of not being update in two days. You state "You can then use GSAK to find the caches in that area that haven't been updated in the past two days. These would be the archived caches. It's pretty easy, once you accept Groundspeak's reasoning." The two day rule or actually any number of days since the last update is not a good way to do this. Why?

 

I looked back at my finds and found a number that had the Last GPX date that was much earlier then my find. For example when I found Dick Cheney Cache (GCXCFV) it's GPX date was 7/1/2007 and I found it on 7/26/2007. If I used your method I would have thought the cache was archived and not looked for it. But, I looked for it and found it.

 

And, Rock Clingers Clinging Here (GCXC51) last GPX date 6/10/2007, I found it on 7/26/207.

 

And, Lost Dog (GCMWD3) last GPX date 7/1/2007, I fount it on 7/26/2007.

 

And, so on . . .

 

So, if I'd used your two day of no updates to assume that the cache was archived I wouldn't have looked for these.

 

I'm curious, who "approved" this method?

Edited by W8TTS
Link to comment

After reading everything in this thread about this issue, and agree with points on both sides of the lines, it came to me.

 

Lets add in new options to PQs.

  • Under That(and), add is Archived. If this box is checked, this PQ becomes only runable once a week(not daily).
  • Not allow more than 3 PQs with the 'archived' box checked per week.

Does this sound acceptable as a solution to this issue?

 

The Steaks

 

Works for me! Basically because it answers a need.

 

Ted (W8TTS)

 

Basically - I would not hold your breath waiting for this feature to come along. It has been suggested for a long time and would clearly be easy to implement. Yet it does not happen.

 

TPTB just do not support the creation and/or use of offline databases. Never have. They want you to revisit the site and download fresh data. If you do that - they should never be a "need" to have archived caches included.

Link to comment

After I find a cache, I mark it as Found in GSAK and Move it to my "Found" database.

 

I don't keep all the caches in the same "Default" database. That would be unwieldly. I also have a separate database for the "Puzzle" caches. I don't want to accidentally load Puzzle cache coordinates into my GPSr. I only move them from the Puzzle database after I have used the "Corrected Coordinates" feature in GSAK.

 

There is a way to do this and it really isn't that hard . . . really. <_<

 

Edit for typo . . .

Edited by Miragee
Link to comment

Basically - I would not hold your breath waiting for this feature to come along. It has been suggested for a long time and would clearly be easy to implement. Yet it does not happen.

 

TPTB just do not support the creation and/or use of offline databases. Never have. They want you to revisit the site and download fresh data. If you do that - they should never be a "need" to have archived caches included.

 

If this is their true policy it's a very poor one. When we go out for a day of caching, we don't have time to visit every cache on-line in the area that we're going to attempt to visit in person. Having the correct information is VERY important even in a game like geocaching. It would actually benefit them to provide the most up-to-date information, even through PQ's, that they can.

Link to comment

After I find a cache, I mark it as Found in GSAK and Move it to my "Found" database.

 

I don't keep all the caches in the same "Default" database. That would be unwieldly. I also have a separate database for the "Puzzle" caches. I don't want to accidentally load Puzzle cache coordinates into my GPSr. I only move them from the Puzzle database after I have used the "Corrected Coordinates" feature in GSAK.

 

There is a way to do this and it really isn't that hard . . . really. <_<

 

Edit for typo . . .

 

Good point. But, not the point of this threat. We're trying to solve the archived cache problem. I update GSAK and geocaching.com at the same time using the "Update from CacheMate" macro.

Edited by W8TTS
Link to comment

As I think has been said many times in this thread, TPTB will not give you what you want. After a cache has been Archived, Groundspeak doesn't want to send out data about it, except in the All Finds PQs.

 

The best you can do is to figure out how to use the tool you are already using -- GSAK -- to solve your "stale data" problem. <_<

Link to comment

My GSAK database covers 40 miles from my house. All I would need is to run one PQ requesting archived caches up to 40 miles from my house that had been archived since the last time I ran the archive PQ. You would probably never have very many if you ran it every week or so. I would be happy to get the freshest data off the web site if it would give me complete data. As it stands now you can only get the last five logs. When the sixth log gives the corrected coordinates it is a problem. I would like to have a PQ choice to get fewer caches with more past logs. Maybe 5 logs 500 caches, 10 logs 250 caches, 20 logs 125 caches, all logs 50 caches. I could still get more caches than I could find in a week just with one days worth of Pocket Queries.

 

Mine covers 50. And, I agree a PQ that you could run once a week would probably be sufficient. If they can do it for the "My Finds" PQ, then it should be a no brained to do it for an "Archived" PQ.

Link to comment

As I have said many times before, "Why do you need so much data?" :huh:

 

Basically so that you can manipulate the data the way you want to.

 

Can't you just keep your local area, and areas you travel to/through frequently updated?

 

Sure you can if you want to limit yourself. There's a couple of us here that will pick up on a days notice and maybe drive a couple hundred miles away to cache. We run a PQ, update GSAK, upload out GPSr's and Palms, and away we go. But, we want the most accurate data we can get, and digging through the web pages looking at each cache just isn't an option. All we want is accurate data.

 

I only update the areas in my GSAK database I think I will be traveling through or to when I am headed that direction. Otherwise I will have stale data. And, no one wants to have stale data, depending on why a cache went from "active" to "Archived" in one step . . . :blink:

 

And, if a cache is archived and already in your GSAK database you won't know it, and you'll go hunting an "archived" cache. Or, use the fallacy of the not updated in the past two days and miss an active cache.

 

Is there some reason you have to have all the caches in Washington state. What about those just across the border in Oregon and Idaho? Maybe some of those should be included, as well . . . :P

 

Maybe. He may travel all over the state and want to know where the active caches are. :ph34r:<_<

Edited by W8TTS
Link to comment

Is there some reason you have to have all the caches in Washington state. What about those just across the border in Oregon and Idaho? Maybe some of those should be included, as well . . . <_<

 

Maybe. He may travel all over the state and want to know where the active caches are. :blink::huh:

Actually, the Wa State PQ does capture some of the caches just across the border considering the radius of the search pattern.

Link to comment

In GSAK, after you have refreshed your database with new PQs, on the Filter Dialogue box, go to the "Dates" tab. Now, go down to the one that says "Last .gpx Update."

 

Set it for "Not during the last '1' day" or the last '2' days, or whatever works for the PQs you just got.

 

All the caches that are returned in that filter will be the ones that did not update. Depending on how "stale" the data is in your database, this could be a short list . . . or a long list. :P

 

You can either delete all the waypoints in that filter, or you can "visit" each of the cache pages to see which have been Archived, and which have been Temporarily Disabled, before you Delete them, or move them to an "Archived" database, if you care to have one. :ph34r:

 

At any rate, after you get them out of your database, you will have "fresh" data to send to your GPSr. It was "fresh" when the PQ ran. After that, with each hour, it becomes "less fresh." :huh:

 

Here's the fallacy in this train of thought of not being update in two days. It's been stated that "You can then use GSAK to find the caches in that area that haven't been updated in the past two days. These would be the archived caches. It's pretty easy, once you accept Groundspeak's reasoning." This two day rule or actually any number of days since the last update is not a good way to do this. Why?

 

I looked back at my finds and found a number that had the Last GPX date that was much earlier then my find. For example when I found Dick Cheney Cache (GCXCFV) it's GPX date was 7/1/2007 and I found it on 7/26/2007. If I used your method I would have thought the cache was archived and not looked for it. But, I looked for it and found it.

 

And, Rock Clingers Clinging Here (GCXC51) last GPX date 6/10/2007, I found it on 7/26/207.

 

And, Lost Dog (GCMWD3) last GPX date 7/1/2007, I fount it on 7/26/2007.

 

And, so on . . .

 

So, if I'd used your two day of no updates to assume that the cache was archived I wouldn't have looked for these.

 

What you are assuming is that every active cache is update every day or two, and this just isn't true. They're only updated when someone finds it and logs it, or when there is some maintenance on it. Rock Clingers Clinging wasn't updated for a month in a half, it wasn't archived, it was still there and active. :huh::P<_<:blink:

Link to comment

Is there some reason you have to have all the caches in Washington state. What about those just across the border in Oregon and Idaho? Maybe some of those should be included, as well . . . <_<

 

Maybe. He may travel all over the state and want to know where the active caches are. :blink::huh:

Actually, the Wa State PQ does capture some of the caches just across the border considering the radius of the search pattern.

 

Yep. It all depends on how close to the boarder you are.

Link to comment
And, if a cache is archived and already in your GSAK database you won't know it, and you'll go hunting an "archived" cache. Or, use the fallacy of the not updated in the past two days and miss an active cache.

 

I think you totally misunderstand this.......

 

If you run exactly the same query on day 1 and then again on day 42 - any caches that were in GSAK and did not get updated by the exact same PQ - can be weeded out as archived. AKA - they were there and because a PQ has no archived data - you can conclude they are now archived.

Link to comment
Sure you can if you want to limit yourself. There's a couple of us here that will pick up on a days notice and maybe drive a couple hundred miles away to cache. We run a PQ, update GSAK, upload out GPSr's and Palms, and away we go. But, we want the most accurate data we can get, and digging through the web pages looking at each cache just isn't an option. All we want is accurate data.

 

I really don't understand this - I do this all the time and have never once had a need for the archive data.

 

I run a PQ - load it into GSAK upload into PDA and GPSr and off I go. No chance of stale data because I clear out any old data as I load the new in.

Link to comment

Two things:

An archived cache could be output just once unless requested again via PQs.

 

If you go to the filters for GSAK there is an option for "Last GPX update". I set this to one week prior as I get all caches in my area each week. This doesn't mark them as archived but filters them out. A reverse filter could be made of this allowing someone to change caches to archived.

 

But, what if there was no activity on that cache in the past week? You could loose out on an active cache. I found one that hadn't been updated (Last GPX date) for a month and a half. I feel that the Last GPX Date thing doesn't hold water.

Link to comment

My solution to this problem is that I use a instanotify each time a cache is archived within 50 miles from my house. To set it up you go to "set up notifications" on your http://www.geocaching.com/my/ page and then on the instanotifcations page go to the bottom of your list and select "Create a new notification" from there you can create archive Notices along with Publish so you could have a chance at FTF but you have to create one for every cache type. I can't find a how to page on doing this so post here if you have any questions on this.

 

When you have Instanotify you can check your email box for when caches get archived or published. When you see a archive notification you can go directly to that cache page and download the GPX of that cache into your pocket query directory that you download PQs to from emails. When you run get data from email on GSAK that information will automatically load into GSAK.

 

I know this is a low tech way to do this but until GSAK can read these emails to automatically archive caches in those emails or when Groundspeak automatically sends GPXs to your email box for this type notification so GSAK can get them as attachements, you have to use this instanotify method.

 

Very low tech <_< And, in this day and age not the way to do things. We're using satellites and GPSr's to find something, and relying on a notify message to manually keep something up to date. IMHO, not the way to go.

 

If you run a PQ that askes for updates in the last 7 days then it should include the Archived caches. That's an update, isn't it?

Link to comment
And, if a cache is archived and already in your GSAK database you won't know it, and you'll go hunting an "archived" cache. Or, use the fallacy of the not updated in the past two days and miss an active cache.

 

I think you totally misunderstand this.......

 

If you run exactly the same query on day 1 and then again on day 42 - any caches that were in GSAK and did not get updated by the exact same PQ - can be weeded out as archived. AKA - they were there and because a PQ has no archived data - you can conclude they are now archived.

 

Wrong. Your logic is completely wrong. I have found a cache that wasn't updated in a month and a half, 45 days. You would have walk right past it and not found it. A cache is updated when something happens to it, it found and logged, or some maintenance is performed. If nothing happens, no update. In the winter some caches might not be found for a month or two or three.

Edited by W8TTS
Link to comment
Sure you can if you want to limit yourself. There's a couple of us here that will pick up on a days notice and maybe drive a couple hundred miles away to cache. We run a PQ, update GSAK, upload out GPSr's and Palms, and away we go. But, we want the most accurate data we can get, and digging through the web pages looking at each cache just isn't an option. All we want is accurate data.

 

I really don't understand this - I do this all the time and have never once had a need for the archive data.

 

I run a PQ - load it into GSAK upload into PDA and GPSr and off I go. No chance of stale data because I clear out any old data as I load the new in.

 

There's always a chance of stale data. But, you do it your way, I'll do it mine, and we'll both find caches. I may find some you don't and you may find some that I don't,

Link to comment

Given this, the "approved" method for keeping an offline database is to setup your PQs to return all the caches in some area of interest. If you are getting more than 500 caches, you should divide the PQs by date placed rather than using a bunch of overlapping circles. If your area takes 10 PQs, for example, you will be able to update the area every other day. You can then use GSAK to find the caches in that area that haven't been updated in the past two days. These would be the archived caches. It's pretty easy, once you accept Groundspeak's reasoning.

 

First I do use PQ's by Date Placed and it work well, except for those that were archived.

 

Here's the fallacy in this train of thought of not being update in two days. You state "You can then use GSAK to find the caches in that area that haven't been updated in the past two days. These would be the archived caches. It's pretty easy, once you accept Groundspeak's reasoning." The two day rule or actually any number of days since the last update is not a good way to do this. Why?

 

I looked back at my finds and found a number that had the Last GPX date that was much earlier then my find. For example when I found Dick Cheney Cache (GCXCFV) it's GPX date was 7/1/2007 and I found it on 7/26/2007. If I used your method I would have thought the cache was archived and not looked for it. But, I looked for it and found it.

 

And, Rock Clingers Clinging Here (GCXC51) last GPX date 6/10/2007, I found it on 7/26/207.

 

And, Lost Dog (GCMWD3) last GPX date 7/1/2007, I fount it on 7/26/2007.

 

And, so on . . .

 

So, if I'd used your two day of no updates to assume that the cache was archived I wouldn't have looked for these.

 

I'm curious, who "approved" this method?

When you get your PQ you get ALL the caches in the area - whether they have changed or not. When you load the GPX file into GSAK, GSAK records the date of the GPX file for each cache in it's database - whether or not anything changed. This is the date of the last GPX update. So if you get every cache in a GPX file once every other day then any cache that has a last GPX update date more than two days old was not in one of the GPX files you loaded in the last two days. Since you have asked for every cache, the only caches that have not been updated are the archived caches.

 

You can run Pocket Queries to look for something other than all the caches. Then you can filter in GSAK for just the caches that match the criteria in your PQs AND have not had a GPX file update in the last N days and remove those archived caches. But you run the risk of looking for some other cache that was archived that you did not ask for in your pocket query.

 

I put the word "approved" in quotation marks because there is of course no officially approved method. This is the method that I approved because it works for me.

 

Just to let you know, I think Jeremy is being stubborn not allowing for a pocket query that returns at least those caches archived in the last 7 days. However, since he doesn't seem to want to give in on this I thought if I explained what his reasons might be you could accept the work around that everyone is giving you.

Link to comment

 

When you get your PQ you get ALL the caches in the area - whether they have changed or not. When you load the GPX file into GSAK, GSAK records the date of the GPX file for each cache in it's database - whether or not anything changed. This is the date of the last GPX update. So if you get every cache in a GPX file once every other day then any cache that has a last GPX update date more than two days old was not in one of the GPX files you loaded in the last two days. Since you have asked for every cache, the only caches that have not been updated are the archived caches.

 

You can run Pocket Queries to look for something other than all the caches. Then you can filter in GSAK for just the caches that match the criteria in your PQs AND have not had a GPX file update in the last N days and remove those archived caches. But you run the risk of looking for some other cache that was archived that you did not ask for in your pocket query.

 

I put the word "approved" in quotation marks because there is of course no officially approved method. This is the method that I approved because it works for me.

 

Just to let you know, I think Jeremy is being stubborn not allowing for a pocket query that returns at least those caches archived in the last 7 days. However, since he doesn't seem to want to give in on this I thought if I explained what his reasons might be you could accept the work around that everyone is giving you.

 

I don't completely reload my QSAK data base, I just apply updates. And, I only select update in the past 7 days in my PQ's that run for selected add dates.

 

Like I told someone else, your method works for you and mine works for me. We'll both find caches. All I want is to see those caches that are archived at least once and then maybe on request.

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...