Jump to content

New Member Here! Question About Pq's


mantis7

Recommended Posts

I had a question / suggestion regarding pocket queries. With many of the newer GPS units being able to store 1000 (and more with SD cards) waypoints, would an increase of PQ size to 1000 be out of the question?

 

I realize we can run 5 searches in a day, but for my needs I would rather run one 2500 query per day or at least have the option of one day running one 2500 , and the next day two 1250, and the next one five 500 item searches...and so on.

 

Also as a general comment on pocket queries, it seems like they are a little hidden in terms of navigation. It is such an important feature, I would want to put it in a more visable spot. (I just go to /pocket, but it took the forums to show me that).

 

Anyhow, I have only joined a few days ago, and have found a couple of caches already...my girlfriend really is enjoying this as well. Awesome job on the web site..Keep up the great work!

 

Mantis7

Link to comment
If you can tell us more about WHY you want so many caches in a PQ, maybe we can help you with a work-around.

 

That is kind of why I posted this in the first place. I think, rather than a "work-around" it should be a feature of the premium membership. But I will do my best to explain. Keep in mind (as Hemlock bluntly pointed out) I am a new member. (Ive only owned my GPS for a few days) But here goes:

 

In my area that I generally drive, there are over 5000 caches...My GPS unit Holds maps from Seattle, Washington all the way down to Southern California. It can hold 1000 waypoints ( and some are coming out that can hold many more via SD cards)

 

Most of my caches are going to be "caches of opportunity." That is to say, I will be driving around, have an extra hour or so and then think "HMMM let me check my GPS to see what caches are around."

 

I realize that that I can do five sperate 500 cache PQ's. But this seems awkward to me when I just want 2500 of the nearest caches. I only want to update once a week or so, and I don't really want to have to import/download/deal with five spereate files when I could just use one.

 

As Hemlock suggested, of course I cant do 500 in a day. He even lays claim to doing 70 in one weekend (48 hours). That is amazing. That is more than one cache an hour with no sleep. Bravo. I will never do that many. So according to his logic, why don't we limit PQ's to 70 per weekend?

 

All's I am saying is rather than five seperate pocket queries (per day) of 500 each, I'd rather HAVE THE OPTION do one big one. If there is a "workaround" that would be fantastic. In asking this question originally I did not intend to be rude or critical. I was (as a brand new member) merely commenting on my thoughts in what I assumed was the appropriate forum. Maybe I took it wrong, but I thought Hemlock's response was rude (especially for a "Volunteer Reviewer" who lives in my area).

 

In making my suggestion I was only trying to be 100% helpful and providing my feedback. I apologize to you both if this was an inappropriate topic.

Edited by mantis7
Link to comment

Hi mantis7,

 

Before we start refering to you as "your highness"....

 

- 70 in a weekend is nothing to sneeze at, but check around the forums and see what the current record is.

- Setup a PQ for starting in downtown Oakland and find all traditional/multi caches where terrrain level is 2.5 or higher, and cache container size is regular or larger. How big is the resulting circle?

- How many PQ's does it take to get 5000 caches?

- Do you use a PDA to house the web pages? Or will you print out all the pages?

 

Sorry if I sound a bit terse here, I think this thread started on the wrong foot.

 

Try multiple PQ's.

- detirmine your area of operations

-- find the center and radius of that area.

- Set the queries up by "placed between" dates, using your centerpoint and radius as determnined above.

-- first PQ would start on 1/1/2000, then change the end date and preview the results adjusting the last date until the results yield just under 500 caches.

-- use the last stop date as the start date on the new PQ, repeat the process.

-- keep going until you get past today's date.

- load it all into GSAK

- Carry around a laptop so you can download the section of town you are in to your GPS. From the sounds of it, you won't even get what you need in a 1000 waypoint GPS.

Link to comment
Before we start refering to you as "your highness"....

 

I did not set this as my title, I swear! I don't know who did but it was obviously someone "higher" up.... LOL (I'm serious)

 

70 in a weekend is nothing to sneeze at, but check around the forums and see what the current record is.

 

Never said is was something to sneeze at. Read my post. I said I would "never" do that many. It is over one per hour with no sleep, so I agree with this point.

 

- How many PQ's does it take to get 5000 caches?

 

Well, not counting duplicates it takes 10 if my math is correct. But Counting Duplicates, I dont know the answer.

 

 

Do you use a PDA to house the web pages? Or will you print out all the pages?

 

I never mentioned either in my previous post, and it isn't all that relevant to my original post....See further down for my explanation.

 

Sorry if I sound a bit terse here, I think this thread started on the wrong foot.

 

You don't sound terse to me! I am glad you responded...because the more I think about it the better my idea seems to me and there havent been any reasons in this thread (except maybe AZ Cardinal) to make me think otherwise. See further down for my brief explanation.

 

Try multiple PQ's.

 

This is the reason I made this thread, to avoid having multiple PQ's and .gpx files...See further down....

 

- detirmine your area of operations

-- find the center and radius of that area.

- Set the queries up by "placed between" dates, using your centerpoint and radius as determnined above.

-- first PQ would start on 1/1/2000, then change the end date and preview the results adjusting the last date until the results yield just under 500 caches.

-- use the last stop date as the start date on the new PQ, repeat the process.

-- keep going until you get past today's date.

- load it all into GSAK

 

This is the type of work around that my suggestion would solve. Please read further down , we're almost there. Your above example would give you 5 different PQ's and 5 different GPX files (if you did this 5 times). My idea would save time, and be much easier. Please read further down.

 

Carry around a laptop so you can download the section of town you are in to your GPS. From the sounds of it, you won't even get what you need in a 1000 waypoint GPS.

 

I plan on getting a laptop. But again, this doesn't address the issue at hand which is PQ's. Besides, it is an expensive option that not everyone would be able to afford. My idea would work for everyone, even those who can't afford a laptop.

 

Okay, I tried to respond to all of your thoughts. Please read the follwoing idea and answer the question at the end. I will try and be more clear so here goes:

 

*****

 

As it is now, we get five 500 result PQ's per day, totalling 2500 results per day.

 

My idea is very simple. I am not even suggesting raising the amount of PQ's (5)or total daily results (2500).

 

What I am suggesting is simply allowing users to determine how they would like their 2500 results to be divided up. Here are several examples:

 

a) User A would like to keep using the current system (5 PQ's totalling 2500 results)

B) User b would like 3 PQ's (two of 1000 and one of 500) (2500 total)

c) User C would like 1 PQ of 2500 (2500 total)

d) User D would like 2 PQ's of 1250 (2500 total)

 

and so on...

 

As long as you do not exceed a) 5 PQ's in one day or B) 2500 results in one day.

 

Real world example:

 

So, with this example....the user who wanted to get the 2500 closet, traditional caches that they haven't found yet that are still active and not archived or temporarily disabled could do so in ONE STEP. But they would use their entire PQ allotment for that day. But, they would have one GPX file to download and deal with and be assured of getting no duplicates. This would be one, easy step.

 

Now if you have followed the thread thus far, please answer this simple question:

 

Can you please describe to me the current process (having to do 5 seperate PQ's of 500 max results each and be assured of no duplicates) whereby the same user would get the 2500 closet caches from oakland with the same criteria as the "real world example"?

 

It seems to me it would be 5 complicated steps that produce 5 GPX files, which would then need to be imported 5 seperate times into GSAK....and it seems that my suggestion would eliminate several steps for many users both at the PQ stage and at the GSAK stage.

 

If not please tell me what I am doing wrong.

 

Thanks for the help!

Edited by mantis7
Link to comment

My thought is that a PQ with 2500 caches in it would be almost 25Meg if not zipped. You know that there would be folks who don't zip their files and then complain because it takes so long to download or they don't receive the file because the service provider limits the size of attachments.

Link to comment
My thought is that a PQ with 2500 caches in it would be almost 25Meg if not zipped. You know that there would be folks who don't zip their files and then complain because it takes so long to download or they don't receive the file because the service provider limits the size of attachments.

 

There is a very simple solution to this. In fact its not really a solution, but rather a system which is already (for the most part) in place.

 

Lets assume your math is correct, and that a 2500 result PQ NOT ZIPPED is indeed 25 megabytes. I have never downloaded one that isn't zipped, so I do not know how big they are...but I am assuming that your number is valid.

 

A 500 result PQ would then be about 5 megabytes, right? (Not zipped).

 

Well, a 2500 result PQ *ZIPPED* would be less than 4 MB according to my math.

 

(751 k per 500 result PQ zipped, times 5)

 

So as you can see, a 2500 result PQ would actually take up LESS space (zipped) than a 500 result non zipped one.

 

This makes the solution obvious: If this idea were to be implemented, a .gpx file with more than 500 results would have to be sent zipped.

 

I don't think this would be much of a problem though...as the default as it is now is to send the files zipped...and I would imagine most users prefer their files sent in zip format.

 

BUT if anyone should want PQ's sent NON_ZIP format, fine. Just as long as the GPX file doesnt have more than 500 results (in other words the same as it is now).

 

Would that work for you?

Edited by mantis7
Link to comment

Well, let's start with the fact that I agree with you. I would love to have the ability to generate a PQ with all that I needed in one neatly package PQ.

 

For casual caching, I do not need that. Emphasis on the word "I". When I first started caching, my "not found" radius in Las Vegas was 35 miles. That fails in comparison to your scenario. Last I checked, the radius in San Jose was 10.2 miles for terrain level 3 and below. I'm not sure they have any terrain level 3 or higher by desert mountain standards, but that's getting off topic. Anyways... I needed a 70 mile circle to get what I needed. I had to run multiple PQ's. That was before I learned of the date placed method. So... after I found some caches in the hills around Las Vegas, and picked at the urban micros, my not found circle of 500 is over 100 miles. It will take you a couple years to get there, I do realize that.

 

If you area of operations is kept small, then the work is easier. If you try to extend that way out, (just in case you go there) then the work gets exponentially harder. Experience will teach you that. If you end up in an area you didn't expect, and you miss a few caches, then so what. Experience teaches you that too.

 

So, addressing the work around. In order for you to get the PQs you want, either you will need to run multiple days worth. Five today, five tomorrow, etc. Using the date placed method, you will get few, if any duplicates. You load them into GSAK in bulk, and you will have them all in one nice little database. Having that database on a laptop will allow you to download them to your GPS and have the offline capability of reading the page for the cache. We won't talk about WAP and such quite yet.

 

Or, you can plan better. Set up some PQ's for areas you might go to and don't set the day to run. That way, if you find out that you will be in Concord tomorrow, you can turn on the Concord and Vallejo PQs today, and have that info in a few minutes.(usually). You might even get some caches in benicia if you cross that bridge.

 

I know it's sometimes frustrating to know you just past a cache, but hey, it's part of the learning process. We can't get them all in one day. I'm sure that the bay area has more caches planted in a week that any person can keep up with.

 

As a heads up.. it's also frustrating when you go into an area and get all the caches, say Foster City, then get home and find that someone planted a new cache there and it was approved before you got home.

 

And for clarification...

The "nothing to sneeze at" comment was an effort to not deemphasis Hemlock's efforts. My best eweekend was 87 caches. Some were terrain level 4's. I got 2 hours of sleep in two very intense days. Also took a week to get them all logged in.

 

The current record for 24 hrs is 246. More than one per 6 minutes. Yes, it can be done. My personal speed record is 10 in 45 minutes, but that was more coinsidence than planning. There are other threads about lame micros.

 

So, why I still agree with your suggestion? I like road trips. Because of the increased cache density, it takes more PQs to make the same trip. What would help me even better is to get linear PQs. Those would be a PQ for a line, in your case, it might be as simple as Vallejo to Sacramento.

 

BTW: Welcome to the addiction!

Link to comment
My thought is that a PQ with 2500 caches in it would be almost 25Meg if not zipped. You know that there would be folks who don't zip their files and then complain because it takes so long to download or they don't receive the file because the service provider limits the size of attachments.

 

There is a very simple solution to this. In fact its not really a solution, but rather a system which is already (for the most part) in place.

 

Lets assume your math is correct, and that a 2500 result PQ NOT ZIPPED is indeed 25 megabytes. I have never downloaded one that isn't zipped, so I do not know how big they are...but I am assuming that your number is valid.

 

A 500 result PQ would then be about 5 megabytes, right? (Not zipped).

 

Well, a 2500 result PQ *ZIPPED* would be less than 4 MB according to my math.

 

(751 k per 500 result PQ zipped, times 5)

 

So as you can see, a 2500 result PQ would actually take up LESS space (zipped) than a 500 result non zipped one.

 

This makes the solution obvious: If this idea were to be implemented, a .gpx file with more than 500 results would have to be sent zipped.

 

I don't think this would be much of a problem though...as the default as it is now is to send the files zipped...and I would imagine most users prefer their files sent in zip format.

 

BUT if anyone should want PQ's sent NON_ZIP format, fine. Just as long as the GPX file doesnt have more than 500 results (in other words the same as it is now).

 

Would that work for you?

Slow down young grasshopper

 

Another issue that TPTB must consider is bandwidth. Sending 2000 caches a day to a person that they will not be using is overhead. Increased overhead = increased fees to the customer. Yes they do that now in seperate PQs. But... most of us don't do it, and it isn't wise from the bigger perspective.

 

Give it some time, you will learn that there is more to this than meets the eye.

Link to comment

Thanks for the reply , MooseMob!

 

I also want to get something clear. I _LOVE_ PQ's. What I mean was, I never intended to badmouth or say they weren't useful or that I "needed" a PQ of more than 500 results or anything like that. Which is sort of why I was surprised by the first reply, but anyway...I digress...

 

I read your response and I will definitely start planning better.

 

The current record for 24 hrs is 246. More than one per 6 minutes. Yes, it can be done. My personal speed record is 10 in 45 minutes, but that was more coinsidence than planning. There are other threads about lame micros.

 

Dang! Thats crazy LMAO. I'm not in this to set any records....the only reason I posted this suggestion is because I saw the Web page/ suggestion forums and I had just found out about PQ's and started playing with them.

 

So, addressing the work around. In order for you to get the PQs you want, either you will need to run multiple days worth. Five today, five tomorrow, etc. Using the date placed method, you will get few, if any duplicates. You load them into GSAK in bulk, and you will have them all in one nice little database. Having that database on a laptop will allow you to download them to your GPS and have the offline capability of reading the page for the cache. We won't talk about WAP and such quite yet.

 

Okay! This is what I will do for now (until they implement MY idea muahahahahaha). But seriously, thanks for your very helpful replies. I definietly want to get a laptop. I am going to Europe for a month (in May) and plan on finding at least one cache in every country I visit, so having a laptop will make things much easier. I am a computer junkie, but haven't had the need for a laptop...UNTIL NOW!

 

Another issue that TPTB must consider is bandwidth. Sending 2000 caches a day to a person that they will not be using is overhead. Increased overhead = increased fees to the customer. Yes they do that now in seperate PQs. But... most of us don't do it, and it isn't wise from the bigger perspective.

 

Well, I disagree with you on this one. In fact you seem to contradict yourself on this because just one quote ago you will see you told me to do five 500 result searches a day... But here is why I disagree with you: Anyone who wants the 2500 results will get them, one way or the other, And what good are they unless theyre updated? So you will have to update them which means doing LOTS of searches and LOTS up importing with GSAK. I would think it would be much easier to do ONE block of 2500 at a go.

 

Anyway thanks for actually reading / responding to my idea. And thanks for the hearty welcome.

 

Mantis7

Edited by mantis7
Link to comment
Thanks for the reply , MooseMob!

 

I also want to get something clear. I _LOVE_ PQ's. What I mean was, I never intended to badmouth or say they weren't useful or that I "needed" a PQ of more than 500 results or anything like that. Which is sort of why I was surprised by the first reply, but anyway...I digress...

 

I read your response and I will definitely start planning better.

 

The current record for 24 hrs is 246. More than one per 6 minutes. Yes, it can be done. My personal speed record is 10 in 45 minutes, but that was more coinsidence than planning. There are other threads about lame micros.

 

Dang! Thats crazy LMAO. I'm not in this to set any records....the only reason I posted this suggestion is because I saw the Web page/ suggestion forums and I had just found out about PQ's and started playing with them.

 

So, addressing the work around. In order for you to get the PQs you want, either you will need to run multiple days worth. Five today, five tomorrow, etc. Using the date placed method, you will get few, if any duplicates. You load them into GSAK in bulk, and you will have them all in one nice little database. Having that database on a laptop will allow you to download them to your GPS and have the offline capability of reading the page for the cache. We won't talk about WAP and such quite yet.

 

Okay! This is what I will do for now (until they implement MY idea muahahahahaha). But seriously, thanks for your very helpful replies. I definietly want to get a laptop. I am going to Europe for a month (in May) and plan on finding at least one cache in every country I visit, so having a laptop will make things much easier. I am a computer junkie, but haven't had the need for a laptop...UNTIL NOW!

 

Thanks again for all the advice and welcoming!

As the posts cross paths in the internet.

 

Great... many of us still are loobying for larger PQs for certain situations. For routine use, 500 works well for most of us most of the time.

 

I am jealous about the Europe thing. I retired from the Reserves 3 months prior to the first geocaching was ever placed, and now those opportunities are very limited.

 

Don't forget the main reason people like to geocache...

 

We geocache because it's fun ... let's keep it that way.

 

(I think I may have just found a new sig line.. kewl!)

Link to comment
I am jealous about the Europe thing. I retired from the Reserves 3 months prior to the first geocaching was ever placed, and now those opportunities are very limited.

 

I know I feel really lucky. I mean, it is almost May and I just lucked out and found out about geocaching JUST in time for my trip....I think it will add so much depth and fun to my trip...I can't wait.

 

We geocache because it's fun ... let's keep it that way.

 

True of every sport indeed, and true of life itself. Why bother if it isn't fun?

 

OH BTW - If you want to send a TB off to Europe you could try and send one up toward reno / lake tahoe / northern california.....I could try and grab it and take it with me...

 

Mantis7

Link to comment

Cool! I have a TB Farrah that has been in the South Bay for a month. Someone has it rigt now, and I think the tag may have got seperated from the item.

 

Then I was also thinking of welding a TB tag to a 2 foot piece of railroad track and starting that out. How far do you think would a 40 pound TB travel?

 

Edit: correct word.

Edited by Moose Mob
Link to comment

One of the other projects that Jeremy is working on (although I haven't heard when, if ever, it will happen) is caches within a proximity of a given route. There are several work-arounds to that, as well, but it involves multiple pocket queries. This sticky thread details the progress being made, and this page deals with one of the current solutions. Again - this solution is post pocket query.

 

Something else to consider is what type of caches you like to hunt. The suggestions on date ranges make it so that it would be next to impossible to get overlap in your PQs. But you can also break down the PQs by difficulty and terrain, or by size, or by type.

 

My own personal example: if I'm doing a road trip, I've probably got my kids with me, and I'll be on a short time frame. I'll be looking for caches with a terrain lower than 3.5, and I'll be grabbing single stage traditional ones. My kids won't tolerate a three mile walk, and my SWMBO won't allow me the time to do a long multi-stage.

 

Since you're thinking about spur-of-the-moment caching, you might consider limiting your criteria a little further to what you like to find spur-of-the-moment.

Link to comment
Before we start refering to you as "your highness"....

I did not set this as my title, I swear! I don't know who did but it was obviously someone "higher" up.... LOL (I'm serious)

That's funny. Wish I could find out the details behind how/why that happened.

 

But you can change it if you have not grown fond of it by now.

 

Recent Very Good Explination How To Change Avatar Text (Forum Title)

Link to comment
Another issue that TPTB must consider is bandwidth. Sending 2000 caches a day to a person that they will not be using is overhead. Increased overhead = increased fees to the customer. Yes they do that now in seperate PQs. But... most of us don't do it, and it isn't wise from the bigger perspective.

 

Well, I disagree with you on this one. In fact you seem to contradict yourself on this because just one quote ago you will see you told me to do five 500 result searches a day... But here is why I disagree with you: Anyone who wants the 2500 results will get them, one way or the other, ...

See, the thing is right now everyone doesn't create PQs to give them 2500 caches. I suspect that most people are like me. They have thier regular PQ run one day per week to give them the closest 500.

 

If there was an option to get 2500 caches in that PQ, I'd probably take it. I wouldn't need it and it wouldn't even be extremely useful to me. In fact, it would cause me more work because I couldn't just dump it into my 3+. I would still take the larger PQ and I bet that many others would also. Therefore, bandwidth would be needlessly increased.

Edited by sbell111
Link to comment
See, the thing is right now everyone doesn't create PQs to give them 2500 caches. I suspect that most people are like me. They have thier regular PQ run one day per week to give them the closest 500.

 

I can't pretend to know what other people do BUT I can tell you what other people have told ME to do. I have been told to set up PQ's that specify date ranges, and do several of them to make sure to get no duplicates. Multiple people have told me this is how they do it, and that this is how I should do it. Now, once I get all of this data, it is useless unless it is updated, so basicaly the only way I know of to keep them updated is to keep running all of these PQ's over and over again....

 

Not sure where you live but take a look at this:

 

in my zip code (95409) there are over 5000 caches.

 

In quite a few zip codes nearby me there are well over 5000 caches (San Francisco, Oakland, Sacramento, Sonoma, Vacaville, etc..)

 

But i think here is the most telling statement of yours:

 

If there was an option to get 2500 caches in that PQ, I'd probably take it.

 

Of course you would, especially if you had more than 500 caches nearby. That is why I think this idea is a pretty good one, because it seems like something that people would use.

 

I wouldn't need it and it wouldn't even be extremely useful to me. In fact, it would cause me more work because I couldn't just dump it into my 3+. I would still take the larger PQ and I bet that many others would also. Therefore, bandwidth would be needlessly increased.

 

I have never said anyone "needs" this feature. In fact we don't "need" PQ's at all, we could enter them in manually. I don't see how it would cause more work...because it would lead to less GPX files to download / import....

 

As far as the bandwidth, I do agree with you on that. It would increase traffic for sure. BUT, I don't have enough information as to whether it would be significant or cause a problem at present. One thing I can be sure of though, is that bandwidth is increasing in availability. The DSL speeds and server capabilities are only going up, so at some point I think the increased bandwidth will be a drop in the proverbial bucket.

 

Cheers!

Link to comment
in my zip code (95409) there are over 5000 caches.

 

In quite a few zip codes nearby me there are well over 5000 caches (San Francisco, Oakland, Sacramento, Sonoma, Vacaville, etc..)

That's not even close to true. The standard search based on a zip code goes out 100 miles, or 31,000 square miles. The actual zip codes themself have maybe 100 or so caches.

 

Even a delivery driver that doesn't know where his day's route will takje him, still won't cover 31,000 square miles. :ph34r:

 

I work in the southern part of the San Francisco Bay Area. I don't know where a typical day will take me, but I can logically narrow it down to the area of 2 PQ of about 400 caches each. On the few days I know I will travel outside that area, it only takes a few minutes to run another PQ and a few more minutes to Spin and Pluck it.

 

Besided that, I'm not a FTF hound, so I don't care if my PQs are 2 or 3 days old.

Link to comment
That's not even close to true. The standard search based on a zip code goes out 100 miles, or 31,000 square miles. The actual zip codes themself have maybe 100 or so caches.

 

You're right..What I meant to say was "when you do a search on my zip code , you get over 5000 results"

 

(The default is 100 mile radius...)

 

I did not mean to say there were 5000 caches within the exact zipcode of 95409, so I apologize if that caused any confusion.

 

Let me rephrase:

 

Within 100 miles of EACH the following cities there are well over 5000 caches:

 

Santa Rosa, Sonoma, San Francisco, Vacaville, Sacramento, Oakland...etc....

 

*some of these caches of course will be duplicates but the point I was trying to make is that it is a VERY dense area for caches.

 

Edit: I think I had sonoma in the list like 3 times LOL

Edited by mantis7
Link to comment
That's not even close to true. The standard search based on a zip code goes out 100 miles, or 31,000 square miles.

 

Oh and since were nitpicking...you might want to check your math. :ph34r:

 

(31,415.9............) is a closer number nut its not exact. LOL. (just messin' with ya')

Edited by mantis7
Link to comment

I thought 500 was overly generous.......

 

I would like a daily download of all caches within 100 miles of me too but the numbers stagger my imagination why I would need huge amounts of email capacity - new gps, memory cards for the PDA........oh wait - that's right - I forgot - I have less than 120 in that range........

Link to comment
I thought 500 was overly generous.......

 

I would like a daily download of all caches within 100 miles of me too but the numbers stagger my imagination why I would need huge amounts of email capacity - new gps, memory cards for the PDA........oh wait - that's right - I forgot - I have less than 120 in that range........

The reason I like to have a large number (like, for example 2500) is to be able to download them into MapSend and browse around, looking for groupings of caches in an interesting location. When I find an interesting looking location, I set my center in GSAK and filter a small subset of caches in that area and download them onto my GPS and Palm.

 

--Marky

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...