Re: Report Site Issues Here
Posted: 17 June 17 12:03 pm
A wall of text.
There are some 3,627 geocaches in NSW that are not archived.
If I set up a query to grab them all I have no issues at all. Both GPX and ZIP.
The GPX file is just under 30MB. The ZIP file is just under 5MB.
There are some 1,930 geocaches in Tasmania that are not archived.
If I set up a query to grab them all I have no issues at all. Both GPX and ZIP.
The GPX file is just under 27MB. The ZIP file is just over 5MB.
The difference is logs. The Tasmanian geocaches are found much more frequently.
The download of the GPX and ZIP files from the state lists was an original implementation when we had very few geocaches per state and at that time we did not have the My Query function. So while the data was small and the ability to filter geocaches was unavailable, this was a good approach.
As the number of geocaches grows the "just give me everything" has caused a problem with the size of the data being returned from those lists for the GPX files, etc. We are now forcing you to get the data via a My Query. If we continue to maintain both output lists we run the machine out of RAM (memory). If someone would like to donate between $8K and $10K we could buy an additional database server. We would also like an additional $1,500 per year for the hosting and bandwith costs. Pause for a donation to appear... ... ... No. Didn't think so.
This is how simple it is:
http://geocaching.com.au/my/query/new
Click Create New Query (or directly here http://geocaching.com.au/my/query/new)
Set Number of items to return to 10000
Give it meaningful name
Tick Is Available, tick Is Unavailable (if you want)
State select (AU) Tasmania
Country select Australia
Click the "new" button at the bottom of the page
You're done with setting the query up.
Now you can use the
Count link to see how many geocaches there are
GPX to get the GPX file
ZIP to get the ZIP file
So taking (approximately) 10 seconds to set up a query, you can now run that query any time you like to get the whole state of Tasmania.
I don't recommend you do this every day and that will cause us to exhaust our bandwith and then no-one gets anything until the new month starts.
The Chicken Little sky is falling scenario is tired and worn out.
The accusations of denying Tasmanians access to the entire state data is unneeded passive aggressive behaviour and is tired and worn out.
The "why don't you do everything I want" is tired and worn out.
The volunteers at Geocaching Australia put in tens of thousands of dollars of free time over hundreds of hours per year (I average around between 750 and 1,000 hours of support at this site per year, at my usual rate of $100 per hour this is between $75K and $100K of free service the site gets). We will provide free and open access to the data as per the mantra of Geocaching Australia.
Free: You will need an account to create a my query, but there is no cost.
Open: Access to the data is (until we run out of memory) unlimited by a My Query and the new GCA API.
Most of the GCA API calls are restricted to groups of 500, some with pagination and some without.
If you're in the field and using an app that uses the API, I doubt you would need more than 500 geocaches from where you are right then and there but if you use the nearest search, you can have tens of thousands returned to you.
If you are using a My Query via the app you are restricted to 500 geocaches via the JSON call. Because JSON is not compressed output (it's plain text) 500 is a limit we set to minimise the bandwith but maximise the number of geocaches you might like to find that are near you. You cannot do more than 500 in a day, so there is not intent to give you 10,000 geocaches via the My Query API JSON response.
If you are not using an app and the API, the My Query should return what you need so you can store these offline (such as GSAK). For example I use a macro in GSAK which gets my list of queries and will return the results to me for the query I select. At the moment it returns all 671 unfound by me geocaches in Victoria. That comes in as a GPX file. This is not restricted but it would be preferable to get the ZIP file to minimise bandwith.
No one can have the whole database at once, on demand. I have stated before and I will state again if you are collecting stats, then we can work with anyone to provide the data as needed without sending GB's of geocaches and logs to them at once.
In very simple terms you can get what you want and what you need, you just cannot have custom code that runs to the detriment of the rest of the users and the site. There are many ways to get the data. If you tell us what you need we can try and accommodate that. Demanding change without consideration of the server, the users and the site will summarily dismissed.
There are some 3,627 geocaches in NSW that are not archived.
If I set up a query to grab them all I have no issues at all. Both GPX and ZIP.
The GPX file is just under 30MB. The ZIP file is just under 5MB.
There are some 1,930 geocaches in Tasmania that are not archived.
If I set up a query to grab them all I have no issues at all. Both GPX and ZIP.
The GPX file is just under 27MB. The ZIP file is just over 5MB.
The difference is logs. The Tasmanian geocaches are found much more frequently.
The download of the GPX and ZIP files from the state lists was an original implementation when we had very few geocaches per state and at that time we did not have the My Query function. So while the data was small and the ability to filter geocaches was unavailable, this was a good approach.
As the number of geocaches grows the "just give me everything" has caused a problem with the size of the data being returned from those lists for the GPX files, etc. We are now forcing you to get the data via a My Query. If we continue to maintain both output lists we run the machine out of RAM (memory). If someone would like to donate between $8K and $10K we could buy an additional database server. We would also like an additional $1,500 per year for the hosting and bandwith costs. Pause for a donation to appear... ... ... No. Didn't think so.
This is how simple it is:
http://geocaching.com.au/my/query/new
Click Create New Query (or directly here http://geocaching.com.au/my/query/new)
Set Number of items to return to 10000
Give it meaningful name
Tick Is Available, tick Is Unavailable (if you want)
State select (AU) Tasmania
Country select Australia
Click the "new" button at the bottom of the page
You're done with setting the query up.
Now you can use the
Count link to see how many geocaches there are
GPX to get the GPX file
ZIP to get the ZIP file
So taking (approximately) 10 seconds to set up a query, you can now run that query any time you like to get the whole state of Tasmania.
I don't recommend you do this every day and that will cause us to exhaust our bandwith and then no-one gets anything until the new month starts.
The Chicken Little sky is falling scenario is tired and worn out.
The accusations of denying Tasmanians access to the entire state data is unneeded passive aggressive behaviour and is tired and worn out.
The "why don't you do everything I want" is tired and worn out.
The volunteers at Geocaching Australia put in tens of thousands of dollars of free time over hundreds of hours per year (I average around between 750 and 1,000 hours of support at this site per year, at my usual rate of $100 per hour this is between $75K and $100K of free service the site gets). We will provide free and open access to the data as per the mantra of Geocaching Australia.
Free: You will need an account to create a my query, but there is no cost.
Open: Access to the data is (until we run out of memory) unlimited by a My Query and the new GCA API.
Most of the GCA API calls are restricted to groups of 500, some with pagination and some without.
If you're in the field and using an app that uses the API, I doubt you would need more than 500 geocaches from where you are right then and there but if you use the nearest search, you can have tens of thousands returned to you.
If you are using a My Query via the app you are restricted to 500 geocaches via the JSON call. Because JSON is not compressed output (it's plain text) 500 is a limit we set to minimise the bandwith but maximise the number of geocaches you might like to find that are near you. You cannot do more than 500 in a day, so there is not intent to give you 10,000 geocaches via the My Query API JSON response.
If you are not using an app and the API, the My Query should return what you need so you can store these offline (such as GSAK). For example I use a macro in GSAK which gets my list of queries and will return the results to me for the query I select. At the moment it returns all 671 unfound by me geocaches in Victoria. That comes in as a GPX file. This is not restricted but it would be preferable to get the ZIP file to minimise bandwith.
No one can have the whole database at once, on demand. I have stated before and I will state again if you are collecting stats, then we can work with anyone to provide the data as needed without sending GB's of geocaches and logs to them at once.
In very simple terms you can get what you want and what you need, you just cannot have custom code that runs to the detriment of the rest of the users and the site. There are many ways to get the data. If you tell us what you need we can try and accommodate that. Demanding change without consideration of the server, the users and the site will summarily dismissed.