Report Site Issues Here

Discussion about the Geocaching Australia web site
User avatar
caughtatwork
Posts: 17013
Joined: 17 May 04 12:11 pm
Location: Melbourne
Contact:

Re: Report Site Issues Here

Post by caughtatwork » 17 June 17 12:03 pm

A wall of text.

There are some 3,627 geocaches in NSW that are not archived.
If I set up a query to grab them all I have no issues at all. Both GPX and ZIP.
The GPX file is just under 30MB. The ZIP file is just under 5MB.

There are some 1,930 geocaches in Tasmania that are not archived.
If I set up a query to grab them all I have no issues at all. Both GPX and ZIP.
The GPX file is just under 27MB. The ZIP file is just over 5MB.

The difference is logs. The Tasmanian geocaches are found much more frequently.

The download of the GPX and ZIP files from the state lists was an original implementation when we had very few geocaches per state and at that time we did not have the My Query function. So while the data was small and the ability to filter geocaches was unavailable, this was a good approach.

As the number of geocaches grows the "just give me everything" has caused a problem with the size of the data being returned from those lists for the GPX files, etc. We are now forcing you to get the data via a My Query. If we continue to maintain both output lists we run the machine out of RAM (memory). If someone would like to donate between $8K and $10K we could buy an additional database server. We would also like an additional $1,500 per year for the hosting and bandwith costs. Pause for a donation to appear... ... ... No. Didn't think so.

This is how simple it is:
http://geocaching.com.au/my/query/new
Click Create New Query (or directly here http://geocaching.com.au/my/query/new)
Set Number of items to return to 10000
Give it meaningful name
Tick Is Available, tick Is Unavailable (if you want)
State select (AU) Tasmania
Country select Australia
Click the "new" button at the bottom of the page

You're done with setting the query up.
Now you can use the
Count link to see how many geocaches there are
GPX to get the GPX file
ZIP to get the ZIP file

So taking (approximately) 10 seconds to set up a query, you can now run that query any time you like to get the whole state of Tasmania.

I don't recommend you do this every day and that will cause us to exhaust our bandwith and then no-one gets anything until the new month starts.

The Chicken Little sky is falling scenario is tired and worn out.
The accusations of denying Tasmanians access to the entire state data is unneeded passive aggressive behaviour and is tired and worn out.
The "why don't you do everything I want" is tired and worn out.

The volunteers at Geocaching Australia put in tens of thousands of dollars of free time over hundreds of hours per year (I average around between 750 and 1,000 hours of support at this site per year, at my usual rate of $100 per hour this is between $75K and $100K of free service the site gets). We will provide free and open access to the data as per the mantra of Geocaching Australia.

Free: You will need an account to create a my query, but there is no cost.
Open: Access to the data is (until we run out of memory) unlimited by a My Query and the new GCA API.

Most of the GCA API calls are restricted to groups of 500, some with pagination and some without.

If you're in the field and using an app that uses the API, I doubt you would need more than 500 geocaches from where you are right then and there but if you use the nearest search, you can have tens of thousands returned to you.

If you are using a My Query via the app you are restricted to 500 geocaches via the JSON call. Because JSON is not compressed output (it's plain text) 500 is a limit we set to minimise the bandwith but maximise the number of geocaches you might like to find that are near you. You cannot do more than 500 in a day, so there is not intent to give you 10,000 geocaches via the My Query API JSON response.

If you are not using an app and the API, the My Query should return what you need so you can store these offline (such as GSAK). For example I use a macro in GSAK which gets my list of queries and will return the results to me for the query I select. At the moment it returns all 671 unfound by me geocaches in Victoria. That comes in as a GPX file. This is not restricted but it would be preferable to get the ZIP file to minimise bandwith.

No one can have the whole database at once, on demand. I have stated before and I will state again if you are collecting stats, then we can work with anyone to provide the data as needed without sending GB's of geocaches and logs to them at once.

In very simple terms you can get what you want and what you need, you just cannot have custom code that runs to the detriment of the rest of the users and the site. There are many ways to get the data. If you tell us what you need we can try and accommodate that. Demanding change without consideration of the server, the users and the site will summarily dismissed.

User avatar
caughtatwork
Posts: 17013
Joined: 17 May 04 12:11 pm
Location: Melbourne
Contact:

Re: Report Site Issues Here

Post by caughtatwork » 17 June 17 12:11 pm

MavEtJu wrote:
CraigRat wrote:We pay for x GB of bandwidth.

We have to pay if that is exceeded.
Have you considered turning on gzip compression on HTTP level? It reduces the GPX/JSON data by a factor of four.

[~/dev/RubiksCubeGuide] edwin@MavvieMac>ls -lh
-rw-r--r-- 1 edwin staff 1.4M 17 Jun 10:17 test.gpx
-rw-r--r-- 1 edwin staff 689K 17 Jun 10:21 test.json
[~/dev/RubiksCubeGuide] edwin@MavvieMac>gzip test.*
[~/dev/RubiksCubeGuide] edwin@MavvieMac>ls -lh
-rw-r--r-- 1 edwin staff 332K 17 Jun 10:17 test.gpx.gz
-rw-r--r-- 1 edwin staff 171K 17 Jun 10:21 test.json.gz

Edwin
We do not use the GZIP compression built in to the server when we deliver files. It is all handled in the application.

MavEtJu
Posts: 486
Joined: 07 January 15 9:15 pm
Twitter: @mavetju
Location: Caringbah
Contact:

Re: Report Site Issues Here

Post by MavEtJu » 17 June 17 12:46 pm

caughtatwork wrote:
MavEtJu wrote:
CraigRat wrote:We pay for x GB of bandwidth.

We have to pay if that is exceeded.
Have you considered turning on gzip compression on HTTP level? It reduces the GPX/JSON data by a factor of four.

[~/dev/RubiksCubeGuide] edwin@MavvieMac>ls -lh
-rw-r--r-- 1 edwin staff 1.4M 17 Jun 10:17 test.gpx
-rw-r--r-- 1 edwin staff 689K 17 Jun 10:21 test.json
[~/dev/RubiksCubeGuide] edwin@MavvieMac>gzip test.*
[~/dev/RubiksCubeGuide] edwin@MavvieMac>ls -lh
-rw-r--r-- 1 edwin staff 332K 17 Jun 10:17 test.gpx.gz
-rw-r--r-- 1 edwin staff 171K 17 Jun 10:21 test.json.gz

Edwin
We do not use the GZIP compression built in to the server when we deliver files. It is all handled in the application.
Give it a try, it's called mod_deflate on Apache 2.x. It will perform the compression on HTTP level instead of on application level. See http://www.simonwhatley.co.uk/how-to-co ... components for the way I configured it in the past.

Edwin

User avatar
caughtatwork
Posts: 17013
Joined: 17 May 04 12:11 pm
Location: Melbourne
Contact:

Re: Report Site Issues Here

Post by caughtatwork » 17 June 17 12:49 pm

MavEtJu wrote:
caughtatwork wrote:
MavEtJu wrote:
CraigRat wrote:We pay for x GB of bandwidth.

We have to pay if that is exceeded.
Have you considered turning on gzip compression on HTTP level? It reduces the GPX/JSON data by a factor of four.

[~/dev/RubiksCubeGuide] edwin@MavvieMac>ls -lh
-rw-r--r-- 1 edwin staff 1.4M 17 Jun 10:17 test.gpx
-rw-r--r-- 1 edwin staff 689K 17 Jun 10:21 test.json
[~/dev/RubiksCubeGuide] edwin@MavvieMac>gzip test.*
[~/dev/RubiksCubeGuide] edwin@MavvieMac>ls -lh
-rw-r--r-- 1 edwin staff 332K 17 Jun 10:17 test.gpx.gz
-rw-r--r-- 1 edwin staff 171K 17 Jun 10:21 test.json.gz

Edwin
We do not use the GZIP compression built in to the server when we deliver files. It is all handled in the application.
Give it a try, it's called mod_deflate on Apache 2.x. It will perform the compression on HTTP level instead of on application level. See http://www.simonwhatley.co.uk/how-to-co ... components for the way I configured it in the past.

Edwin
Again you assume the way the data is stored and prepared for delivery. This is not possible without a major rewrite. The point is moot because you can't have that volume of data and I shan't be re-enabling those links. Please use the correct tools for the correct job.

MavEtJu
Posts: 486
Joined: 07 January 15 9:15 pm
Twitter: @mavetju
Location: Caringbah
Contact:

Re: Report Site Issues Here

Post by MavEtJu » 17 June 17 1:17 pm

caughtatwork wrote:
Again you assume the way the data is stored and prepared for delivery. This is not possible without a major rewrite. The point is moot because you can't have that volume of data and I shan't be re-enabling those links. Please use the correct tools for the correct job.
This was in relation to the note about the bandwidth usage for the queries, not about the nuking of the GPX links.

Edwin

User avatar
WazzaAndWenches
5000 or more caches found
5000 or more caches found
Posts: 395
Joined: 08 April 07 10:28 pm
Location: Echuca, Vic

Re: Report Site Issues Here

Post by WazzaAndWenches » 02 July 17 11:05 pm

I had a discussion with a non GCA cacher at an event last night and have probably convinced her to take a serious look at GCA caching. As a bit of research to help her see the benefits of this listing site I checked her general stats page with the idea of emailing the page to her. I found a probable fault with the stats. As she has never logged a hide or find on a GCA cache her verbosity stats show Longest log of 0 words, shortest log of 9,999,999 words.

User avatar
caughtatwork
Posts: 17013
Joined: 17 May 04 12:11 pm
Location: Melbourne
Contact:

Re: Report Site Issues Here

Post by caughtatwork » 03 July 17 9:17 am

What's the geocacher name please. It's 100% impossible to troubleshoot a problem without knowing where the problem is happening or to which geocache the problem is showing up. Thanks.

User avatar
WazzaAndWenches
5000 or more caches found
5000 or more caches found
Posts: 395
Joined: 08 April 07 10:28 pm
Location: Echuca, Vic

Re: Report Site Issues Here

Post by WazzaAndWenches » 03 July 17 10:14 am


User avatar
caughtatwork
Posts: 17013
Joined: 17 May 04 12:11 pm
Location: Melbourne
Contact:

Re: Report Site Issues Here

Post by caughtatwork » 04 July 17 8:43 pm

Thanks. All fixed.

User avatar
WazzaAndWenches
5000 or more caches found
5000 or more caches found
Posts: 395
Joined: 08 April 07 10:28 pm
Location: Echuca, Vic

Re: Report Site Issues Here

Post by WazzaAndWenches » 05 July 17 10:17 am

Great work, as always.

Another one (groan)... GA10378 has a formatting issue. The source code seems to have a very, very long url for an image and it's causing the cache description to format incorrectly.

"...and they are injecting money into the community through visiting the local cafe, roadhouse or staying at the caravan park.<img alt="" src="..."

A job for the faeries or should I contact the cache owner directly?

User avatar
caughtatwork
Posts: 17013
Joined: 17 May 04 12:11 pm
Location: Melbourne
Contact:

Re: Report Site Issues Here

Post by caughtatwork » 05 July 17 10:48 am

WazzaAndWenches wrote:Great work, as always.

Another one (groan)... GA10378 has a formatting issue. The source code seems to have a very, very long url for an image and it's causing the cache description to format incorrectly.

"...and they are injecting money into the community through visiting the local cafe, roadhouse or staying at the caravan park.<img alt="" src="..."

A job for the faeries or should I contact the cache owner directly?
Fixed.

User avatar
spatialriq
5500 or more caches found
5500 or more caches found
Posts: 326
Joined: 12 May 11 11:52 pm
Location: Perth

Re: Report Site Issues Here

Post by spatialriq » 07 July 17 1:30 pm

Just wondering if we're any closer to getting the forum posts to be able to be displayed by date of post as opposed to only by geocachers names.

I'm forever scrolling thru trying to work out which threads are the latest. 8-[

User avatar
CraigRat
850 or more found!!!
850 or more found!!!
Posts: 7015
Joined: 23 August 04 3:17 pm
Twitter: CraigRat
Facebook: http://facebook.com/CraigRat
Location: Launceston, TAS
Contact:

Re: Report Site Issues Here

Post by CraigRat » 07 July 17 1:32 pm

Short answer is no.

Sorry.

Does the 'view new posts' link work for you?

User avatar
spatialriq
5500 or more caches found
5500 or more caches found
Posts: 326
Joined: 12 May 11 11:52 pm
Location: Perth

Re: Report Site Issues Here

Post by spatialriq » 07 July 17 4:03 pm

CraigRat wrote:Does the 'view new posts' link work for you?
Ah yes, hadn't noticed that before :)

Usually just click on the "View active topics".

Thanks!

User avatar
WazzaAndWenches
5000 or more caches found
5000 or more caches found
Posts: 395
Joined: 08 April 07 10:28 pm
Location: Echuca, Vic

Re: Report Site Issues Here

Post by WazzaAndWenches » 18 July 17 10:43 am

I noticed a post in the secret Senate forum a short time ago with a post time of 0946 18/7/17 yet the actual time was 0926 18/7/17. A known fault or just one of those mysterious computer things that we should ignore until the new forum software is up and running?

Post Reply