Report Site Issues Here

Discussion about the Geocaching Australia web site
Laighside Legends
10000 or more caches found
10000 or more caches found
Posts: 1304
Joined: 05 October 10 10:20 pm
Location: Australia

Re: Report Site Issues Here

Post by Laighside Legends » 07 June 17 4:02 pm

Yes, it seems to work now. Although it still says the zones are submitted for review even though they are just accepted automatically?

User avatar
caughtatwork
Posts: 17015
Joined: 17 May 04 12:11 pm
Location: Melbourne
Contact:

Re: Report Site Issues Here

Post by caughtatwork » 07 June 17 4:47 pm

We turned off reviewing a long time ago but it looks like the message has never been removed.

User avatar
Richary
8000 or more caches found
8000 or more caches found
Posts: 4189
Joined: 04 February 04 10:55 pm
Location: Waitara, Sydney

Re: Report Site Issues Here

Post by Richary » 12 June 17 8:47 pm

I am still not quite sure where all my USA finds came from (probably old GC locationless style ones) but this state seems to have an interesting name.

1 (Iadho)

User avatar
caughtatwork
Posts: 17015
Joined: 17 May 04 12:11 pm
Location: Melbourne
Contact:

Re: Report Site Issues Here

Post by caughtatwork » 13 June 17 6:42 pm

Yes, locationless were an issue. They have now been tidied as has our appalling spelling

MavEtJu
Posts: 486
Joined: 07 January 15 9:15 pm
Twitter: @mavetju
Location: Caringbah
Contact:

Re: Report Site Issues Here

Post by MavEtJu » 15 June 17 11:01 am

From the last four attempts to download the Tasmania GPX file I managed to have three memory failures and one successful one:

[~/dev/GC/Geocube] edwin@MavvieMac>ls -lah files/tas.gpx
-rw-r--r--@ 1 edwin staff 4.4M 15 Jun 09:57 files/tas.gpx
[~/dev/GC/Geocube] edwin@MavvieMac>tail -4 files/tas.gpx
<p><span style="font-size:14px"><span style="font-family:arial,helvetica,sans-serif">To log a find on the Geocaching Ausralia website, you will need to include a picture of the CORS, along with your GPS receiver and preferably yourself.  You are encouraged to leave a description of your journey in your log to help others in finding the CORS. </span></span></p>
]]></groundspeak:long_description>
<br />
<b>Fatal error</b>: Allowed memory size of 33554432 bytes exhausted (tried to allocate 16384 bytes) in <b>/var/www/site/gca/include/gca_database.php</b> on line <b>30</b><br />

URL used: geocaching.com.au/caches/available/gca/au/tas.gpx

Edwin

User avatar
caughtatwork
Posts: 17015
Joined: 17 May 04 12:11 pm
Location: Melbourne
Contact:

Re: Report Site Issues Here

Post by caughtatwork » 15 June 17 10:42 pm

Set a smaller download via the my query function. The downloads are too large and will be removed shortly.

MavEtJu
Posts: 486
Joined: 07 January 15 9:15 pm
Twitter: @mavetju
Location: Caringbah
Contact:

Re: Report Site Issues Here

Post by MavEtJu » 15 June 17 11:11 pm

caughtatwork wrote:Set a smaller download via the my query function. The downloads are too large and will be removed shortly.
If they are going to be removed, please test in advance if the queries are able to download the data required. Currently the NSW data cannot be downloaded because there is one single day contains more than the maximum number of 500 entries.

In the mean time, will the people in Tasmania be denied to the data of their state?

Edwin

MavEtJu
Posts: 486
Joined: 07 January 15 9:15 pm
Twitter: @mavetju
Location: Caringbah
Contact:

Re: Report Site Issues Here

Post by MavEtJu » 16 June 17 9:15 am

MavEtJu wrote:Currently the NSW data cannot be downloaded because there is one single day contains more than the maximum number of 500 entries.
That is only for the JSON data, not for the GPX data. But still...

User avatar
caughtatwork
Posts: 17015
Joined: 17 May 04 12:11 pm
Location: Melbourne
Contact:

Re: Report Site Issues Here

Post by caughtatwork » 16 June 17 2:59 pm

The state files are now too large and will run the machine out of memory. I have removed the ability to download GPX and ZIP state files from the caches list page. You can still download the data but you will need to create a query and there is a link on the state list pages to take you to your My Query page.

This is a pleasing outcome of more and more geocaches being listed, but we do not have the infrastructure to support endless, full state downloads.

If you are using JSON, please stop. That is deprecated in favour of the new API and will be removed without notice.

MavEtJu
Posts: 486
Joined: 07 January 15 9:15 pm
Twitter: @mavetju
Location: Caringbah
Contact:

Re: Report Site Issues Here

Post by MavEtJu » 16 June 17 10:30 pm

caughtatwork wrote:The state files are now too large and will run the machine out of memory. I have removed the ability to download GPX and ZIP state files from the caches list page. You can still download the data but you will need to create a query and there is a link on the state list pages to take you to your My Query page.
Maybe as a public service these batch of queries of 500 caches for each state could be provided by GCA instead of everybody making their own?

Edwin

User avatar
Richary
8000 or more caches found
8000 or more caches found
Posts: 4189
Joined: 04 February 04 10:55 pm
Location: Waitara, Sydney

Re: Report Site Issues Here

Post by Richary » 16 June 17 11:04 pm

MavEtJu wrote:Maybe as a public service these batch of queries of 500 caches for each state could be provided by GCA instead of everybody making their own?

Edwin
Same argument I have with GC - why can't I just download a premade query of every cache in NSW? Yes it's getting close to 20,000 and the argument is that some states in the USA will be much much higher. But still with unlimited download limits these days and high speed why not? How many people would like a download of every cache in their state from GC? Instead I have to set up (at the moment) 16 PQs with date ranges that are all generated separately by their server, when instead they could just provide one link for a Australia-NSW PQ that is generated nightly and downloadable with no load on their system apart from the one off generation at midnight their time?

I am not sure of the behind the scenes issues here that would stop GCA doing the same, as I have my own PQ that generates all NSW caches for me, and emails them to me every Thursday morning.

Sol de Lune
Posts: 1943
Joined: 09 December 07 7:08 pm
Location: Lucas, Ballarat Victoria

Re: Report Site Issues Here

Post by Sol de Lune » 17 June 17 3:10 am

MavEtJu wrote:
caughtatwork wrote:The state files are now too large and will run the machine out of memory. I have removed the ability to download GPX and ZIP state files from the caches list page. You can still download the data but you will need to create a query and there is a link on the state list pages to take you to your My Query page.
Maybe as a public service these batch of queries of 500 caches for each state could be provided by GCA instead of everybody making their own?

Edwin
A public service??? WTF???? People are quite capable of calling their own PQ's for the area's they want to cache in.....and if you think about it, that is probably what they want to do anyway. Why? Because they can control the area's they want to call caches in. Any 'batch' of 500 caches may not completely cover the area they want to cache in. What's the point of that?

To be perfectly blunt, as the GCA cacher with the most (unique) finds in Australia, I find it hard to fathom how you post the info you do without expecting any backlash. Looking at your stats you have found a total of 319 Geocaching Australia caches in 2.5 years, so why would you need a listing of every GCA cache in NSW, when there is every likely hood that you will never find them? The thought of all that is beyond me? Add to that your post about Tasmanian caches and how....."In the meantime, will the people in Tasmania be denied to the data of their state", I wonder if you are really thinking about the good of the Geocaching Australia game, or just grandstanding to push you're own personal barrow???

Tasmania is one of Geocaching Australia's hotspots.....and they have been doing fine for the last 10 years. Your comment about them being denied is both insulting and simply rude. It shows you have no idea about how folk go about caching, rather than you wanting to push your own agenda about how you think the Geocaching Australia site should be run.

Instead of running against us, as you have certainly done, why not help us to promote our product.....because up to now you have been nothing but a PITA.

And please, don't play the.... "I'm only trying to help card", because that ran out with me a long time ago. Do you remember the 'I Hate Muggles' response from last year!!!!

If you want to keep arguing the point, I'm up for it. My goal is promote Geocaching Australia without trying to blindside people with knowledge of programming/coding/hacking....what is your goal???????

User avatar
CraigRat
850 or more found!!!
850 or more found!!!
Posts: 7015
Joined: 23 August 04 3:17 pm
Twitter: CraigRat
Facebook: http://facebook.com/CraigRat
Location: Launceston, TAS
Contact:

Re: Report Site Issues Here

Post by CraigRat » 17 June 17 9:14 am

We pay for x GB of bandwidth.

We have to pay if that is exceeded.

crew 153
9000 or more caches found
9000 or more caches found
Posts: 1099
Joined: 09 October 04 7:51 pm
Location: Calamvale, Brisbane
Contact:

Re: Report Site Issues Here

Post by crew 153 » 17 June 17 11:13 am

In answer to Richary about maintaining his NSW database..

I maintain my NSW DB (and GCA) with 9 PQs weekly. I have them set up with different date ranges and set them for caches which have had logs in the last 7 days. Each PQ is set to provide about 600-700 caches to cater for weekly fluctuations (and school holidays).

MavEtJu
Posts: 486
Joined: 07 January 15 9:15 pm
Twitter: @mavetju
Location: Caringbah
Contact:

Re: Report Site Issues Here

Post by MavEtJu » 17 June 17 11:32 am

CraigRat wrote:We pay for x GB of bandwidth.

We have to pay if that is exceeded.
Have you considered turning on gzip compression on HTTP level? It reduces the GPX/JSON data by a factor of four.

[~/dev/RubiksCubeGuide] edwin@MavvieMac>ls -lh
-rw-r--r-- 1 edwin staff 1.4M 17 Jun 10:17 test.gpx
-rw-r--r-- 1 edwin staff 689K 17 Jun 10:21 test.json
[~/dev/RubiksCubeGuide] edwin@MavvieMac>gzip test.*
[~/dev/RubiksCubeGuide] edwin@MavvieMac>ls -lh
-rw-r--r-- 1 edwin staff 332K 17 Jun 10:17 test.gpx.gz
-rw-r--r-- 1 edwin staff 171K 17 Jun 10:21 test.json.gz

Edwin

Post Reply