Geocache/GPX Data

Discussion about the Geocaching Australia web site
vk3jap
Posts: 50
Joined: 25 May 08 3:07 pm
Location: Macedon Ranges,Vic,Au

Geocache/GPX Data

Post by vk3jap » 07 March 09 1:24 pm

I've been thinking about the process of getting GEOCACHE data be it gc or GCA, perhaps could be done better? Personally I now use GSAK exclusively but of course the value of GSAK is tied to the freshness of your cache information. I'm now using the hidden date ranges to good effect and whilst this is a great work around to the 500 cache limit I'm still left thinking.. some questions..

1. Has anyone written up a decent article explaining how the GC cache information trickles into GCA - is it just a bunch of queries contributed from various members with premium accounts or..?

2. What is the reason why we can't grab GPX information from GCA containing both GC and GCA caches.. the information is clearly held in the site DB. Aside from the possible issue with a user having cache information that is a day or two old is there another reason?

3. Has anyone shared their pocket query output via a web server.? i.e. do your hidden date range queries then post them on a website. The GPX does have a flag saying if the QUERIER has found the cache but a simple perl script would fix that such that they can be used by all.

I did some searching but surely there are many other posts on this point, could someone post URL's here please.

aside from pointed questions, appreciate comments from others on the subject. With so many premium cachers doing so many pocket queries surely we could have a better system. i.e. check this URL://viccaches.gpx on Tuesday and Friday afternoon and you will get ONE file with all VIC caches listed, import that into GSAK - and then we could do the same with NSW, QLD etc etc

User avatar
squalid
2700 or more caches found
2700 or more caches found
Posts: 255
Joined: 06 February 04 12:36 pm
Location: Melbourne

Post by squalid » 07 March 09 2:37 pm

This is similar to what roblisa.com used to provide, which was gold for new cachers. I would like to see such files hosted on GCA if legally and politically possible. The whole 500 limit thing makes it hard for noobs to get going. I'm not too sure about copyright issues though - there's no explicit copyright message, although Groundspeak do claim authorship within the file:

" <desc>Geocache file generated by Groundspeak</desc>
<author>Groundspeak</author>
<email>contact@groundspeak.com</email> "

By the way - I AM NOT A LAWYER, so quote me at your own risk

Pros:
Bandwidth saving - GC seem to have the 500 limit to cut down on workload - DB and Network - off-loading to GCA helps here.

Cons:
Extra bandwidth for GCA, but no DB overhead
Potential revenue loss for GC - less need to become a premium member, although I'd keeep mine up for overseas stuff.

User avatar
caughtatwork
Posts: 17016
Joined: 17 May 04 12:11 pm
Location: Melbourne
Contact:

Re: Geocache/GPX Data

Post by caughtatwork » 07 March 09 3:28 pm

vk3jap wrote:1. Has anyone written up a decent article explaining how the GC cache information trickles into GCA - is it just a bunch of queries contributed from various members with premium accounts or..?
No. If we did, it would publish how it's done and that would not be a good thing from a GC / GCA relationship. You can trust the faeries to do what they do.
vk3jap wrote:2. What is the reason why we can't grab GPX information from GCA containing both GC and GCA caches.. the information is clearly held in the site DB. Aside from the possible issue with a user having cache information that is a day or two old is there another reason?
Groundspeak data which is owned by the cache creator is not ours to distribute. Note that apart from the name of the cache and a few other trivial things we don't display that data. Nothing that we do display it copyrightable so we are not in breach of their ToS.
vk3jap wrote:3. Has anyone shared their pocket query output via a web server.? i.e. do your hidden date range queries then post them on a website. The GPX does have a flag saying if the QUERIER has found the cache but a simple perl script would fix that such that they can be used by all.
That's against the Groundspeak terms of use which the GPX file creator signed up to.
vk3jap wrote:With so many premium cachers doing so many pocket queries surely we could have a better system. i.e. check this URL://viccaches.gpx on Tuesday and Friday afternoon and you will get ONE file with all VIC caches listed, import that into GSAK - and then we could do the same with NSW, QLD etc etc
Groundspeak have said they won't do this, but won't give the reasons.

User avatar
caughtatwork
Posts: 17016
Joined: 17 May 04 12:11 pm
Location: Melbourne
Contact:

Post by caughtatwork » 07 March 09 3:30 pm

squalid wrote:This is similar to what roblisa.com used to provide, which was gold for new cachers. I would like to see such files hosted on GCA if legally and politically possible.
Not possible. The receivers of GPX file say they won't do what you are suggesting and if they do they are in breach of the ToS and can have their account cut off. I doubt there would be a legal pursuit (especially as we are in different counties), but we are simply not going to attempt to reach their ToS in this way.

vk3jap
Posts: 50
Joined: 25 May 08 3:07 pm
Location: Macedon Ranges,Vic,Au

Post by vk3jap » 07 March 09 3:44 pm

Thanks C@W .. I think you've pretty much closed the thread with those words!, No, No, No and No..but your points are valid, so thanks for chiming in.

a great scenario of how to take a game, make it commercial and control the world, profit seems to be the ruler of all logic from HQ, at least that's what one might assume - standing on the outside looking in.

Anyone else thinking of posting please do still.. interested to hear other opinion/interest etc.

it's be great to see ideas and thoughts on how something might be possible/do-able, as opposed to the multitude of reasons why not etc, please be constructive. if you've got a negative - just PM or email it to me on vk3jap@vk3jap.net and at some point I'll collate and dot point the reasons why this hasn't happened to date and is not still - for history's sake that's important to know.

User avatar
squalid
2700 or more caches found
2700 or more caches found
Posts: 255
Joined: 06 February 04 12:36 pm
Location: Melbourne

Post by squalid » 07 March 09 5:11 pm

BitTorrent?

Each time one of us creates the full vic build, share it on bitTorrent, or even more group or friend oriented, OneSwarm.

e.g. latest victoria vic-2009-03-08.gpx.

Not too hard to load into gsak, and generate a regular pq to create updates.

vk3jap
Posts: 50
Joined: 25 May 08 3:07 pm
Location: Macedon Ranges,Vic,Au

Post by vk3jap » 07 March 09 5:26 pm

I'm happy to donante space on geotech.vk3jap.net - plenty of room there.

just playing in PErl now to see how I can remote the header of the GPX file such that we can take the 8 PQ's with assosiated WayPoints and make them one file. then just a matter of removing the <sym>Geocache Found</sym> and replacing it with <sym>Geocache</sym> such that all users get an untainted feed.

here's what I'm thinking...

1. get RAW PQ data sent do dummy emil account. hopefully all in one day - like Friday each week etc.
2. Use program like formail/procmail or something like that to strip attachments into one folder.
3. process each file and where you find <sym>Geocache Found</sym> replace it with <sym>Geocache</sym>
4. remove following line from head sections of files other than first file.
<gpx xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" version="1.0" creator="Groundspeak Pocket Query" xsi:schemaLocation="http://www.topografix.com/GPX/1/0 http://www.topografix.com/GPX/1/0/gpx.xsd http://www.groundspeak.com/cache/1/0 http://www.groundspeak.com/cache/1/0/cache.xsd" xmlns="http://www.topografix.com/GPX/1/0">
<name>HQ-VIC-4</name>
<desc>Geocache file generated by Groundspeak</desc>
<author>Groundspeak</author>
<email>contact@groundspeak.com</email>
<time>2009-03-05T17:05:34.9001113-08:00</time>
<keywords>cache, geocache, groundspeak</keywords>
<bounds minlat="-39.134117" minlon="141.031667" maxlat="-34.600767" maxlon="149.626117" />
5. a bit more massaging, then we have one final GPX file for all of vic, unbiased for any user, usable by all, with all child waypoints in it
6. FTP upload to http://geotech.vk3jap.net somewhere
7. Add a date/time stamp on that webpage so users know the data is being updated.

Bob = uncle..

at least that's what I'm working on right now. As I'm not a coder I'm having a hard time with step 4 at the moment.. help welcome. Like to do it with sed/grep or Perl if possible.

User avatar
ideology
Posts: 2763
Joined: 28 March 03 4:01 pm
Location: Sydney
Contact:

Post by ideology » 07 March 09 6:29 pm

vk3jap wrote:Anyone else thinking of posting please do still.. interested to hear other opinion/interest etc.

it's be great to see ideas and thoughts on how something might be possible/do-able, as opposed to the multitude of reasons why not etc, please be constructive.
you'll always bash up against the terms of use issue. the most constructive thing we could think of was to set up and encourage people to list their caches on a site that promotes sharing cache information rather than locking it down.

(having said that, we're having some fun corresponding with the NSW Minister for Transport and CityRail on a similar issue: http://www.smh.com.au/news/digital-life ... 37210.html it looks like sanity will prevail!)

User avatar
CraigRat
850 or more found!!!
850 or more found!!!
Posts: 7015
Joined: 23 August 04 3:17 pm
Twitter: CraigRat
Facebook: http://facebook.com/CraigRat
Location: Launceston, TAS
Contact:

Post by CraigRat » 07 March 09 6:54 pm

...and...
yes all the above COULD be done, but if you are planning to, can it please be discussed away from these forums.... lets not see how far it can be pushed!

vk3jap
Posts: 50
Joined: 25 May 08 3:07 pm
Location: Macedon Ranges,Vic,Au

Post by vk3jap » 07 March 09 7:17 pm

just had another look at the ToS - it's been a while since I read them verbatim..

for those who love em : http://www.geocaching.com/about/termsofuse.aspx

I can't see how taking cache information elements, collating them into one file, inserting a copyright notice honouring the original provider(s) of content could be a breach of the ToS.

I just spoke with my sister who's a legal beagle, she's read the ToS and understands the objective I'm talking about, she says we're fine. She did say that if we were changing cache names or ratings etc, or doing something to bring GC.com and/or it's members into disrepute then that would pose an an issue of libel etc.

We're really not pushing any boundaries here. I'd be keen to control access to the site to premium members only as I do believe that even if we can make it easier to get the information that one should still pay for premium access to GC.

User avatar
CraigRat
850 or more found!!!
850 or more found!!!
Posts: 7015
Joined: 23 August 04 3:17 pm
Twitter: CraigRat
Facebook: http://facebook.com/CraigRat
Location: Launceston, TAS
Contact:

Post by CraigRat » 07 March 09 8:00 pm

vk3jap wrote:I can't see how taking cache information elements, collating them into one file, inserting a copyright notice honouring the original provider(s) of content could be a breach of the ToS.
That's all fine, but be prepared to get letter from them.

what part of:
You may not reproduce or retransmit the Site Materials, in whole or in part, in any manner, without the prior written consent of the owner of such materials, except as follows: You may make a single copy of the Site Materials solely for Your personal, noncommercial use, but such copying must be consistent with any applicable additional terms and conditions and You must preserve any copyright, trademark, or other notices contained in or associated with such Site Materials. You may not distribute such copies to others, whether or not in electronic form and whether or not for a charge or other consideration, without prior written consent of the owner of such materials. If you have any questions, contact us at contact@groundspeak.com.
makes you think you can legally disseminate the data?

Just curious.
(not arguing with your intent, but I'm just making sure you go in to it with all the information, I'd hate to see something happen)

EDIT: Had bit about taking Premium Membership revenue off them, but just re-read your post above.

(I've been the recipient of a letter in the past BTW)
Last edited by CraigRat on 07 March 09 8:35 pm, edited 1 time in total.

User avatar
Papa Bear_Left
800 or more hollow logs searched
800 or more hollow logs searched
Posts: 2573
Joined: 03 April 03 12:28 am
Location: Kalamunda, WA
Contact:

Post by Papa Bear_Left » 07 March 09 8:20 pm

Hmmm.. the listings are owned by the cache owners, I think? (That allows the site to avoid any problems arising from cache placements: "We're just a listing site, not the owners of the data or the caches")

So, if we had written permission from a majority of Aussie cache owners, we could re-list that information here?

Hey, if any of the fairies want to do that, you hereby have permission to reproduce any and all Bear_Left caches! :)

(Disclaimer: IANAL, IAABD!)

User avatar
mtrax
Posts: 1974
Joined: 19 December 06 9:57 am
Location: Weston Creek, Canberra

Post by mtrax » 10 March 09 3:18 pm

maybe you can tackle another way..
eg having a tick box or some disclaimer by joining GCA you agree to publish/copy caches on GCA with there full details .

User avatar
Mr Router
1500 or more caches found
1500 or more caches found
Posts: 2782
Joined: 22 May 05 11:59 am
Location: Bathurst

Post by Mr Router » 10 March 09 4:18 pm

Papa Bear_Left wrote:"We're just a listing site, not the owners of the data or the caches")
So why do we pay to retrieve the data :shock: :?

User avatar
caughtatwork
Posts: 17016
Joined: 17 May 04 12:11 pm
Location: Melbourne
Contact:

Post by caughtatwork » 10 March 09 4:54 pm

Mr Router wrote:
Papa Bear_Left wrote:"We're just a listing site, not the owners of the data or the caches")
So why do we pay to retrieve the data :shock: :?
You only pay for the convenience of getting lots of data at the same time. You are free to get the data page by page as long as you don't use a bot to scrape their site.

Locked