Geocache/GPX Data
Geocache/GPX Data
I've been thinking about the process of getting GEOCACHE data be it gc or GCA, perhaps could be done better? Personally I now use GSAK exclusively but of course the value of GSAK is tied to the freshness of your cache information. I'm now using the hidden date ranges to good effect and whilst this is a great work around to the 500 cache limit I'm still left thinking.. some questions..
1. Has anyone written up a decent article explaining how the GC cache information trickles into GCA - is it just a bunch of queries contributed from various members with premium accounts or..?
2. What is the reason why we can't grab GPX information from GCA containing both GC and GCA caches.. the information is clearly held in the site DB. Aside from the possible issue with a user having cache information that is a day or two old is there another reason?
3. Has anyone shared their pocket query output via a web server.? i.e. do your hidden date range queries then post them on a website. The GPX does have a flag saying if the QUERIER has found the cache but a simple perl script would fix that such that they can be used by all.
I did some searching but surely there are many other posts on this point, could someone post URL's here please.
aside from pointed questions, appreciate comments from others on the subject. With so many premium cachers doing so many pocket queries surely we could have a better system. i.e. check this URL://viccaches.gpx on Tuesday and Friday afternoon and you will get ONE file with all VIC caches listed, import that into GSAK - and then we could do the same with NSW, QLD etc etc
1. Has anyone written up a decent article explaining how the GC cache information trickles into GCA - is it just a bunch of queries contributed from various members with premium accounts or..?
2. What is the reason why we can't grab GPX information from GCA containing both GC and GCA caches.. the information is clearly held in the site DB. Aside from the possible issue with a user having cache information that is a day or two old is there another reason?
3. Has anyone shared their pocket query output via a web server.? i.e. do your hidden date range queries then post them on a website. The GPX does have a flag saying if the QUERIER has found the cache but a simple perl script would fix that such that they can be used by all.
I did some searching but surely there are many other posts on this point, could someone post URL's here please.
aside from pointed questions, appreciate comments from others on the subject. With so many premium cachers doing so many pocket queries surely we could have a better system. i.e. check this URL://viccaches.gpx on Tuesday and Friday afternoon and you will get ONE file with all VIC caches listed, import that into GSAK - and then we could do the same with NSW, QLD etc etc
This is similar to what roblisa.com used to provide, which was gold for new cachers. I would like to see such files hosted on GCA if legally and politically possible. The whole 500 limit thing makes it hard for noobs to get going. I'm not too sure about copyright issues though - there's no explicit copyright message, although Groundspeak do claim authorship within the file:
" <desc>Geocache file generated by Groundspeak</desc>
<author>Groundspeak</author>
<email>contact@groundspeak.com</email> "
By the way - I AM NOT A LAWYER, so quote me at your own risk
Pros:
Bandwidth saving - GC seem to have the 500 limit to cut down on workload - DB and Network - off-loading to GCA helps here.
Cons:
Extra bandwidth for GCA, but no DB overhead
Potential revenue loss for GC - less need to become a premium member, although I'd keeep mine up for overseas stuff.
" <desc>Geocache file generated by Groundspeak</desc>
<author>Groundspeak</author>
<email>contact@groundspeak.com</email> "
By the way - I AM NOT A LAWYER, so quote me at your own risk
Pros:
Bandwidth saving - GC seem to have the 500 limit to cut down on workload - DB and Network - off-loading to GCA helps here.
Cons:
Extra bandwidth for GCA, but no DB overhead
Potential revenue loss for GC - less need to become a premium member, although I'd keeep mine up for overseas stuff.
- caughtatwork
- Posts: 17025
- Joined: 17 May 04 12:11 pm
- Location: Melbourne
- Contact:
Re: Geocache/GPX Data
No. If we did, it would publish how it's done and that would not be a good thing from a GC / GCA relationship. You can trust the faeries to do what they do.vk3jap wrote:1. Has anyone written up a decent article explaining how the GC cache information trickles into GCA - is it just a bunch of queries contributed from various members with premium accounts or..?
Groundspeak data which is owned by the cache creator is not ours to distribute. Note that apart from the name of the cache and a few other trivial things we don't display that data. Nothing that we do display it copyrightable so we are not in breach of their ToS.vk3jap wrote:2. What is the reason why we can't grab GPX information from GCA containing both GC and GCA caches.. the information is clearly held in the site DB. Aside from the possible issue with a user having cache information that is a day or two old is there another reason?
That's against the Groundspeak terms of use which the GPX file creator signed up to.vk3jap wrote:3. Has anyone shared their pocket query output via a web server.? i.e. do your hidden date range queries then post them on a website. The GPX does have a flag saying if the QUERIER has found the cache but a simple perl script would fix that such that they can be used by all.
Groundspeak have said they won't do this, but won't give the reasons.vk3jap wrote:With so many premium cachers doing so many pocket queries surely we could have a better system. i.e. check this URL://viccaches.gpx on Tuesday and Friday afternoon and you will get ONE file with all VIC caches listed, import that into GSAK - and then we could do the same with NSW, QLD etc etc
- caughtatwork
- Posts: 17025
- Joined: 17 May 04 12:11 pm
- Location: Melbourne
- Contact:
Not possible. The receivers of GPX file say they won't do what you are suggesting and if they do they are in breach of the ToS and can have their account cut off. I doubt there would be a legal pursuit (especially as we are in different counties), but we are simply not going to attempt to reach their ToS in this way.squalid wrote:This is similar to what roblisa.com used to provide, which was gold for new cachers. I would like to see such files hosted on GCA if legally and politically possible.
Thanks C@W .. I think you've pretty much closed the thread with those words!, No, No, No and No..but your points are valid, so thanks for chiming in.
a great scenario of how to take a game, make it commercial and control the world, profit seems to be the ruler of all logic from HQ, at least that's what one might assume - standing on the outside looking in.
Anyone else thinking of posting please do still.. interested to hear other opinion/interest etc.
it's be great to see ideas and thoughts on how something might be possible/do-able, as opposed to the multitude of reasons why not etc, please be constructive. if you've got a negative - just PM or email it to me on vk3jap@vk3jap.net and at some point I'll collate and dot point the reasons why this hasn't happened to date and is not still - for history's sake that's important to know.
a great scenario of how to take a game, make it commercial and control the world, profit seems to be the ruler of all logic from HQ, at least that's what one might assume - standing on the outside looking in.
Anyone else thinking of posting please do still.. interested to hear other opinion/interest etc.
it's be great to see ideas and thoughts on how something might be possible/do-able, as opposed to the multitude of reasons why not etc, please be constructive. if you've got a negative - just PM or email it to me on vk3jap@vk3jap.net and at some point I'll collate and dot point the reasons why this hasn't happened to date and is not still - for history's sake that's important to know.
I'm happy to donante space on geotech.vk3jap.net - plenty of room there.
just playing in PErl now to see how I can remote the header of the GPX file such that we can take the 8 PQ's with assosiated WayPoints and make them one file. then just a matter of removing the <sym>Geocache Found</sym> and replacing it with <sym>Geocache</sym> such that all users get an untainted feed.
here's what I'm thinking...
1. get RAW PQ data sent do dummy emil account. hopefully all in one day - like Friday each week etc.
2. Use program like formail/procmail or something like that to strip attachments into one folder.
3. process each file and where you find <sym>Geocache Found</sym> replace it with <sym>Geocache</sym>
4. remove following line from head sections of files other than first file.
6. FTP upload to http://geotech.vk3jap.net somewhere
7. Add a date/time stamp on that webpage so users know the data is being updated.
Bob = uncle..
at least that's what I'm working on right now. As I'm not a coder I'm having a hard time with step 4 at the moment.. help welcome. Like to do it with sed/grep or Perl if possible.
just playing in PErl now to see how I can remote the header of the GPX file such that we can take the 8 PQ's with assosiated WayPoints and make them one file. then just a matter of removing the <sym>Geocache Found</sym> and replacing it with <sym>Geocache</sym> such that all users get an untainted feed.
here's what I'm thinking...
1. get RAW PQ data sent do dummy emil account. hopefully all in one day - like Friday each week etc.
2. Use program like formail/procmail or something like that to strip attachments into one folder.
3. process each file and where you find <sym>Geocache Found</sym> replace it with <sym>Geocache</sym>
4. remove following line from head sections of files other than first file.
5. a bit more massaging, then we have one final GPX file for all of vic, unbiased for any user, usable by all, with all child waypoints in it<gpx xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" version="1.0" creator="Groundspeak Pocket Query" xsi:schemaLocation="http://www.topografix.com/GPX/1/0 http://www.topografix.com/GPX/1/0/gpx.xsd http://www.groundspeak.com/cache/1/0 http://www.groundspeak.com/cache/1/0/cache.xsd" xmlns="http://www.topografix.com/GPX/1/0">
<name>HQ-VIC-4</name>
<desc>Geocache file generated by Groundspeak</desc>
<author>Groundspeak</author>
<email>contact@groundspeak.com</email>
<time>2009-03-05T17:05:34.9001113-08:00</time>
<keywords>cache, geocache, groundspeak</keywords>
<bounds minlat="-39.134117" minlon="141.031667" maxlat="-34.600767" maxlon="149.626117" />
6. FTP upload to http://geotech.vk3jap.net somewhere
7. Add a date/time stamp on that webpage so users know the data is being updated.
Bob = uncle..
at least that's what I'm working on right now. As I'm not a coder I'm having a hard time with step 4 at the moment.. help welcome. Like to do it with sed/grep or Perl if possible.
you'll always bash up against the terms of use issue. the most constructive thing we could think of was to set up and encourage people to list their caches on a site that promotes sharing cache information rather than locking it down.vk3jap wrote:Anyone else thinking of posting please do still.. interested to hear other opinion/interest etc.
it's be great to see ideas and thoughts on how something might be possible/do-able, as opposed to the multitude of reasons why not etc, please be constructive.
(having said that, we're having some fun corresponding with the NSW Minister for Transport and CityRail on a similar issue: http://www.smh.com.au/news/digital-life ... 37210.html it looks like sanity will prevail!)
- CraigRat
- 850 or more found!!!
- Posts: 7015
- Joined: 23 August 04 3:17 pm
- Twitter: CraigRat
- Facebook: http://facebook.com/CraigRat
- Location: Launceston, TAS
- Contact:
just had another look at the ToS - it's been a while since I read them verbatim..
for those who love em : http://www.geocaching.com/about/termsofuse.aspx
I can't see how taking cache information elements, collating them into one file, inserting a copyright notice honouring the original provider(s) of content could be a breach of the ToS.
I just spoke with my sister who's a legal beagle, she's read the ToS and understands the objective I'm talking about, she says we're fine. She did say that if we were changing cache names or ratings etc, or doing something to bring GC.com and/or it's members into disrepute then that would pose an an issue of libel etc.
We're really not pushing any boundaries here. I'd be keen to control access to the site to premium members only as I do believe that even if we can make it easier to get the information that one should still pay for premium access to GC.
for those who love em : http://www.geocaching.com/about/termsofuse.aspx
I can't see how taking cache information elements, collating them into one file, inserting a copyright notice honouring the original provider(s) of content could be a breach of the ToS.
I just spoke with my sister who's a legal beagle, she's read the ToS and understands the objective I'm talking about, she says we're fine. She did say that if we were changing cache names or ratings etc, or doing something to bring GC.com and/or it's members into disrepute then that would pose an an issue of libel etc.
We're really not pushing any boundaries here. I'd be keen to control access to the site to premium members only as I do believe that even if we can make it easier to get the information that one should still pay for premium access to GC.
- CraigRat
- 850 or more found!!!
- Posts: 7015
- Joined: 23 August 04 3:17 pm
- Twitter: CraigRat
- Facebook: http://facebook.com/CraigRat
- Location: Launceston, TAS
- Contact:
That's all fine, but be prepared to get letter from them.vk3jap wrote:I can't see how taking cache information elements, collating them into one file, inserting a copyright notice honouring the original provider(s) of content could be a breach of the ToS.
what part of:
makes you think you can legally disseminate the data?You may not reproduce or retransmit the Site Materials, in whole or in part, in any manner, without the prior written consent of the owner of such materials, except as follows: You may make a single copy of the Site Materials solely for Your personal, noncommercial use, but such copying must be consistent with any applicable additional terms and conditions and You must preserve any copyright, trademark, or other notices contained in or associated with such Site Materials. You may not distribute such copies to others, whether or not in electronic form and whether or not for a charge or other consideration, without prior written consent of the owner of such materials. If you have any questions, contact us at contact@groundspeak.com.
Just curious.
(not arguing with your intent, but I'm just making sure you go in to it with all the information, I'd hate to see something happen)
EDIT: Had bit about taking Premium Membership revenue off them, but just re-read your post above.
(I've been the recipient of a letter in the past BTW)
Last edited by CraigRat on 07 March 09 8:35 pm, edited 1 time in total.
- Papa Bear_Left
- 800 or more hollow logs searched
- Posts: 2573
- Joined: 03 April 03 12:28 am
- Location: Kalamunda, WA
- Contact:
Hmmm.. the listings are owned by the cache owners, I think? (That allows the site to avoid any problems arising from cache placements: "We're just a listing site, not the owners of the data or the caches")
So, if we had written permission from a majority of Aussie cache owners, we could re-list that information here?
Hey, if any of the fairies want to do that, you hereby have permission to reproduce any and all Bear_Left caches!
(Disclaimer: IANAL, IAABD!)
So, if we had written permission from a majority of Aussie cache owners, we could re-list that information here?
Hey, if any of the fairies want to do that, you hereby have permission to reproduce any and all Bear_Left caches!
(Disclaimer: IANAL, IAABD!)
- caughtatwork
- Posts: 17025
- Joined: 17 May 04 12:11 pm
- Location: Melbourne
- Contact:
You only pay for the convenience of getting lots of data at the same time. You are free to get the data page by page as long as you don't use a bot to scrape their site.Mr Router wrote:So why do we pay to retrieve the dataPapa Bear_Left wrote:"We're just a listing site, not the owners of the data or the caches")