funding a dedicated server

Geocaching Australia governance issues
User avatar
CraigRat
850 or more found!!!
850 or more found!!!
Posts: 7015
Joined: 23 August 04 3:17 pm
Twitter: CraigRat
Facebook: http://facebook.com/CraigRat
Location: Launceston, TAS
Contact:

Post by CraigRat » 12 August 08 8:42 am

I'm beginning to think that a SMALL set of ads might become a necessity.
Opt In is dangerous, as it requires action on the users behalf, perhaps opt-out would be a good compromise?

The ads back in the bad old days were those huge garish 460x80 image banners that had blinking text and were pretty naff, whereas nowdays the google ads etc are a lot more subtle, less intrusive and most of all: more relevant to the content being served up.

I have been contemplating something similar since the last time we discussed potential sponsorship:
Approach a few national hiking/bushwalking/hardware vendors who's demographics our users fit in to, and offer something, like adding 'This Daily Cache Update email is brought to you by CompanyX' in our daily cache updates or perhaps something in the Footer or the Forum... perhaps different rates for more prominent locations?

Something unobtrusive, but that could generate some revenue.

If we could get $500 to $1k off 1 or 2 big chains then it would make a HUGE difference to what we could do. And it means we would probably need to come up with only a few hundred some other way...

User avatar
caughtatwork
Posts: 17013
Joined: 17 May 04 12:11 pm
Location: Melbourne
Contact:

Post by caughtatwork » 12 August 08 9:44 am

mtrax wrote:is there only 1 option eg $5,000 pa version or is there a cheaper option?

is it possible to have a list of options so we can get some "group" to vote and move this forward?
ie if we can put forward a proposal to the community and perhaps some opportunities may pop up if some of the "components" of the proposal can be donated etc.

I'm not sure if H/W is require but a one-off purchase I think can be achieved just the ongoing co-lo housing of server is an big issue, is co-loc a package which includes bandwidth?

so we have three components
1. new H/W maybe 1 or 2 boxes
2. housing of server. annual fee
3. bandwidth
In the past we have looked at what we needed.
At the moment we are on a 2GB RAM, 2 dual core 2.8Ghz CPUs machine, but we are shared with some other sites that i! own, so it's not all ours.
We would need to match (at least) or better that system specification to achieve a difference.

My limited scoping out of a dedicated server with sufficient RAM, CPU and bandwidth runs to about $300 per month (give or take) which is around $4,000 per year. If we took two smaller machines and ran a dual server environment of one webserver and one database machine then the cost of each server would be reduced, but there would be two of them. That's how I came up with the amount of about $5,000 per year.

Shared hosting is simply not a suitable alternative.

There is also the issue with the forum software. phpBB is free, but is not terribly well optimised. This is probably a big chunk of our problem. Investigations of the DB access when the forum is being hit are quite horrific. We could move to a better, commercial package such a vBulletin, but there are licence fees for vBulletin which I think are a couple of hundred a year.

Buying a server and paying the colo fees along with bandwidth costs I think are going to way exceed anything we can fund from donations. A HP DL320 or DL360 is going to cost about $3,000 to $4,000 plus the colo fee and bandwidth costs. It also means that we would have an "asset" which would be owned by everyone and no-one. If someone decides to leave the GCA community, do they expect their part of the hardware (or equivalent depreciated value) back?

Having a dedicated server administered by a service provider also means that they worry about heating, air-conditioning power, UPS, etc. Things that we would have no ability to fund or administer if we were to purchase our own server.

Bandwidth costs for a colo solution are (from memory) substantial.

Using a dedicated server from an provider reduces the TCO as everything is bundled together.

I like your thoughts though. If you can provide some examples of equipment and providers that would meet our technical requirements as well as the costs then all options are open.

User avatar
mtrax
Posts: 1974
Joined: 19 December 06 9:57 am
Location: Weston Creek, Canberra

Post by mtrax » 12 August 08 10:04 am

just going back the the "hosted" option, how difficult would it be to setup a trail migration to a hosted server.
eg some are about $6/month if its feasibly to setup a new site for purposes of testing the capacity..
I understand the performance may be an issue, but nothing like a real world test
perhaps some host sites can provide a VM host rather than a shared site, which can dedicate memory and cpu to that host. I'll try and ring a few today.
ps here is one site I found http://www.host1.com.au/colocation.html

note I found a hosting site which allows a 90-day trail so if the test can be done in that time period it should only cost about $30.

Gee we need a slush fund for such projects.

User avatar
mtrax
Posts: 1974
Joined: 19 December 06 9:57 am
Location: Weston Creek, Canberra

Post by mtrax » 12 August 08 10:51 am

maybe we should put this another way, how much can we afford and see what we can get for that.
eg if we can easily raise $1,000 per year then this will be our budget.

User avatar
caughtatwork
Posts: 17013
Joined: 17 May 04 12:11 pm
Location: Melbourne
Contact:

Post by caughtatwork » 12 August 08 10:51 am

No hope of moving to a shared server environment. Not even as a trial.

I have been on shared servers before (refer my website http://www.caughtatwork.net).

I blew every limit they imposed and my site is not exactly a hard driving DB based engine either.

I have to move to a VPS (Virtual Private Server) and the costs went from about $240 a year to $500 a year.

The VPS is a 3Ghz dual core Xeon processor which is shared between a number of accounts. I get 1GB of burst memory dedicated to me alone.

When running some of the graphs for GCA data it ceilings out on the CPU and RAM.

So a shared server option is simply not worth the time and effort. It will, from experience, prove to be a useless exercise.

From your example of http://www.host1.com.au/colocation.html the colobig is $220 a month. Multiplied by 12 that's over $2,500 and we would need to provide our own server. That's the type of price we can get a half decent dedicated server for per year (give or take a thousand bucks).

Keep the suggestions coming though. I'd like to be able to explore as many options and we can.

strong-arm
500 or more caches logged
500 or more caches logged
Posts: 42
Joined: 30 November 07 11:06 am
Location: Carseldine, Brisbane

Post by strong-arm » 12 August 08 12:43 pm

(Moved my post from other thread).

What about marking older content (say any threads that haven't had a response in 3 months) as static and removing them from the database as dynamic content and turning it into static so it was still searchable etc. You could have an option to search "live" threads or "static" threads with a warning that a static search would take longer as a low priority job. Not sure how feasible that would be or how much work turning the existing dynamic into static content would be, as you'd have to re-run it monthly to "archive" threads with no responses in 3 months.

--

I had an additional thought while moving the post. What about only updating the graphs once per day, starting at say 1am or something and doing a batch update for everyone. Store the results as static images that can be served out instead of dynamic content. Mark the graphs as unavailable between 1am and 6am, or however long it takes to run them all.

User avatar
caughtatwork
Posts: 17013
Joined: 17 May 04 12:11 pm
Location: Melbourne
Contact:

Post by caughtatwork » 12 August 08 1:02 pm

strong-arm wrote:I had an additional thought while moving the post. What about only updating the graphs once per day, starting at say 1am or something and doing a batch update for everyone. Store the results as static images that can be served out instead of dynamic content. Mark the graphs as unavailable between 1am and 6am, or however long it takes to run them all.
Non cacher graphs already do this.
Cacher graphs don't ... yet.
We can continue to performance tune the system. We have been for about 18 months now. Sooner or later we are going to have to upgrade.
Just like as you grow you'll bust out of your pants. You can continue to hold your breath or eat less, but it's not a long term solution.
Great suggestion though.

strong-arm
500 or more caches logged
500 or more caches logged
Posts: 42
Joined: 30 November 07 11:06 am
Location: Carseldine, Brisbane

Post by strong-arm » 12 August 08 1:58 pm

Hey it's always possible to lose weight and go backwards and fit into those old pants... :D

Although it would be a massive chunk of work, changing the entire site to be static content instead of dynamic would have to improve performance. I think sites like slashdot do something like this for their "main" story page, write it to a location and then cache and serve that page out until it's updated again.

So then the dashboard, cacher pages etc would only update whenever some back end process rewrites the static content and clears the cache. High traffic/computational/db heavy pages could be updated once per day, like the dashboard could update nightly when it receives new feed data. The same with cacher pages, only update once per day.

Again, i've no clue how feasible this might be :)

Geof
450 or more roots tripped over
450 or more roots tripped over
Posts: 1232
Joined: 10 August 04 12:26 pm
Location: Yarra Ranges

Post by Geof » 12 August 08 4:00 pm

Did we ever establish a person/persons/entity to accept funds for GCA?

Where funding comes from is a side issue until we have that.

User avatar
caughtatwork
Posts: 17013
Joined: 17 May 04 12:11 pm
Location: Melbourne
Contact:

Post by caughtatwork » 12 August 08 4:04 pm

Geof wrote:Did we ever establish a person/persons/entity to accept funds for GCA?

Where funding comes from is a side issue until we have that.
I believe that the Treasurer of the Tasmanian Geocaching Association has agreed to act on behalf of GCA in accepting and holding the donations.

User avatar
caughtatwork
Posts: 17013
Joined: 17 May 04 12:11 pm
Location: Melbourne
Contact:

Post by caughtatwork » 12 August 08 4:12 pm

strong-arm wrote:Hey it's always possible to lose weight and go backwards and fit into those old pants... :D

Although it would be a massive chunk of work, changing the entire site to be static content instead of dynamic would have to improve performance. I think sites like slashdot do something like this for their "main" story page, write it to a location and then cache and serve that page out until it's updated again.

So then the dashboard, cacher pages etc would only update whenever some back end process rewrites the static content and clears the cache. High traffic/computational/db heavy pages could be updated once per day, like the dashboard could update nightly when it receives new feed data. The same with cacher pages, only update once per day.

Again, i've no clue how feasible this might be :)
Very true. I have the ban stick ready to be wielded :-)

But seriously, the points you make are valid and to a great extent we do this. The dashboards and the front page ladder are all cached. They are generated hourly and then served as static pages. RSS feeds are done the same way as are the stats graphs (excluding caher graphs).

Geobirthdays and the GC Hidden Ranges are generated dynamically first time of the day and then served statically for the rest of the day.

We could go the whole hog and create the same solution for all stats. Then we get inundated with queries as to why the data is out of date. We have tried to balance between everything and nothing.

The simple fact is that there are more and more people using the site and that as use grows we are exceeding our machine capacity.

Last year we had more than a million visits to the site. Not in the same league as GC, but still not a trivial amount either.

User avatar
mtrax
Posts: 1974
Joined: 19 December 06 9:57 am
Location: Weston Creek, Canberra

Post by mtrax » 13 August 08 7:39 am

so I wonder what our realistic budget is?
eg if there are 1000 CGA members that would mean $5/yr membership fee or some thing near that a reasonable guess?
Failing that is there some fall back options? ether scale back the functions or as I reffered somehow create tiered system, moving some of the grunt work to some backend nodes.
Can we put a request out to people to see if a ISP will donate a colo spot.

User avatar
caughtatwork
Posts: 17013
Joined: 17 May 04 12:11 pm
Location: Melbourne
Contact:

Post by caughtatwork » 13 August 08 9:56 am

mtrax wrote:so I wonder what our realistic budget is?
eg if there are 1000 CGA members that would mean $5/yr membership fee or some thing near that a reasonable guess?
Failing that is there some fall back options? ether scale back the functions or as I reffered somehow create tiered system, moving some of the grunt work to some backend nodes.
Can we put a request out to people to see if a ISP will donate a colo spot.
If you could that would be much appreciated.

User avatar
mtrax
Posts: 1974
Joined: 19 December 06 9:57 am
Location: Weston Creek, Canberra

Post by mtrax » 13 August 08 11:32 am

before I go and canvas a my local ISP or others,
lets assume they accept a subsidy arrangement ie a discount on colo, what would be our most likely budget?

based on current realistic number of CGA users contributing something like $5-$10/ per year?

User avatar
caughtatwork
Posts: 17013
Joined: 17 May 04 12:11 pm
Location: Melbourne
Contact:

Post by caughtatwork » 13 August 08 12:04 pm

A poll conducted earlier this year is indicative of intent.
http://forum.geocaching.com.au/viewtopic.php?t=9826

80 people indicated a total of $2,370

Of course when it comes to hand in pocket time, some may back out, some may offer more, others may join the fray.

So I'd guess a number of about $2,000 would be about right.

You have the essential numbers with regards to RAM and CPU. Bandwidth I can't answer for as I don't have any numbers I consider satisfactory.

About 2 years ago (early 2006) we were chewing through about 15-20GB of download per month.

I would say that the volume will have increased significantly since then and we're probably around 30-40GB per month.

Would those numbers help start the discussions?

Post Reply