355412 Website SEO help and/or PERL

In Progress Posted Oct 1, 2009 Paid on delivery
In Progress Paid on delivery

There may be more than one way to approach this. So it's either proramming a stand alone program or modify my existing cgi PERL script. Or you tell me.

I have a vacation rental property listing site - for some reason I can not get google to spider and index all the pages of rental listings created. There are aproximately 150 separate STATIC PAGE listings each one with it's own URL.

i.e. [url removed, login to view] [url removed, login to view] [url removed, login to view] etc. etc.

These are flat listings - there is no database call.

I would like to have a prorgam that will index my site with ALL of the subdirectories listed. And be able to run it any time to recreate xml sitemap whenever properties are added or deleted such that I can resubmit updated sitemaps to my domain for google to re-index.

I have tried generating a XML site map online but even that won't pick up all of the sub-sirectories.

The whole reason for this? Since the caffeine update I have dropped down over 35 keyword SERPS that I was at number 1. Now I am in the 50's to 100's or more. I used to be number one for several years (like 8 years)and many more top 10 for totally relevant/related keywords to my website. Now I am toast. Gee, thanks caffeine for your "new and improved" algorithm.

If you have any other ideas, let me know.

PM me for the website and to ask me further details so that you can provide me with your proposal.

-Mike

Odd Jobs Perl SEO

Project ID: #2101245

About the project

Remote project Active Jul 11, 2012