Code: Select all
<?php
header("Content-Type: text/xml;charset=iso-8859-1");
echo '<?xml version="1.0" encoding="UTF-8"?>' . "\r\n";
echo '<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" >' . "\r\n";
$fileLines = file('FileList.txt', FILE_SKIP_EMPTY_LINES);
foreach ($fileLines as $nameLine)
{
$nameLine = rtrim($nameLine, "\r\n");
echo ' <url>' . "\r\n";
echo ' <loc>http://' . $_SERVER["HTTP_HOST"] . '/' . $nameLine . '</loc>' . "\r\n";
echo ' <lastmod>' . date("Y-m-d") . '</lastmod>' . "\r\n";
echo ' <changefreq>weekly</changefreq>' . "\r\n";
echo ' <priority>0.5</priority>' . "\r\n";
echo ' </url>' . "\r\n";
}
echo '</urlset>';
?>
In your .htaccess file add:
RewriteEngine on
RewriteRule ^sitemap.xml$ sitemap.php [NC,L]
Then give google, etc your sitemaps file link as http://yoursite.com/sitemap.xml and save your sitemap.php in the root. Any domain linking in to that same folder will each get a unique sitemap file.
Could be an interesting extension of addition to WB. If an addition to WB it could be improved to read the folder content and specify in WB what file types to scan for and what files to exclude, or if could just generate FileList.txt with the files it would normally put in the sitemap. Would make for a much smaller sitemap file too.