M
Mike C. Fletcher
Attached is a simple little script. It automates downloading nightly
CVS tarballs for multiple projects from SourceForge (you do back up your
project's CVS, don't you? (And if you're like me, you find going through
the web interface to get the tarball for each project is a pain.)) It's
by no means a particularly complex script, but no reason for everyone
(well, everyone with lots of projects to maintain) to re-invent the wheel.
At the moment the projects to download, and the directory to which to
download are coded as variables in the script. If you pass in
command-line arguments it'll treat each one as a project and download to
the current directory.
Just for the record, on Win2K, to schedule weekly downloads of the set
of projects encoded in the script:
Y:\>at 06:47 /interactive /EVERY:W c:\bin\lang\py23\python.exe
d:\pylive\sourceforgebackup.py
Have fun all,
Mike
_______________________________________
Mike C. Fletcher
Designer, VR Plumber, Coder
http://members.rogers.com/mcfletch/
"""Example script to automate downloading of SF project cvs backups
This script is neither extensively tested, nor warranteed in any
way, use at your own risk!
Assumes you have bzip2 and gzip in your path. The script uses
bzip2 -d to confirm that the downloaded file really is the archive
and not an error message HTML page.
Usage:
Set your downloadDirectory to the directory where you
want to store the snapshots of the CVS repository.
Set projects to a sequence of unix project names you
want to backup from sourceforge.
Then, set up a cron or at job that runs the script every week
or so to make sure you have your backups up to date. If everything
goes correctly, nothing will be written to stderr, if something
fails then messages will be written to stderr.
XXX Realistically, the CVS snapshot *should* include everything
from all previous snapshots, so script should likely have a way
to kill off the old copies once it confirms that the new copies
are functional. I'm just a pack-rat .
"""
import time, os, urllib, sys, traceback
downloadDirectory = 'y:\\pending backups'
projects = [
'simpleparse',
'pyopengl',
'ttfquery',
'wxappbar',
'twistedsnmp',
'pyvrml97',
'wxpycolors',
'pydispatcher',
'wxpypropdist',
'conflictsolver',
'pyspelling',
'resourcepackage',
'pytable',
'basicproperty',
]
def retrieve( projectName ):
"""Given a projectName, retrieve and store download to downloadDirectory"""
os.chdir( downloadDirectory )
url = "http://cvs.sourceforge.net/cvstarballs/%(projectName)s-cvsroot.tar.bz2"%locals()
date = time.strftime( '%Y-%m-%d' )
fileName = '%(projectName)s-%(date)s-cvsroot.tar.bz2'%locals()
file = os.path.join( downloadDirectory, fileName )
def doDots( current, block, total ):
sys.stdout.write( '.' )
print 'Retrieving:\n %(url)s\nInto:\n %(fileName)s'%locals()
urllib.urlretrieve( url, file, doDots )
print
print 'Decompressing bzip format'
if os.system( 'bzip2 -f -d %(fileName)s'%locals() ):
raise IOError( """Failure unpacking %(fileName)s"""%locals())
print 'Recompressing in gzip format'
tarFile = os.path.splitext(fileName)[0]
if os.system( 'gzip -f -9 %(tarFile)s'%locals() ):
raise IOError( """Failure gzipping %(fileName)s"""%locals())
print 'Finished'
if __name__ == "__main__":
if sys.argv[1:]:
# in this mode, download passed projects to current directory
todo = sys.argv[1:]
downloadDirectory = '.'
else:
todo = projects
failed = []
for project in todo:
try:
retrieve( project )
except Exception, err:
failed.append( project )
traceback.print_exc()
if failed:
sys.stderr.write( """\n\nWARNING: Failures downloading these projects:\n %s"""%(
", ".join( failed ),
))
CVS tarballs for multiple projects from SourceForge (you do back up your
project's CVS, don't you? (And if you're like me, you find going through
the web interface to get the tarball for each project is a pain.)) It's
by no means a particularly complex script, but no reason for everyone
(well, everyone with lots of projects to maintain) to re-invent the wheel.
At the moment the projects to download, and the directory to which to
download are coded as variables in the script. If you pass in
command-line arguments it'll treat each one as a project and download to
the current directory.
Just for the record, on Win2K, to schedule weekly downloads of the set
of projects encoded in the script:
Y:\>at 06:47 /interactive /EVERY:W c:\bin\lang\py23\python.exe
d:\pylive\sourceforgebackup.py
Have fun all,
Mike
_______________________________________
Mike C. Fletcher
Designer, VR Plumber, Coder
http://members.rogers.com/mcfletch/
"""Example script to automate downloading of SF project cvs backups
This script is neither extensively tested, nor warranteed in any
way, use at your own risk!
Assumes you have bzip2 and gzip in your path. The script uses
bzip2 -d to confirm that the downloaded file really is the archive
and not an error message HTML page.
Usage:
Set your downloadDirectory to the directory where you
want to store the snapshots of the CVS repository.
Set projects to a sequence of unix project names you
want to backup from sourceforge.
Then, set up a cron or at job that runs the script every week
or so to make sure you have your backups up to date. If everything
goes correctly, nothing will be written to stderr, if something
fails then messages will be written to stderr.
XXX Realistically, the CVS snapshot *should* include everything
from all previous snapshots, so script should likely have a way
to kill off the old copies once it confirms that the new copies
are functional. I'm just a pack-rat .
"""
import time, os, urllib, sys, traceback
downloadDirectory = 'y:\\pending backups'
projects = [
'simpleparse',
'pyopengl',
'ttfquery',
'wxappbar',
'twistedsnmp',
'pyvrml97',
'wxpycolors',
'pydispatcher',
'wxpypropdist',
'conflictsolver',
'pyspelling',
'resourcepackage',
'pytable',
'basicproperty',
]
def retrieve( projectName ):
"""Given a projectName, retrieve and store download to downloadDirectory"""
os.chdir( downloadDirectory )
url = "http://cvs.sourceforge.net/cvstarballs/%(projectName)s-cvsroot.tar.bz2"%locals()
date = time.strftime( '%Y-%m-%d' )
fileName = '%(projectName)s-%(date)s-cvsroot.tar.bz2'%locals()
file = os.path.join( downloadDirectory, fileName )
def doDots( current, block, total ):
sys.stdout.write( '.' )
print 'Retrieving:\n %(url)s\nInto:\n %(fileName)s'%locals()
urllib.urlretrieve( url, file, doDots )
print 'Decompressing bzip format'
if os.system( 'bzip2 -f -d %(fileName)s'%locals() ):
raise IOError( """Failure unpacking %(fileName)s"""%locals())
print 'Recompressing in gzip format'
tarFile = os.path.splitext(fileName)[0]
if os.system( 'gzip -f -9 %(tarFile)s'%locals() ):
raise IOError( """Failure gzipping %(fileName)s"""%locals())
print 'Finished'
if __name__ == "__main__":
if sys.argv[1:]:
# in this mode, download passed projects to current directory
todo = sys.argv[1:]
downloadDirectory = '.'
else:
todo = projects
failed = []
for project in todo:
try:
retrieve( project )
except Exception, err:
failed.append( project )
traceback.print_exc()
if failed:
sys.stderr.write( """\n\nWARNING: Failures downloading these projects:\n %s"""%(
", ".join( failed ),
))