Excuse me? Buy tools for something any Pythonista can do in seventeen lines?
You'll need the pyodbc module, which you should have anyway because it rocks.
cnxn = pyodbc.connect(
curs = cnxn.cursor()
with open(tblname+'.csv', 'wb') as outfile:
writer = csv.writer(outfile)
curs.execute('SELECT * FROM %s' % tblname)
curs.execute('SELECT name FROM sys.tables')
table_names = curs.fetchall()
for table_name in table_names:
Go on, get more sophisticated with the hardcoded connect string, etc.
AND A NOTE: If the ultimate destination of your .csv is to be imported into another database, you'd better distinguish between empty strings and NULLS. To do that, replace
for row in curs:
writer.writerow(['NULL' if r is None else r
for r in row])
AND, if you end up importing this .csv into MySQL, you'll want to set ESCAPED BY '' in your LOAD DATA statement, or else backslashes will start mucking up your field boundaries. (Thanks to Shannon -jj Behrens for saving my sanity on that one). Here's my script to consume the files:
for fn in `ls *.csv`
mysql myinst -e "SET FOREIGN_KEY_CHECKS=0;
DELETE FROM $tbl;
LOAD DATA LOCAL INFILE '$fn'
INTO TABLE $tbl
COLUMNS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '\"'
ESCAPED BY '';"