Excuse me? Buy tools for something any Pythonista can do in seventeen lines?
You'll need the pyodbc module, which you should have anyway because it rocks.
import csv
import pyodbc
cnxn = pyodbc.connect(
'''DRIVER={SQL Server};
SERVER=localhost;
DATABASE=mydb;
UID=myname;
PWD=mypwd''')
curs = cnxn.cursor()
def write_table(tblname):
with open(tblname+'.csv', 'wb') as outfile:
writer = csv.writer(outfile)
curs.execute('SELECT * FROM %s' % tblname)
writer.writerows(curs.fetchall())
curs.execute('SELECT name FROM sys.tables')
table_names = curs.fetchall()
for table_name in table_names:
write_table(table_name.name)
Go on, get more sophisticated with the hardcoded connect string, etc.
AND A NOTE: If the ultimate destination of your .csv is to be imported into another database, you'd better distinguish between empty strings and NULLS. To do that, replace
writer.writerows(curs.fetchall())
with
for row in curs:
writer.writerow(['NULL' if r is None else r
for r in row])
AND, if you end up importing this .csv into MySQL, you'll want to set ESCAPED BY '' in your LOAD DATA statement, or else backslashes will start mucking up your field boundaries. (Thanks to Shannon -jj Behrens for saving my sanity on that one). Here's my script to consume the files:
for fn in `ls *.csv`
do
tbl=${fn%.*}
mysql myinst -e "SET FOREIGN_KEY_CHECKS=0;
DELETE FROM $tbl;
LOAD DATA LOCAL INFILE '$fn'
INTO TABLE $tbl
COLUMNS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '\"'
ESCAPED BY '';"
done
2 comments:
Nice. Even better, add this before the writerows call:
writer.writerow([_[0] for _ in curs.description])
Catherine--
Long time no chat--why not just use SSIS from Microsoft, it works pretty well cross-platform.
Post a Comment