python - Running a large query in mysql -
i need grab rows in database contain item matches of 175,000 items , convert results csv file (which later parse , analyze python script). issues come mind are: [can u input large list of items workbench sql query (there not enough memory copy it)? network support such large data transfer? other things don't know?] smart way query , fetch large amount of data? using mysql workbench on windows windows server, open trying better interface option.
simple (but not practical in case) query format:
select * database date>='2017-06-01 00:00:00' , date<='2017-07-01 00:00:00' , instr in ('ab123', 'azx0456', 'rtpz888')
*there should 10,000,000 records (or rows) between 2 specified dates. *the "instr in (...)" part require list of 175,000 unique items
- import instr filter separate table, example table xx, column name instr
select * database date>='2017-06-01 00:00:00' , date<='2017-07-01 00:00:00' , instr in (select instr xx)
i have not using sql while. should fine. exporting part select * database outfile "aa.txt" ....
Comments
Post a Comment