python - Running a large query in mysql -


i need grab rows in database contain item matches of 175,000 items , convert results csv file (which later parse , analyze python script). issues come mind are: [can u input large list of items workbench sql query (there not enough memory copy it)? network support such large data transfer? other things don't know?] smart way query , fetch large amount of data? using mysql workbench on windows windows server, open trying better interface option.

simple (but not practical in case) query format:

select * database date>='2017-06-01 00:00:00' , date<='2017-07-01 00:00:00' , instr in ('ab123', 'azx0456', 'rtpz888') 

*there should 10,000,000 records (or rows) between 2 specified dates. *the "instr in (...)" part require list of 175,000 unique items

  1. import instr filter separate table, example table xx, column name instr
  2. select * database date>='2017-06-01 00:00:00' , date<='2017-07-01 00:00:00' , instr in (select instr xx)

i have not using sql while. should fine. exporting part select * database outfile "aa.txt" ....


Comments

Popular posts from this blog

Is there a better way to structure post methods in Class Based Views -

performance - Why is XCHG reg, reg a 3 micro-op instruction on modern Intel architectures? -

jquery - Responsive Navbar with Sub Navbar -