sql - How to make batch insert with ColdFusion having more than 1000 records? -


i having spreadsheet contains around 3000 records. need insert these data new table. in case using batch insert mechanism quite good.

so tried simple example ,

 <cfquery datasource="cse">     insert names     values     <cfloop from="1" to="3000" index="i">         ('#i#')         <cfif lt 3000>, </cfif>     </cfloop> </cfquery> 

but sql server 2008 allows 1000 batch insert @ time getting error.

so how make separate batches each containing 999 records @ time , can execute @ time?

you can use bulk insert statement should cope extremely large datasets.

the data need in csv, , you'll have create variable file location.

  <cfquery datasource="cse">     bulk insert names     '#variables.scsvlocation#'   </cfquery> 

if have reason not use bulk insert , want break down loops of 999, have work out how many 'records' in dataset, divide 999 amount of times you'd have loop on it.


Comments

Popular posts from this blog

Android layout hidden on keyboard show -

google app engine - 403 Forbidden POST - Flask WTForms -

c - Why would PK11_GenerateRandom() return an error -8023? -