sql – Insert through a procedure of a large number of rows from CSV

Question:

There is a task of uploading a CSV file via the Web interface. At boot time, client-side checks are performed to ensure that the data in the file is correct.
We are now inserting millions of rows by calling the single row insert routine. That is, a procedure is called for each line in the file.

Schematic view of the procedure:

procedure example(iValue1 IN Table.column1,
                  iValue2 IN Table.column2,
                  iValue3 IN Table.column3) is
begin
  insert into table(column1, column2, column3) values (iValue1, iValue3, iValue3);
end;

Calling this procedure millions of times is prohibitively slow. How can you change the approach to speed up the insertion process?

Answer:

If you cannot use SQL * loader or insert from External Table , then you can try using bulk insert :

DECLARE
TYPE prod_tab IS TABLE OF products%ROWTYPE;
products_tab   prod_tab := prod_tab();
start_time  number;  end_time   number;
BEGIN
-- Populate a collection - 100000 rows
SELECT * BULK COLLECT INTO products_tab FROM products;

EXECUTE IMMEDIATE 'TRUNCATE TABLE products';
Start_time := DBMS_UTILITY.get_time;
FOR i in products_tab.first .. products_tab.last LOOP
 INSERT INTO products (product_id, product_name, effective_date)
   VALUES (products_tab(i).product_id, products_tab(i).product_name,
           products_tab(i).effective_date);
END LOOP;
end_time := DBMS_UTILITY.get_time;
DBMS_OUTPUT.PUT_LINE(‘Conventional Insert: ’||to_char(end_time-start_time));

EXECUTE IMMEDIATE 'TRUNCATE TABLE products';
Start_time := DBMS_UTILITY.get_time;
FORALL i in products_tab.first .. products_tab.last
 INSERT INTO products VALUES products_tab(i);
end_time := DBMS_UTILITY.get_time;
DBMS_OUTPUT.PUT_LINE(‘Bulk Insert: ’||to_char(end_time-start_time));
COMMIT;
END;

Examples of using BULK COLLECT and FORALL …

Scroll to Top