Question:
There is a task of loading a CSV file via the Web interface. During upload, client-side checks are made to ensure that the data in the file is correct.
We are now inserting millions of rows by calling the insert procedure for one row. That is, a procedure is called for each line of the file.
Schematic view of the procedure:
procedure example(iValue1 IN Table.column1,
iValue2 IN Table.column2,
iValue3 IN Table.column3) is
begin
insert into table(column1, column2, column3) values (iValue1, iValue3, iValue3);
end;
Calling this procedure millions of times is unacceptably slow. How can the approach be changed to speed up the insertion process?
Answer:
If it is not possible to use SQL*loader or insert from External Table
, then you can try using bulk insert :
DECLARE
TYPE prod_tab IS TABLE OF products%ROWTYPE;
products_tab prod_tab := prod_tab();
start_time number; end_time number;
BEGIN
-- Populate a collection - 100000 rows
SELECT * BULK COLLECT INTO products_tab FROM products;
EXECUTE IMMEDIATE 'TRUNCATE TABLE products';
Start_time := DBMS_UTILITY.get_time;
FOR i in products_tab.first .. products_tab.last LOOP
INSERT INTO products (product_id, product_name, effective_date)
VALUES (products_tab(i).product_id, products_tab(i).product_name,
products_tab(i).effective_date);
END LOOP;
end_time := DBMS_UTILITY.get_time;
DBMS_OUTPUT.PUT_LINE(‘Conventional Insert: ’||to_char(end_time-start_time));
EXECUTE IMMEDIATE 'TRUNCATE TABLE products';
Start_time := DBMS_UTILITY.get_time;
FORALL i in products_tab.first .. products_tab.last
INSERT INTO products VALUES products_tab(i);
end_time := DBMS_UTILITY.get_time;
DBMS_OUTPUT.PUT_LINE(‘Bulk Insert: ’||to_char(end_time-start_time));
COMMIT;
END;
Examples of using BULK COLLECT and FORALL…