Bulk Bind Reminder: ORA-06502: PL/SQL: numeric or value error: Bulk Bind: Truncated Bind

Bulk Bind: Truncated bind

I recently had a good reminder on one of the subtleties of PL/SQL bulk fetches and inserts. I kept getting this error on the FETCH:

ORA-06502: PL/SQL: numeric or value error: Bulk Bind: Truncated Bind  

The procedure purpose is to bulk insert data from a temporary staging table to the real, properly-sized column Load table.  Simple, right?  Except when I use the wrong table types.

My problem?

The elements I was selecting in my cursor did not logically fit into the table type target of my cursor fetch.

I defined my cursor using a query on the temporary staging table (larger VARCHAR2 columns), not the real table (exact-fit VARCHAR2, NUMBER and DATE columns).
But, I defined my table type to receive the cursor results on my real table:


CURSOR c_dl IS
SELECT ...
...
...
FROM my_staging_table stg
WHERE action IN ( 'INSERT','UPDATE');
TYPE my_load_table_tt IS TABLE OF my_load_table%ROWTYPE
INDEX BY PLS_INTEGER;
load_data_arr my_load_table_tt;


BEGIN
OPEN c_dl;
LOOP
FETCH c_dl BULK COLLECT INTO load_data_arr LIMIT 1000; ----- Error  happens here!
FORALL ii IN 1 .. load_data_arr.COUNT
INSERT INTO my_load_table
...
...

The Bulk Bind: Truncated Bind error is because the cursor – with bigger elements, cannot logically fit in the defined array/table type – the columns are too big.

To correct this, I changed the definition of the table type to be a table of the staging table. Then the FETCH proceeds fine, as does the INSERT.

But wait, why didn’t the INSERT fail, since I am Inserting from a (potentially) large-column cursor into the exact-fit table?

The INSERT proceeds because the elements being inserted fit  – the transforms and validations (processed previous to this point in the code) have ensured that all elements are indeed the size they should be in the exact-fit table.  So the INSERT does not complain.

LOG ERRORS REJECT LIMIT UNLIMITED

Now, IF one column in one row was too large, or the wrong format for a date, the entire INSERT would have failed. Unless I used the LOG ERRORS clause, which I did.

IF one or more records is/are bad, I want to load all the good records, and single out the bad. To do this, I used the LOG ERRORS REJECT LIMIT UNLIMITED clause at the end of the INSERT statement:

INSERT INTO my_load_table
VALUES ( ..load_data_arr(ii).col1,
load_data_arr(ii).col2,
load_data_arr(ii).col3,
...
load_data_arr(ii).coln)
LOG ERRORS REJECT LIMIT UNLIMITED;

This LOG ERRORS REJECT LIMIT UNLIMITED clause ensures that all clean rows will insert, and all “bad” rows get logged to an error table, in my case called ERR$MY_LOAD_TABLE.

I created the error log table previously, using the command:
BEGIN
DBMS_ERRLOG.create_error_log (dml_table_name => 'MY_LOAD_TABLE');
END;

We define the staging tables with generous VARCHAR2s on purpose. We want to allow leeway in the incoming data, knowing we can strip punctuation, perform lookups, format dates, and perform other transformations in the validation stage. That means doing all that validation is on us, to make sure the data is in shape for the final insert. Just one way of doing things – there are many other approaches.

My reminder is – especially when writing things quickly – to pay attention to the nuances of the features in use.  I won’t make that mistake for a while now :).

For a full treatment of bulk processing with BULK COLLECT and FORALL, see Steve Feuerstein’s article PL/SQL 101: Bulk Processing with BULK COLLECT and FORALL.

 

Leave a Reply