I try to insert 100 millions records of 18 fields on scv file(15G) in oracle by sql loader but 89 million records inserted and others not without any errors in log. I set all field to char type in sql loader file control and oracle tables to handle conflict data type. this my clt file
option (skip=1, parallel=true, errors=5000)
load data
characterset UTF8
infile 'file.scv'
append
into table table_name
fields terminated by ','
trailling nullcols(
name char(200),
family char(200)
...
)
sql loader command is
sqlldr userid=user/pas@ip:port/sid
DATA=file_name.csv table=table_name log=log_file_name.log
and my table space options are, size 77.2G, auto extend true, max size unlimited. why this happen? is it about memory capacity or other options in oracle or sql loader?