0

I'm about to develop an app to load data from a file into the DB. I have used sqlLoader to do that through the command-Line and it works great, but now I want to have a GUI for that process.

I was thinking about calling sqlLoader from java with Runtime.exec() but I think I'll be limiting my program to windows only, so no portability.

I would like to know if there's a more efficient/native way to load data from file through java into the DB. Something like:

FILE -> JAVA - > ORACLE.

Any advise is most welcomed !

Cheers,

3 Answers 3

1

You can use JDBC for connecting to Oracle and managing data coming from the file. Here is a tutorial on JDBC: http://www.cs.ubc.ca/~ramesh/cpsc304/tutorial/JDBC/jdbc1.html

Edited after response:

  • SQLLoader works on other platforms as well (maybe with different name) so if you insist on avoiding JDBC then you can write a front-end to SQLLoader.
  • You can improve the performance of JDBC bulk upload by update batching.
Sign up to request clarification or add additional context in comments.

3 Comments

I know how to use the JDBC thanks for your answer. But what I'm trying to find is a very optimized way to load data. I was trying to find something like an API to directly load into Oracle from java.
Thanks very much. I insist on avoiding JDBC because I think it is a slow way to load lots of data to the DB, correct me if I'm wrong. Sometimes I have to load files of aprox 5 Gb (22 million of registers). Now I have a front-end to SQLLoader, but I have to check what OS the app is running so I can call the correct command line (again correct me If Im wrong). I'll check the update batching and see if that helps me. Thanks!!
I have a requirement like this.Is jdbc well optimised enoughh to do this.I dont want to go for hibernate or jpa or ibatis or mybatis.Please help !!!
1

Managing a remote process and having it depend on an Oracle utility OS install is a maintenance headache. I used associative arrays and a package function. Define types in the package for VARCHAR2, NUMBER, etc input arrays. Then (basically!) build parallel arrays for each column using readline().split(",") on the input files. Pass the arrays to package function which uses Oracle bulk processing in a single round trip:

my_package.do_my_insert( field1_array, field2_array, field3_array ... )
   forall idx in field1_array.first .. field1_array.last
      insert into table1 ( field 1, field2, field3 ... )
      values ( field1_array(idx), field2_array(idx), field3_array(idx) ... );

(lots of missing code for how to create an Oracle array type in JDBC above, but that's documented elsewhere by Oracle)

Easily a 10x performance improvement over SQL batching

Comments

0

There are multiple ways to load a file to a DB via Java.

  1. Read the file, parse data and construct a query like INSERT ALL into table name (col1,col2,..) values (val1,val2,..) select * from dual; and insert as batch.
  2. Read the file, parse data, inject into Java object, create list. Then iterate list and insert into database via JPA ORM concept.
  3. Read file, parse data convert into XML and send the XML through PL/SQL procedure to load it.
  4. Invoke SQL loader from Java.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.